Sample records for dice design environment

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Desai, M. I.; McComas, D. J.; Allegrini, F.

    We have developed a novel concept for a Compact Dual Ion Composition Experiment (CoDICE) that simultaneously provides high quality plasma and energetic ion composition measurements over 6 decades in ion energy in a wide variety of space plasma environments. CoDICE measures the two critical ion populations in space plasmas: (1) mass and ionic charge state composition and 3D velocity and angular distributions of ∼10 eV/q–40 keV/q plasma ions—CoDICE-Lo and (2) mass composition, energy spectra, and angular distributions of ∼30 keV–10 MeV energetic ions—CoDICE-Hi. CoDICE uses a common, integrated Time-of-Flight (TOF) versus residual energy (E) subsystem for measuring the two distinctmore » ion populations. This paper describes the CoDICE design concept, and presents results of the laboratory tests of the TOF portion of the TOF vs. E subsystem, focusing specifically on (1) investigation of spill-over and contamination rates on the start and stop microchannel plate (MCP) anodes vs. secondary electron steering and focusing voltages, scanned around their corresponding model-optimized values, (2) TOF measurements and resolution and angular resolution, and (3) cross-contamination of the start and stop MCPs’ singles rates from CoDICE-Lo and -Hi, and (4) energy resolution of avalanche photodiodes near the lower end of the CoDICE-Lo energy range. We also discuss physical effects that could impact the performance of the TOF vs. E subsystem in a flight instrument. Finally, we discuss advantages of the CoDICE design concept by comparing with capabilities and resources of existing flight instruments.« less

  2. An integrated time-of-flight versus residual energy subsystem for a compact dual ion composition experiment for space plasmas

    NASA Astrophysics Data System (ADS)

    Desai, M. I.; Ogasawara, K.; Ebert, R. W.; McComas, D. J.; Allegrini, F.; Weidner, S. E.; Alexander, N.; Livi, S. A.

    2015-05-01

    We have developed a novel concept for a Compact Dual Ion Composition Experiment (CoDICE) that simultaneously provides high quality plasma and energetic ion composition measurements over 6 decades in ion energy in a wide variety of space plasma environments. CoDICE measures the two critical ion populations in space plasmas: (1) mass and ionic charge state composition and 3D velocity and angular distributions of ˜10 eV/q-40 keV/q plasma ions—CoDICE-Lo and (2) mass composition, energy spectra, and angular distributions of ˜30 keV-10 MeV energetic ions—CoDICE-Hi. CoDICE uses a common, integrated Time-of-Flight (TOF) versus residual energy (E) subsystem for measuring the two distinct ion populations. This paper describes the CoDICE design concept, and presents results of the laboratory tests of the TOF portion of the TOF vs. E subsystem, focusing specifically on (1) investigation of spill-over and contamination rates on the start and stop microchannel plate (MCP) anodes vs. secondary electron steering and focusing voltages, scanned around their corresponding model-optimized values, (2) TOF measurements and resolution and angular resolution, and (3) cross-contamination of the start and stop MCPs' singles rates from CoDICE-Lo and -Hi, and (4) energy resolution of avalanche photodiodes near the lower end of the CoDICE-Lo energy range. We also discuss physical effects that could impact the performance of the TOF vs. E subsystem in a flight instrument. Finally, we discuss advantages of the CoDICE design concept by comparing with capabilities and resources of existing flight instruments.

  3. An integrated time-of-flight versus residual energy subsystem for a compact dual ion composition experiment for space plasmas.

    PubMed

    Desai, M I; Ogasawara, K; Ebert, R W; McComas, D J; Allegrini, F; Weidner, S E; Alexander, N; Livi, S A

    2015-05-01

    We have developed a novel concept for a Compact Dual Ion Composition Experiment (CoDICE) that simultaneously provides high quality plasma and energetic ion composition measurements over 6 decades in ion energy in a wide variety of space plasma environments. CoDICE measures the two critical ion populations in space plasmas: (1) mass and ionic charge state composition and 3D velocity and angular distributions of ∼10 eV/q-40 keV/q plasma ions—CoDICE-Lo and (2) mass composition, energy spectra, and angular distributions of ∼30 keV-10 MeV energetic ions—CoDICE-Hi. CoDICE uses a common, integrated Time-of-Flight (TOF) versus residual energy (E) subsystem for measuring the two distinct ion populations. This paper describes the CoDICE design concept, and presents results of the laboratory tests of the TOF portion of the TOF vs. E subsystem, focusing specifically on (1) investigation of spill-over and contamination rates on the start and stop microchannel plate (MCP) anodes vs. secondary electron steering and focusing voltages, scanned around their corresponding model-optimized values, (2) TOF measurements and resolution and angular resolution, and (3) cross-contamination of the start and stop MCPs' singles rates from CoDICE-Lo and -Hi, and (4) energy resolution of avalanche photodiodes near the lower end of the CoDICE-Lo energy range. We also discuss physical effects that could impact the performance of the TOF vs. E subsystem in a flight instrument. Finally, we discuss advantages of the CoDICE design concept by comparing with capabilities and resources of existing flight instruments.

  4. Development of n+-in-p large-area silicon microstrip sensors for very high radiation environments - ATLAS12 design and initial results

    NASA Astrophysics Data System (ADS)

    Unno, Y.; Edwards, S. O.; Pyatt, S.; Thomas, J. P.; Wilson, J. A.; Kierstead, J.; Lynn, D.; Carter, J. R.; Hommels, L. B. A.; Robinson, D.; Bloch, I.; Gregor, I. M.; Tackmann, K.; Betancourt, C.; Jakobs, K.; Kuehn, S.; Mori, R.; Parzefall, U.; Wiik-Fucks, L.; Clark, A.; Ferrere, D.; Gonzalez Sevilla, S.; Ashby, J.; Blue, A.; Bates, R.; Buttar, C.; Doherty, F.; Eklund, L.; McMullen, T.; McEwan, F.; O`Shea, V.; Kamada, S.; Yamamura, K.; Ikegami, Y.; Nakamura, K.; Takubo, Y.; Nishimura, R.; Takashima, R.; Chilingarov, A.; Fox, H.; Affolder, A. A.; Allport, P. P.; Casse, G.; Dervan, P.; Forshaw, D.; Greenall, A.; Wonsak, S.; Wormald, M.; Cindro, V.; Kramberger, G.; Mandic, I.; Mikuz, M.; Gorelov, I.; Hoeferkamp, M.; Palni, P.; Seidel, S.; Taylor, A.; Toms, K.; Wang, R.; Hessey, N. P.; Valencic, N.; Arai, Y.; Hanagaki, K.; Dolezal, Z.; Kodys, P.; Bohm, J.; Mikestikova, M.; Bevan, A.; Beck, G.; Ely, S.; Fadeyev, V.; Galloway, Z.; Grillo, A. A.; Martinez-McKinney, F.; Ngo, J.; Parker, C.; Sadrozinski, H. F.-W.; Schumacher, D.; Seiden, A.; French, R.; Hodgson, P.; Marin-Reyes, H.; Parker, K.; Paganis, S.; Jinnouchi, O.; Motohashi, K.; Todome, K.; Yamaguchi, D.; Hara, K.; Hagihara, M.; Garcia, C.; Jimenez, J.; Lacasta, C.; Marti i Garcia, S.; Soldevila, U.

    2014-11-01

    We have been developing a novel radiation-tolerant n+-in-p silicon microstrip sensor for very high radiation environments, aiming for application in the high luminosity large hadron collider. The sensors are fabricated in 6 in., p-type, float-zone wafers, where large-area strip sensor designs are laid out together with a number of miniature sensors. Radiation tolerance has been studied with ATLAS07 sensors and with independent structures. The ATLAS07 design was developed into new ATLAS12 designs. The ATLAS12A large-area sensor is made towards an axial strip sensor and the ATLAS12M towards a stereo strip sensor. New features to the ATLAS12 sensors are two dicing lines: standard edge space of 910 μm and slim edge space of 450 μm, a gated punch-through protection structure, and connection of orphan strips in a triangular corner of stereo strips. We report the design of the ATLAS12 layouts and initial measurements of the leakage current after dicing and the resistivity of the wafers.

  5. DICE/ColDICE: 6D collisionless phase space hydrodynamics using a lagrangian tesselation

    NASA Astrophysics Data System (ADS)

    Sousbie, Thierry

    2018-01-01

    DICE is a C++ template library designed to solve collisionless fluid dynamics in 6D phase space using massively parallel supercomputers via an hybrid OpenMP/MPI parallelization. ColDICE, based on DICE, implements a cosmological and physical VLASOV-POISSON solver for cold systems such as dark matter (CDM) dynamics.

  6. DICE: An Object Oriented Programming Environment for Cooperative Engineering Design

    DTIC Science & Technology

    1989-03-20

    environment called PARMENIDES /FRULEKIT; PARMENIDES /FRULEKIT supports programming in frames and rules and was developed in LISP at Carnegie-Mellon...the domain of building design and construction. The Blackboard in DICEY-BUILDER is represented as frames in PARMENIDES , while the KMs are implemented... PARMENIDES fo rart omat format d a b C /envelope BLACKBOAR D machine to machine (’BLACKBOARD l m message f il transfer message p read •d message format J

  7. DARPA Initiative in Concurrent Engineering (DICE). Phase 2

    DTIC Science & Technology

    1990-07-31

    XS spreadsheet tool " Q-Calc spreadsheet tool " TAE+ outer wrapper for XS • Framemaker-based formal EDN (Electronic Design Notebook) " Data...shared global object space and object persistence. Technical Results Module Development XS Integration Environment A prototype of the wrapper concepts...for a spreadsheet integration environment, using an X-Windows based extensible Lotus 1-2-3 emulation called XS , and was (initially) targeted for

  8. Innovation for integrated command environments

    NASA Astrophysics Data System (ADS)

    Perry, Amie A.; McKneely, Jennifer A.

    2000-11-01

    Command environments have rarely been able to easily accommodate rapid changes in technology and mission. Yet, command personnel, by their selection criteria, experience, and very nature, tend to be extremely adaptive and flexible, and able to learn new missions and address new challenges fairly easily. Instead, the hardware and software components of the systems do no provide the needed flexibility and scalability for command personnel. How do we solve this problem? In order to even dream of keeping pace with a rapidly changing world, we must begin to think differently about the command environment and its systems. What is the correct definition of the integrated command environment system? What types of tasks must be performed in this environment, and how might they change in the next five to twenty-five years? How should the command environment be developed, maintained, and evolved to provide needed flexibility and scalability? The issues and concepts to be considered as new Integrated Command/Control Environments (ICEs) are designed following a human-centered process. A futuristic model, the Dream Integrated Command Environment (DICE) will be described which demonstrates specific ICE innovations. The major paradigm shift required to be able to think differently about this problem is to center the DICE around the command personnel from its inception. Conference participants may not agree with every concept or idea presented, but will hopefully come away with a clear understanding that to radically improve future systems, designers must focus on the end users.

  9. Permanence of diced cartilage, bone dust and diced cartilage/bone dust mixture in experimental design in twelve weeks.

    PubMed

    Islamoglu, Kemal; Dikici, Mustafa Bahadir; Ozgentas, Halil Ege

    2006-09-01

    Bone dust and diced cartilage are used for contour restoration because their minimal donor site morbidity. The purpose of this study is to investigate permanence of bone dust, diced cartilage and bone dust/diced cartilage mixture in rabbits over 12 weeks. New Zealand white rabbits were used for this study. There were three groups in the study: Group I: 1 mL bone dust. Group II: 1 mL diced cartilage. Group III: 0.5 mL bone dust + 0.5 mL diced cartilage mixture. They were placed into subcutaneous tissue of rabbits and removed 12 weeks later. The mean volumes of groups were 0.23 +/- 0.08 mL in group I, 0.60 +/- 0.12 mL in group II and 0.36 +/- 0.10 mL in group III. The differences between groups were found statistically significant. In conclusion, diced cartilage was found more reliable than bone dust aspect of preserving its volume for a long period in this study.

  10. Compact Dual Ion Composition Experiment for space plasmas—CoDICE

    NASA Astrophysics Data System (ADS)

    Desai, M. I.; Ogasawara, K.; Ebert, R. W.; Allegrini, F.; McComas, D. J.; Livi, S.; Weidner, S. E.

    2016-07-01

    The Compact Dual Ion Composition Experiment—CoDICE—simultaneously provides high-quality plasma and energetic ion composition measurements over six decades in energy in a wide variety of space plasma environments. CoDICE measures two critical ion populations in space plasmas: (1) Elemental and charge state composition, and 3-D velocity distributions of <10 eV/q-40 keV/q plasma ions; and (2) Elemental composition, energy spectra, and angular distributions of ˜30 keV->10 MeV energetic ions. CoDICE uses a novel, integrated, common time-of-flight subsystem that provides several advantages over the commonly used separate plasma and energetic ion sensors currently flying on several space missions. These advantages include reduced mass and volume compared to two separate instruments, reduced shielding in high-radiation environments, and simplified spacecraft interface and accommodation requirements. This paper describes the operation principles, electro-optic simulation results and applies the CoDICE concept for measuring plasma and energetic ion populations in Jupiter's magnetosphere.

  11. Injection of Compressed Diced Cartilage in the Correction of Secondary and Primary Rhinoplasty: A New Technique with 12 Years' Experience.

    PubMed

    Erol, O Onur

    2017-11-01

    There are instances where small or large pockets are filled with diced cartilage in the nose, without use of wrapping materials. For this purpose, 1-cc commercial syringes were used. The obtained results were partial and incomplete. For better and improved results, the author designed new syringes, with two different sizes, which compress the diced cartilage for injection. The author presents his experience accrued over the past 12 years with 2366 primary, 749 secondary, 67 cleft lip and nose, and a total of 3182 rhinoplasties, using his new syringe design, which compresses diced cartilage and injects the diced cartilages as a conglutinate mass, simulating carved costal cartilage, but a malleable one. In 3125 patients, the take of cartilage graft was complete (98.2 percent) and a smooth surface was obtained, giving them a natural appearance. In 21 patients (0.65 percent), there was partial resorption of cartilage. Correction was performed with touch-up surgery by reinjection of a small amount of diced cartilage. In 36 patients (1.13 percent), there was overcorrection that, 1 year later, was treated by simple rasping. Compared with diced cartilage wrapped with Surgicel or fascia, the amount of injected cartilage graft is predictable because it consists purely of cartilage. The injected diced cartilage, because it is compressed and becomes a conglutinated mass, resembles a wood chip and simulates carved cartilage. It is superior to carved cartilage in that it is moldable, time saving, and gives a good result with no late show or warping. The injection takes only a few minutes.

  12. High Aspect-Ratio Neural Probes using Conventional Blade Dicing

    NASA Astrophysics Data System (ADS)

    Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Correia, J. H.

    2016-10-01

    Exploring deep neural circuits has triggered the development of long penetrating neural probes. Moreover, driven by brain displacement, the long neural probes require also a high aspect-ratio shafts design. In this paper, a simple and reproducible method of manufacturing long-shafts neural probes using blade dicing technology is presented. Results shows shafts up to 8 mm long and 200 µm wide, features competitive to the current state-of-art, being its outline simply accomplished by a single blade dicing program. Therefore, conventional blade dicing presents itself as a viable option to manufacture long neural probes.

  13. Validation of a DICE Simulation Against a Discrete Event Simulation Implemented Entirely in Code.

    PubMed

    Möller, Jörgen; Davis, Sarah; Stevenson, Matt; Caro, J Jaime

    2017-10-01

    Modeling is an essential tool for health technology assessment, and various techniques for conceptualizing and implementing such models have been described. Recently, a new method has been proposed-the discretely integrated condition event or DICE simulation-that enables frequently employed approaches to be specified using a common, simple structure that can be entirely contained and executed within widely available spreadsheet software. To assess if a DICE simulation provides equivalent results to an existing discrete event simulation, a comparison was undertaken. A model of osteoporosis and its management programmed entirely in Visual Basic for Applications and made public by the National Institute for Health and Care Excellence (NICE) Decision Support Unit was downloaded and used to guide construction of its DICE version in Microsoft Excel ® . The DICE model was then run using the same inputs and settings, and the results were compared. The DICE version produced results that are nearly identical to the original ones, with differences that would not affect the decision direction of the incremental cost-effectiveness ratios (<1% discrepancy), despite the stochastic nature of the models. The main limitation of the simple DICE version is its slow execution speed. DICE simulation did not alter the results and, thus, should provide a valid way to design and implement decision-analytic models without requiring specialized software or custom programming. Additional efforts need to be made to speed up execution.

  14. Adequate Hand Washing and Glove Use Are Necessary To Reduce Cross-Contamination from Hands with High Bacterial Loads.

    PubMed

    Robinson, Andrew L; Lee, Hyun Jung; Kwon, Junehee; Todd, Ewen; Rodriguez, Fernando Perez; Ryu, Dojin

    2016-02-01

    Hand washing and glove use are the main methods for reducing bacterial cross-contamination from hands to ready-to-eat food in a food service setting. However, bacterial transfer from hands to gloves is poorly understood, as is the effect of different durations of soap rubbing on bacterial reduction. To assess bacterial transfer from hands to gloves and to compare bacterial transfer rates to food after different soap washing times and glove use, participants' hands were artificially contaminated with Enterobacter aerogenes B199A at ∼9 log CFU. Different soap rubbing times (0, 3, and 20 s), glove use, and tomato dicing activities followed. The bacterial counts in diced tomatoes and on participants' hands and gloves were then analyzed. Different soap rubbing times did not significantly change the amount of bacteria recovered from participants' hands. Dicing tomatoes with bare hands after 20 s of soap rubbing transferred significantly less bacteria (P < 0.01) to tomatoes than did dicing with bare hands after 0 s of soap rubbing. Wearing gloves while dicing greatly reduced the incidence of contaminated tomato samples compared with dicing with bare hands. Increasing soap washing time decreased the incidence of bacteria recovered from outside glove surfaces (P < 0.05). These results highlight that both glove use and adequate hand washing are necessary to reduce bacterial cross-contamination in food service environments.

  15. (Sn)DICE: A Calibration System Designed for Wide Field Imagers

    NASA Astrophysics Data System (ADS)

    Regnault, N.; Barrelet, E.; Guyonnet, A.; Juramy, C.; Rocci, P.-F.; Le Guillou, L.; Schahmanèche, K.; Villa, F.

    2016-05-01

    Dark Energy studies with type Ia supernovae set very tight constraints on the photometric calibration of the imagers used to detect the supernovae and follow up their flux variations. Among the key challenges is the measurement of the shape and normalization of the instrumental throughput. The DICE system was developed by members of the Supernova Legacy Survey (SNLS) , building upon the lessons learnt working with the MegaCam imager. It consists in a very stable light source, placed in the telescope enclosure, and generating compact, conical beams, yielding an almost flat illumination of the imager focal plane. The calibration light is generated by narrow spectrum LEDs selected to cover the entire wavelength range of the imager. It is monitored in real time using control photodiodes. A first DICE demonstrator, SnDICE has been installed at CFHT. A second generation instrument (SkyDICE) has been installed in the enclosure of the SkyMapper telescope. We present the main goals of the project. We discuss the main difficulties encoutered when trying to calibrate a wide field imager, such as MegaCam (or SkyMapper) using such a calibrated light source.

  16. Constellation of CubeSats for Realtime Ionospheric E-field Measurements for Global Space Weather

    NASA Astrophysics Data System (ADS)

    Crowley, G.; Swenson, C.; Pilinski, M.; Fish, C. S.; Neilsen, T. L.; Stromberg, E. M.; Azeem, I.; Barjatya, A.

    2014-12-01

    Inexpensive and robust space-weather monitoring instruments are needed to fill upcoming gaps in the Nation's ability to meet requirements for space weather specification and forecasting. Foremost among the needed data are electric fields, since they drive global ionospheric and thermospheric behavior, and because there are relatively few ground-based measurements. We envisage a constellation of CubeSats to provide global coverage of the electric field and its variability. The DICE (Dynamic Ionosphere CubeSat Experiment) mission was a step towards this goal, with two identical 1.5U CubeSats, each carrying three space weather instruments: (1) double probe instruments to measure AC and DC electric fields; (2) Langmuir probes to measure ionospheric electron density, and; (3) a magnetometer to measure field-aligned currents. DICE launched in October 2011. DICE was the first CubeSat mission to observe a Storm Enhanced Density event, fulfilling a major goal of the mission. Due to attitude control anomalies encountered in orbit, the DICE electric field booms have not yet been deployed. Important lessons have been learned for the implementation of a spin-stabilized CubeSat, and the design and performance of the Attitude Determination & Control System (ADCS). These lessons are now being applied to the DIME SensorSat, a risk-reduction mission that is capable of deploying flexible electric field booms up to a distance of 10-m tip-to-tip from a 1.5U CubeSat. DIME will measure AC and DC electric fields, and will exceed several IORD-2 threshold requirements. Ion densities, and magnetic fields will also be measured to characterize the performance of the sensor in different plasma environments. We show the utility of a constellation of electric field measurements, describe the DIME SensorSat, and demonstrate how the measurement will meet or exceed IORD requirements. The reduced cost of these sensors will enable constellations that can, for the first time, adequately resolve the spatial and temporal variability in ionospheric electrodynamics. DICE and DIME are collaborations between ASTRA and Space Dynamics Lab/Utah State University.

  17. Discretely Integrated Condition Event (DICE) Simulation for Pharmacoeconomics.

    PubMed

    Caro, J Jaime

    2016-07-01

    Several decision-analytic modeling techniques are in use for pharmacoeconomic analyses. Discretely integrated condition event (DICE) simulation is proposed as a unifying approach that has been deliberately designed to meet the modeling requirements in a straightforward transparent way, without forcing assumptions (e.g., only one transition per cycle) or unnecessary complexity. At the core of DICE are conditions that represent aspects that persist over time. They have levels that can change and many may coexist. Events reflect instantaneous occurrences that may modify some conditions or the timing of other events. The conditions are discretely integrated with events by updating their levels at those times. Profiles of determinant values allow for differences among patients in the predictors of the disease course. Any number of valuations (e.g., utility, cost, willingness-to-pay) of conditions and events can be applied concurrently in a single run. A DICE model is conveniently specified in a series of tables that follow a consistent format and the simulation can be implemented fully in MS Excel, facilitating review and validation. DICE incorporates both state-transition (Markov) models and non-resource-constrained discrete event simulation in a single formulation; it can be executed as a cohort or a microsimulation; and deterministically or stochastically.

  18. The design and implementation of the Dynamic Ionosphere Cubesat Experiment (DICE) science instruments

    NASA Astrophysics Data System (ADS)

    Burr, Steven Reed

    Dynamic Ionosphere Cubesat Experiment (DICE) is a satellite project funded by the National Science Foundation (NSF) to study the ionosphere, more particularly Storm Enhanced Densities (SED) with a payload consisting of plasma diagnostic instrumentation. Three instruments onboard DICE include an Electric Field Probe (EFP), Ion Langmuir Probe (ILP), and Three Axis Magnetometer (TAM). The EFP measures electric fields from +/-8V and consists of three channels a DC to 40Hz channel, a Floating Potential Probe (FPP), and an spectrographic channel with four bands from 16Hz to 512Hz. The ILP measures plasma densities from 1x104 cm--3 to 2x107 cm--3. The TAM measures magnetic field strength with a range +/-0.5 Gauss with a sensitivity of 2nT. To achieve desired mission requirements careful selection of instrument requirements and planning of the instrumentation design to achieve mission success. The analog design of each instrument is described in addition to the digital framework required to sample the science data at a 70Hz rate and prepare the data for the Command and Data Handing (C&DH) system. Calibration results are also presented and show fulfillment of the mission and instrumentation requirements.

  19. Gain-of-function mutation of AtDICE1, encoding a putative endoplasmic reticulum-localized membrane protein, causes defects in anisotropic cell elongation by disturbing cell wall integrity in Arabidopsis.

    PubMed

    Le, Phi-Yen; Jeon, Hyung-Woo; Kim, Min-Ha; Park, Eung-Jun; Lee, Hyoshin; Hwang, Indeok; Han, Kyung-Hwan; Ko, Jae-Heung

    2018-04-05

    Anisotropic cell elongation depends on cell wall relaxation and cellulose microfibril arrangement. The aim of this study was to characterize the molecular function of AtDICE1 encoding a novel transmembrane protein involved in anisotropic cell elongation in Arabidopsis. Phenotypic characterizations of transgenic Arabidopsis plants mis-regulating AtDICE1 expression with different pharmacological treatments were made, and biochemical, cell biological and transcriptome analyses were performed. Upregulation of AtDICE1 in Arabidopsis (35S::AtDICE1) resulted in severe dwarfism, probably caused by defects in anisotropic cell elongation. Epidermal cell swelling was evident in all tissues, and abnormal secondary wall thickenings were observed in pith cells of stems. These phenotypes were reproduced not only by inducible expression of AtDICE1 but also by overexpression of its poplar homologue in Arabidopsis. RNA interference suppression lines of AtDICE1 resulted in no observable phenotypic changes. Interestingly, wild-type plants treated with isoxaben, a cellulose biosynthesis inhibitor, phenocopied the 35S::AtDICE1 plants, suggesting that cellulose biosynthesis was compromised in the 35S::AtDICE1 plants. Indeed, disturbed cortical microtubule arrangements in 35S::AtDICE1/GFP-TuA6 plants were observed, and the cellulose content was significantly reduced in 35S::AtDICE1 plants. A promoter::GUS analysis showed that AtDICE1 is mainly expressed in vascular tissue, and transient expression of GFP:AtDICE1 in tobacco suggests that AtDICE1 is probably localized in the endoplasmic reticulum (ER). In addition, the external N-terminal conserved domain of AtDICE1 was found to be necessary for AtDICE1 function. Whole transcriptome analyses of 35S::AtDICE1 revealed that many genes involved in cell wall modification and stress/defence responses were mis-regulated. AtDICE1, a novel ER-localized transmembrane protein, may contribute to anisotropic cell elongation in the formation of vascular tissue by affecting cellulose biosynthesis.

  20. Investigation of Probability Distributions Using Dice Rolling Simulation

    ERIC Educational Resources Information Center

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  1. Viability and Biomechanics of Diced Cartilage Blended With Platelet-Rich Plasma and Wrapped With Poly (Lactic-Co-Glycolic) Acid Membrane.

    PubMed

    Liao, Jun-Lin; Chen, Jia; He, Bin; Chen, Yong; Xu, Jia-Qun; Xie, Hong-Ju; Hu, Feng; Wang, Ai-Jun; Luo, ChengQun; Li, Qing-Feng; Zhou, Jian-Da

    2017-09-01

    The objective of this study was to investigate the viability and biomechanics of diced cartilage blended with platelet-rich plasma (PRP) and wrapped with poly (lactic-co-glycolic) acid (PLGA) membrane in a rabbit model. A total of 10 New Zealand rabbits were used for the study. Cartilage grafts were harvested from 1 side ear. The grafts were divided into 3 groups for comparison: bare diced cartilage, diced cartilage wrapped with PLGA membrane, and diced cartilage blended with PRP and wrapped with PLGA membrane. Platelet-rich plasma was prepared using 8 mL of auricular blood. Three subcutaneous pockets were made in the backs of the rabbits, and the grafts were placed in these pockets. The subcutaneous implant tests were conducted for safety assessment of the PLGA membrane in vivo. All of the rabbits were sacrificed at the end of 3 months, and the specimens were collected. The sections were stained with hematoxylin and eosin, toluidin blue, and collagen II immunohistochemical. Simultaneously, biomechanical properties of grafts were assessed. This sample of PLGA membrane was conformed to the current standard of biological evaluation of medical devices. Moderate resorption was seen at the end of 3 months in the gross assessment in diced cartilage wrapped with PLGA membrane, while diced cartilage blended with PRP had no apparent resorption macroscopically and favorable viability in vivo after 3 months, and the histological parameters supported this. Stress-strain curves for the compression test indicated that the modulus of elasticity of bare diced cartilage was 7.65 ± 0.59 MPa; diced cartilage wrapped with PLGA membrane was 5.98 ± 0.45 MPa; and diced cartilage blended with PRP and wrapped with PLGA membrane was 7.48 ± 0.55 MPa, respectively. Diced cartilage wrapped with PLGA membrane had moderate resorption macroscopically after 3 months. However, blending with PRP has beneficial effects in improving the viability of diced cartilages. Additionally, the compression modulus of diced cartilage blended with PRP and wrapped with PLGA membrane was similar to bare diced cartilage.

  2. An investigation on die crack detection using Temperature Sensitive Parameter for high speed LED mass production

    NASA Astrophysics Data System (ADS)

    Annaniah, Luruthudass; Devarajan, Mutharasu; San, Teoh Kok

    To ensure the highest quality & long-term reliability of LED components it is necessary to examine LED dice that have sustained mechanical damage during the manufacturing process. This paper has demonstrated that detection of die crack in mass manufactured LEDs can be achieved by measuring Temperature Sensitive Parameters (TSPs) during final testing. A newly-designed apparatus and microcontroller was used for this investigation in order to achieve the millisecond switching time needed for detecting thermal transient effects and at the same time meet the expected speed for mass manufacturing. Evaluations conducted at lab scale shows that thermal transient behaviour of cracked die is significantly different than that of an undamaged die. Having an established test limits to differentiate cracked dice, large volume tests in a production environment were used to confirm the effectiveness of this test method. Failure Bin Analysis (FBA) of this high volume experiment confirmed that all the cracked die LEDs were detected and the undamaged LEDs passed this test without over-rejection. The work verifies that tests based on TSP are effective in identifying die cracks and it is believed that the method could be extended to other types of rejects that have thermal transient signatures such as die delamination.

  3. Slice&Dice: Recognizing Food Preparation Activities Using Embedded Accelerometers

    NASA Astrophysics Data System (ADS)

    Pham, Cuong; Olivier, Patrick

    Within the context of an endeavor to provide situated support for people with cognitive impairments in the kitchen, we developed and evaluated classifiers for recognizing 11 actions involved in food preparation. Data was collected from 20 lay subjects using four specially designed kitchen utensils incorporating embedded 3-axis accelerometers. Subjects were asked to prepare a mixed salad in our laboratory-based instrumented kitchen environment. Video of each subject's food preparation activities were independently annotated by three different coders. Several classifiers were trained and tested using these features. With an overall accuracy of 82.9% our investigation demonstrated that a broad set of food preparation actions can be reliably recognized using sensors embedded in kitchen utensils.

  4. Are Stupid Dice Necessary?

    ERIC Educational Resources Information Center

    Bermudez, Frank; Medina, Anthony; Rosin, Amber; Scott, Eren

    2013-01-01

    A pair of 6-sided dice cannot be relabeled to make the sums 2, 3,...., 12 equally likely. It is possible to label seven, 10-sided dice so that the sums 7. 8,..., 70 occur equally often. We investigate such relabelings for "pq"-sided dice, where "p" and "q" are distinct primes, and show that these relabelings usually…

  5. Markov Chain Analysis of Musical Dice Games

    NASA Astrophysics Data System (ADS)

    Volchenkov, D.; Dawin, J. R.

    2012-07-01

    A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.

  6. Free Diced Cartilage: A New Application of Diced Cartilage Grafts in Primary and Secondary Rhinoplasty.

    PubMed

    Kreutzer, Christian; Hoehne, Julius; Gubisch, Wolfgang; Rezaeian, Farid; Haack, Sebastian

    2017-09-01

    Irregularities or deformities of the nasal dorsum after hump reduction account for a significant number of revision rhinoplasties. The authors therefore developed a technique of meticulously dicing and exactly placing free diced cartilage grafts, harvested from septum, rib, or ear cartilage. The cartilage paste is used for smoothening, augmentation, or camouflaging of the nasal dorsum in primary or revision rhinoplasties. A retrospective analysis of multisurgeon consecutive open approach rhinoplasties from January to December of 2014 was conducted at a single center. The authors compared the outcome of three different techniques to augment or cover the nasal dorsum after an observation period of 7 months. In group I, 325 patients with free diced cartilage grafts as the only onlay were included. In group II, consisting of 73 patients, the dorsal onlay was either fascia alone or in combination with free diced cartilage grafts. Forty-eight patients in group III received a dorsal augmentation with the classic diced cartilage in fascia technique. Four hundred forty-six patients undergoing primary and secondary rhinoplasties in which one of the above-mentioned diced cartilage techniques was used were included in the study. The authors found revision rates for dorsal irregularities within the 7-month postoperative observation period of 5.2, 8.2, and 25 percent for groups I, II, and III, respectively. The authors' findings strongly support their clinical experience that the free diced cartilage graft technique presents an effective and easily reproducible method for camouflage and augmentation in aesthetic and reconstructive rhinoplasty.

  7. Digital Image Correlation Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Dan; Crozier, Paul; Reu, Phil

    DICe is an open source digital image correlation (DIC) tool intended for use as a module in an external application or as a standalone analysis code. It's primary capability is computing full-field displacements and strains from sequences of digital These images are typically of a material sample undergoing a materials characterization experiment, but DICe is also useful for other applications (for example, trajectory tracking). DICe is machine portable (Windows, Linux and Mac) and can be effectively deployed on a high performance computing platform. Capabilities from DICe can be invoked through a library interface, via source code integration of DICe classesmore » or through a graphical user interface.« less

  8. Two-Dice Horse Race

    ERIC Educational Resources Information Center

    Foster, Colin; Martin, David

    2016-01-01

    We analyse the "two-dice horse race" task often used in lower secondary school, in which two ordinary dice are thrown repeatedly and each time the sum of the scores determines which horse (numbered 1 to 12) moves forwards one space.

  9. A Touch of...Class!

    ERIC Educational Resources Information Center

    Netten, Joan W., Ed.

    1984-01-01

    A collection of ideas for class activities in elementary and secondary language classes includes a vocabulary review exercise and games of memory, counting, vocabulary, flashcard tic-tac-toe, dice, trashcans, questioning, and spelling. Some are designed specifically for French. (MSE)

  10. Healthy Blood Pressure "It's worth the effort!" | NIH MedlinePlus the Magazine

    MedlinePlus

    ... drain excess liquid. Add onion, green and red peppers – stir until tender Add kidney beans, diced tomatoes ... 16 oz) – rinse and drain 1 red bell pepper, diced 1 green bell pepper, diced 1 medium ...

  11. MicroSEQ® Salmonella spp. Detection Kit Using the Pathatrix® 10-Pooling Salmonella spp. Kit Linked Protocol Method Modification.

    PubMed

    Wall, Jason; Conrad, Rick; Latham, Kathy; Liu, Eric

    2014-03-01

    Real-time PCR methods for detecting foodborne pathogens offer the advantages of simplicity and quick time to results compared to traditional culture methods. The addition of a recirculating pooled immunomagnetic separation method prior to real-time PCR analysis increases processing output while reducing both cost and labor. This AOAC Research Institute method modification study validates the MicroSEQ® Salmonella spp. Detection Kit [AOAC Performance Tested Method (PTM) 031001] linked with the Pathatrix® 10-Pooling Salmonella spp. Kit (AOAC PTM 090203C) in diced tomatoes, chocolate, and deli ham. The Pathatrix 10-Pooling protocol represents a method modification of the enrichment portion of the MicroSEQ Salmonella spp. The results of the method modification were compared to standard cultural reference methods for diced tomatoes, chocolate, and deli ham. All three matrixes were analyzed in a paired study design. An additional set of chocolate test portions was analyzed using an alternative enrichment medium in an unpaired study design. For all matrixes tested, there were no statistically significant differences in the number of positive test portions detected by the modified candidate method compared to the appropriate reference method. The MicroSEQ Salmonella spp. protocol linked with the Pathatrix individual or 10-Pooling procedure demonstrated reliability as a rapid, simplified, method for the preparation of samples and subsequent detection of Salmonella in diced tomatoes, chocolate, and deli ham.

  12. Contouring Variability of the Penile Bulb on CT Images: Quantitative Assessment Using a Generalized Concordance Index

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia

    2012-11-01

    Purpose: Within a multicenter study (DUE-01) focused on the search of predictors of erectile dysfunction and urinary toxicity after radiotherapy for prostate cancer, a dummy run exercise on penile bulb (PB) contouring on computed tomography (CT) images was carried out. The aim of this study was to quantitatively assess interobserver contouring variability by the application of the generalized DICE index. Methods and Materials: Fifteen physicians from different Institutes drew the PB on CT images of 10 patients. The spread of DICE values was used to objectively select those observers who significantly disagreed with the others. The analyses were performed withmore » a dedicated module in the VODCA software package. Results: DICE values were found to significantly change among observers and patients. The mean DICE value was 0.67, ranging between 0.43 and 0.80. The statistics of DICE coefficients identified 4 of 15 observers who systematically showed a value below the average (p value range, 0.013 - 0.059): Mean DICE values were 0.62 for the 4 'bad' observers compared to 0.69 of the 11 'good' observers. For all bad observers, the main cause of the disagreement was identified. Average DICE values were significantly worse from the average in 2 of 10 patients (0.60 vs. 0.70, p < 0.05) because of the limited visibility of the PB. Excluding the bad observers and the 'bad' patients,' the mean DICE value increased from 0.67 to 0.70; interobserver variability, expressed in terms of standard deviation of DICE spread, was also reduced. Conclusions: The obtained values of DICE around 0.7 shows an acceptable agreement, considered the small dimension of the PB. Additional strategies to improve this agreement are under consideration and include an additional tutorial of the so-called bad observers with a recontouring procedure, or the recontouring by a single observer of the PB for all patients included in the DUE-01 study.« less

  13. PREFACE: 7th International Workshop DICE2014 Spacetime - Matter - Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Elze, H. T.; Diósi, L.; Fronzoni, L.; Halliwell, J. J.; Kiefer, C.; Prati, E.; Vitiello, G.

    2015-07-01

    Presented in this volume are the Invited Lectures and the Contributed Papers of the Seventh International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2014, held at Castello Pasquini, Castiglioncello (Tuscany), September 15-19, 2014. These proceedings are intended to reflect the lively exchange of ideas during the meeting for the interested public and the wider scientific community, as well as to provide a document of the scientific works presented. The number of participants has continued to grow, which may correspond to an increasing attraction, if not need, of such conference: Our very intention has always been to bring together leading researchers, advanced students, and renowned scholars from various areas, in order to stimulate new ideas and their exchange across the borders of specialization. In this way, the series of meetings successfully continued from the beginning with DICE 2002, followed by DICE 2004, DICE 2006, DICE 2008, DICE 2010, and DICE 2012. This time, DICE 2014 brought together more than 120 participants representing more than 30 countries. It has been a great honour and inspiration that we had with us Nobel Prize laureate Gerard 't Hooft (Utrecht - Keynote Lecture ''The Cellular Automaton Interpretation and Bell's Theorem''), Fields Medal winner Alain Connes (Paris - Keynote Lecture ''Quanta of geometry''), Professor Avshalom Elitzur (Rehovot - Keynote Lecture ''Voices of silence, novelties of noise: on some quantum hairsplitting methods with nontrivial consequences'', in this volume) and Professor Mario Rasetti (Torino - Keynote Lecture ''The topological field theory of data: a possible new venue for data mining'', in this volume). The opening Keynote Lecture ''History of electroweak symmetry breaking'' was presented by Sir Tom Kibble (London), co-discoverer of the Higgs mechanism, Sakurai Prize laureate and winner of, i.a., Dirac and Einstein Medals.

  14. Efficacy of platelet-rich fibrin matrix on viability of diced cartilage grafts in a rabbit model.

    PubMed

    Güler, İsmail; Billur, Deniz; Aydin, Sevim; Kocatürk, Sinan

    2015-03-01

    The objective of this study was to compare the viability of cartilage grafts embedded in platelet-rich fibrin matrix (PRFM) wrapped with no material (bare diced cartilage grafts), oxidized methylcellulose (Surgicel), or acellular dermal tissue (AlloDerm). Experimental study. In this study, six New Zealand rabbits were used. Cartilage grafts including perichondrium were excised from each ear and diced into 2-mm-by 2-mm pieces. There were four comparison groups: 1) group A, diced cartilage (not wrapped with any material); 2) group B, diced cartilage wrapped with AlloDerm; 3) group C, diced cartilage grafts wrapped with Surgicel; and 4) group D, diced cartilage wrapped with PRFM. Four cartilage grafts were implanted under the skin at the back of each rabbit. All rabbits were sacrificed at the end of 10 weeks. The cartilages were stained with hematoxylin-eosin, Masson's Trichrome, and Orcein. After that, they were evaluated for the viability of chondrocytes, collagen content, fibrillar structure of matrix, and changes in peripheral tissues. When the viability of chondrocytes, the content of fiber in matrix, and changes in peripheral tissues were compared, the cartilage embedded in the PRFM group was statistically significantly higher than in the other groups (P < 0.05). We concluded that PRFM has significant advantages in ensuring the chondrocyte viability of diced cartilage grafts. It is also biocompatible, with relatively lesser inflammation and fibrosis. © 2014 The American Laryngological, Rhinological and Otological Society, Inc.

  15. The diced cartilage glue graft for nasal augmentation. Morphometric evidence of longevity.

    PubMed

    Tasman, Abel-Jan; Diener, Pierre-André; Litschel, Ralph

    2013-03-01

    A grafting technique that uses diced cartilage without fascia, which improves formability while maintaining long-term stability, would be a welcome addition to the rhinoplasty armamentarium. A diced cartilage glue graft was recently introduced as the Tasman technique. The technique has been used by one of us (A.-J.T.) in 28 patients who were monitored clinically for 4 to 26 months. Sonographic morphometry of the graft was used in 10 patients with a maximum follow-up of 15 months, and 2 biopsies were obtained for histologic examination. Fashioning the diced cartilage glue graft reduced operating time compared with the diced cartilage fascia graft and allowed for a wide variety of transplant shapes and sizes, depending on the mold used. All grafts were used for augmentation of the nasal dorsum or radix and healed uneventfully. Sonographic cross-section measures of the grafts changed between 6% and –29%(median, –5%) in the early postoperative phase and 8%and –7% (median, –2%) between 3 and 15 months after insertion. Histologic examination of the graft biopsies revealed viable cartilage with signs of regeneration. The diced cartilage glue graft may become an attractive alternative to accepted methods for dorsal augmentation, the diced cartilage fascia graft in particular.

  16. Expectation and Variation with a Virtual Die

    ERIC Educational Resources Information Center

    Watson, Jane; English, Lyn

    2015-01-01

    By the time students reach the middle years they have experienced many chance activities based on dice. Common among these are rolling one die to explore the relationship of frequency and theoretical probability, and rolling two dice and summing the outcomes to consider their probabilities. Although dice may be considered overused by some, the…

  17. Musical Markov Chains

    NASA Astrophysics Data System (ADS)

    Volchenkov, Dima; Dawin, Jean René

    A system for using dice to compose music randomly is known as the musical dice game. The discrete time MIDI models of 804 pieces of classical music written by 29 composers have been encoded into the transition matrices and studied by Markov chains. Contrary to human languages, entropy dominates over redundancy, in the musical dice games based on the compositions of classical music. The maximum complexity is achieved on the blocks consisting of just a few notes (8 notes, for the musical dice games generated over Bach's compositions). First passage times to notes can be used to resolve tonality and feature a composer.

  18. Using Role-Playing Game Dice to Teach the Concepts of Symmetry

    ERIC Educational Resources Information Center

    Grafton, Anthony K.

    2011-01-01

    Finding and describing the location of symmetry elements in complex objects is often a difficult skill to learn. Introducing the concepts of symmetry using high-symmetry game dice is one way of helping students overcome this difficulty in introductory physical chemistry classes. The dice are inexpensive, reusable, and come in a variety of shapes…

  19. Lake Wobegon Dice

    ERIC Educational Resources Information Center

    Moraleda, Jorge; Stork, David G.

    2012-01-01

    We introduce Lake Wobegon dice, where each die is "better than the set average." Specifically, these dice have the paradoxical property that on every roll, each die is more likely to roll greater than the set average on the roll, than less than this set average. We also show how to construct minimal optimal Lake Wobegon sets for all "n" [greater…

  20. Using Dice Games to Teach Hazards, Risk, and Outcomes in HACCP Classes

    ERIC Educational Resources Information Center

    Oyarzabal, Omar A.

    2015-01-01

    This article describes the incorporation of a dice game (piggy) to teach food safety hazards and risk in an engaging way in HACCP classes. Each player accumulates points by rolling two dice, but loses points in a turn when rolling a 7, or all accumulated points when rolling two consecutive doubles. This game helps explain the difference between a…

  1. 21 CFR 102.39 - Onion rings made from diced onion.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Onion rings made from diced onion. 102.39 Section... Nonstandardized Foods § 102.39 Onion rings made from diced onion. (a) The common or usual name of the food product that resembles and is of the same composition as onion rings, except that it is composed of comminuted...

  2. 21 CFR 102.39 - Onion rings made from diced onion.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Onion rings made from diced onion. 102.39 Section... Nonstandardized Foods § 102.39 Onion rings made from diced onion. (a) The common or usual name of the food product that resembles and is of the same composition as onion rings, except that it is composed of comminuted...

  3. 21 CFR 102.39 - Onion rings made from diced onion.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Onion rings made from diced onion. 102.39 Section... Nonstandardized Foods § 102.39 Onion rings made from diced onion. (a) The common or usual name of the food product that resembles and is of the same composition as onion rings, except that it is composed of comminuted...

  4. 21 CFR 102.39 - Onion rings made from diced onion.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Onion rings made from diced onion. 102.39 Section... Nonstandardized Foods § 102.39 Onion rings made from diced onion. (a) The common or usual name of the food product that resembles and is of the same composition as onion rings, except that it is composed of comminuted...

  5. 21 CFR 102.39 - Onion rings made from diced onion.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Onion rings made from diced onion. 102.39 Section... Nonstandardized Foods § 102.39 Onion rings made from diced onion. (a) The common or usual name of the food product that resembles and is of the same composition as onion rings, except that it is composed of comminuted...

  6. MIDDLE NORTH Series Pre-DICE THROW I, II and DICE THROW Test Execution Report

    DTIC Science & Technology

    1978-04-01

    THROW 1 1- 14 1-10 Site Layout, 5 -Ton Events, Pre-DICE THROW I 1-16 1-11 Test Bed Layout - Pre-DICE THROW I 1-17 1-12 Airblast Gages and Instrumentation...111- 14 0-15 List of Tables (Cont’d) Table Page 3-3 Stacking Data 111-19 -22 3-4 Charge Weights 111-23 3- 5 ANFO Charge Summiary 111-23 3-6 Power...described in Table 1-1 and in Figures 1- 5 and 1-6 (these were detonated on 30 April, 14 May and 31 May 1975). Figures 1-7 and 1-8 show photographs of the ANFO

  7. Cross Comparison of Electron Density and Electron Temperature Observations from the DICE CubeSat Langmuir Probes and the Millstone Hill Incoherent Scatter Radar.

    NASA Astrophysics Data System (ADS)

    Swenson, C.; Erickson, P. J.; Crowley, G.; Pilinski, M.; Barjatya, A.; Fish, C. S.

    2014-12-01

    The Dynamic Ionosphere CubeSat Experiment (DICE) consists of two identical 1.5U CubeSats deployed simultaneously from a single P-POD (Poly Picosatellite Orbital Deployer) into the same orbit. Several observational campaigns were planned between the DICE CubeSats and the mid-latitude Millstone Hill Incoherent Scatter Radar (ISR) in order to calibrate the DICE measurements of electron density and electron temperature. In this presentation, we compare in-situ observations from the Dynamic Ionosphere CubeSat Experiment (DICE) and from the Millstone Hill ISR. Both measurements are cross-calibrated against an assimilative model of the global ionospheric electron density. The electron density and electron temperature were obtained for three Millstone Hill DICE overflights (2013-03-12, 2013-03-15, 2013-03-17). We compare the data during quiet and geomagnetically disturbed conditions and find evidence of an storm enhanced density (SED) plume in the topside ionosphere on 2013-03-17 at 19? UTC. During this disturbed interval, American longitude sector high density plasma was convected near 15 SLT towards the noontime cusp. DICE was selected for flight under the NSF "CubeSat-based Science Mission for Space Weather and Atmospheric Research" program. The DICE twin satellites were launched on a Delta II rocket on October 28, 2011. The satellites are flying in a "leader-follower" formation in an elliptical orbit which ranges from 820 to 400 km in altitude. Each satellite carries a fixed-bias DC Langmuir Probe (DCP) to measure in-situ ionospheric plasma densities and a science grade magnetometer to measure DC and AC geomagnetic fields. The purpose of these measurements was to permit accurate identification of storm-time features such as the SED bulge and plume. The mission team combines expertise from ASTRA, Utah State University/Space Dynamics Laboratory (USU/SDL), and Embry-Riddle Aeronautical University. In this paper we present a comparison of data from DICE and Millstone Hill ISR during quiet and magnetically disturbed conditions.

  8. Automated detection and classification of dice

    NASA Astrophysics Data System (ADS)

    Correia, Bento A. B.; Silva, Jeronimo A.; Carvalho, Fernando D.; Guilherme, Rui; Rodrigues, Fernando C.; de Silva Ferreira, Antonio M.

    1995-03-01

    This paper describes a typical machine vision system in an unusual application, the automated visual inspection of a Casino's playing tables. The SORTE computer vision system was developed at INETI under a contract with the Portuguese Gaming Inspection Authorities IGJ. It aims to automate the tasks of detection and classification of the dice's scores on the playing tables of the game `Banca Francesa' (which means French Banking) in Casinos. The system is based on the on-line analysis of the images captured by a monochrome CCD camera placed over the playing tables, in order to extract relevant information concerning the score indicated by the dice. Image processing algorithms for real time automatic throwing detection and dice classification were developed and implemented.

  9. The Brandeis Dice Problem and Statistical Mechanics

    NASA Astrophysics Data System (ADS)

    van Enk, Steven J.

    2014-11-01

    Jaynes invented the Brandeis Dice Problem as a simple illustration of the MaxEnt (Maximum Entropy) procedure that he had demonstrated to work so well in Statistical Mechanics. I construct here two alternative solutions to his toy problem. One, like Jaynes' solution, uses MaxEnt and yields an analog of the canonical ensemble, but at a different level of description. The other uses Bayesian updating and yields an analog of the micro-canonical ensemble. Both, unlike Jaynes' solution, yield error bars, whose operational merits I discuss. These two alternative solutions are not equivalent for the original Brandeis Dice Problem, but become so in what must, therefore, count as the analog of the thermodynamic limit, M-sided dice with M → ∞. Whereas the mathematical analogies between the dice problem and Stat Mech are quite close, there are physical properties that the former lacks but that are crucial to the workings of the latter. Stat Mech is more than just MaxEnt.

  10. Quantum dice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sassoli de Bianchi, Massimiliano, E-mail: autoricerca@gmail.com

    In a letter to Born, Einstein wrote [42]: “Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the ‘old one.’ I, at any rate, am convinced that He does not throw dice.” In this paper we take seriously Einstein’s famous metaphor, and show that we can gain considerable insight into quantum mechanics by doing something as simple as rolling dice. More precisely, we show how to perform measurements on a single die, tomore » create typical quantum interference effects, and how to connect (entangle) two identical dice, to maximally violate Bell’s inequality. -- Highlights: •Rolling a die is a quantum process admitting a Hilbert space representation. •Rolling experiments with a single die can produce interference effects. •Two connected dice can violate Bell’s inequality. •Correlations need to be created by the measurement, to violate Bell’s inequality.« less

  11. Contouring variability of human- and deformable-generated contours in radiotherapy for prostate cancer

    NASA Astrophysics Data System (ADS)

    Gardner, Stephen J.; Wen, Ning; Kim, Jinkoo; Liu, Chang; Pradhan, Deepak; Aref, Ibrahim; Cattaneo, Richard, II; Vance, Sean; Movsas, Benjamin; Chetty, Indrin J.; Elshaikh, Mohamed A.

    2015-06-01

    This study was designed to evaluate contouring variability of human-and deformable-generated contours on planning CT (PCT) and CBCT for ten patients with low-or intermediate-risk prostate cancer. For each patient in this study, five radiation oncologists contoured the prostate, bladder, and rectum, on one PCT dataset and five CBCT datasets. Consensus contours were generated using the STAPLE method in the CERR software package. Observer contours were compared to consensus contour, and contour metrics (Dice coefficient, Hausdorff distance, Contour Distance, Center-of-Mass [COM] Deviation) were calculated. In addition, the first day CBCT was registered to subsequent CBCT fractions (CBCTn: CBCT2-CBCT5) via B-spline Deformable Image Registration (DIR). Contours were transferred from CBCT1 to CBCTn via the deformation field, and contour metrics were calculated through comparison with consensus contours generated from human contour set. The average contour metrics for prostate contours on PCT and CBCT were as follows: Dice coefficient—0.892 (PCT), 0.872 (CBCT-Human), 0.824 (CBCT-Deformed); Hausdorff distance—4.75 mm (PCT), 5.22 mm (CBCT-Human), 5.94 mm (CBCT-Deformed); Contour Distance (overall contour)—1.41 mm (PCT), 1.66 mm (CBCT-Human), 2.30 mm (CBCT-Deformed); COM Deviation—2.01 mm (PCT), 2.78 mm (CBCT-Human), 3.45 mm (CBCT-Deformed). For human contours on PCT and CBCT, the difference in average Dice coefficient between PCT and CBCT (approx. 2%) and Hausdorff distance (approx. 0.5 mm) was small compared to the variation between observers for each patient (standard deviation in Dice coefficient of 5% and Hausdorff distance of 2.0 mm). However, additional contouring variation was found for the deformable-generated contours (approximately 5.0% decrease in Dice coefficient and 0.7 mm increase in Hausdorff distance relative to human-generated contours on CBCT). Though deformable contours provide a reasonable starting point for contouring on CBCT, we conclude that contours generated with B-Spline DIR require physician review and editing if they are to be used in the clinic.

  12. Rib Diced Cartilage-Fascia Grafting in Dorsal Nasal Reconstruction: A Randomized Clinical Trial of Wrapping With Rectus Muscle Fascia vs Deep Temporal Fascia.

    PubMed

    As'adi, Kamran; Salehi, Seyed Hamid; Shoar, Saeed

    2014-08-01

    Rib cartilage is an abundant source for cartilage grafts when significant dorsal nasal augmentation or structural support is indicated. Diced cartilage wrapped in fascia was developed to counteract warping, visibility, and displacement of rib cartilage as a dorsal solid graft. The technique for wrapping diced cartilage has evolved during the past several years. The authors compared 2 distinct fascial sleeves for wrapping rib diced cartilage in the treatment of patients who required major dorsal nasal augmentation. Thirty-six patients who planned to undergo major dorsal nasal reconstruction with diced costal rib cartilage were assigned randomly to 1 of 2 groups: the intervention group, which received grafts wrapped with rectus muscle fascia from the rib cartilage harvesting site, or the control group, which received deep temporal fascia harvested separately. Outcomes were compared between the groups. Patients in the intervention group had significantly shorter operating times, significantly higher average satisfaction scores, and significantly shorter postoperative hospital stays than did patients in the control group. Harvesting rectus muscle fascia for wrapping diced rib cartilage is a feasible and reliable technique in dorsal nasal reconstruction surgery. It is associated with favorable outcomes and a high level of patient satisfaction. 4. © 2014 The American Society for Aesthetic Plastic Surgery, Inc.

  13. Dice in space

    NASA Image and Video Library

    2014-05-31

    ISS040-E-006093 (31 May 2014) --- With a few explanatory words attached to a message to Earth, Expedition 40 Flight Engineer Reid Wiseman of NASA sent down this image of a single piece of dice floating in front of one of the windows in the Cupola of the Earth-orbiting International Space Station. Wiseman commented, "This one is just for us board game players, table top strategy gamers, (etc.) whose dice collection behaviour borders on hoarding."

  14. Hacienda. Technical Note No. 3.

    ERIC Educational Resources Information Center

    Hoxeng, James

    This paper describes a simulation game, "Hacienda," designed to replicate the economic and social realities of the peasants' situation in rural Ecuador. The game involves three to 15 players (and often more), one of whom, by a roll of the dice, takes the role of "hacendado," or hacienda owner, who gains title to all the…

  15. A novel autologous scaffold for diced-cartilage grafts in dorsal augmentation rhinoplasty.

    PubMed

    Bullocks, Jamal M; Echo, Anthony; Guerra, Gerardo; Stal, Samuel; Yuksel, Eser

    2011-08-01

    Diced-cartilage grafts have been used for dorsal nasal augmentation for several years with good results. However, compounds such as Surgicel and temporalis fascia used as a wrap have inherent problems associated with them, predominantly inflammation and graft resorption. An autologous carrier could provide stabilization of cartilage grafts while avoiding the complications seen with earlier techniques. In our patients, a malleable construct was used for dorsal nasal augmentation in which autologous diced-cartilage grafts were stabilized with autologous tissue glue (ATG) created from platelet-rich plasma (platelet gel) and platelet-poor plasma (fibrin glue). A prospective analysis of 68 patients, who underwent dorsal nasal augmentation utilizing ATG and diced-cartilage grafts between 2005 and 2008, were included in the study. Although there was notable maintenance of the dorsal height, no complications occurred that required explantation over a mean follow-up of 15 months. The use of ATG to stabilize diced-cartilage grafts is a safe, reliable technique for dorsal nasal augmentation. The platelet gel provides growth factors while the fibrin glue creates a scaffold that allows stabilization and diffusion of nutrients to the cartilage graft.

  16. Low optical-loss facet preparation for silica-on-silicon photonics using the ductile dicing regime

    NASA Astrophysics Data System (ADS)

    Carpenter, Lewis G.; Rogers, Helen L.; Cooper, Peter A.; Holmes, Christopher; Gates, James C.; Smith, Peter G. R.

    2013-11-01

    The efficient production of high-quality facets for low-loss coupling is a significant production issue in integrated optics, usually requiring time consuming and manually intensive lapping and polishing steps, which add considerably to device fabrication costs. The development of precision dicing saws with diamond impregnated blades has allowed optical grade surfaces to be machined in crystalline materials such as lithium niobate and garnets. In this report we investigate the optimization of dicing machine parameters to obtain optical quality surfaces in a silica-on-silicon planar device demonstrating high optical quality in a commercially important glassy material. We achieve a surface roughness of 4.9 nm (Sa) using the optimized dicing conditions. By machining a groove across a waveguide, using the optimized dicing parameters, a grating based loss measurement technique is used to measure precisely the average free space interface loss per facet caused by scattering as a consequence of surface roughness. The average interface loss per facet was calculated to be: -0.63 dB and -0.76 dB for the TE and TM polarizations, respectively.

  17. Parallel processing for scientific computations

    NASA Technical Reports Server (NTRS)

    Alkhatib, Hasan S.

    1995-01-01

    The scope of this project dealt with the investigation of the requirements to support distributed computing of scientific computations over a cluster of cooperative workstations. Various experiments on computations for the solution of simultaneous linear equations were performed in the early phase of the project to gain experience in the general nature and requirements of scientific applications. A specification of a distributed integrated computing environment, DICE, based on a distributed shared memory communication paradigm has been developed and evaluated. The distributed shared memory model facilitates porting existing parallel algorithms that have been designed for shared memory multiprocessor systems to the new environment. The potential of this new environment is to provide supercomputing capability through the utilization of the aggregate power of workstations cooperating in a cluster interconnected via a local area network. Workstations, generally, do not have the computing power to tackle complex scientific applications, making them primarily useful for visualization, data reduction, and filtering as far as complex scientific applications are concerned. There is a tremendous amount of computing power that is left unused in a network of workstations. Very often a workstation is simply sitting idle on a desk. A set of tools can be developed to take advantage of this potential computing power to create a platform suitable for large scientific computations. The integration of several workstations into a logical cluster of distributed, cooperative, computing stations presents an alternative to shared memory multiprocessor systems. In this project we designed and evaluated such a system.

  18. Random Variables: Simulations and Surprising Connections.

    ERIC Educational Resources Information Center

    Quinn, Robert J.; Tomlinson, Stephen

    1999-01-01

    Features activities for advanced second-year algebra students in grades 11 and 12. Introduces three random variables and considers an empirical and theoretical probability for each. Uses coins, regular dice, decahedral dice, and calculators. (ASK)

  19. multi-dice: r package for comparative population genomic inference under hierarchical co-demographic models of independent single-population size changes.

    PubMed

    Xue, Alexander T; Hickerson, Michael J

    2017-11-01

    Population genetic data from multiple taxa can address comparative phylogeographic questions about community-scale response to environmental shifts, and a useful strategy to this end is to employ hierarchical co-demographic models that directly test multi-taxa hypotheses within a single, unified analysis. This approach has been applied to classical phylogeographic data sets such as mitochondrial barcodes as well as reduced-genome polymorphism data sets that can yield 10,000s of SNPs, produced by emergent technologies such as RAD-seq and GBS. A strategy for the latter had been accomplished by adapting the site frequency spectrum to a novel summarization of population genomic data across multiple taxa called the aggregate site frequency spectrum (aSFS), which potentially can be deployed under various inferential frameworks including approximate Bayesian computation, random forest and composite likelihood optimization. Here, we introduce the r package multi-dice, a wrapper program that exploits existing simulation software for flexible execution of hierarchical model-based inference using the aSFS, which is derived from reduced genome data, as well as mitochondrial data. We validate several novel software features such as applying alternative inferential frameworks, enforcing a minimal threshold of time surrounding co-demographic pulses and specifying flexible hyperprior distributions. In sum, multi-dice provides comparative analysis within the familiar R environment while allowing a high degree of user customization, and will thus serve as a tool for comparative phylogeography and population genomics. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  20. [Comparison of reproducibility measurements for calibration of dental caries epidemiological surveys].

    PubMed

    Assaf, Andréa Videira; Zanin, Luciane; Meneghim, Marcelo de Castro; Pereira, Antonio Carlos; Ambrosano, Gláucia Maria Bovi

    2006-09-01

    This study compares three measurements (Kappa, general agreement percentage, or GAP, and dice index) used to determine the reproducibility of caries diagnosis in epidemiological surveys under different clinical diagnostic thresholds. Eleven examiners with previous experience in epidemiological surveys were submitted to a theoretical and clinical calibration process. Data analysis used two caries detection thresholds: World Health Organization (WHO) and WHO with the inclusion of initial enamel lesions (WHO + IL). Twenty-three children 6-7 years of age were examined, with and without caries. Mean values for Kappa index, GAP, and Dice were considered high (> 0.90), except for the dice index for the WHO + IL threshold (0.69). Since Kappa is an adjusted agreement index, it can be considered the instrument of choice for calibration of examiners. However, when it is impossible to use, the GAP is recommended together with the dice index in order to orient and improve examiners when examining caries lesions.

  1. Deformable image registration based automatic CT-to-CT contour propagation for head and neck adaptive radiotherapy in the routine clinical setting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumarasiri, Akila, E-mail: akumara1@hfhs.org; Siddiqui, Farzan; Liu, Chang

    2014-12-15

    Purpose: To evaluate the clinical potential of deformable image registration (DIR)-based automatic propagation of physician-drawn contours from a planning CT to midtreatment CT images for head and neck (H and N) adaptive radiotherapy. Methods: Ten H and N patients, each with a planning CT (CT1) and a subsequent CT (CT2) taken approximately 3–4 week into treatment, were considered retrospectively. Clinically relevant organs and targets were manually delineated by a radiation oncologist on both sets of images. Four commercial DIR algorithms, two B-spline-based and two Demons-based, were used to deform CT1 and the relevant contour sets onto corresponding CT2 images. Agreementmore » of the propagated contours with manually drawn contours on CT2 was visually rated by four radiation oncologists in a scale from 1 to 5, the volume overlap was quantified using Dice coefficients, and a distance analysis was done using center of mass (CoM) displacements and Hausdorff distances (HDs). Performance of these four commercial algorithms was validated using a parameter-optimized Elastix DIR algorithm. Results: All algorithms attained Dice coefficients of >0.85 for organs with clear boundaries and those with volumes >9 cm{sup 3}. Organs with volumes <3 cm{sup 3} and/or those with poorly defined boundaries showed Dice coefficients of ∼0.5–0.6. For the propagation of small organs (<3 cm{sup 3}), the B-spline-based algorithms showed higher mean Dice values (Dice = 0.60) than the Demons-based algorithms (Dice = 0.54). For the gross and planning target volumes, the respective mean Dice coefficients were 0.8 and 0.9. There was no statistically significant difference in the Dice coefficients, CoM, or HD among investigated DIR algorithms. The mean radiation oncologist visual scores of the four algorithms ranged from 3.2 to 3.8, which indicated that the quality of transferred contours was “clinically acceptable with minor modification or major modification in a small number of contours.” Conclusions: Use of DIR-based contour propagation in the routine clinical setting is expected to increase the efficiency of H and N replanning, reducing the amount of time needed for manual target and organ delineations.« less

  2. A Case Study of Collaboration with Multi-Robots and Its Effect on Children's Interaction

    ERIC Educational Resources Information Center

    Hwang, Wu-Yuin; Wu, Sheng-Yi

    2014-01-01

    Learning how to carry out collaborative tasks is critical to the development of a student's capacity for social interaction. In this study, a multi-robot system was designed for students. In three different scenarios, students controlled robots in order to move dice; we then examined their collaborative strategies and their behavioral…

  3. An overview of the stereo correlation and triangulation formulations used in DICe.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, Daniel Z.

    This document provides a detailed overview of the stereo correlation algorithm and triangulation formulation used in the Digital Image Correlation Engine (DICe) to triangulate three dimensional motion in space given the image coordinates and camera calibration parameters.

  4. A 45° saw-dicing process applied to a glass substrate for wafer-level optical splitter fabrication for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Maciel, M. J.; Costa, C. G.; Silva, M. F.; Gonçalves, S. B.; Peixoto, A. C.; Ribeiro, A. Fernando; Wolffenbuttel, R. F.; Correia, J. H.

    2016-08-01

    This paper reports on the development of a technology for the wafer-level fabrication of an optical Michelson interferometer, which is an essential component in a micro opto-electromechanical system (MOEMS) for a miniaturized optical coherence tomography (OCT) system. The MOEMS consists on a titanium dioxide/silicon dioxide dielectric beam splitter and chromium/gold micro-mirrors. These optical components are deposited on 45° tilted surfaces to allow the horizontal/vertical separation of the incident beam in the final micro-integrated system. The fabrication process consists of 45° saw dicing of a glass substrate and the subsequent deposition of dielectric multilayers and metal layers. The 45° saw dicing is fully characterized in this paper, which also includes an analysis of the roughness. The optimum process results in surfaces with a roughness of 19.76 nm (rms). The actual saw dicing process for a high-quality final surface results as a compromise between the dicing blade’s grit size (#1200) and the cutting speed (0.3 mm s-1). The proposed wafer-level fabrication allows rapid and low-cost processing, high compactness and the possibility of wafer-level alignment/assembly with other optical micro components for OCT integrated imaging.

  5. Air Quality Impact of Diffuse and Inefficient Combustion Emissions in Africa (DICE-Africa).

    PubMed

    Marais, Eloise A; Wiedinmyer, Christine

    2016-10-04

    Anthropogenic pollution in Africa is dominated by diffuse and inefficient combustion sources, as electricity access is low and motorcycles and outdated cars proliferate. These sources are missing, out-of-date, or misrepresented in state-of-the-science emission inventories. We address these deficiencies with a detailed inventory of Diffuse and Inefficient Combustion Emissions in Africa (DICE-Africa) for 2006 and 2013. Fuelwood for energy is the largest emission source in DICE-Africa, but grows from 2006 to 2013 at a slower rate than charcoal production and use, and gasoline and diesel for motorcycles, cars, and generators. Only kerosene use and gas flaring decline. Increase in emissions from 2006 to 2013 in this work is consistent with trends in satellite observations of formaldehyde and NO 2 , but much slower than the explosive growth projected with a fuel consumption model. Seasonal biomass burning is considered a large pollution source in Africa, but we estimate comparable emissions of black carbon and higher emissions of nonmethane volatile organic compounds from DICE-Africa. Nitrogen oxide (NO x ≡ NO + NO 2 ) emissions are much lower than from biomass burning. We use GEOS-Chem to estimate that the largest contribution of DICE-Africa to annual mean surface fine particulate matter (PM 2.5 ) is >5 μg m -3 in populous Nigeria.

  6. Unders and Overs: Using a Dice Game to Illustrate Basic Probability Concepts

    ERIC Educational Resources Information Center

    McPherson, Sandra Hanson

    2015-01-01

    In this paper, the dice game "Unders and Overs" is described and presented as an active learning exercise to introduce basic probability concepts. The implementation of the exercise is outlined and the resulting presentation of various probability concepts are described.

  7. DICE: A novel tumor surveillance mechanism-a new therapy for cancer?

    PubMed

    Peter, Marcus E

    2014-01-01

    The conventional view of CD95 (Fas/APO-1) is that it is a dedicated apoptosis-inducing receptor with important functions in immune cell homeostasis and in viral and tumor defense. There is an emerging recognition, however, that CD95 also has multiple non-apoptotic activities. In the context of cancer, CD95 was shown to have tumor-promoting activities, and the concept of this new function of CD95 in cancer is gaining traction. Recently, we showed that not only is CD95 a growth promoter for cancer cells, but, paradoxically, when either CD95 or CD95 ligand (CD95L) is removed, that virtually all cancer cells die through a process we have named DICE (death induced by CD95R/L elimination). In this perspective, I outline a hypothesis regarding the physiological function of DICE, and why it may be possible to use induction of DICE to treat many, if not most, cancers.

  8. Expressing gambling-related cognitive biases in motor behaviour: rolling dice to win prizes.

    PubMed

    Lim, Matthew S M; Bowden-Jones, Henrietta; Rogers, Robert D

    2014-09-01

    Cognitive perspectives on gambling propose that biased thinking plays a significant role in sustaining gambling participation and, in vulnerable individuals, gambling problems. One prominent set of cognitive biases include illusions of control involving beliefs that it is possible to influence random gaming events. Sociologists have reported that (some) gamblers believe that it is possible to throw dice in different ways to achieve gaming outcomes (e.g., 'dice-setting' in craps). However, experimental demonstrations of these phenomena are lacking. Here, we asked regular gamblers to roll a computer-simulated, but fair, 6 sided die for monetary prizes. Gamblers allowed the die to roll for longer when attempting to win higher value bets, and when attempting to hit high winning numbers. This behaviour was exaggerated in gamblers motivated to keep gambling following the experience of almost-winning in gambling games. These results suggest that gambling cognitive biases find expression in the motor behaviour of rolling dice for monetary prizes, possibly reflecting embodied substrates.

  9. Initiative in Concurrent Engineering (DICE). Phase 1.

    DTIC Science & Technology

    1990-02-09

    and power of commercial and military electronics systems. The continual evolution of HDE technology offers far greater flexibility in circuit design... powerful magnetic field of the permanent magnets in the sawyer motors. This makes it possible to have multiple robots in the workcell and to have them...Controller. The Adept IC was chosen because of its extensive processing power , integrated grayscale vision, standard 28 industrial I/O control

  10. Quantum systems as embarrassed colleagues: what do tax evasion and state tomography have in common?

    NASA Astrophysics Data System (ADS)

    Ferrie, Chris; Blume-Kohout, Robin

    2011-03-01

    Quantum state estimation (a.k.a. ``tomography'') plays a key role in designing quantum information processors. As a problem, it resembles probability estimation - e.g. for classical coins or dice - but with some subtle and important discrepancies. We demonstrate an improved classical analogue that captures many of these differences: the ``noisy coin.'' Observations on noisy coins are unreliable - much like soliciting sensitive information such as ones tax preparation habits. So, like a quantum system, it cannot be sampled directly. Unlike standard coins or dice, whose worst-case estimation risk scales as 1 / N for all states, noisy coins (and quantum states) have a worst-case risk that scales as 1 /√{ N } and is overwhelmingly dominated by nearly-pure states. The resulting optimal estimation strategies for noisy coins are surprising and counterintuitive. We demonstrate some important consequences for quantum state estimation - in particular, that adaptive tomography can recover the 1 / N risk scaling of classical probability estimation.

  11. Variations on a Simple Dice Game

    ERIC Educational Resources Information Center

    Heafner, Joe

    2018-01-01

    I begin my introductory astronomy course with a unit on critical thinking that focuses on, among other things, the differences between the "scientific method" as frequently presented in textbooks and actual scientific practice. One particular classroom activity uses a simple dice game to simulate observation of a natural phenomenon and…

  12. Prism-type holographic optical element design and verification for the blue-light small-form-factor optical pickup head.

    PubMed

    Shih, Hsi-Fu; Chiu, Yi; Cheng, Stone; Lee, Yuan-Chin; Lu, Chun-Shin; Chen, Yung-Chih; Chiou, Jin-Chern

    2012-08-20

    This paper presents the prism-type holographic optical element (PT-HOE) design for a small-form-factor (SFF) optical pickup head (OPH). The surface of the PT-HOE was simulated by three steps of optimization and generated by binary optics. Its grating pattern was fabricated on the inclined plane of a microprism by using the standard photolithography and specific dicing procedures. The optical characteristics of the device were verified. Based on the virtual image method, the SFF-OPH with the device was assembled and realized.

  13. Psychophysics of Complex Auditory and Speech Stimuli

    DTIC Science & Technology

    1993-10-31

    unexpected, and does not seem to l:a ý a dice-ct counterpart in the extensive research on pitch perception. Experiment 2 was designed to quantify our...project is to use of different procedures to provide converging evidence on the natuge of perceptual spaces for speech categories. Completed research ...prior speech research on classification procedures may have led to errors. Thus, the opposite (falling F2 & F3) transitions lead somewhat ambiguous

  14. 21 CFR 155.200 - Certain other canned vegetables.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... of the sweet pepper plant Whole; halves or halved; pieces; dice or diced; strips; chopped. Red sweet peppers Red-ripe pods of the sweet pepper plant Do. Pimientos or pimentos Red-ripe pods of the pimiento.... (v) Spice. (vi) A vinegar. (vii) Green peppers or red peppers which may be dried. (viii) Mint leaves...

  15. Sensory and chemical changes in tomato sauces during storage.

    PubMed

    Landy, Pascale; Boucon, Claire; Kooyman, Gonnie M; Musters, Pieter A D; Rosing, Ed A E; De Joode, Teun; Laan, Jan; Haring, Peter G M

    2002-05-22

    The present work aimed to identify the key odorants of tomato sauces responsible for the flavor change during storage. Products made from paste or canned tomatoes were stored at 25 and 40 degrees C. Sensory properties and quantification of the key odorants were measured and correlated. Significant sensory changes appeared after 1 and 3 months at 25 degrees C in the respective dice and paste sauces (p < 0.01). The dice sauce was characterized by a steep loss of the sensory quality during the early storage and then by identical changes within the same time span at 25 and 40 degrees C. In the paste sauce the sensory deterioration was slower than for the dice sauce and occurred more extensively at 40 degrees C than at 25 degrees C. Correlation between sensory and instrumental data revealed that the source of sensory changes should be (E,E)-deca-2,4-dienal in the dice sauce. The sensory change in the paste sauce could be due to acetaldehyde, methylpropanal, 3-methylbutanal, oct-1-en-3-one, 3-methylbutanoic acid, deca-2,4-dienal, 2-methoxyphenol, and beta-damascenone.

  16. The Star Schema Benchmark and Augmented Fact Table Indexing

    NASA Astrophysics Data System (ADS)

    O'Neil, Patrick; O'Neil, Elizabeth; Chen, Xuedong; Revilak, Stephen

    We provide a benchmark measuring star schema queries retrieving data from a fact table with Where clause column restrictions on dimension tables. Clustering is crucial to performance with modern disk technology, since retrievals with filter factors down to 0.0005 are now performed most efficiently by sequential table search rather than by indexed access. DB2’s Multi-Dimensional Clustering (MDC) provides methods to "dice" the fact table along a number of orthogonal "dimensions", but only when these dimensions are columns in the fact table. The diced cells cluster fact rows on several of these "dimensions" at once so queries restricting several such columns can access crucially localized data, with much faster query response. Unfortunately, columns of dimension tables of a star schema are not usually represented in the fact table. In this paper, we show a simple way to adjoin physical copies of dimension columns to the fact table, dicing data to effectively cluster query retrieval, and explain how such dicing can be achieved on database products other than DB2. We provide benchmark measurements to show successful use of this methodology on three commercial database products.

  17. A Combined Desorption Ionization by Charge Exchange (DICE) and Desorption Electrospray Ionization (DESI) Source for Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Chan, Chang-Ching; Bolgar, Mark S.; Miller, Scott A.; Attygalle, Athula B.

    2011-01-01

    A source that couples the desorption ionization by charge exchange (DICE) and desorption electrospray ionization (DESI) techniques together was demonstrated to broaden the range of compounds that can be analyzed in a single mass spectrometric experiment under ambient conditions. A tee union was used to mix the spray reagents into a partially immiscible blend before this mixture was passed through a conventional electrospray (ES) probe capillary. Using this technique, compounds that are ionized more efficiently by the DICE method and those that are ionized better with the DESI procedure could be analyzed simultaneously. For example, hydroquinone, which is not detected when subjected to DESI-MS in the positive-ion generation mode, or the sodium adduct of guaifenesin, which is not detected when examined by DICE-MS, could both be detected in one experiment when the two techniques were combined. The combined technique was able to generate the molecular ion, proton and metal adduct from the same compound. When coupled to a tandem mass spectrometer, the combined source enabled the generation of product ion spectra from the molecular ion and the [M + H]+ or [M + metal]+ ions of the same compound without the need to physically change the source from DICE to DESI. The ability to record CID spectra of both the molecular ion and adduct ions in a single mass spectrometric experiment adds a new dimension to the array of mass spectrometric methods available for structural studies.

  18. Development of a novel cell sorting method that samples population diversity in flow cytometry.

    PubMed

    Osborne, Geoffrey W; Andersen, Stacey B; Battye, Francis L

    2015-11-01

    Flow cytometry based electrostatic cell sorting is an important tool in the separation of cell populations. Existing instruments can sort single cells into multi-well collection plates, and keep track of cell of origin and sorted well location. However currently single sorted cell results reflect the population distribution and fail to capture the population diversity. Software was designed that implements a novel sorting approach, "Slice and Dice Sorting," that links a graphical representation of a multi-well plate to logic that ensures that single cells are sampled and sorted from all areas defined by the sort region/s. Therefore the diversity of the total population is captured, and the more frequently occurring or rarer cell types are all sampled. The sorting approach was tested computationally, and using functional cell based assays. Computationally we demonstrate that conventional single cell sorting can sample as little as 50% of the population diversity dependant on the population distribution, and that Slice and Dice sorting samples much more of the variety present within a cell population. We then show by sorting single cells into wells using the Slice and Dice sorting method that there are cells sorted using this method that would be either rarely sorted, or not sorted at all using conventional single cell sorting approaches. The present study demonstrates a novel single cell sorting method that samples much more of the population diversity than current methods. It has implications in clonal selection, stem cell sorting, single cell sequencing and any areas where population heterogeneity is of importance. © 2015 International Society for Advancement of Cytometry.

  19. Survival or growth of inoculated Escherichia coli O157:H7 and Salmonella on yellow onions (Allium cepa) under conditions simulating food service and consumer handling and storage.

    PubMed

    Lieberman, Vanessa M; Zhao, Irene Y; Schaffner, Donald W; Danyluk, Michelle D; Harris, Linda J

    2015-01-01

    Whole and diced yellow onions (Allium cepa) were inoculated with five-strain cocktails of rifampin-resistant Escherichia coli O157:H7 or Salmonella and stored under conditions to simulate food service or consumer handling. The inoculum was grown in broth (for both whole and diced onion experiments) or on agar plates (for whole onion experiments). Marked circles (3.3 cm in diameter) on the outer papery skin of whole onions were spot inoculated (10 μl in 10 drops) at 7 log CFU per circle, and onions were stored at 4°C, 30 to 50 % relative humidity, or at ambient conditions (23°C, 30 to 50 % relative humidity). Diced onions were inoculated at 3 log CFU/g and then stored in open or closed containers at 4°C or ambient conditions. Previously inoculated and ambient-stored diced onions were also mixed 1:9 (wt/wt) with refrigerated uninoculated freshly diced onions and stored in closed containers at ambient conditions. Inoculated pathogens were recovered in 0.1 % peptone and plated onto selective and nonselective media supplemented with 50 μg/ml rifampin. Both E. coli O157:H7 and Salmonella populations declined more rapidly on onion skins when the inoculum was prepared in broth rather than on agar. Agar-prepared E. coli O157:H7 and Salmonella declined by 0.4 and 0.3 log CFU per sample per day, respectively, at ambient conditions; at 4°C the rates of reduction were 0.08 and 0.06 log CFU per sample per day for E. coli O157:H7 and Salmonella, respectively. Populations of E. coli O157:H7 and Salmonella did not change over 6 days of storage at 4°C in diced onions. Lag times of 6 to 9 h were observed with freshly inoculated onion at ambient conditions; no lag was observed when previously inoculated and uninoculated onions were mixed. Growth rates at ambient conditions were 0.2 to 0.3 log CFU/g/h for E. coli O157:H7 and Salmonella in freshly inoculated onion and 0.2 log CFU/g/h in mixed product. Diced onions support pathogen growth and should be kept refrigerated.

  20. Psychometric Evaluation of the Demographic Index of Cultural Exposure (DICE) in Two Mexican-Origin Community Samples

    ERIC Educational Resources Information Center

    Cruz, Rick A.; Wilkinson, Anna V.; Bondy, Melissa L.; Koehly, Laura M.

    2012-01-01

    Reliability and validity evidence is provided for the Demographic Index of Cultural Exposure (DICE), consisting of six demographic proxy indicators of acculturation, within two community samples of Mexican-origin adults (N= 497 for each sample). Factor analytic procedures were used to examine the common variance shared between the six demographic…

  1. Gaming the Law of Large Numbers

    ERIC Educational Resources Information Center

    Hoffman, Thomas R.; Snapp, Bart

    2012-01-01

    Many view mathematics as a rich and wonderfully elaborate game. In turn, games can be used to illustrate mathematical ideas. Fibber's Dice, an adaptation of the game Liar's Dice, is a fast-paced game that rewards gutsy moves and favors the underdog. It also brings to life concepts arising in the study of probability. In particular, Fibber's Dice…

  2. A digitally implemented communications experiment utilizing the communications technology satellite, Hermes

    NASA Technical Reports Server (NTRS)

    Jackson, H. D.; Fiala, J.

    1980-01-01

    Developments which will reduce the costs associated with the distribution of satellite services are considered with emphasis on digital communication link implementation. A digitally implemented communications experiment (DICE) which demonstrates the flexibility and efficiency of digital transmission of television video and audio, telephone voice, and high-bit-rate data is described. The utilization of the DICE system in a full duplex teleconferencing mode is addressed. Demonstration teleconferencing results obtained during the conduct of two sessions of the 7th AIAA Communication Satellite Systems Conference are discussed. Finally, the results of link characterization tests conducted to determine (1) relationships between the Hermes channel 1 EIRP and DICE model performance and (2) channel spacing criteria for acceptable multichannel operation, are presented.

  3. Klein tunneling in the α -T3 model

    NASA Astrophysics Data System (ADS)

    Illes, E.; Nicol, E. J.

    2017-06-01

    We investigate Klein tunneling for the α -T3 model, which interpolates between graphene and the dice lattice via parameter α . We study transmission across two types of electrostatic interfaces: sharp potential steps and sharp potential barriers. We find both interfaces to be perfectly transparent for normal incidence for the full range of the parameter α for both interfaces. For other angles of incidence, we find that transmission is enhanced with increasing α . For the dice lattice, we find perfect, all-angle transmission across a potential step for incoming electrons with energy equal to half of the height of the potential step. This is analogous to the "super", all-angle transmission reported for the dice lattice for Klein tunneling across a potential barrier.

  4. Platelet-Rich Fibrin Improves the Viability of Diced Cartilage Grafts in a Rabbit Model.

    PubMed

    Göral, Ali; Aslan, Cem; Bolat Küçükzeybek, Betül; Işık, Dağhan; Hoşnuter, Mübin; Durgun, Mustafa

    2016-04-01

    Diced cartilage may be wrapped with synthetic or biological materials before grafting to a recipient site. These materials have unique advantages and disadvantages, and a gold standard is not available. The authors investigated the effects of platelet-rich fibrin (PRF) on the survival of cartilage grafts in a rabbit model. In this experimental study, diced cartilage pieces from the ears of 9 male rabbits were left unwrapped or were wrapped with PRF, oxidized regenerated cellulose, or fascia. Specimens then were placed into subcutaneous pockets prepared on the backs of the rabbits. The animals were sacrificed 2 months after the procedure, and the grafts were excised for macroscopic and histopathologic examination. The cartilage graft wrapped with PRF showed superior viability compared with the cartilage graft wrapped with oxidized regenerated cellulose. No significant differences were found among the other groups. The groups were not significantly different in terms of rates of inflammation, fibrosis, or vascularization. PRF enhances the viability of diced cartilage grafts and should be considered an appropriate biological wrapping material for cartilage grafting. © 2016 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  5. Applying 4-H Judging Strategies to Board, Dice, and Card Games: Developing Skills in Urban and Suburban Youths

    ERIC Educational Resources Information Center

    Brandt, Brian; Stowe, James

    2017-01-01

    Most 4-H judging events involve livestock or other traditional 4-H projects. Consequently, many urban and suburban youths miss out on building life skills developed through judging. In a nontraditional approach to 4-H judging, such youths play board, dice, and card games and then judge the games using the practice of giving oral reasons. The…

  6. A Framework for Robust Multivariable Optimization of Integrated Circuits in Space Applications

    NASA Technical Reports Server (NTRS)

    DuMonthier, Jeffrey; Suarez, George

    2013-01-01

    Application Specific Integrated Circuit (ASIC) design for space applications involves multiple challenges of maximizing performance, minimizing power and ensuring reliable operation in extreme environments. This is a complex multidimensional optimization problem which must be solved early in the development cycle of a system due to the time required for testing and qualification severely limiting opportunities to modify and iterate. Manual design techniques which generally involve simulation at one or a small number of corners with a very limited set of simultaneously variable parameters in order to make the problem tractable are inefficient and not guaranteed to achieve the best possible results within the performance envelope defined by the process and environmental requirements. What is required is a means to automate design parameter variation, allow the designer to specify operational constraints and performance goals, and to analyze the results in a way which facilitates identifying the tradeoffs defining the performance envelope over the full set of process and environmental corner cases. The system developed by the Mixed Signal ASIC Group (MSAG) at the Goddard Space Flight Center is implemented as framework of software modules, templates and function libraries. It integrates CAD tools and a mathematical computing environment, and can be customized for new circuit designs with only a modest amount of effort as most common tasks are already encapsulated. Customization is required for simulation test benches to determine performance metrics and for cost function computation. Templates provide a starting point for both while toolbox functions minimize the code required. Once a test bench has been coded to optimize a particular circuit, it is also used to verify the final design. The combination of test bench and cost function can then serve as a template for similar circuits or be re-used to migrate the design to different processes by re-running it with the new process specific device models. The system has been used in the design of time to digital converters for laser ranging and time-of-flight mass spectrometry to optimize analog, mixed signal and digital circuits such as charge sensitive amplifiers, comparators, delay elements, radiation tolerant dual interlocked (DICE) flip-flops and two of three voter gates.

  7. Case Study 2: Using Games Based on Giant Dice and Time Restrictions to Enable Creativity When Teaching Artistic or Creative Subjects

    ERIC Educational Resources Information Center

    Barnard, Dan

    2017-01-01

    This case study draws on some experiments I have been doing in the use of dice in the ideas generation phase of a creative project. It draws on workshops I have run with creative technology students at Goldsmiths, with a range of adults at the Counterplay Conference in Aarhus (Denmark) and the Playful Learning Conference at Manchester Metropolitan…

  8. Integrated Assessment of Carbon Dioxide Removal

    NASA Astrophysics Data System (ADS)

    Rickels, W.; Reith, F.; Keller, D.; Oschlies, A.; Quaas, M. F.

    2018-03-01

    To maintain the chance of keeping the average global temperature increase below 2°C and to limit long-term climate change, removing carbon dioxide from the atmosphere (carbon dioxide removal, CDR) is becoming increasingly necessary. We analyze optimal and cost-effective climate policies in the dynamic integrated assessment model (IAM) of climate and the economy (DICE2016R) and investigate (1) the utilization of (ocean) CDR under different climate objectives, (2) the sensitivity of policies with respect to carbon cycle feedbacks, and (3) how well carbon cycle feedbacks are captured in the carbon cycle models used in state-of-the-art IAMs. Overall, the carbon cycle model in DICE2016R shows clear improvements compared to its predecessor, DICE2013R, capturing much better long-term dynamics and also oceanic carbon outgassing due to excess oceanic storage of carbon from CDR. However, this comes at the cost of a (too) tight short-term remaining emission budget, limiting the model suitability to analyze low-emission scenarios accurately. With DICE2016R, the compliance with the 2°C goal is no longer feasible without negative emissions via CDR. Overall, the optimal amount of CDR has to take into account (1) the emission substitution effect and (2) compensation for carbon cycle feedbacks.

  9. Final-Approach Spacing Aids (FASA) evaluation for terminal-area, time-based air traffic control

    NASA Technical Reports Server (NTRS)

    Credeur, Leonard; Capron, William R.; Lohr, Gary W.; Crawford, Daniel J.; Tang, Dershuen A.; Rodgers, William G., Jr.

    1993-01-01

    A jointly funded (NASA/FAA) real-time simulation study was conducted at NASA Langley Research Center to gather comparative performance data among three candidate final-approach spacing aid (FASA) display formats. Several objective measures of controller performance and their display eye-scan behavior as well as subjective workload and rating questionnaires were used. For each of two representative pattern-speed procedures (a 170-knot procedure and a 210-knot procedure with speed control aiding), data were gathered, via twelve FAA controllers, using four final-controller display format conditions (manual/ARTS 3, graphic marker, DICE countdown, and centerline slot marker). Measured runway separations were more precise with both the graphic marker and DICE countdown formats than with the centerline slot marker and both (graphic and DICE) improved precision relative to the manual/ARTS 3 format. For three separate rating criteria, the subject controllers ranked the FASA formats in the same order: graphic marker, DICE countdown, and centerline slot marker. The increased precision measured with the 210-knot pattern-speed procedure may indicate the potential for the application of speed-control aiding where higher pattern speeds are practical after the base-to-final turn. Also presented are key FASA issues, a rationale for the formats selected for testing, and their description.

  10. Capillary-Driven Microfluidic Chips for Miniaturized Immunoassays: Efficient Fabrication and Sealing of Chips Using a "Chip-Olate" Process.

    PubMed

    Temiz, Yuksel; Delamarche, Emmanuel

    2017-01-01

    The fabrication of silicon-based microfluidic chips is invaluable in supporting the development of many microfluidic concepts for research in the life sciences and in vitro diagnostic applications such as the realization of miniaturized immunoassays using capillary-driven chips. While being extremely abundant, the literature covering microfluidic chip fabrication and assay development might not have addressed properly the challenge of fabricating microfluidic chips on a wafer level or the need for dicing wafers to release chips that need then to be further processed, cleaned, rinsed, and dried one by one. Here, we describe the "chip-olate" process wherein microfluidic structures are formed on a silicon wafer, followed by partial dicing, cleaning, and drying steps. Then, integration of reagents (if any) can be done, followed by lamination of a sealing cover. Breaking by hand the partially diced wafer yields individual chips ready for use.

  11. Utility of Cartilage Grafts Wrapped With Amniotic Membrane in Dorsal Nasal Augmentation.

    PubMed

    Atespare, Altay; Kara, Hakan; Ilter, Erdin; Boyaci, Zerrin; Çelik, Öner; Midi, Ahmet

    2016-06-01

    The success of rhinoplasty may be compromised with postoperative problems like rough and rigid nasal dorsum. Biological grafts or alloplastic materials are required to hurdle and correct nasal dorsal deformities and also irregularities. The purpose of this experimental study was to compare pure cartilage graft, cartilage graft wrapped in amniotic membrane, and diced cartilage grafts wrapped in amniotic membrane for soft tissue augmentation. All grafts were transplanted through a subcutaneous tunnel created in the nasal dorsum of 18 rats, 6 in each group. After 3 months follow-up, the histopathological changes in all groups were evaluated by light microscopy and volumetric measurements. With regard to cartilage viability, cartilage wrapped in amniotic membrane had a higher success rate than pure cartilage graft. Also, a further increased success rate was found in the diced group. In the soft tissue augmentation after rhinoplasty surgery, especially diced cartilage wrapped in amniotic membrane keeps the graft viable and adjoined.

  12. Splenomegaly Segmentation using Global Convolutional Kernels and Conditional Generative Adversarial Networks

    PubMed Central

    Huo, Yuankai; Xu, Zhoubing; Bao, Shunxing; Bermudez, Camilo; Plassard, Andrew J.; Liu, Jiaqi; Yao, Yuang; Assad, Albert; Abramson, Richard G.; Landman, Bennett A.

    2018-01-01

    Spleen volume estimation using automated image segmentation technique may be used to detect splenomegaly (abnormally enlarged spleen) on Magnetic Resonance Imaging (MRI) scans. In recent years, Deep Convolutional Neural Networks (DCNN) segmentation methods have demonstrated advantages for abdominal organ segmentation. However, variations in both size and shape of the spleen on MRI images may result in large false positive and false negative labeling when deploying DCNN based methods. In this paper, we propose the Splenomegaly Segmentation Network (SSNet) to address spatial variations when segmenting extraordinarily large spleens. SSNet was designed based on the framework of image-to-image conditional generative adversarial networks (cGAN). Specifically, the Global Convolutional Network (GCN) was used as the generator to reduce false negatives, while the Markovian discriminator (PatchGAN) was used to alleviate false positives. A cohort of clinically acquired 3D MRI scans (both T1 weighted and T2 weighted) from patients with splenomegaly were used to train and test the networks. The experimental results demonstrated that a mean Dice coefficient of 0.9260 and a median Dice coefficient of 0.9262 using SSNet on independently tested MRI volumes of patients with splenomegaly.

  13. PZT Thin-Film Micro Probe Device with Dual Top Electrodes

    NASA Astrophysics Data System (ADS)

    Luo, Chuan

    Lead zirconate titanate (PZT) thin-film actuators have been studied intensively for years because of their potential applications in many fields. In this dissertation, a PZT thin-film micro probe device is designed, fabricated, studied, and proven to be acceptable as an intracochlear acoustic actuator. The micro probe device takes the form of a cantilever with a PZT thin-film diaphragm at the tip of the probe. The tip portion of the probe will be implanted in cochlea later in animal tests to prove its feasibility in hearing rehabilitation. The contribution of the dissertation is three-fold. First, a dual top electrodes design, consisting of a center electrode and an outer electrode, is developed to improve actuation displacement of the PZT thin-film diaphragm. The improvement by the dual top electrodes design is studied via a finite element model. When the dimensions of the dual electrodes are optimized, the displacement of the PZT thin-film diaphragm increases about 30%. A PZT thin-film diaphragm with dual top electrodes is fabricated to prove the concept, and experimental results confirm the predictions from the finite element analyses. Moreover, the dual electrode design can accommodate presence of significant residual stresses in the PZT thin-film diaphragm by changing the phase difference between the two electrodes. Second, a PZT thin-film micro probe device is fabricated and tested. The fabrication process consists of PZT thin-film deposition and deep reactive ion etching (DRIE). The uniqueness of the fabrication process is an automatic dicing mechanism that allows a large number of probes to be released easily from the wafer. Moreover, the fabrication is very efficient, because the DRIE process will form the PZT thin-film diaphragm and the special dicing mechanism simultaneously. After the probes are fabricated, they are tested with various possible implantation depths (i.e., boundary conditions). Experimental results show that future implantation depths should be less than 3 mm in order to guarantee the first resonant frequency above 60 kHz. Finally, a package for the PZT thin-film micro probe device is developed to ensure its proper function in an aqueous environment, such as inside of cochlea. The package is an insulation layer of parylene coating on the probe. A finite element analysis indicates that a coating thickness of less than 1 mum will reduce the PZT diaphragm displacement by less than 10%. A special fixture is designed to hold a large number of probes for parylene deposition of a thickness of 250 nm. A packaged probe is then submerged in deionized water and functions properly for at least 55 hours. Displacement and impedance of the probe are measured via a laser Doppler vibrometer and an impedance analyzer, respectively. Experimental results show that displacement of the PZT diaphragm increases about 30% in two hours, after the probe is submerged in the deionized water. The impedance measurement shows consistent trends. A hypothesis to explain this unusual phenomenon is diffusion of water molecules into the PZT thin film. High-resolution SEM images of the probe indicate presence of numerous nano-pores in the surface of the PZT thin film, indirectly confirming the hypothesis. Keywords: PZT, Thin-Film, Dual Electrodes, Parylene Coating, Aqueous Environment, Cochlear Implant

  14. High-energy-resolution diced spherical quartz analyzers for resonant inelastic X-ray scattering

    DOE PAGES

    Said, Ayman H.; Gog, Thomas; Wieczorek, Michael; ...

    2018-02-15

    A novel diced spherical quartz analyzer for use in resonant inelastic X-ray scattering (RIXS) is introduced, achieving an unprecedented energy resolution of 10.53 meV at the IrL 3absorption edge (11.215 keV). In this work the fabrication process and the characterization of the analyzer are presented, and an example of a RIXS spectrum of magnetic excitations in a Sr 3Ir 2O 7sample is shown.

  15. How LO Can You GO? Using the Dice-Based Golf Game GOLO to Illustrate Inferences on Proportions and Discrete Probability Distributions

    ERIC Educational Resources Information Center

    Stephenson, Paul; Richardson, Mary; Gabrosek, John; Reischman, Diann

    2009-01-01

    This paper describes an interactive activity that revolves around the dice-based golf game GOLO. The GOLO game can be purchased at various retail locations or online at igolo.com. In addition, the game may be played online free of charge at igolo.com. The activity is completed in four parts. The four parts can be used in a sequence or they can be…

  16. Do Amnesic Patients with Korsakoff's Syndrome Use Feedback when Making Decisions under Risky Conditions? An Experimental Investigation with the Game of Dice Task with and without Feedback

    ERIC Educational Resources Information Center

    Brand, Matthias; Pawlikowski, Mirko; Labudda, Kirsten; Laier, Christian; von Rothkirch, Nadine; Markowitsch, Hans J.

    2009-01-01

    We investigated the role of feedback processing in decision making under risk conditions in 50 patients with amnesia in the course of alcoholic Korsakoff's syndrome (KS). Half of the patients were administered the Game of Dice Task (GDT) and the remaining 25 patients were examined with a modified version of the GDT in which no feedback was…

  17. A novel fully automatic multilevel thresholding technique based on optimized intuitionistic fuzzy sets and tsallis entropy for MR brain tumor image segmentation.

    PubMed

    Kaur, Taranjit; Saini, Barjinder Singh; Gupta, Savita

    2018-03-01

    In the present paper, a hybrid multilevel thresholding technique that combines intuitionistic fuzzy sets and tsallis entropy has been proposed for the automatic delineation of the tumor from magnetic resonance images having vague boundaries and poor contrast. This novel technique takes into account both the image histogram and the uncertainty information for the computation of multiple thresholds. The benefit of the methodology is that it provides fast and improved segmentation for the complex tumorous images with imprecise gray levels. To further boost the computational speed, the mutation based particle swarm optimization is used that selects the most optimal threshold combination. The accuracy of the proposed segmentation approach has been validated on simulated, real low-grade glioma tumor volumes taken from MICCAI brain tumor segmentation (BRATS) challenge 2012 dataset and the clinical tumor images, so as to corroborate its generality and novelty. The designed technique achieves an average Dice overlap equal to 0.82010, 0.78610 and 0.94170 for three datasets. Further, a comparative analysis has also been made between the eight existing multilevel thresholding implementations so as to show the superiority of the designed technique. In comparison, the results indicate a mean improvement in Dice by an amount equal to 4.00% (p < 0.005), 9.60% (p < 0.005) and 3.58% (p < 0.005), respectively in contrast to the fuzzy tsallis approach.

  18. Interactions between sanitizers and packaging gas compositions and their effects on the safety and quality of fresh-cut onions (Allium cepa L.).

    PubMed

    Page, Natalie; González-Buesa, Jaime; Ryser, Elliot T; Harte, Janice; Almenar, Eva

    2016-02-02

    Onions are one of the most widely utilized vegetables worldwide, with demand for fresh-cut onions steadily increasing. Due to heightened safety concerns and consumer demand, the implications of sanitizing and packaging on fresh-cut onion safety and quality need to be better understood. The objective of this study was to investigate the effect of produce sanitizers, in-package atmospheres, and their interactions on the growth of Salmonella Typhimurium, mesophilic aerobic bacteria, yeast and mold, and the physico-chemical quality of diced onions to determine the best sanitizer and in-package atmosphere combination for both safety and quality. Diced onions were inoculated or not with S. Typhimurium, sanitized in sodium hypochlorite, peroxyacetic acid, or liquid chlorine dioxide, and then packaged in either polylactic acid bags containing superatmospheric O2, elevated CO2/reduced O2, or air, or in polyethylene terephthalate snap-fit containers. Throughout 14 days of storage at 7 °C, packaged diced onions were assessed for their safety (S. Typhimurium), and quality (mesophilic aerobic bacteria, yeasts and molds, physico-chemical analyses, and descriptive and consumer acceptance sensory panels). While sanitizer affected (P<0.05) fewer parameters (S. Typhimurium, mesophiles, yeasts and molds, headspace CO2, weight loss, and pH), in-package atmosphere had a significant (P<0.05) effect on all parameters evaluated. Two-way interactions between sanitizer and atmosphere that affected S. Typhimurium and pH were identified whereas 3-way interactions (sanitizer, atmosphere and time) were only observed for headspace CO2. Sodium hypochlorite and elevated CO2/reduced O2 was the best sanitizer and in-package atmosphere combination for enhancing the safety and quality of packaged diced onions. In addition, this combination led to diced onions acceptable for purchase after 2 weeks of storage by trained and consumer panels. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamichhane, N; Johnson, P; Chinea, F

    Purpose: To evaluate the correlation between image features and the accuracy of manually drawn target contours on synthetic PET images Methods: A digital PET phantom was used in combination with Monte Carlo simulation to create a set of 26 simulated PET images featuring a variety of tumor shapes and activity heterogeneity. These tumor volumes were used as a gold standard in comparisons with manual contours delineated by 10 radiation oncologist on the simulated PET images. Metrics used to evaluate segmentation accuracy included the dice coefficient, false positive dice, false negative dice, symmetric mean absolute surface distance, and absolute volumetric difference.more » Image features extracted from the simulated tumors consisted of volume, shape complexity, mean curvature, and intensity contrast along with five texture features derived from the gray-level neighborhood difference matrices including contrast, coarseness, busyness, strength, and complexity. Correlation between these features and contouring accuracy were examined. Results: Contour accuracy was reasonably well correlated with a variety of image features. Dice coefficient ranged from 0.7 to 0.90 and was correlated closely with contrast (r=0.43, p=0.02) and complexity (r=0.5, p<0.001). False negative dice ranged from 0.10 to 0.50 and was correlated closely with contrast (r=0.68, p<0.001) and complexity (r=0.66, p<0.001). Absolute volumetric difference ranged from 0.0002 to 0.67 and was correlated closely with coarseness (r=0.46, p=0.02) and complexity (r=0.49, p=0.008). Symmetric mean absolute difference ranged from 0.02 to 1 and was correlated closely with mean curvature (r=0.57, p=0.02) and contrast (r=0.6, p=0.001). Conclusion: The long term goal of this study is to assess whether contouring variability can be reduced by providing feedback to the practitioner based on image feature analysis. The results are encouraging and will be used to develop a statistical model which will enable a prediction of contour accuracy based purely on image feature analysis.« less

  20. Hypernatremia in Dice snakes (Natrix tessellata) from a coastal population: implications for osmoregulation in marine snake prototypes.

    PubMed

    Brischoux, François; Kornilev, Yurii V

    2014-01-01

    The widespread relationship between salt excreting structures (e.g., salt glands) and marine life strongly suggests that the ability to regulate salt balance has been crucial during the transition to marine life in tetrapods. Elevated natremia (plasma sodium) recorded in several marine snakes species suggests that the development of a tolerance toward hypernatremia, in addition to salt gland development, has been a critical feature in the evolution of marine snakes. However, data from intermediate stage (species lacking salt glands but occasionally using salty environments) are lacking to draw a comprehensive picture of the evolution of an euryhaline physiology in these organisms. In this study, we assessed natremia of free-ranging Dice snakes (Natrix tessellata, a predominantly fresh water natricine lacking salt glands) from a coastal population in Bulgaria. Our results show that coastal N. tessellata can display hypernatremia (up to 195.5 mmol x l(-1)) without any apparent effect on several physiological and behavioural traits (e.g., hematocrit, body condition, foraging). More generally, a review of natremia in species situated along a continuum of habitat use between fresh- and seawater shows that snake species display a concomitant tolerance toward hypernatremia, even in species lacking salt glands. Collectively, these data suggest that a physiological tolerance toward hypernatremia has been critical during the evolution of an euryhaline physiology, and may well have preceded the evolution of salt glands.

  1. Hypernatremia in Dice Snakes (Natrix tessellata) from a Coastal Population: Implications for Osmoregulation in Marine Snake Prototypes

    PubMed Central

    Brischoux, François; Kornilev, Yurii V.

    2014-01-01

    The widespread relationship between salt excreting structures (e.g., salt glands) and marine life strongly suggests that the ability to regulate salt balance has been crucial during the transition to marine life in tetrapods. Elevated natremia (plasma sodium) recorded in several marine snakes species suggests that the development of a tolerance toward hypernatremia, in addition to salt gland development, has been a critical feature in the evolution of marine snakes. However, data from intermediate stage (species lacking salt glands but occasionally using salty environments) are lacking to draw a comprehensive picture of the evolution of an euryhaline physiology in these organisms. In this study, we assessed natremia of free-ranging Dice snakes (Natrix tessellata, a predominantly fresh water natricine lacking salt glands) from a coastal population in Bulgaria. Our results show that coastal N. tessellata can display hypernatremia (up to 195.5 mmol.l−1) without any apparent effect on several physiological and behavioural traits (e.g., hematocrit, body condition, foraging). More generally, a review of natremia in species situated along a continuum of habitat use between fresh- and seawater shows that snake species display a concomitant tolerance toward hypernatremia, even in species lacking salt glands. Collectively, these data suggest that a physiological tolerance toward hypernatremia has been critical during the evolution of an euryhaline physiology, and may well have preceded the evolution of salt glands. PMID:24658047

  2. Determinants of the Army Applicant Job Choice Decision and the Development of a Decision Support Tool for the Enlistment Incentive Review Board

    DTIC Science & Technology

    2012-02-01

    USMA) to assess which preferences of youth could be influenced by incentives (Joles, Charbonneau , & Barr, 1998; Henry, Dice, & Davis, 2001). More... Charbonneau , and Barr (1998) and Henry, Dice, and Davis (2001) conducted surveys to assess the extent to which preferences of youth could be influenced by...Army enlistment initiatives. West Point, NY: United States Military Academy. Joles, J., Charbonneau , S., & Barr, D. (1998, February). An enlistment

  3. Sojourner Rover Near The Dice

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Lander image of rover near The Dice (three small rocks behind the rover) and Yogi on sol 22. Color (red, green, and blue filters at 6:1 compression) image shows dark rocks, bright red dust, dark red soil exposed in rover tracks, and dark (black) soil. The APXS is in view at the rear of the vehicle, and the forward stereo cameras and laser light stripers are in shadow just below the front edge of the solar panel.

    NOTE: original caption as published in Science Magazine

  4. DPSSL for direct dicing and drilling of dielectrics

    NASA Astrophysics Data System (ADS)

    Ashkenasi, David; Schwagmeier, M.

    2007-02-01

    New strategies in laser micro processing of glasses and other optically transparent materials are being developed with increasing interest and intensity using diode pumped solid state laser (DPSSL) systems generating short or ultra-short pulses in the optical spectra at good beam quality. Utilizing non-linear absorption channels, it can be demonstrated that ns green (532 nm) laser light can scribe, dice, full body cut and drill (flat) borofloat and borosilicate glasses at good quality. Outside of the correct choice in laser parameters, an intelligent laser beam management plays an important role in successful micro processing of glass. This application characterizes a very interesting alternative where standard methods demonstrate severe limitations such as diamond dicing, CO2 laser treatment or water jet cutting, especially for certain type of optical materials and/or geometric conditions. Application near processing examples using different DPSSL systems generating ns pulsed light at 532 nm in TEM 00 at average powers up to 10 W are presented and discussed in respect to potential applications in display technology, micro electronics and optics.

  5. Computer simulations of comet- and asteroidlike bodies passing through the Venusian atmosphere: Preliminary results on atmospheric and ground shock effects

    NASA Technical Reports Server (NTRS)

    Roddy, D.; Hatfield, D.; Hassig, P.; Rosenblatt, M.; Soderblom, L.; Dejong, E.

    1992-01-01

    We have completed computer simulations that model shock effects in the venusian atmosphere caused during the passage of two cometlike bodies 100 m and 1000 m in diameter and an asteroidlike body 10 km in diameter. Our objective is to examine hypervelocity-generated shock effects in the venusian atmosphere for bodies of different types and sizes in order to understand the following: (1) their deceleration and depth of penetration through the atmosphere; and (2) the onset of possible ground-surface shock effects such as splotches, craters, and ejecta formations. The three bodies were chosen to include both a range of general conditions applicable to Venus as well as three specific cases of current interest. These calculations use a new multiphase computer code (DICE-MAZ) designed by California Research & Technology for shock-dynamics simulations in complex environments. The code was tested and calibrated in large-scale explosion, cratering, and ejecta research. It treats a wide range of different multiphase conditions, including material types (vapor, melt, solid), particle-size distributions, and shock-induced dynamic changes in velocities, pressures, temperatures (internal energies), densities, and other related parameters, all of which were recorded in our calculations.

  6. Using a Pareto-optimal solution set to characterize trade-offs between a broad range of values and preferences in climate risk management

    NASA Astrophysics Data System (ADS)

    Garner, Gregory; Reed, Patrick; Keller, Klaus

    2015-04-01

    Integrated assessment models (IAMs) are often used to inform the design of climate risk management strategies. Previous IAM studies have broken important new ground on analyzing the effects of parametric uncertainties, but they are often silent on the implications of uncertainties regarding the problem formulation. Here we use the Dynamic Integrated model of Climate and the Economy (DICE) to analyze the effects of uncertainty surrounding the definition of the objective(s). The standard DICE model adopts a single objective to maximize a weighted sum of utilities of per-capita consumption. Decision makers, however, are often concerned with a broader range of values and preferences that may be poorly captured by this a priori definition of utility. We reformulate the problem by introducing three additional objectives that represent values such as (i) reliably limiting global average warming to two degrees Celsius and minimizing (ii) the costs of abatement and (iii) the climate change damages. We use advanced multi-objective optimization methods to derive a set of Pareto-optimal solutions over which decision makers can trade-off and assess performance criteria a posteriori. We illustrate the potential for myopia in the traditional problem formulation and discuss the capability of this multiobjective formulation to provide decision support.

  7. WE-G-207-05: Relationship Between CT Image Quality, Segmentation Performance, and Quantitative Image Feature Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, J; Nishikawa, R; Reiser, I

    Purpose: Segmentation quality can affect quantitative image feature analysis. The objective of this study is to examine the relationship between computed tomography (CT) image quality, segmentation performance, and quantitative image feature analysis. Methods: A total of 90 pathology proven breast lesions in 87 dedicated breast CT images were considered. An iterative image reconstruction (IIR) algorithm was used to obtain CT images with different quality. With different combinations of 4 variables in the algorithm, this study obtained a total of 28 different qualities of CT images. Two imaging tasks/objectives were considered: 1) segmentation and 2) classification of the lesion as benignmore » or malignant. Twenty-three image features were extracted after segmentation using a semi-automated algorithm and 5 of them were selected via a feature selection technique. Logistic regression was trained and tested using leave-one-out-cross-validation and its area under the ROC curve (AUC) was recorded. The standard deviation of a homogeneous portion and the gradient of a parenchymal portion of an example breast were used as an estimate of image noise and sharpness. The DICE coefficient was computed using a radiologist’s drawing on the lesion. Mean DICE and AUC were used as performance metrics for each of the 28 reconstructions. The relationship between segmentation and classification performance under different reconstructions were compared. Distributions (median, 95% confidence interval) of DICE and AUC for each reconstruction were also compared. Results: Moderate correlation (Pearson’s rho = 0.43, p-value = 0.02) between DICE and AUC values was found. However, the variation between DICE and AUC values for each reconstruction increased as the image sharpness increased. There was a combination of IIR parameters that resulted in the best segmentation with the worst classification performance. Conclusion: There are certain images that yield better segmentation or classification performance. The best segmentation Result does not necessarily lead to the best classification Result. This work has been supported in part by grants from the NIH R21-EB015053. R Nishikawa is receives royalties form Hologic, Inc.« less

  8. SU-F-J-58: Evaluation of RayStation Hybrid Deformable Image Registration for Accurate Contour Propagation in Adaptive Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rong, Y; Rao, S; Daly, M

    Purpose: Adaptive radiotherapy requires complete new sets of regions of interests (ROIs) delineation on the mid-treatment CT images. This work aims at evaluating the accuracy of the RayStation hybrid deformable image registration (DIR) algorithm for its overall integrity and accuracy in contour propagation for adaptive planning. Methods: The hybrid DIR is based on the combination of intensity-based algorithm and anatomical information provided by contours. Patients who received mid-treatment CT scans were identified for the study, including six lung patients (two mid-treatment CTs) and six head-and-neck (HN) patients (one mid-treatment CT). DIRpropagated ROIs were compared with physician-drawn ROIs for 8 ITVsmore » and 7 critical organs (lungs, heart, esophagus, and etc.) for the lung patients, as well as 14 GTVs and 20 critical organs (mandible, eyes, parotids, and etc.) for the HN patients. Volume difference, center of mass (COM) difference, and Dice index were used for evaluation. Clinical-relevance of each propagated ROI was scored by two physicians, and correlated with the Dice index. Results: For critical organs, good agreement (Dice>0.9) were seen on all 7 for lung patients and 13 out of 20 for HN patients, with the rest requiring minimal edits. For targets, COM differences were within 5 mm on average for all patients. For Lung, 5 out of 8 ITVs required minimal edits (Dice 0.8–0.9), with the rest 2 needed re-drawn due to their small volumes (<10 cc). However, the propagated HN GTVs resulted in relatively low Dice values (0.5–0.8) due to their small volumes (3–40 cc) and high variability, among which 2 required re-drawn due to new nodal target identified on the mid-treatment CT scans. Conclusion: The hybrid DIR algorithm was found to be clinically useful and efficient for lung and HN patients, especially for propagated critical organ ROIs. It has potential to significantly improve the workflow in adaptive planning.« less

  9. Interobserver variability in identification of breast tumors in MRI and its implications for prognostic biomarkers and radiogenomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saha, Ashirbani, E-mail: as698@duke.edu; Grimm, La

    Purpose: To assess the interobserver variability of readers when outlining breast tumors in MRI, study the reasons behind the variability, and quantify the effect of the variability on algorithmic imaging features extracted from breast MRI. Methods: Four readers annotated breast tumors from the MRI examinations of 50 patients from one institution using a bounding box to indicate a tumor. All of the annotated tumors were biopsy proven cancers. The similarity of bounding boxes was analyzed using Dice coefficients. An automatic tumor segmentation algorithm was used to segment tumors from the readers’ annotations. The segmented tumors were then compared between readersmore » using Dice coefficients as the similarity metric. Cases showing high interobserver variability (average Dice coefficient <0.8) after segmentation were analyzed by a panel of radiologists to identify the reasons causing the low level of agreement. Furthermore, an imaging feature, quantifying tumor and breast tissue enhancement dynamics, was extracted from each segmented tumor for a patient. Pearson’s correlation coefficients were computed between the features for each pair of readers to assess the effect of the annotation on the feature values. Finally, the authors quantified the extent of variation in feature values caused by each of the individual reasons for low agreement. Results: The average agreement between readers in terms of the overlap (Dice coefficient) of the bounding box was 0.60. Automatic segmentation of tumor improved the average Dice coefficient for 92% of the cases to the average value of 0.77. The mean agreement between readers expressed by the correlation coefficient for the imaging feature was 0.96. Conclusions: There is a moderate variability between readers when identifying the rectangular outline of breast tumors on MRI. This variability is alleviated by the automatic segmentation of the tumors. Furthermore, the moderate interobserver variability in terms of the bounding box does not translate into a considerable variability in terms of assessment of enhancement dynamics. The authors propose some additional ways to further reduce the interobserver variability.« less

  10. A Comparative Study of the Hypoxia PET Tracers [{sup 18}F]HX4, [{sup 18}F]FAZA, and [{sup 18}F]FMISO in a Preclinical Tumor Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peeters, Sarah G.J.A., E-mail: sarah.peeters@maastrichtuniversity.nl; Zegers, Catharina M.L.; Lieuwes, Natasja G.

    Purpose: Several individual clinical and preclinical studies have shown the possibility of evaluating tumor hypoxia by using noninvasive positron emission tomography (PET). The current study compared 3 hypoxia PET tracers frequently used in the clinic, [{sup 18}F]FMISO, [{sup 18}F]FAZA, and [{sup 18}F]HX4, in a preclinical tumor model. Tracer uptake was evaluated for the optimal time point for imaging, tumor-to-blood ratios (TBR), spatial reproducibility, and sensitivity to oxygen modification. Methods and Materials: PET/computed tomography (CT) images of rhabdomyosarcoma R1-bearing WAG/Rij rats were acquired at multiple time points post injection (p.i.) with one of the hypoxia tracers. TBR values were calculated, andmore » reproducibility was investigated by voxel-to-voxel analysis, represented as correlation coefficients (R) or Dice similarity coefficient of the high-uptake volume. Tumor oxygen modifications were induced by exposure to either carbogen/nicotinamide treatment or 7% oxygen breathing. Results: TBR was stabilized and maximal at 2 hours p.i. for [{sup 18}F]FAZA (4.0 ± 0.5) and at 3 hours p.i. for [{sup 18}F]HX4 (7.2 ± 0.7), whereas [{sup 18}F]FMISO showed a constant increasing TBR (9.0 ± 0.8 at 6 hours p.i.). High spatial reproducibility was observed by voxel-to-voxel comparisons and Dice similarity coefficient calculations on the 30% highest uptake volume for both [{sup 18}F]FMISO (R = 0.86; Dice coefficient = 0.76) and [{sup 18}F]HX4 (R = 0.76; Dice coefficient = 0.70), whereas [{sup 18}F]FAZA was less reproducible (R = 0.52; Dice coefficient = 0.49). Modifying the hypoxic fraction resulted in enhanced mean standardized uptake values for both [{sup 18}F]HX4 and [{sup 18}F]FAZA upon 7% oxygen breathing. Only [{sup 18}F]FMISO uptake was found to be reversible upon exposure to nicotinamide and carbogen. Conclusions: This study indicates that each tracer has its own strengths and, depending on the question to be answered, a different tracer can be put forward.« less

  11. Hospital-acquired listeriosis outbreak caused by contaminated diced celery--Texas, 2010.

    PubMed

    Gaul, Linda Knudson; Farag, Noha H; Shim, Trudi; Kingsley, Monica A; Silk, Benjamin J; Hyytia-Trees, Eija

    2013-01-01

    Listeria monocytogenes causes often-fatal infections affecting mainly immunocompromised persons. Sources of hospital-acquired listeriosis outbreaks can be difficult to identify. We investigated a listeriosis outbreak spanning 7 months and involving 5 hospitals. Outbreak-related cases were identified by pulsed-field gel electrophoresis (PFGE) and confirmed by multiple-locus variable-number tandem-repeat analysis (MLVA). We conducted patient interviews, medical records reviews, and hospital food source evaluations. Food and environmental specimens were collected at a hospital (hospital A) where 6 patients had been admitted before listeriosis onset; these specimens were tested by culture, polymerase chain reaction (PCR), and PFGE. We collected and tested food and environmental samples at the implicated processing facility. Ten outbreak-related patients were immunocompromised by ≥1 underlying conditions or treatments; 5 died. All patients had been admitted to or visited an acute-care hospital during their possible incubation periods. The outbreak strain of L. monocytogenes was isolated from chicken salad and its diced celery ingredient at hospital A, and in 19 of >200 swabs of multiple surfaces and in 8 of 11 diced celery products at the processing plant. PCR testing detected Listeria in only 3 of 10 environmental and food samples from which it was isolated by culturing. The facility was closed, products were recalled, and the outbreak ended. Contaminated diced celery caused a baffling, lengthy outbreak of hospital-acquired listeriosis. PCR testing often failed to detect the pathogen, suggesting its reliability should be further evaluated. Listeriosis risk should be considered in fresh produce selections for immunocompromised patients.

  12. Automated MRI parcellation of the frontal lobe

    PubMed Central

    Ranta, Marin E.; Chen, Min; Crocetti, Deana; Prince, Jerry L.; Subramaniam, Krish; Fischl, Bruce; Kaufmann, Walter E.; Mostofsky, Stewart H.

    2014-01-01

    Examination of associations between specific disorders and physical properties of functionally relevant frontal lobe sub-regions is a fundamental goal in neuropsychiatry. Here we present and evaluate automated methods of frontal lobe parcellation with the programs FreeSurfer(FS) and TOADS-CRUISE(T-C), based on the manual method described in Ranta et al. (2009) in which sulcal-gyral landmarks were used to manually delimit functionally relevant regions within the frontal lobe: i.e., primary motor cortex, anterior cingulate, deep white matter, premotor cortex regions (supplementary motor complex, frontal eye field and lateral premotor cortex) and prefrontal cortex (PFC) regions (medial PFC, dorsolateral PFC, inferior PFC, lateral orbitofrontal cortex (OFC) and medial OFC). Dice's coefficient, a measure of overlap, and percent volume difference were used to measure the reliability between manual and automated delineations for each frontal lobe region. For FS, mean Dice's coefficient for all regions was 0.75 and percent volume difference was 21.2%. For T-C the mean Dice's coefficient was 0.77 and the mean percent volume difference for all regions was 20.2%. These results, along with a high degree of agreement between the two automated methods (mean Dice's coefficient = 0.81, percent volume difference = 12.4%) and a proof-of-principle group difference analysis that highlights the consistency and sensitivity of the automated methods, indicate that the automated methods are valid techniques for parcellation of the frontal lobe into functionally relevant sub-regions. Thus, the methodology has the potential to increase efficiency, statistical power and reproducibility for population analyses of neuropsychiatric disorders with hypothesized frontal lobe contributions. PMID:23897577

  13. Multi-scale hippocampal parcellation improves atlas-based segmentation accuracy

    NASA Astrophysics Data System (ADS)

    Plassard, Andrew J.; McHugo, Maureen; Heckers, Stephan; Landman, Bennett A.

    2017-02-01

    Known for its distinct role in memory, the hippocampus is one of the most studied regions of the brain. Recent advances in magnetic resonance imaging have allowed for high-contrast, reproducible imaging of the hippocampus. Typically, a trained rater takes 45 minutes to manually trace the hippocampus and delineate the anterior from the posterior segment at millimeter resolution. As a result, there has been a significant desire for automated and robust segmentation of the hippocampus. In this work we use a population of 195 atlases based on T1-weighted MR images with the left and right hippocampus delineated into the head and body. We initialize the multi-atlas segmentation to a region directly around each lateralized hippocampus to both speed up and improve the accuracy of registration. This initialization allows for incorporation of nearly 200 atlases, an accomplishment which would typically involve hundreds of hours of computation per target image. The proposed segmentation results in a Dice similiarity coefficient over 0.9 for the full hippocampus. This result outperforms a multi-atlas segmentation using the BrainCOLOR atlases (Dice 0.85) and FreeSurfer (Dice 0.75). Furthermore, the head and body delineation resulted in a Dice coefficient over 0.87 for both structures. The head and body volume measurements also show high reproducibility on the Kirby 21 reproducibility population (R2 greater than 0.95, p < 0.05 for all structures). This work signifies the first result in an ongoing work to develop a robust tool for measurement of the hippocampus and other temporal lobe structures.

  14. PREFACE: DICE 2012 : Spacetime Matter Quantum Mechanics - from the Planck scale to emergent phenomena

    NASA Astrophysics Data System (ADS)

    Diósi, Lajos; Elze, Hans-Thomas; Fronzoni, Leone; Halliwell, Jonathan; Prati, Enrico; Vitiello, Giuseppe; Yearsley, James

    2013-06-01

    Presented in this volume are the Invited Lectures and the Contributed Papers of the Sixth International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2012, held at Castello Pasquini, Castiglioncello (Tuscany), 17-21 September 2012. These proceedings may document to the interested public and to the wider scientific community the stimulating exchange of ideas at the meeting. The number of participants has been steadily growing over the years, reflecting an increasing attraction, if not need, of such conference. Our very intention has always been to bring together leading researchers, advanced students, and renowned scholars from various areas, in order to stimulate new ideas and their exchange across the borders of specialization. In this way, the series of meetings successfully continued from the beginning with DICE 20021, followed by DICE 20042, DICE 20063, DICE 20084, and DICE 20105, Most recently, DICE 2012 brought together more than 120 participants representing more than 30 countries worldwide. It has been a great honour and inspiration to have Professor Yakir Aharonov (Tel Aviv) with us, who presented the opening Keynote Lecture 'The two-vector quantum formalism'. With the overarching theme 'Spacetime - Matter - Quantum Mechanics - from the Planck scale to emergent phenomena', the conference took place in the very pleasant and inspiring atmosphere of Castello Pasquini - in beautiful surroundings, overlooking a piece of Tuscany's coast. The 5-day program covered these major topics: Quantum Mechanics, Foundations and Quantum-Classical Border Quantum-Classical Hybrids and Many-Body Systems Spectral Geometry, Path Integrals and Experiments Quantum -/- Gravity -/- Spacetime Quantum Mechanics on all Scales? A Roundtable Discussion under the theme 'Nuovi orizzonti nella ricerca scientifica. Ci troviamo di fronte ad una rivoluzione scientifica?' formed an integral part of the program. With participation of E Del Giudice (INFN & Università di Milano), F Guerra (Università 'La Sapienza', Roma) and G Vitiello (Università di Salerno), this event traditionally dedicated to the public drew a large audience involved in lively discussions until late. The workshop was organized by L Diósi (Budapest), H-T Elze (Pisa, chair), L Fronzoni (Pisa), J J Halliwell (London), E Prati (Milano) and G Vitiello (Salerno), with most essential help from our conference secretaries L Fratino, N Lampo, I Pozzana, and A Sonnellini, all students from Pisa, and from our former secretaries M Pesce-Rollins and L Baldini. Several institutions and sponsors supported the workshop and their representatives and, in particular, the citizens of Rosignano/Castiglioncello are deeply thanked for the generous help and kind hospitality: Comune di Rosignano - A Franchi (Sindaco di Rosignano), S Scarpellini (Segreteria sindaco), L Benini (Assessore ai lavori pubblici), M Pia (Assessore all' urbanistica) REA Rosignano Energia Ambiente s.p.a. - F Ghelardini (Presidente della REA), E Salvadori and C Peccianti (Segreteria) Associazione Armunia - A Nanni (Direttore), G Mannari (Programmazione), C Perna, F Bellini, M Nannerini, P Bruni and L Meucci (Tecnici). Special thanks go to G Mannari and her collaborators for advice and great help in all the practical matters that had to be dealt with, in order to run the meeting at Castello Pasquini smoothly Funds made available by Università di Pisa, Domus Galilaeana (Pisa), Centro Interdisciplinare per lo Studio dei Sistemi Complessi - CISSC (Pisa), Dipartimento di Ingegneria Industriale (Università di Salerno), Istituto Italiano per gli Studi Filosofici - IISF (Napoli), Solvay Italia SA (Rosignano), Institute of Physics Publishing - IOP (Bristol), Springer Verlag (Heidelberg), and Hungarian Scientific Research Fund OTKA are gratefully acknowledged. Last, but not least, special thanks are due to Laura Pesce (Vitrium Galleria, San Vincenzo) for the exposition of her artwork 'arte e scienza' at Castello Pasquini during the conference. The papers submitted in the wake of the conference have been edited by L Diósi, H-T Elze, L Fronzoni, J J Halliwell, E Prati, G Vitiello and J Yearsley. The proceedings follow essentially the order of presentation during the conference, separating, however, invited lectures and contributed papers6. In the name of all participants, we would like to thank S Toms with her collaborators at IOP Publishing (Bristol) for friendly advice and most valuable immediate help during the editing process and, especially, for their continuing efforts to make the Journal of Physics: Conference Series available to all. Budapest, Pisa, London, Milano, Salerno, Cambridge, April 2013 Lajos Diósi, Hans-Thomas Elze, Leone Fronzoni, Jonathan Halliwell, Enrico Prati, Giuseppe Vitiello and James Yearsley 1 Decoherence and Entropy in Complex Systems ed H-T Elze Lecture Notes in Physics 633 (Berlin: Springer, 2004) 2 Proceedings of the Second International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2004 ed H-T Elze Braz. Journ. Phys. 35 A & 2B (2005) pp 205-529 free access at: www.sbfisica.org.br/bjp 3 Proceedings of the Third International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2006 eds H-T Elze, L Diósi and G Vitiello Journal of Physics: Conference Series 67 (2007); free access at: www.iop.org/EJ/toc/1742-6596/67/1 4 Proceedings of the Fourth International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2008> eds H-T Elze, L Diósi, L Fronzoni, J J Halliwell and G Vitiello Journal of Physics: Conference Series 174 (2009); free access at: http://www.iop.org/EJ/toc/1742-6596/174/1 5 Proceedings of the Fifth International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2010 eds H-T Elze, L Diósi, L Fronzoni, J J Halliwell, E Prati, G Vitiello and J Yearsley Journal of Physics: Conference Series 306 (2011); free access at: http://iopscience.iop.org/1742-6596/306/1 6 We regret that invited lectures by Y Aharonov, J Barbour, G Casati and X-G Wen could not be reproduced here, partly for copyright reasons

  15. New optoelectronic methodology for nondestructive evaluation of MEMS at the wafer level

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Ferguson, Curtis F.; Melson, Michael J.

    2004-02-01

    One of the approaches to fabrication of MEMS involves surface micromachining to define dies on single crystal silicon wafers, dicing of the wafers to separate the dies, and electronic packaging of the individual dies. Dicing and packaging of MEMS accounts for a large fraction of the fabrication costs, therefore, nondestructive evaluation at the wafer level, before dicing, can have significant implications on improving production yield and costs. In this paper, advances in development of optoelectronic holography (OEH) techniques for nondestructive, noninvasive, full-field of view evaluation of MEMS at the wafer level are described. With OEH techniques, quantitative measurements of shape and deformation of MEMS, as related to their performance and integrity, are obtained with sub-micrometer spatial resolution and nanometer measuring accuracy. To inspect an entire wafer with OEH methodologies, measurements of overlapping regions of interest (ROI) on a wafer are recorded and adjacent ROIs are stitched together through efficient 3D correlation analysis algorithms. Capabilities of the OEH techniques are illustrated with representative applications, including determination of optimal inspection conditions to minimize inspection time while achieving sufficient levels of accuracy and resolution.

  16. Automated MRI parcellation of the frontal lobe.

    PubMed

    Ranta, Marin E; Chen, Min; Crocetti, Deana; Prince, Jerry L; Subramaniam, Krish; Fischl, Bruce; Kaufmann, Walter E; Mostofsky, Stewart H

    2014-05-01

    Examination of associations between specific disorders and physical properties of functionally relevant frontal lobe sub-regions is a fundamental goal in neuropsychiatry. Here, we present and evaluate automated methods of frontal lobe parcellation with the programs FreeSurfer(FS) and TOADS-CRUISE(T-C), based on the manual method described in Ranta et al. [2009]: Psychiatry Res 172:147-154 in which sulcal-gyral landmarks were used to manually delimit functionally relevant regions within the frontal lobe: i.e., primary motor cortex, anterior cingulate, deep white matter, premotor cortex regions (supplementary motor complex, frontal eye field, and lateral premotor cortex) and prefrontal cortex (PFC) regions (medial PFC, dorsolateral PFC, inferior PFC, lateral orbitofrontal cortex [OFC] and medial OFC). Dice's coefficient, a measure of overlap, and percent volume difference were used to measure the reliability between manual and automated delineations for each frontal lobe region. For FS, mean Dice's coefficient for all regions was 0.75 and percent volume difference was 21.2%. For T-C the mean Dice's coefficient was 0.77 and the mean percent volume difference for all regions was 20.2%. These results, along with a high degree of agreement between the two automated methods (mean Dice's coefficient = 0.81, percent volume difference = 12.4%) and a proof-of-principle group difference analysis that highlights the consistency and sensitivity of the automated methods, indicate that the automated methods are valid techniques for parcellation of the frontal lobe into functionally relevant sub-regions. Thus, the methodology has the potential to increase efficiency, statistical power and reproducibility for population analyses of neuropsychiatric disorders with hypothesized frontal lobe contributions. Copyright © 2013 Wiley Periodicals, Inc.

  17. The Meal, Ready-to-Eat Consumed in a Cold Environment

    DTIC Science & Technology

    1990-02-23

    9 58 Beef, Diced with Gravy 1/4 1/2 3/4 1 2 1 2 3 4 5 6 7 B 9 59 Chicken a la King 1/4 1/2 3/4 1 2 1 2 3 4 5 6 7 8 9 60 Meatballs with BBQ Sauce 1/4...W LGRAVY- CHICKEN A LA KING- MEATBALLS W/BBQ SAUCE- BEEF GROUND W/SPICED SAUCE- STARCHES- BEANS IN TOMATO SAUCE- C> ~ ~ ( H )()-CQb) CRACKERS- S...2 3/4 1 Chicken a la King 60 1/4 1/2 3/4 1 Meatballs with BBQ Sauce 61 1/4 1/2 3/4 1 Ham Slices 62 1/4 1/2 3/4 1 Beef Ground with Spiced Sauce

  18. Automated segmentation of white matter fiber bundles using diffusion tensor imaging data and a new density based clustering algorithm.

    PubMed

    Kamali, Tahereh; Stashuk, Daniel

    2016-10-01

    Robust and accurate segmentation of brain white matter (WM) fiber bundles assists in diagnosing and assessing progression or remission of neuropsychiatric diseases such as schizophrenia, autism and depression. Supervised segmentation methods are infeasible in most applications since generating gold standards is too costly. Hence, there is a growing interest in designing unsupervised methods. However, most conventional unsupervised methods require the number of clusters be known in advance which is not possible in most applications. The purpose of this study is to design an unsupervised segmentation algorithm for brain white matter fiber bundles which can automatically segment fiber bundles using intrinsic diffusion tensor imaging data information without considering any prior information or assumption about data distributions. Here, a new density based clustering algorithm called neighborhood distance entropy consistency (NDEC), is proposed which discovers natural clusters within data by simultaneously utilizing both local and global density information. The performance of NDEC is compared with other state of the art clustering algorithms including chameleon, spectral clustering, DBSCAN and k-means using Johns Hopkins University publicly available diffusion tensor imaging data. The performance of NDEC and other employed clustering algorithms were evaluated using dice ratio as an external evaluation criteria and density based clustering validation (DBCV) index as an internal evaluation metric. Across all employed clustering algorithms, NDEC obtained the highest average dice ratio (0.94) and DBCV value (0.71). NDEC can find clusters with arbitrary shapes and densities and consequently can be used for WM fiber bundle segmentation where there is no distinct boundary between various bundles. NDEC may also be used as an effective tool in other pattern recognition and medical diagnostic systems in which discovering natural clusters within data is a necessity. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Pc as Physics Computer for Lhc ?

    NASA Astrophysics Data System (ADS)

    Jarp, Sverre; Simmins, Antony; Tang, Hong; Yaari, R.

    In the last five years, we have seen RISC workstations take over the computing scene that was once controlled by mainframes and supercomputers. In this paper we will argue that the same phenomenon might happen again. A project, active since March this year in the Physics Data Processing group, of CERN's CN division is described where ordinary desktop PCs running Windows (NT and 3.11) have been used for creating an environment for running large LHC batch jobs (initially the DICE simulation job of Atlas). The problems encountered in porting both the CERN library and the specific Atlas codes are described together with some encouraging benchmark results when comparing to existing RISC workstations in use by the Atlas collaboration. The issues of establishing the batch environment (Batch monitor, staging software, etc.) are also covered. Finally a quick extrapolation of commodity computing power available in the future is touched upon to indicate what kind of cost envelope could be sufficient for the simulation farms required by the LHC experiments.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kay, Randolph R; Campbell, David V; Shinde, Subhash L

    A modular, scalable focal plane array is provided as an array of integrated circuit dice, wherein each die includes a given amount of modular pixel array circuitry. The array of dice effectively multiplies the amount of modular pixel array circuitry to produce a larger pixel array without increasing die size. Desired pixel pitch across the enlarged pixel array is preserved by forming die stacks with each pixel array circuitry die stacked on a separate die that contains the corresponding signal processing circuitry. Techniques for die stack interconnections and die stack placement are implemented to ensure that the desired pixel pitchmore » is preserved across the enlarged pixel array.« less

  1. Evaluating the Impact of a Canadian National Anatomy and Radiology Contouring Boot Camp for Radiation Oncology Residents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaswal, Jasbir; D'Souza, Leah; Johnson, Marjorie

    Background: Radiation therapy treatment planning has advanced over the past 2 decades, with increased emphasis on 3-dimensional imaging for target and organ-at-risk (OAR) delineation. Recent studies suggest a need for improved resident instruction in this area. We developed and evaluated an intensive national educational course (“boot camp”) designed to provide dedicated instruction in site-specific anatomy, radiology, and contouring using a multidisciplinary (MDT) approach. Methods: The anatomy and radiology contouring (ARC) boot camp was modeled after prior single-institution pilot studies and a needs-assessment survey. The boot camp incorporated joint lectures from radiation oncologists, anatomists, radiologists, and surgeons, with hands-on contouring instructionmore » and small group interactive seminars using cadaveric prosections and correlative axial radiographs. Outcomes were evaluated using pretesting and posttesting, including anatomy/radiology multiple-choice questions (MCQ), timed contouring sessions (evaluated relative to a gold standard using Dice similarity metrics), and qualitative questions on satisfaction and perceived effectiveness. Analyses of pretest versus posttest scores were performed using nonparametric paired testing. Results: Twenty-nine radiation oncology residents from 10 Canadian universities participated. As part of their current training, 29%, 75%, and 21% receive anatomy, radiology, and contouring instruction, respectively. On posttest scores, the MCQ knowledge scores improved significantly (pretest mean 60% vs posttest mean 80%, P<.001). Across all contoured structures, there was a 0.20 median improvement in students' average Dice score (P<.001). For individual structures, significant Dice improvements occurred in 10 structures. Residents self-reported an improved ability to contour OARs and interpret radiographs in all anatomic sites, 92% of students found the MDT format effective for their learning, and 93% found the boot camp more effective than educational sessions at their own institutions. All of the residents (100%) would recommend this course to others. Conclusions: The ARC boot camp is an effective intervention for improving radiation oncology residents' knowledge and understanding of anatomy and radiology in addition to enhancing their confidence and accuracy in contouring.« less

  2. The De-Icing Comparison Experiment (D-ICE): A campaign for improving data retention rates of radiometric measurements under icing conditions in cold regions

    NASA Astrophysics Data System (ADS)

    Cox, C. J.; Morris, S. M.

    2017-12-01

    Longwave and shortwave radiative fluxes are fundamental quantities regularly observed globally using broadband radiometers. In cold climates, frost, rime, snow and ice (collectively, "icing") frequently builds up on sensor windows, contaminating measurements. Since icing occurs under particular meteorological conditions, associated data losses constitutes a climatological bias. Furthermore, the signal caused by ice is difficult to distinguish from that of clouds, hampering efforts to identify contaminated from real data in post-processing. Because of the sensitivity of radiometers to internal temperature instabilities, there are limitations to using heat as a de-icing method. The magnitude of this problem is indicated by the large number of research institutions and commercial vendors that have developed various de-icing strategies. The D-ICE campaign has been designed to bring together a large number of currently available systems to quantitatively evaluate and compare ice-migration strategies and also to characterize the potentially adverse effects of the techniques themselves. For D-ICE, a variety of automated approaches making use of ventilation, heating, modified housings and alcohol spray are being evaluated alongside standard units operating with only the regularly scheduled manual cleaning by human operators at the NOAA Baseline Surface Radiation Network (BSRN) station in Utqiaġvik (formerly Barrow), Alaska. Previous experience within the BSRN community suggests that aspiration of ambient air alone may be sufficient to maintain ice-free radiometers without increasing measurement uncertainty during icing conditions, forming the main guiding hypothesis of the experiment. Icing on the sensors is monitored visually using cameras recording images every 15 minutes and quantitatively using an icing probe and met station. The effects of applied heat on infrared loss in pyranometers will be analyzed and the integrated effect of icing on monthly averages will be assessed by comparing ice-mitigated and unmitigated systems. The project is a community effort led by NOAA in collaboration with the BSRN Cold Climates Issues Working Group (CCIWG) in partnership with industry representatives and research institutes. The campaign will operate for a full annual cycle from August 2017 through 2018.

  3. Evaluating the impact of a Canadian national anatomy and radiology contouring boot camp for radiation oncology residents.

    PubMed

    Jaswal, Jasbir; D'Souza, Leah; Johnson, Marjorie; Tay, KengYeow; Fung, Kevin; Nichols, Anthony; Landis, Mark; Leung, Eric; Kassam, Zahra; Willmore, Katherine; D'Souza, David; Sexton, Tracy; Palma, David A

    2015-03-15

    Radiation therapy treatment planning has advanced over the past 2 decades, with increased emphasis on 3-dimensional imaging for target and organ-at-risk (OAR) delineation. Recent studies suggest a need for improved resident instruction in this area. We developed and evaluated an intensive national educational course ("boot camp") designed to provide dedicated instruction in site-specific anatomy, radiology, and contouring using a multidisciplinary (MDT) approach. The anatomy and radiology contouring (ARC) boot camp was modeled after prior single-institution pilot studies and a needs-assessment survey. The boot camp incorporated joint lectures from radiation oncologists, anatomists, radiologists, and surgeons, with hands-on contouring instruction and small group interactive seminars using cadaveric prosections and correlative axial radiographs. Outcomes were evaluated using pretesting and posttesting, including anatomy/radiology multiple-choice questions (MCQ), timed contouring sessions (evaluated relative to a gold standard using Dice similarity metrics), and qualitative questions on satisfaction and perceived effectiveness. Analyses of pretest versus posttest scores were performed using nonparametric paired testing. Twenty-nine radiation oncology residents from 10 Canadian universities participated. As part of their current training, 29%, 75%, and 21% receive anatomy, radiology, and contouring instruction, respectively. On posttest scores, the MCQ knowledge scores improved significantly (pretest mean 60% vs posttest mean 80%, P<.001). Across all contoured structures, there was a 0.20 median improvement in students' average Dice score (P<.001). For individual structures, significant Dice improvements occurred in 10 structures. Residents self-reported an improved ability to contour OARs and interpret radiographs in all anatomic sites, 92% of students found the MDT format effective for their learning, and 93% found the boot camp more effective than educational sessions at their own institutions. All of the residents (100%) would recommend this course to others. The ARC boot camp is an effective intervention for improving radiation oncology residents' knowledge and understanding of anatomy and radiology in addition to enhancing their confidence and accuracy in contouring. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Effects of gamma irradiation on microbial safety and quality of stir fry chicken dices with hot chili during storage

    NASA Astrophysics Data System (ADS)

    Chen, Qian; Cao, Mei; Chen, Hao; Gao, Peng; Fu, Yi; Liu, Mianxue; Wang, Yan; Huang, Min

    2016-10-01

    The purpose of this study was to investigate effects of irradiation with different doses on microbial safety, sensory quality and protein content of ready-to-eat stir fry chicken dices with hot chili (FCC) during one year storage. Fresh chicken meat was cut into small dices and fried at approximately 180 °C for 10 min for preparation of FCC samples. The samples were vacuum-packaged and gamma irradiated at 10, 20, 30 and 40 kGy. The results suggest that irradiation with the doses of 10 and 20 kGy could ensure microbiological safety of the samples without deterioration of sensory quality. Microbial counts, sensory qualities and protein contents of the samples were investigated during one year storage. No viable cells were observed and the samples were completely sterilized. Sensory qualities showed no significant difference after irradiated at the doses of 10 and 20 kGy during the storage period. Protein contents were also not affected by irradiation at the same doses. Our results indicate that gamma irradiation of 10 and 20 kGy are effective to maintain shelf stability of ready-to-eat FCC products with microbial safety, sensory quality and nutritional value.

  5. Lymph node segmentation by dynamic programming and active contours.

    PubMed

    Tan, Yongqiang; Lu, Lin; Bonde, Apurva; Wang, Deling; Qi, Jing; Schwartz, Lawrence H; Zhao, Binsheng

    2018-03-03

    Enlarged lymph nodes are indicators of cancer staging, and the change in their size is a reflection of treatment response. Automatic lymph node segmentation is challenging, as the boundary can be unclear and the surrounding structures complex. This work communicates a new three-dimensional algorithm for the segmentation of enlarged lymph nodes. The algorithm requires a user to draw a region of interest (ROI) enclosing the lymph node. Rays are cast from the center of the ROI, and the intersections of the rays and the boundary of the lymph node form a triangle mesh. The intersection points are determined by dynamic programming. The triangle mesh initializes an active contour which evolves to low-energy boundary. Three radiologists independently delineated the contours of 54 lesions from 48 patients. Dice coefficient was used to evaluate the algorithm's performance. The mean Dice coefficient between computer and the majority vote results was 83.2%. The mean Dice coefficients between the three radiologists' manual segmentations were 84.6%, 86.2%, and 88.3%. The performance of this segmentation algorithm suggests its potential clinical value for quantifying enlarged lymph nodes. © 2018 American Association of Physicists in Medicine.

  6. Is decision making in hypoxia affected by pre-acclimatisation? A randomized controlled trial.

    PubMed

    Niedermeier, Martin; Weisleitner, Andreas; Lamm, Claus; Ledochowski, Larissa; Frühauf, Anika; Wille, Maria; Burtscher, Martin; Kopp, Martin

    2017-05-01

    Decision making is impaired in hypoxic environments, which may have serious or even lethal consequences for mountaineers. An acclimatisation period prior to high altitude exposures may help to overcome adverse effects of hypoxia. Thus, we investigated possible effects of short-term pre-acclimatisation on decision making in hypoxia. In a randomized controlled study design, 52 healthy participants were allocated to a hypoxia group (HG: short-term pre-acclimatisation by the use of intermittent hypoxia 7×1h at F i O 2 =12.6%, equivalent to 4500m) or a control group (CG: sham pre-acclimatisation 7×1h at F i O 2 =20.9%, equivalent to 600m). The number of risky decisions was assessed using the Game of Dice Task at four time points during a 12-hours stay in hypoxia (F i O 2 =12.6%). 42 (HG: 27, CG: 25) participants completed the study. The number of risky decisions was significantly (p=0.048 as determined by 4×2 ANCOVA) reduced in the hypoxia group compared to the control group, partial η 2 =0.11, when the age-effect on decision making was controlled. Self-reported positive affective valence prior to decision making was negatively related to the number of risky decisions, r<-0.38. Short-term pre-acclimatisation might influence decision making in hypoxia in a positive way and might be considered as a risk-reducing preparation method prior to exposures to hypoxic environments. Positive affective states seem to have a medium-sized protective effect against risky decision making. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Chip-on-Board Technology 1996 Year-end Report (Design, Manufacturing, and Reliability Study)

    NASA Technical Reports Server (NTRS)

    Le, Binh Q.; Nhan, Elbert; Maurer, Richard H.; Lew, Ark L.; Lander, Juan R.

    1996-01-01

    The major impetus for flight qualifying Chip-On-Board (COB) packaging technology is the shift in emphasis for space missions to smaller, better, and cheaper spacecraft and satellites resulting from the NASA New Millenium initiative and similar requirements in DoD-sponsored programs. The most important benefit that can potentially be derived from miniaturizing spacecraft and satellites is the significant cost saving realizable if a smaller launch vehicle may be employed. Besides the program cost saving, there are several other advantages to building COB-based space hardware. First, once a well-controlled process is established, COB can be low cost compared to standard Multi-Chip Module (MCM) technology. This cost competitiveness is regarded as a result of the generally greater availability and lower cost of Known Good Die (KGD). Coupled with the elimination of the first level of packaging (chip package), compact, high-density circuit boards can be realized with Printed Wiring Boards (PWB) that can now be made with ever-decreasing feature size in line width and via hole. Since the COB packaging technique in this study is based mainly on populating bare dice on a suitable multi-layer laminate substrate which is not hermetically sealed, die coating for protection from the environment is required. In recent years, significant improvements have been made in die coating materials which further enhance the appeal of COB. Hysol epoxies, silicone, parylene and silicon nitride are desirable because of their compatible Thermal Coefficient of Expansion (TCE) and good moisture resistant capability. These die coating materials have all been used in the space and other industries with varying degrees of success. COB technology, specifically siliconnitride coated hardware, has been flown by Lockheed on the Polar satellite. In addition, DARPA has invested a substantial amount of resources on MCM and COB-related activities recently. With COB on the verge of becoming a dominant player in DoD programs, DARPA is increasing its support of the availability of KGDs which will help decrease their cost. Aside from the various major developments and trends in the space and defense industries that are favorable to the acceptance and widespread use of'COB packaging technology, implementing COB can be appealing in other aspects. Since the interconnection interface is usually the weak link in a system, the overall circuit or system reliability may actually be improved because of the elimination of a level of interconnect/packaging at the chip. With COB, mixing packaging technologies is possible. Because some devices are only available in commercial plastic packages, populating a multi-layer laminate substrate with both bare dice and plastic-package parts is inevitable. Another attractive feature of COB is that re-workability is possible if die coating is applied only on the die top. This method allows local replacement of individual dice that were found to be defective instead of replacing an entire board. In terms of thermal management, unpackaged devices offer a shorter thermal resistance path than their packaged counterparts thereby improving thermal sinking and heat removal from the parts.

  8. Correction tool for Active Shape Model based lumbar muscle segmentation.

    PubMed

    Valenzuela, Waldo; Ferguson, Stephen J; Ignasiak, Dominika; Diserens, Gaelle; Vermathen, Peter; Boesch, Chris; Reyes, Mauricio

    2015-08-01

    In the clinical environment, accuracy and speed of the image segmentation process plays a key role in the analysis of pathological regions. Despite advances in anatomic image segmentation, time-effective correction tools are commonly needed to improve segmentation results. Therefore, these tools must provide faster corrections with a low number of interactions, and a user-independent solution. In this work we present a new interactive correction method for correcting the image segmentation. Given an initial segmentation and the original image, our tool provides a 2D/3D environment, that enables 3D shape correction through simple 2D interactions. Our scheme is based on direct manipulation of free form deformation adapted to a 2D environment. This approach enables an intuitive and natural correction of 3D segmentation results. The developed method has been implemented into a software tool and has been evaluated for the task of lumbar muscle segmentation from Magnetic Resonance Images. Experimental results show that full segmentation correction could be performed within an average correction time of 6±4 minutes and an average of 68±37 number of interactions, while maintaining the quality of the final segmentation result within an average Dice coefficient of 0.92±0.03.

  9. Reproducible segmentation of white matter hyperintensities using a new statistical definition.

    PubMed

    Damangir, Soheil; Westman, Eric; Simmons, Andrew; Vrenken, Hugo; Wahlund, Lars-Olof; Spulber, Gabriela

    2017-06-01

    We present a method based on a proposed statistical definition of white matter hyperintensities (WMH), which can work with any combination of conventional magnetic resonance (MR) sequences without depending on manually delineated samples. T1-weighted, T2-weighted, FLAIR, and PD sequences acquired at 1.5 Tesla from 119 subjects from the Kings Health Partners-Dementia Case Register (healthy controls, mild cognitive impairment, Alzheimer's disease) were used. The segmentation was performed using a proposed definition for WMH based on the one-tailed Kolmogorov-Smirnov test. The presented method was verified, given all possible combinations of input sequences, against manual segmentations and a high similarity (Dice 0.85-0.91) was observed. Comparing segmentations with different input sequences to one another also yielded a high similarity (Dice 0.83-0.94) that exceeded intra-rater similarity (Dice 0.75-0.91). We compared the results with those of other available methods and showed that the segmentation based on the proposed definition has better accuracy and reproducibility in the test dataset used. Overall, the presented definition is shown to produce accurate results with higher reproducibility than manual delineation. This approach can be an alternative to other manual or automatic methods not only because of its accuracy, but also due to its good reproducibility.

  10. A region-based segmentation of tumour from brain CT images using nonlinear support vector machine classifier.

    PubMed

    Nanthagopal, A Padma; Rajamony, R Sukanesh

    2012-07-01

    The proposed system provides new textural information for segmenting tumours, efficiently and accurately and with less computational time, from benign and malignant tumour images, especially in smaller dimensions of tumour regions of computed tomography (CT) images. Region-based segmentation of tumour from brain CT image data is an important but time-consuming task performed manually by medical experts. The objective of this work is to segment brain tumour from CT images using combined grey and texture features with new edge features and nonlinear support vector machine (SVM) classifier. The selected optimal features are used to model and train the nonlinear SVM classifier to segment the tumour from computed tomography images and the segmentation accuracies are evaluated for each slice of the tumour image. The method is applied on real data of 80 benign, malignant tumour images. The results are compared with the radiologist labelled ground truth. Quantitative analysis between ground truth and the segmented tumour is presented in terms of segmentation accuracy and the overlap similarity measure dice metric. From the analysis and performance measures such as segmentation accuracy and dice metric, it is inferred that better segmentation accuracy and higher dice metric are achieved with the normalized cut segmentation method than with the fuzzy c-means clustering method.

  11. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soffientini, Chiara Dolores, E-mail: chiaradolores.soffientini@polimi.it; Baselli, Giuseppe; De Bernardi, Elisabetta

    Purpose: Quantitative {sup 18}F-fluorodeoxyglucose positron emission tomography is limited by the uncertainty in lesion delineation due to poor SNR, low resolution, and partial volume effects, subsequently impacting oncological assessment, treatment planning, and follow-up. The present work develops and validates a segmentation algorithm based on statistical clustering. The introduction of constraints based on background features and contiguity priors is expected to improve robustness vs clinical image characteristics such as lesion dimension, noise, and contrast level. Methods: An eight-class Gaussian mixture model (GMM) clustering algorithm was modified by constraining the mean and variance parameters of four background classes according to the previousmore » analysis of a lesion-free background volume of interest (background modeling). Hence, expectation maximization operated only on the four classes dedicated to lesion detection. To favor the segmentation of connected objects, a further variant was introduced by inserting priors relevant to the classification of neighbors. The algorithm was applied to simulated datasets and acquired phantom data. Feasibility and robustness toward initialization were assessed on a clinical dataset manually contoured by two expert clinicians. Comparisons were performed with respect to a standard eight-class GMM algorithm and to four different state-of-the-art methods in terms of volume error (VE), Dice index, classification error (CE), and Hausdorff distance (HD). Results: The proposed GMM segmentation with background modeling outperformed standard GMM and all the other tested methods. Medians of accuracy indexes were VE <3%, Dice >0.88, CE <0.25, and HD <1.2 in simulations; VE <23%, Dice >0.74, CE <0.43, and HD <1.77 in phantom data. Robustness toward image statistic changes (±15%) was shown by the low index changes: <26% for VE, <17% for Dice, and <15% for CE. Finally, robustness toward the user-dependent volume initialization was demonstrated. The inclusion of the spatial prior improved segmentation accuracy only for lesions surrounded by heterogeneous background: in the relevant simulation subset, the median VE significantly decreased from 13% to 7%. Results on clinical data were found in accordance with simulations, with absolute VE <7%, Dice >0.85, CE <0.30, and HD <0.81. Conclusions: The sole introduction of constraints based on background modeling outperformed standard GMM and the other tested algorithms. Insertion of a spatial prior improved the accuracy for realistic cases of objects in heterogeneous backgrounds. Moreover, robustness against initialization supports the applicability in a clinical setting. In conclusion, application-driven constraints can generally improve the capabilities of GMM and statistical clustering algorithms.« less

  13. Central focused convolutional neural networks: Developing a data-driven model for lung nodule segmentation.

    PubMed

    Wang, Shuo; Zhou, Mu; Liu, Zaiyi; Liu, Zhenyu; Gu, Dongsheng; Zang, Yali; Dong, Di; Gevaert, Olivier; Tian, Jie

    2017-08-01

    Accurate lung nodule segmentation from computed tomography (CT) images is of great importance for image-driven lung cancer analysis. However, the heterogeneity of lung nodules and the presence of similar visual characteristics between nodules and their surroundings make it difficult for robust nodule segmentation. In this study, we propose a data-driven model, termed the Central Focused Convolutional Neural Networks (CF-CNN), to segment lung nodules from heterogeneous CT images. Our approach combines two key insights: 1) the proposed model captures a diverse set of nodule-sensitive features from both 3-D and 2-D CT images simultaneously; 2) when classifying an image voxel, the effects of its neighbor voxels can vary according to their spatial locations. We describe this phenomenon by proposing a novel central pooling layer retaining much information on voxel patch center, followed by a multi-scale patch learning strategy. Moreover, we design a weighted sampling to facilitate the model training, where training samples are selected according to their degree of segmentation difficulty. The proposed method has been extensively evaluated on the public LIDC dataset including 893 nodules and an independent dataset with 74 nodules from Guangdong General Hospital (GDGH). We showed that CF-CNN achieved superior segmentation performance with average dice scores of 82.15% and 80.02% for the two datasets respectively. Moreover, we compared our results with the inter-radiologists consistency on LIDC dataset, showing a difference in average dice score of only 1.98%. Copyright © 2017. Published by Elsevier B.V.

  14. Quantum dice rolling: a multi-outcome generalization of quantum coin flipping

    NASA Astrophysics Data System (ADS)

    Aharon, N.; Silman, J.

    2010-03-01

    The problem of quantum dice rolling (DR)—a generalization of the problem of quantum coin flipping (CF) to more than two outcomes and parties—is studied in both its weak and strong variants. We prove by construction that quantum mechanics allows for (i) weak N-sided DR admitting arbitrarily small bias for any N and (ii) two-party strong N-sided DR saturating Kitaev's bound for any N. To derive (ii) we also prove by construction that quantum mechanics allows for (iii) strong imbalanced CF saturating Kitaev's bound for any degree of imbalance. Furthermore, as a corollary of (ii) we introduce a family of optimal 2m-party strong nm-sided DR protocols for any pair m and n.

  15. 7 CFR 953.322 - Handling regulation.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE IRISH POTATOES GROWN IN... substantial change. The act of peeling, cooling, slicing, dicing, or applying material to prevent oxidation...

  16. 7 CFR 953.322 - Handling regulation.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGREEMENTS AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE IRISH POTATOES GROWN IN... substantial change. The act of peeling, cooling, slicing, dicing, or applying material to prevent oxidation...

  17. 7 CFR 953.322 - Handling regulation.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGREEMENTS AND ORDERS; FRUITS, VEGETABLES, NUTS), DEPARTMENT OF AGRICULTURE IRISH POTATOES GROWN IN... substantial change. The act of peeling, cooling, slicing, dicing, or applying material to prevent oxidation...

  18. 7 CFR 953.322 - Handling regulation.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF AGRICULTURE IRISH POTATOES GROWN IN... substantial change. The act of peeling, cooling, slicing, dicing, or applying material to prevent oxidation...

  19. Effective user guidance in online interactive semantic segmentation

    NASA Astrophysics Data System (ADS)

    Petersen, Jens; Bendszus, Martin; Debus, Jürgen; Heiland, Sabine; Maier-Hein, Klaus H.

    2017-03-01

    With the recent success of machine learning based solutions for automatic image parsing, the availability of reference image annotations for algorithm training is one of the major bottlenecks in medical image segmentation. We are interested in interactive semantic segmentation methods that can be used in an online fashion to generate expert segmentations. These can be used to train automated segmentation techniques or, from an application perspective, for quick and accurate tumor progression monitoring. Using simulated user interactions in a MRI glioblastoma segmentation task, we show that if the user possesses knowledge of the correct segmentation it is significantly (p <= 0.009) better to present data and current segmentation to the user in such a manner that they can easily identify falsely classified regions compared to guiding the user to regions where the classifier exhibits high uncertainty, resulting in differences of mean Dice scores between +0.070 (Whole tumor) and +0.136 (Tumor Core) after 20 iterations. The annotation process should cover all classes equally, which results in a significant (p <= 0.002) improvement compared to completely random annotations anywhere in falsely classified regions for small tumor regions such as the necrotic tumor core (mean Dice +0.151 after 20 it.) and non-enhancing abnormalities (mean Dice +0.069 after 20 it.). These findings provide important insights for the development of efficient interactive segmentation systems and user interfaces.

  20. Calculating ellipse area by the Monte Carlo method and analysing dice poker with Excel at high school

    NASA Astrophysics Data System (ADS)

    Benacka, Jan

    2016-08-01

    This paper reports on lessons in which 18-19 years old high school students modelled random processes with Excel. In the first lesson, 26 students formulated a hypothesis on the area of ellipse by using the analogy between the areas of circle, square and rectangle. They verified the hypothesis by the Monte Carlo method with a spreadsheet model developed in the lesson. In the second lesson, 27 students analysed the dice poker game. First, they calculated the probability of the hands by combinatorial formulae. Then, they verified the result with a spreadsheet model developed in the lesson. The students were given a questionnaire to find out if they found the lesson interesting and contributing to their mathematical and technological knowledge.

  1. Optical ridge waveguides in Nd:LGS crystal produced by combination of swift C5+ ion irradiation and precise diamond blade dicing

    NASA Astrophysics Data System (ADS)

    Cheng, Yazhou; Lv, Jinman; Akhmadaliev, Shavkat; Zhou, Shengqiang; Chen, Feng

    2016-07-01

    We report on the fabrication of optical ridge waveguides in Nd:LGS crystal by using combination of swift C5+ ion irradiation and precise diamond blade dicing. The ridge structures support guidance both at 632.8 nm and 1064 nm wavelength along the TE and TM polarizations. The lowest propagation losses of the ridge waveguide for the TM mode are ~1.6 dB/cm at 632.8 nm and ~1.2 dB/cm at 1064 nm, respectively. The investigation of micro-fluorescence spectra and micro-Raman spectra indicates that the Nd3+ luminescence features have been well preserved and the microstructure of the waveguide region has no significant change after C5+ ion irradiation.

  2. Mid-infrared ridge waveguide in MgO:LiNbO3 crystal produced by combination of swift O5+ ion irradiation and precise diamond blade dicing

    NASA Astrophysics Data System (ADS)

    Cheng, Yazhou; Lv, Jinman; Akhmadaliev, Shavkat; Zhou, Shengqiang; Kong, Yongfa; Chen, Feng

    2015-10-01

    We report on the fabrication of ridge waveguide operating at mid-infrared wavelength in MgO:LiNbO3 crystal by using O5+ ion irradiation and precise diamond blade dicing. The waveguide shows good guiding properties at the wavelength of 4 μm along the TM polarization. Thermal annealing has been implemented to improve the waveguiding performances. The propagation loss of the ridge waveguide has been reduced to be 1.0 dB/cm at 4 μm after annealing at 310 °C. The micro-Raman spectra indicate that the microstructure of the MgO:LiNbO3 crystal has no significant change along the ion track after swift O5+ ion irradiation.

  3. Investigation of Various Surface Acoustic Wave Design Configurations for Improved Sensitivity

    NASA Astrophysics Data System (ADS)

    Manohar, Greeshma

    Surface acoustic wave sensors have been a focus of active research for many years. Its ability to respond for surface perturbation is a basic principle for its sensing capability. Sensitivity to surface perturbation changes with every inter-digital transducer (IDT) design parameters, substrate selection, metallization choice and technique, delay line length and working environment. In this thesis, surface acoustic wave (SAW) sensors are designed and characterized to improve sensitivity and reduce loss. To quantify the improvements with a specific design configuration, the sensors are employed to measure temperature. Four SAW sensors design configurations, namely bi-directional, split electrode, single phase unidirectional transducer (SPUDT) and metal grating on delay line (shear transvers wave sensors) are designed and then fabricated in Nanotechnology Research and Education Center (NREC) facility using traditional MEMS fabrication processes Additionally, sensors are then coated with guiding layer SU8-2035 of 40µm using spin coating and SiO 2 of 6µm using plasma enhanced chemical vapor deposition (PECVD) process. Sensors are later diced and tested for every 5°C increment using network analyzer for temperature ranging from 30°C±0.5°C to 80°C±0.5°C. Data acquired from network analyzer is analyzed using plot of logarithmic magnitude, phase and frequency shift. Furthermore, to investigate the effect of metallization technique on the sensor performance, sensors are also fabricated on substrates that were metallized at a commercial MEMS foundry. All in-house and outside sputtered sensor configurations are compared to investigate quality of sputtered metal on wafer. One with better quality sputtered metal is chosen for further study. Later sensors coated with SU8 and SiO2 as guiding layer are compared to investigate effect of each waveguide on sensors and determine which waveguide offers better performance. The results showed that company sputtered sensors have higher sensitivity compared to in-house sputtered wafers. Furthermore after comparing SU8 and SiO2 coated sensors in the same instrumental and environmental condition, it was observed that SU8 coated di-directional and single phase unidirectional transducer (SPUDT) sensors showed best response.

  4. Diverse convergent evidence in the genetic analysis of complex disease: coordinating omic, informatic, and experimental evidence to better identify and validate risk factors

    PubMed Central

    2014-01-01

    In omic research, such as genome wide association studies, researchers seek to repeat their results in other datasets to reduce false positive findings and thus provide evidence for the existence of true associations. Unfortunately this standard validation approach cannot completely eliminate false positive conclusions, and it can also mask many true associations that might otherwise advance our understanding of pathology. These issues beg the question: How can we increase the amount of knowledge gained from high throughput genetic data? To address this challenge, we present an approach that complements standard statistical validation methods by drawing attention to both potential false negative and false positive conclusions, as well as providing broad information for directing future research. The Diverse Convergent Evidence approach (DiCE) we propose integrates information from multiple sources (omics, informatics, and laboratory experiments) to estimate the strength of the available corroborating evidence supporting a given association. This process is designed to yield an evidence metric that has utility when etiologic heterogeneity, variable risk factor frequencies, and a variety of observational data imperfections might lead to false conclusions. We provide proof of principle examples in which DiCE identified strong evidence for associations that have established biological importance, when standard validation methods alone did not provide support. If used as an adjunct to standard validation methods this approach can leverage multiple distinct data types to improve genetic risk factor discovery/validation, promote effective science communication, and guide future research directions. PMID:25071867

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peressutti, D; Schipaanboord, B; Kadir, T

    Purpose: To investigate the effectiveness of atlas selection methods for improving atlas-based auto-contouring in radiotherapy planning. Methods: 275 H&N clinically delineated cases were employed as an atlas database from which atlases would be selected. A further 40 previously contoured cases were used as test patients against which atlas selection could be performed and evaluated. 26 variations of selection methods proposed in the literature and used in commercial systems were investigated. Atlas selection methods comprised either global or local image similarity measures, computed after rigid or deformable registration, combined with direct atlas search or with an intermediate template image. Workflow Boxmore » (Mirada-Medical, Oxford, UK) was used for all auto-contouring. Results on brain, brainstem, parotids and spinal cord were compared to random selection, a fixed set of 10 “good” atlases, and optimal selection by an “oracle” with knowledge of the ground truth. The Dice score and the average ranking with respect to the “oracle” were employed to assess the performance of the top 10 atlases selected by each method. Results: The fixed set of “good” atlases outperformed all of the atlas-patient image similarity-based selection methods (mean Dice 0.715 c.f. 0.603 to 0.677). In general, methods based on exhaustive comparison of local similarity measures showed better average Dice scores (0.658 to 0.677) compared to the use of either template image (0.655 to 0.672) or global similarity measures (0.603 to 0.666). The performance of image-based selection methods was found to be only slightly better than a random (0.645). Dice scores given relate to the left parotid, but similar results patterns were observed for all organs. Conclusion: Intuitively, atlas selection based on the patient CT is expected to improve auto-contouring performance. However, it was found that published approaches performed marginally better than random and use of a fixed set of representative atlases showed favourable performance. This research was funded via InnovateUK Grant 600277 as part of Eurostars Grant E!9297. DP,BS,MG,TK are employees of Mirada Medical Ltd.« less

  6. SU-C-17A-03: Evaluation of Deformable Image Registration Methods Between MRI and CT for Prostate Cancer Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wen, N; Glide-Hurst, C; Zhong, H

    2014-06-15

    Purpose: We evaluated the performance of two commercially available and one open source B-Spline deformable image registration (DIR) algorithms between T2-weighted MRI and treatment planning CT using the DICE indices. Methods: CT simulation (CT-SIM) and MR simulation (MR-SIM) for four prostate cancer patients were conducted on the same day using the same setup and immobilization devices. CT images (120 kVp, 500 mAs, voxel size = 1.1x1.1x3.0 mm3) were acquired using an open-bore CT scanner. T2-weighted Turbo Spine Echo (T2W-TSE) images (TE/TR/α = 80/4560 ms/90°, voxel size = 0.7×0.7×2.5 mm3) were scanned on a 1.0T high field open MR-SIM. Prostates, seminalmore » vesicles, rectum and bladders were delineated on both T2W-TSE and CT images by the attending physician. T2W-TSE images were registered to CT images using three DIR algorithms, SmartAdapt (Varian), Velocity AI (Velocity) and Elastix (Klein et al 2010) and contours were propagated. DIR results were evaluated quantitatively or qualitatively by image comparison and calculating organ DICE indices. Results: Significant differences in the contours of prostate and seminal vesicles were observed between MR and CT. On average, volume changes of the propagated contours were 5%, 2%, 160% and 8% for the prostate, seminal vesicles, bladder and rectum respectively. Corresponding mean DICE indices were 0.7, 0.5, 0.8, and 0.7. The intraclass correlation coefficient (ICC) was 0.9 among three algorithms for the Dice indices. Conclusion: Three DIR algorithms for CT/MR registration yielded similar results for organ propagation. Due to the different soft tissue contrasts between MRI and CT, organ delineation of prostate and SVs varied significantly, thus efforts to develop other DIR evaluation metrics are warranted. Conflict of interest: Submitting institution has research agreements with Varian Medical System and Philips Healthcare.« less

  7. Ultrafast-laser dicing of thin silicon wafers: strategies to improve front- and backside breaking strength

    NASA Astrophysics Data System (ADS)

    Domke, Matthias; Egle, Bernadette; Stroj, Sandra; Bodea, Marius; Schwarz, Elisabeth; Fasching, Gernot

    2017-12-01

    Thin 50-µm silicon wafers are used to improve heat dissipation of chips with high power densities. However, mechanical dicing methods cause chipping at the edges of the separated dies that reduce the mechanical stability. Thermal load changes may then lead to sudden chip failure. Recent investigations showed that the mechanical stability of the cut chips could be increased using ultrashort-pulsed lasers, but only at the laser entrance (front) side and not at the exit (back) side. The goal of this study was to find strategies to improve both front- and backside breaking strength of chips that were cut out of an 8″ wafer with power metallization using an ultrafast laser. In a first experiment, chips were cut by scanning the laser beam in single lines across the wafer using varying fluencies and scan speeds. Three-point bending tests of the cut chips were performed to measure front and backside breaking strengths. The results showed that the breaking strength of both sides increased with decreasing accumulated fluence per scan. Maximum breaking strengths of about 1100 MPa were achieved at the front side, but only below 600 MPa were measured for the backside. A second experiment was carried out to optimize the backside breaking strength. Here, parallel line scans to increase the distance between separated dies and step cuts to minimize the effect of decreasing fluence during scribing were performed. Bending tests revealed that breaking strengths of about 1100 MPa could be achieved also on the backside using the step cut. A reason for the superior performance could be found by calculating the fluence absorbed by the sidewalls. The calculations suggested that an optimal fluence level to minimize thermal side effects and periodic surface structures was achieved due to the step cut. Remarkably, the best breaking strengths values achieved in this study were even higher than the values obtained on state of the art ns-laser and mechanical dicing machines. This is the first study to the knowledge of the authors, which demonstrates that ultrafast-laser dicing improves the mechanical stability of thin silicon chips.

  8. Extent of BOLD Vascular Dysregulation Is Greater in Diffuse Gliomas without Isocitrate Dehydrogenase 1 R132H Mutation.

    PubMed

    Englander, Zachary K; Horenstein, Craig I; Bowden, Stephen G; Chow, Daniel S; Otten, Marc L; Lignelli, Angela; Bruce, Jeffrey N; Canoll, Peter; Grinband, Jack

    2018-06-01

    Purpose To determine the effect that R132H mutation status of diffuse glioma has on extent of vascular dysregulation and extent of residual blood oxygen level-dependent (BOLD) abnormality after surgical resection. Materials and Methods This study was an institutional review board-approved retrospective analysis of an institutional database of patients, and informed consent was waived. From 2010 to 2017, 39 treatment-naïve patients with diffuse glioma underwent preoperative echo-planar imaging and BOLD functional magnetic resonance imaging. BOLD vascular dysregulation maps were made by identifying voxels with time series similar to tumor and dissimilar to healthy brain. The spatial overlap between tumor and vascular dysregulation was characterized by using the Dice coefficient, and areas of BOLD abnormality outside the tumor margins were quantified as BOLD-only fraction (BOF). Linear regression was used to assess effects of R132H status on the Dice coefficient, BOF, and residual BOLD abnormality after surgical resection. Results When compared with R132H wild-type (R132H-) gliomas, R132H-mutated (R132H+) gliomas showed greater spatial overlap between BOLD abnormality and tumor (mean Dice coefficient, 0.659 ± 0.02 [standard error] for R132H+ and 0.327 ± 0.04 for R132H-; P < .001), less BOLD abnormality beyond the tumor margin (mean BOF, 0.255 ± 0.03 for R132H+ and 0.728 ± 0.04 for R132H-; P < .001), and less postoperative BOLD abnormality (residual fraction, 0.046 ± 0.0047 for R132H+ and 0.397 ± 0.045 for R132H-; P < .001). Receiver operating characteristic curve analysis showed high sensitivity and specificity in the discrimination of R132H+ tumors from R132H- tumors with calculation of both Dice coefficient and BOF (area under the receiver operating characteristic curve, 0.967 and 0.977, respectively). Conclusion R132H mutation status is an important variable affecting the extent of tumor-associated vascular dysregulation and the residual vascular dysregulation after surgical resection. © RSNA, 2018 Online supplemental material is available for this article.

  9. PREFACE: 5th International Workshop DICE2010: Space-Time-Matter - Current Issues in Quantum Mechanics and Beyond

    NASA Astrophysics Data System (ADS)

    Diósi, Lajos; Elze, Hans-Thomas; Fronzoni, Leone; Halliwell, Jonathan; Prati, Enrico; Vitiello, Giuseppe; Yearsley, James

    2011-07-01

    These proceedings present the Invited Lectures and Contributed Papers of the Fifth International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2010, held at Castello Pasquini, Castiglioncello (Tuscany), 13-17 September 2010. These proceedings are intended to document the stimulating exchange of ideas at this conference for both the interested public and the wider scientific community, as well as for the participants. The number of participants attending this series of meetings has been growing steadily, which reflects its increasing attraction. Our intention to bring together leading researchers, advanced students, and renowned scholars from various areas in order to stimulate new ideas and their exchange across the borders of specialization seems to bear fruit. In this way, the series of meetings has continued successfully from the beginning with DICE 2002 [1], followed by DICE 2004 [2], DICE 2006 [3], and DICE 2008 [4], uniting more than 100 participants representing almost 30 countries worldwide. It has been a great honour and inspiration to have Professor Luc Montagnier (Nobel Prize for Medicine 2008) from the World Foundation for AIDS Research and Prevention with us, who presented the lecture DNA waves and water (included in this volume). The discussions took place under the wider theme Space-Time-Matter - current issues in quantum mechanics and beyond in the very pleasant and inspiring atmosphere of Castello Pasquini, which - with its beautiful surroundings, overlooking the Tuscany coast - hosted the conference very successfully for the second time. The five-day program was grouped according to the following topics: Gravity and Quantum Mechanics Quantum Coherent Processes in Biology / Many-Body Systems From Quantum Foundations to Particle Physics The Deep Structure of Spacetime Quantum - Relativity - Cosmology A Public Roundtable Discussion formed an integral part of the program under the theme Sull' Onda Della Coerenza" - le nuove frontiere della scienza moderna with the participation of E Del Giudice (INFN & Università di Milano), L Fronzoni (Università di Pisa) and G Vitiello (Università di Salerno). By now forming a tradition, this evening event drew a large audience, who participated in lively discussions until late. The workshop was organized by L Diósi (Budapest), H-T Elze (Pisa, chair), L Fronzoni (Pisa), J Halliwell (London), E Prati (Milano) and G Vitiello (Salerno), with essential help from our conference seceretaries M Pesce-Rollins and L Baldini and from our students G Gambarotta and F Vallone, all from Pisa. Several institutions and sponsors supported the workshop; their representatives and, in particular, the citizens of Rosignano / Castiglioncello are deeply thanked for their generous help and kind hospitality: Comune di Rosignano - A Franchi (Sindaco di Rosignano), S Scarpellini (Segreteria sindaco), L Benini (Assessore ai lavori pubblici), M Pia (Assessore all' urbanistica). REA Rosignano Energia Ambiente s.p.a. - F Ghelardini (Presidente della REA), E Salvadori (Segreteria). Associazione Armunia - M Paganelli (Direttore), G Mannari (Programmazione). Special thanks go to G Mannari and her collaborators for their advice and great help in all the practical matters that had to be dealt with in order to run the meeting at Castello Pasquini smoothly. Funds made available by Università di Pisa, by Domus Galilaeana (Pisa), Centro Interdisciplinare per lo Studio dei Sistemi Complessi - CISSC (Pisa), Dipartmento di Matematica e Informatica (Università di Salerno), Instituto Italiano per gli Studi Filosofici - IISF (Napoli), and by the Hungarian Scientific Research Fund OTKA, are gratefully ackowledged. Last, but not least, special thanks are due to Laura Pesce (Vitrium Galleria, San Vincenzo) for the exposition for her artwork Dal io al cosmo at Castello Pasquini during the conference. The papers presented at the workshop and collected here have been edited by L Diósi, H-T Elze, L Fronzoni, J J Halliwell, E Prati, G Vitiello and J Yearsley. The proceedings essentially follow the order of presentation during the conference program, however, divided into Invited Lectures and Contributed Papers. (We regret that lectures by D Bouwmeester, G G Guerreschi, G C Ghirardi and C Kiefer could not be reproduced here, partly for copyright reasons.) In the name of all the participants, we would like to thank S Toms and G Douglas, and their collaborators at IOP Publishing (Bristol) for their friendly advice and most valuable and immediate help during the editing process and, especially, for their continuing efforts to make the Journal of Physics: Conference Series available to all. Budapest, Pisa, London, Milano and Salerno, May 2011 Lajos Diósi, Hans-Thomas Elze, Leone Fronzoni, Jonathan Halliwell, Enrico Prati, Guiseppe Vitiello and James Yearsley [1] Elze H-T (ed) 2004 Decoherence and Entropy in Complex Systems Lecture Notes in Physics 633 (Berlin: Springer) [2] Elze H-T (ed) 2005 Proceedings of the Second International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2004 Braz. Journ. Phys. 35 2A and B pp 205-529free access at: www.sbfisica.org.br/bjp[3] Elze H-T, Diósi L, Fronzoni L, Halliwell J J and Vitiello (eds) 2007 Proceedings of the Third International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2006 J. Phys.: Conf. Ser. 67free access at: www.iop.org/EJ/toc/1742-6596/67/1[4] Elze H-T, Diósi L, Fronzoni L, Halliwell J J and G Vitiello (eds) 2009 Proceedings of the Fourth International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2008 J. Phys.: Conf. Ser. 174free access at: www.iop.org/EJ/toc/1742-6596/67/1

  10. Performance metrics for the evaluation of hyperspectral chemical identification systems

    NASA Astrophysics Data System (ADS)

    Truslow, Eric; Golowich, Steven; Manolakis, Dimitris; Ingle, Vinay

    2016-02-01

    Remote sensing of chemical vapor plumes is a difficult but important task for many military and civilian applications. Hyperspectral sensors operating in the long-wave infrared regime have well-demonstrated detection capabilities. However, the identification of a plume's chemical constituents, based on a chemical library, is a multiple hypothesis testing problem which standard detection metrics do not fully describe. We propose using an additional performance metric for identification based on the so-called Dice index. Our approach partitions and weights a confusion matrix to develop both the standard detection metrics and identification metric. Using the proposed metrics, we demonstrate that the intuitive system design of a detector bank followed by an identifier is indeed justified when incorporating performance information beyond the standard detection metrics.

  11. Baked Tilapia with Tomatoes

    MedlinePlus

    ... onion, diced 1 tablespoon lime juice Parsley and lemon wedges for garnish Directions Preheat oven to 400 ° ... with a fork. Garnish with parsley and a lemon wedge. Find more delicious heart healthy recipes from ...

  12. A Simulation To Model Exponential Growth.

    ERIC Educational Resources Information Center

    Appelbaum, Elizabeth Berman

    2000-01-01

    Describes a simulation using dice-tossing students in a population cluster to model the growth of cancer cells. This growth is recorded in a scatterplot and compared to an exponential function graph. (KHR)

  13. Digital Library Storage using iRODS Data Grids

    NASA Astrophysics Data System (ADS)

    Hedges, Mark; Blanke, Tobias; Hasan, Adil

    Digital repository software provides a powerful and flexible infrastructure for managing and delivering complex digital resources and metadata. However, issues can arise in managing the very large, distributed data files that may constitute these resources. This paper describes an implementation approach that combines the Fedora digital repository software with a storage layer implemented as a data grid, using the iRODS middleware developed by DICE (Data Intensive Cyber Environments) as the successor to SRB. This approach allows us to use Fedoras flexible architecture to manage the structure of resources and to provide application- layer services to users. The grid-based storage layer provides efficient support for managing and processing the underlying distributed data objects, which may be very large (e.g. audio-visual material). The Rule Engine built into iRODS is used to integrate complex workflows at the data level that need not be visible to users, e.g. digital preservation functionality.

  14. Thermodynamics of emergent magnetic charge screening in artificial spin ice

    DOE PAGES

    Farhan, Alan; Scholl, Andreas; Petersen, Charlotte F.; ...

    2016-09-01

    Electric charge screening is a fundamental principle governing the behaviour in a variety of systems in nature. Through reconfiguration of the local environment, the Coulomb attraction between electric charges is decreased, leading, for example, to the creation of polaron states in solids or hydration shells around proteins in water. Here, we directly visualize the real-time creation and decay of screened magnetic charge configurations in a two-dimensional artificial spin ice system, the dipolar dice lattice. By comparing the temperature dependent occurrence of screened and unscreened emergent magnetic charge defects, we determine that screened magnetic charges are indeed a result of localmore » energy reduction and appear as a transient minimum energy state before the system relaxes towards the predicted ground state. These results highlight the important role of emergent magnetic charges in artificial spin ice, giving rise to screened charge excitations and the emergence of exotic low-temperature configurations.« less

  15. Thermodynamics of emergent magnetic charge screening in artificial spin ice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farhan, Alan; Scholl, Andreas; Petersen, Charlotte F.

    Electric charge screening is a fundamental principle governing the behaviour in a variety of systems in nature. Through reconfiguration of the local environment, the Coulomb attraction between electric charges is decreased, leading, for example, to the creation of polaron states in solids or hydration shells around proteins in water. Here, we directly visualize the real-time creation and decay of screened magnetic charge configurations in a two-dimensional artificial spin ice system, the dipolar dice lattice. By comparing the temperature dependent occurrence of screened and unscreened emergent magnetic charge defects, we determine that screened magnetic charges are indeed a result of localmore » energy reduction and appear as a transient minimum energy state before the system relaxes towards the predicted ground state. These results highlight the important role of emergent magnetic charges in artificial spin ice, giving rise to screened charge excitations and the emergence of exotic low-temperature configurations.« less

  16. Joe Walker in pressure suit with X-1E

    NASA Technical Reports Server (NTRS)

    1958-01-01

    Joe Walker in a pressure suit beside the X-1E at the NASA High-Speed Flight Station, Edwards,California. The dice and 'Little Joe' are prominently displayed under the cockpit area. (Little Joe is a dice players slang term for two deuces.) Walker is shown in the photo wearing an early Air Force partial pressure suit. This protected the pilot if cockpit pressure was lost above 50,000 feet. Similar suits were used in such aircraft as B-47s, B-52s, F-104s, U-2s, and the X-2 and D-558-II research aircraft. Five years later, Walker reached 354,200 feet in the X-15. Similar artwork - reading 'Little Joe the II' - was applied for the record flight. These cases are two of the few times that research aircraft carried such nose art.

  17. Fiber-integrated refractive index sensor based on a diced Fabry-Perot micro-resonator.

    PubMed

    Suntsov, Sergiy; Rüter, Christian E; Schipkowski, Tom; Kip, Detlef

    2017-11-20

    We report on a fiber-integrated refractive index sensor based on a Fabry-Perot micro-resonator fabricated using simple diamond blade dicing of a single-mode step-index fiber. The performance of the device has been tested for the refractive index measurements of sucrose solutions as well as in air. The device shows a sensitivity of 1160 nm/RIU (refractive index unit) at a wavelength of 1.55 μm and a temperature cross-sensitivity of less than 10 -7   RIU/°C. Based on evaluation of the broadband reflection spectra, refractive index steps of 10 -5 of the solutions were accurately measured. The conducted coating of the resonator sidewalls with layers of a high-index material with real-time reflection spectrum monitoring could help to significantly improve the sensor performance.

  18. ATP-dependent human RISC assembly pathways.

    PubMed

    Yoda, Mayuko; Kawamata, Tomoko; Paroo, Zain; Ye, Xuecheng; Iwasaki, Shintaro; Liu, Qinghua; Tomari, Yukihide

    2010-01-01

    The assembly of RNA-induced silencing complex (RISC) is a key process in small RNA-mediated gene silencing. In humans, small interfering RNAs (siRNAs) and microRNAs (miRNAs) are incorporated into RISCs containing the Argonaute (AGO) subfamily proteins Ago1-4. Previous studies have proposed that, unlike Drosophila melanogaster RISC assembly pathways, human RISC assembly is coupled with dicing and is independent of ATP. Here we show by careful reexamination that, in humans, RISC assembly and dicing are uncoupled, and ATP greatly facilitates RISC loading of small-RNA duplexes. Moreover, all four human AGO proteins show remarkably similar structural preferences for small-RNA duplexes: central mismatches promote RISC loading, and seed or 3'-mid (guide position 12-15) mismatches facilitate unwinding. All these features of human AGO proteins are highly reminiscent of fly Ago1 but not fly Ago2.

  19. Ridge waveguides in Nd:ABC3O7 disordered crystals produced by swift C5+ ion irradiation and precise diamond dicing: Broad band guidance and spectroscopic properties

    NASA Astrophysics Data System (ADS)

    Chen, Chen; Luan, Qingfang; He, Ruiyun; Cheng, Chen; Akhmadaliev, Shavkat; Zhou, Shengqiang; Yu, Haohai; Zhang, Huaijin; Chen, Feng

    2015-05-01

    Optical ridge waveguides have been manufactured in the crystals of Nd:SrLaGa3O7 and Nd:SrGdGa3O7 by combining techniques of swift carbon ion irradiation with precise diamond blade dicing. The guiding properties of the waveguides are investigated at broadband (at wavelength of 633 nm, 1064 nm, and 4 μm). After annealing treatment at 200 °C for 1 h, the propagation losses of ridge waveguides could be reduced to as low as 1 dB/cm. The confocal microfluorescence emission spectra confirm that the fluorescence properties of Nd3+ ions are almost unchanged after the ion irradiation processing, showing promising potentials as application of miniature light sources in integrated optics.

  20. Freesurfer-initialized large deformation diffeomorphic metric mapping with application to Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Chen, Jingyun; Palmer, Samantha J.; Khan, Ali R.; Mckeown, Martin J.; Beg, Mirza Faial

    2009-02-01

    We apply a recently developed automated brain segmentation method, FS+LDDMM, to brain MRI scans from Parkinson's Disease (PD) subjects, and normal age-matched controls and compare the results to manual segmentation done by trained neuroscientists. The data set consisted of 14 PD subjects and 12 age-matched control subjects without neurologic disease and comparison was done on six subcortical brain structures (left and right caudate, putamen and thalamus). Comparison between automatic and manual segmentation was based on Dice Similarity Coefficient (Overlap Percentage), L1 Error, Symmetrized Hausdorff Distance and Symmetrized Mean Surface Distance. Results suggest that FS+LDDMM is well-suited for subcortical structure segmentation and further shape analysis in Parkinson's Disease. The asymmetry of the Dice Similarity Coefficient over shape change is also discussed based on the observation and measurement of FS+LDDMM segmentation results.

  1. Joe Walker in pressure suit with X-1E

    NASA Image and Video Library

    1958-01-27

    Joe Walker in a pressure suit beside the X-1E at the NASA High-Speed Flight Station, Edwards,California. The dice and "Little Joe" are prominently displayed under the cockpit area. (Little Joe is a dice players slang term for two deuces.) Walker is shown in the photo wearing an early Air Force partial pressure suit. This protected the pilot if cockpit pressure was lost above 50,000 feet. Similar suits were used in such aircraft as B-47s, B-52s, F-104s, U-2s, and the X-2 and D-558-II research aircraft. Five years later, Walker reached 354,200 feet in the X-15. Similar artwork - reading "Little Joe the II" - was applied for the record flight. These cases are two of the few times that research aircraft carried such nose art.

  2. Precise Correction of Disease Mutations in Induced Pluripotent Stem Cells Derived From Patients With Limb Girdle Muscular Dystrophy.

    PubMed

    Turan, Soeren; Farruggio, Alfonso P; Srifa, Waracharee; Day, John W; Calos, Michele P

    2016-04-01

    Limb girdle muscular dystrophies types 2B (LGMD2B) and 2D (LGMD2D) are degenerative muscle diseases caused by mutations in the dysferlin and alpha-sarcoglycan genes, respectively. Using patient-derived induced pluripotent stem cells (iPSC), we corrected the dysferlin nonsense mutation c.5713C>T; p.R1905X and the most common alpha-sarcoglycan mutation, missense c.229C>T; p.R77C, by single-stranded oligonucleotide-mediated gene editing, using the CRISPR/Cas9 gene-editing system to enhance the frequency of homology-directed repair. We demonstrated seamless, allele-specific correction at efficiencies of 0.7-1.5%. As an alternative, we also carried out precise gene addition strategies for correction of the LGMD2B iPSC by integration of wild-type dysferlin cDNA into the H11 safe harbor locus on chromosome 22, using dual integrase cassette exchange (DICE) or TALEN-assisted homologous recombination for insertion precise (THRIP). These methods employed TALENs and homologous recombination, and DICE also utilized site-specific recombinases. With DICE and THRIP, we obtained targeting efficiencies after selection of ~20%. We purified iPSC corrected by all methods and verified rescue of appropriate levels of dysferlin and alpha-sarcoglycan protein expression and correct localization, as shown by immunoblot and immunocytochemistry. In summary, we demonstrate for the first time precise correction of LGMD iPSC and validation of expression, opening the possibility of cell therapy utilizing these corrected iPSC.

  3. Zesty Tomato Soup

    MedlinePlus

    ... added diced tomatoes 1 cup jarred roasted red peppers, drained (or substitute fresh roasted red peppers) 1 cup fat-free evaporated milk 1 tsp garlic powder 1/4 tsp ground black pepper 2 Tbsp fresh basil, rinsed and chopped (or ...

  4. Quick Beef Casserole

    MedlinePlus

    ... teaspoon of salt 1/2 teaspoon of ground black pepper 1/4 teaspoon of paprika 1 cup of frozen peas 2 small carrots, rinsed, peeled, and diced 1 cup of uncooked rice 1 and 1/2 cups of water Directions ...

  5. Keep the Beat Recipes | NIH MedlinePlus the Magazine

    MedlinePlus

    ... Cup celery, rinsed and diced 1 Cup pearl onions, raw or frozen 3 Cup low-sodium chicken ... 5 minutes. Add leeks, potatoes, celery, and pearl onions, and continue to cook until the vegetables become ...

  6. Diced Remnant

    NASA Image and Video Library

    2006-06-05

    This MOC image shows blocky remnants of a material that was once more laterally extensive on the floor of an impact crater located northwest of Herschel Crater on Mars. Large ripples of windblown sediment have accumulated around and between the blocks

  7. Users' Manual and Installation Guide for the EverVIEW Slice and Dice Tool (Version 1.0 Beta)

    USGS Publications Warehouse

    Roszell, Dustin; Conzelmann, Craig; Chimmula, Sumani; Chandrasekaran, Anuradha; Hunnicut, Christina

    2009-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need arose for additional tools to view and manipulate NetCDF datasets, specifically to create subsets of large NetCDF files. To address this need, we created the EverVIEW Slice and Dice Tool to allow users to create subsets of grid-based NetCDF files. The major functions of this tool are (1) to subset NetCDF files both spatially and temporally; (2) to view the NetCDF data in table form; and (3) to export filtered data to a comma-separated value file format.

  8. Spontaneous magnetization and anomalous Hall effect in an emergent Dice lattice

    PubMed Central

    Dutta, Omjyoti; Przysiężna, Anna; Zakrzewski, Jakub

    2015-01-01

    Ultracold atoms in optical lattices serve as a tool to model different physical phenomena appearing originally in condensed matter. To study magnetic phenomena one needs to engineer synthetic fields as atoms are neutral. Appropriately shaped optical potentials force atoms to mimic charged particles moving in a given field. We present the realization of artificial gauge fields for the observation of anomalous Hall effect. Two species of attractively interacting ultracold fermions are considered to be trapped in a shaken two dimensional triangular lattice. A combination of interaction induced tunneling and shaking can result in an emergent Dice lattice. In such a lattice the staggered synthetic magnetic flux appears and it can be controlled with external parameters. The obtained synthetic fields are non-Abelian. Depending on the tuning of the staggered flux we can obtain either anomalous Hall effect or its quantized version. Our results are reminiscent of Anomalous Hall conductivity in spin-orbit coupled ferromagnets. PMID:26057635

  9. KSC-2011-7504

    NASA Image and Video Library

    2011-10-04

    The Dynamic Ionosphere Cubesat Experiment DICE is prepared for launch aboard the Delta II rocket that will carry NASA’s National Polar-orbiting Operational Environmental Satellite System Preparatory Project NPP spacecraft. DICE is a National Science Foundation Project conducted by Utah State University in conjunction with the Atmospheric and Space Technology Research Associates ASTRA. NPP represents a critical first step in building the next-generation of Earth-observing satellites. NPP will carry the first of the new sensors developed for this satellite fleet, now known as the Joint Polar Satellite System JPSS, to be launched in 2016. NPP is the bridge between NASA's Earth Observing System EOS satellites and the forthcoming series of JPSS satellites. The mission will test key technologies and instruments for the JPSS missions. NPP is targeted to launch Oct. 28 from Space Launch Complex-2 aboard a United Launch Alliance Delta II rocket. For more information, visit http://www.nasa.gov/NPP. Photo credit: NASA/VAFB

  10. Multiatlas whole heart segmentation of CT data using conditional entropy for atlas ranking and selection.

    PubMed

    Zhuang, Xiahai; Bai, Wenjia; Song, Jingjing; Zhan, Songhua; Qian, Xiaohua; Shi, Wenzhe; Lian, Yanyun; Rueckert, Daniel

    2015-07-01

    Cardiac computed tomography (CT) is widely used in clinical diagnosis of cardiovascular diseases. Whole heart segmentation (WHS) plays a vital role in developing new clinical applications of cardiac CT. However, the shape and appearance of the heart can vary greatly across different scans, making the automatic segmentation particularly challenging. The objective of this work is to develop and evaluate a multiatlas segmentation (MAS) scheme using a new atlas ranking and selection algorithm for automatic WHS of CT data. Research on different MAS strategies and their influence on WHS performance are limited. This work provides a detailed comparison study evaluating the impacts of label fusion, atlas ranking, and sizes of the atlas database on the segmentation performance. Atlases in a database were registered to the target image using a hierarchical registration scheme specifically designed for cardiac images. A subset of the atlases were selected for label fusion, according to the authors' proposed atlas ranking criterion which evaluated the performance of each atlas by computing the conditional entropy of the target image given the propagated atlas labeling. Joint label fusion was used to combine multiple label estimates to obtain the final segmentation. The authors used 30 clinical cardiac CT angiography (CTA) images to evaluate the proposed MAS scheme and to investigate different segmentation strategies. The mean WHS Dice score of the proposed MAS method was 0.918 ± 0.021, and the mean runtime for one case was 13.2 min on a workstation. This MAS scheme using joint label fusion generated significantly better Dice scores than the other label fusion strategies, including majority voting (0.901 ± 0.276, p < 0.01), locally weighted voting (0.905 ± 0.0247, p < 0.01), and probabilistic patch-based fusion (0.909 ± 0.0249, p < 0.01). In the atlas ranking study, the proposed criterion based on conditional entropy yielded a performance curve with higher WHS Dice scores compared to the conventional schemes (p < 0.03). In the atlas database study, the authors showed that the MAS using larger atlas databases generated better performance curves than the MAS using smaller ones, indicating larger atlas databases could produce more accurate segmentation. The authors have developed a new MAS framework for automatic WHS of CTA and investigated alternative implementations of MAS. With the proposed atlas ranking algorithm and joint label fusion, the MAS scheme is able to generate accurate segmentation within practically acceptable computation time. This method can be useful for the development of new clinical applications of cardiac CT.

  11. WaferOptics® mass volume production and reliability

    NASA Astrophysics Data System (ADS)

    Wolterink, E.; Demeyer, K.

    2010-05-01

    The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.

  12. A Deep Learning Approach to Digitally Stain Optical Coherence Tomography Images of the Optic Nerve Head.

    PubMed

    Devalla, Sripad Krishna; Chin, Khai Sing; Mari, Jean-Martial; Tun, Tin A; Strouthidis, Nicholas G; Aung, Tin; Thiéry, Alexandre H; Girard, Michaël J A

    2018-01-01

    To develop a deep learning approach to digitally stain optical coherence tomography (OCT) images of the optic nerve head (ONH). A horizontal B-scan was acquired through the center of the ONH using OCT (Spectralis) for one eye of each of 100 subjects (40 healthy and 60 glaucoma). All images were enhanced using adaptive compensation. A custom deep learning network was then designed and trained with the compensated images to digitally stain (i.e., highlight) six tissue layers of the ONH. The accuracy of our algorithm was assessed (against manual segmentations) using the dice coefficient, sensitivity, specificity, intersection over union (IU), and accuracy. We studied the effect of compensation, number of training images, and performance comparison between glaucoma and healthy subjects. For images it had not yet assessed, our algorithm was able to digitally stain the retinal nerve fiber layer + prelamina, the RPE, all other retinal layers, the choroid, and the peripapillary sclera and lamina cribrosa. For all tissues, the dice coefficient, sensitivity, specificity, IU, and accuracy (mean) were 0.84 ± 0.03, 0.92 ± 0.03, 0.99 ± 0.00, 0.89 ± 0.03, and 0.94 ± 0.02, respectively. Our algorithm performed significantly better when compensated images were used for training (P < 0.001). Besides offering a good reliability, digital staining also performed well on OCT images of both glaucoma and healthy individuals. Our deep learning algorithm can simultaneously stain the neural and connective tissues of the ONH, offering a framework to automatically measure multiple key structural parameters of the ONH that may be critical to improve glaucoma management.

  13. Brain Tumor Segmentation Using Convolutional Neural Networks in MRI Images.

    PubMed

    Pereira, Sergio; Pinto, Adriano; Alves, Victor; Silva, Carlos A

    2016-05-01

    Among brain tumors, gliomas are the most common and aggressive, leading to a very short life expectancy in their highest grade. Thus, treatment planning is a key stage to improve the quality of life of oncological patients. Magnetic resonance imaging (MRI) is a widely used imaging technique to assess these tumors, but the large amount of data produced by MRI prevents manual segmentation in a reasonable time, limiting the use of precise quantitative measurements in the clinical practice. So, automatic and reliable segmentation methods are required; however, the large spatial and structural variability among brain tumors make automatic segmentation a challenging problem. In this paper, we propose an automatic segmentation method based on Convolutional Neural Networks (CNN), exploring small 3 ×3 kernels. The use of small kernels allows designing a deeper architecture, besides having a positive effect against overfitting, given the fewer number of weights in the network. We also investigated the use of intensity normalization as a pre-processing step, which though not common in CNN-based segmentation methods, proved together with data augmentation to be very effective for brain tumor segmentation in MRI images. Our proposal was validated in the Brain Tumor Segmentation Challenge 2013 database (BRATS 2013), obtaining simultaneously the first position for the complete, core, and enhancing regions in Dice Similarity Coefficient metric (0.88, 0.83, 0.77) for the Challenge data set. Also, it obtained the overall first position by the online evaluation platform. We also participated in the on-site BRATS 2015 Challenge using the same model, obtaining the second place, with Dice Similarity Coefficient metric of 0.78, 0.65, and 0.75 for the complete, core, and enhancing regions, respectively.

  14. Multi-Modal Glioblastoma Segmentation: Man versus Machine

    PubMed Central

    Pica, Alessia; Schucht, Philippe; Beck, Jürgen; Verma, Rajeev Kumar; Slotboom, Johannes; Reyes, Mauricio; Wiest, Roland

    2014-01-01

    Background and Purpose Reproducible segmentation of brain tumors on magnetic resonance images is an important clinical need. This study was designed to evaluate the reliability of a novel fully automated segmentation tool for brain tumor image analysis in comparison to manually defined tumor segmentations. Methods We prospectively evaluated preoperative MR Images from 25 glioblastoma patients. Two independent expert raters performed manual segmentations. Automatic segmentations were performed using the Brain Tumor Image Analysis software (BraTumIA). In order to study the different tumor compartments, the complete tumor volume TV (enhancing part plus non-enhancing part plus necrotic core of the tumor), the TV+ (TV plus edema) and the contrast enhancing tumor volume CETV were identified. We quantified the overlap between manual and automated segmentation by calculation of diameter measurements as well as the Dice coefficients, the positive predictive values, sensitivity, relative volume error and absolute volume error. Results Comparison of automated versus manual extraction of 2-dimensional diameter measurements showed no significant difference (p = 0.29). Comparison of automated versus manual segmentation of volumetric segmentations showed significant differences for TV+ and TV (p<0.05) but no significant differences for CETV (p>0.05) with regard to the Dice overlap coefficients. Spearman's rank correlation coefficients (ρ) of TV+, TV and CETV showed highly significant correlations between automatic and manual segmentations. Tumor localization did not influence the accuracy of segmentation. Conclusions In summary, we demonstrated that BraTumIA supports radiologists and clinicians by providing accurate measures of cross-sectional diameter-based tumor extensions. The automated volume measurements were comparable to manual tumor delineation for CETV tumor volumes, and outperformed inter-rater variability for overlap and sensitivity. PMID:24804720

  15. Neutrosophic segmentation of breast lesions for dedicated breast CT

    NASA Astrophysics Data System (ADS)

    Lee, Juhun; Nishikawa, Robert M.; Reiser, Ingrid; Boone, John M.

    2017-03-01

    We proposed the neutrosophic approach for segmenting breast lesions in breast Computer Tomography (bCT) images. The neutrosophic set (NS) considers the nature and properties of neutrality (or indeterminacy), which is neither true nor false. We considered the image noise as an indeterminate component, while treating the breast lesion and other breast areas as true and false components. We first transformed the image into the NS domain. Each voxel in the image can be described as its membership in True, Indeterminate, and False sets. Operations α-mean, β-enhancement, and γ-plateau iteratively smooth and contrast-enhance the image to reduce the noise level of the true set. Once the true image no longer changes, we applied one existing algorithm for bCT images, the RGI segmentation, on the resulting image to segment the breast lesions. We compared the segmentation performance of the proposed method (named as NS-RGI) to that of the regular RGI segmentation. We used a total of 122 breast lesions (44 benign, 78 malignant) of 123 non-contrasted bCT cases. We measured the segmentation performances of the NS-RGI and the RGI using the DICE coefficient. The average DICE value of the NS-RGI was 0.82 (STD: 0.09), while that of the RGI was 0.8 (STD: 0.12). The difference between the two DICE values was statistically significant (paired t test, p-value = 0.0007). We conducted a subsequent feature analysis on the resulting segmentations. The classifier performance for the NS-RGI (AUC = 0.8) improved over that of the RGI (AUC = 0.69, p-value = 0.006).

  16. Comparison of different deep learning approaches for parotid gland segmentation from CT images

    NASA Astrophysics Data System (ADS)

    Hänsch, Annika; Schwier, Michael; Gass, Tobias; Morgas, Tomasz; Haas, Benjamin; Klein, Jan; Hahn, Horst K.

    2018-02-01

    The segmentation of target structures and organs at risk is a crucial and very time-consuming step in radiotherapy planning. Good automatic methods can significantly reduce the time clinicians have to spend on this task. Due to its variability in shape and often low contrast to surrounding structures, segmentation of the parotid gland is especially challenging. Motivated by the recent success of deep learning, we study different deep learning approaches for parotid gland segmentation. Particularly, we compare 2D, 2D ensemble and 3D U-Net approaches and find that the 2D U-Net ensemble yields the best results with a mean Dice score of 0.817 on our test data. The ensemble approach reduces false positives without the need for an automatic region of interest detection. We also apply our trained 2D U-Net ensemble to segment the test data of the 2015 MICCAI head and neck auto-segmentation challenge. With a mean Dice score of 0.861, our classifier exceeds the highest mean score in the challenge. This shows that the method generalizes well onto data from independent sites. Since appropriate reference annotations are essential for training but often difficult and expensive to obtain, it is important to know how many samples are needed to properly train a neural network. We evaluate the classifier performance after training with differently sized training sets (50-450) and find that 250 cases (without using extensive data augmentation) are sufficient to obtain good results with the 2D ensemble. Adding more samples does not significantly improve the Dice score of the segmentations.

  17. Multi-modal and targeted imaging improves automated mid-brain segmentation

    NASA Astrophysics Data System (ADS)

    Plassard, Andrew J.; D'Haese, Pierre F.; Pallavaram, Srivatsan; Newton, Allen T.; Claassen, Daniel O.; Dawant, Benoit M.; Landman, Bennett A.

    2017-02-01

    The basal ganglia and limbic system, particularly the thalamus, putamen, internal and external globus pallidus, substantia nigra, and sub-thalamic nucleus, comprise a clinically relevant signal network for Parkinson's disease. In order to manually trace these structures, a combination of high-resolution and specialized sequences at 7T are used, but it is not feasible to scan clinical patients in those scanners. Targeted imaging sequences at 3T such as F-GATIR, and other optimized inversion recovery sequences, have been presented which enhance contrast in a select group of these structures. In this work, we show that a series of atlases generated at 7T can be used to accurately segment these structures at 3T using a combination of standard and optimized imaging sequences, though no one approach provided the best result across all structures. In the thalamus and putamen, a median Dice coefficient over 0.88 and a mean surface distance less than 1.0mm was achieved using a combination of T1 and an optimized inversion recovery imaging sequences. In the internal and external globus pallidus a Dice over 0.75 and a mean surface distance less than 1.2mm was achieved using a combination of T1 and FGATIR imaging sequences. In the substantia nigra and sub-thalamic nucleus a Dice coefficient of over 0.6 and a mean surface distance of less than 1.0mm was achieved using the optimized inversion recovery imaging sequence. On average, using T1 and optimized inversion recovery together produced significantly improved segmentation results than any individual modality (p<0.05 wilcox sign-rank test).

  18. The Annona muricata leaf ethanol extract affects mobility and reproduction in mutant strain NB327 Caenorhabditis elegans.

    PubMed

    Bustos, A V Gualteros; Jiménez, M Gómez; Mora, R M Sánchez

    2017-07-01

    The C. elegans NB327 mutant strain is characterized for the knockdown of the dic-1 gene. The dic-1 gene is homologous to the dice-1 gene in humans, encoding the protein DICE-1 as a tumor suppressor. Absence or under-regulation of the dice-1 gene can be reflected in lung and prostate cancer [17], [18]. This study evaluated the effect of EEAML on the C. elegans NB327 mutant strain. Phenotypic aspects such as morphology, body length, locomotion, and reproductive behaviour were analyzed. It is important to emphasize that the strain presents a phenotype characteristic with respect to egg laying and hatching. Reported studies showed that Annona muricata extract and its active components evidence anti-cancer and anti-tumor effects, through experimentation in vivo and in vitro models. However, neurotoxicity has been reported as a side effect. The results showed that the mutant strain NB327 was exposed to EEAML (5 mg/ml) concentration, it showed a significant decrease in average locomotion, resulting in 13 undulations in 30 s. This contrasts with the control strain's 17.5 undulations in 30 s. Similarly, the number of progenies was reduced from 188 progenies (control strain) to 114 and 92 progenies at the dose of (1 mg/ml and 5 mg/m) EEAML. The results of this study suggest that EEAML has a possible neurotoxic effect in concentrations equal to or greater than 5 mg/ml. Also, it does not have positive effects on the mutant strain of Caenorhabditis elegans NB327 phenotype.

  19. Deep residual networks for automatic segmentation of laparoscopic videos of the liver

    NASA Astrophysics Data System (ADS)

    Gibson, Eli; Robu, Maria R.; Thompson, Stephen; Edwards, P. Eddie; Schneider, Crispin; Gurusamy, Kurinchi; Davidson, Brian; Hawkes, David J.; Barratt, Dean C.; Clarkson, Matthew J.

    2017-03-01

    Motivation: For primary and metastatic liver cancer patients undergoing liver resection, a laparoscopic approach can reduce recovery times and morbidity while offering equivalent curative results; however, only about 10% of tumours reside in anatomical locations that are currently accessible for laparoscopic resection. Augmenting laparoscopic video with registered vascular anatomical models from pre-procedure imaging could support using laparoscopy in a wider population. Segmentation of liver tissue on laparoscopic video supports the robust registration of anatomical liver models by filtering out false anatomical correspondences between pre-procedure and intra-procedure images. In this paper, we present a convolutional neural network (CNN) approach to liver segmentation in laparoscopic liver procedure videos. Method: We defined a CNN architecture comprising fully-convolutional deep residual networks with multi-resolution loss functions. The CNN was trained in a leave-one-patient-out cross-validation on 2050 video frames from 6 liver resections and 7 laparoscopic staging procedures, and evaluated using the Dice score. Results: The CNN yielded segmentations with Dice scores >=0.95 for the majority of images; however, the inter-patient variability in median Dice score was substantial. Four failure modes were identified from low scoring segmentations: minimal visible liver tissue, inter-patient variability in liver appearance, automatic exposure correction, and pathological liver tissue that mimics non-liver tissue appearance. Conclusion: CNNs offer a feasible approach for accurately segmenting liver from other anatomy on laparoscopic video, but additional data or computational advances are necessary to address challenges due to the high inter-patient variability in liver appearance.

  20. The role of the occupational therapist in the management of neuropsychiatric symptoms of dementia in clinical settings.

    PubMed

    Fraker, Joyce; Kales, Helen C; Blazek, Mary; Kavanagh, Janet; Gitlin, Laura N

    2014-01-01

    Neuropsychiatric symptoms (NPS) of dementia include aggression, agitation, depression, anxiety, delusions, hallucinations, apathy, and disinhibition. NPS affect dementia patients nearly universally across dementia stages and etiologies. They are associated with poor patient and caregiver outcomes, including increased health care utilization, excess morbidity and mortality, and earlier nursing home placement, as well as caregiver stress, depression and reduced employment. There are no FDA-approved medications for NPS, but it is a common clinical practice to use psychotropic medications such as antipsychotics, to control symptoms; however, antipsychotics show only modest efficacy in improving NPS and have significant risks for patients, including side effects and mortality. Nonpharmacologic treatments are considered first-line by multiple medical bodies and expert consensus, as they show evidence for efficacy and have limited potential for adverse effects. Ideally, nonpharmacological management of NPS in clinical settings occurs in multidisciplinary teams, where occupational therapists play an important collaborative role in the care of the person with dementia. Our group has articulated an evidence-informed structured approach to the management of NPS that can be integrated into diverse practice settings and used by providers of various disciplines. The "DICE" (Describe, Investigate, Create, and Evaluate) approach is inherently patient- and caregiver-centered, as patient and caregiver concerns are integral to each step of the process. DICE offers a clinical reasoning approach through which providers can more efficiently and effectively choose optimal treatment plans. The purpose of this paper is to describe the role of the occupational therapy in using the DICE approach for NPS management.

  1. Two and three-dimensional segmentation of hyperpolarized 3He magnetic resonance imaging of pulmonary gas distribution

    NASA Astrophysics Data System (ADS)

    Heydarian, Mohammadreza; Kirby, Miranda; Wheatley, Andrew; Fenster, Aaron; Parraga, Grace

    2012-03-01

    A semi-automated method for generating hyperpolarized helium-3 (3He) measurements of individual slice (2D) or whole lung (3D) gas distribution was developed. 3He MRI functional images were segmented using two-dimensional (2D) and three-dimensional (3D) hierarchical K-means clustering of the 3He MRI signal and in addition a seeded region-growing algorithm was employed for segmentation of the 1H MRI thoracic cavity volume. 3He MRI pulmonary function measurements were generated following two-dimensional landmark-based non-rigid registration of the 3He and 1H pulmonary images. We applied this method to MRI of healthy subjects and subjects with chronic obstructive lung disease (COPD). The results of hierarchical K-means 2D and 3D segmentation were compared to an expert observer's manual segmentation results using linear regression, Pearson correlations and the Dice similarity coefficient. 2D hierarchical K-means segmentation of ventilation volume (VV) and ventilation defect volume (VDV) was strongly and significantly correlated with manual measurements (VV: r=0.98, p<.0001 VDV: r=0.97, p<.0001) and mean Dice coefficients were greater than 92% for all subjects. 3D hierarchical K-means segmentation of VV and VDV was also strongly and significantly correlated with manual measurements (VV: r=0.98, p<.0001 VDV: r=0.64, p<.0001) and the mean Dice coefficients were greater than 91% for all subjects. Both 2D and 3D semi-automated segmentation of 3He MRI gas distribution provides a way to generate novel pulmonary function measurements.

  2. Automated and simultaneous fovea center localization and macula segmentation using the new dynamic identification and classification of edges model.

    PubMed

    Onal, Sinan; Chen, Xin; Satamraju, Veeresh; Balasooriya, Maduka; Dabil-Karacal, Humeyra

    2016-07-01

    Detecting the position of retinal structures, including the fovea center and macula, in retinal images plays a key role in diagnosing eye diseases such as optic nerve hypoplasia, amblyopia, diabetic retinopathy, and macular edema. However, current detection methods are unreliable for infants or certain ethnic populations. Thus, a methodology is proposed here that may be useful for infants and across ethnicities that automatically localizes the fovea center and segments the macula on digital fundus images. First, dark structures and bright artifacts are removed from the input image using preprocessing operations, and the resulting image is transformed to polar space. Second, the fovea center is identified, and the macula region is segmented using the proposed dynamic identification and classification of edges (DICE) model. The performance of the method was evaluated using 1200 fundus images obtained from the relatively large, diverse, and publicly available Messidor database. In 96.1% of these 1200 cases, the distance between the fovea center identified manually by ophthalmologists and automatically using the proposed method remained within 0 to 8 pixels. The dice similarity index comparing the manually obtained results with those of the model for macula segmentation was 96.12% for these 1200 cases. Thus, the proposed method displayed a high degree of accuracy. The methodology using the DICE model is unique and advantageous over previously reported methods because it simultaneously determines the fovea center and segments the macula region without using any structural information, such as optic disc or blood vessel location, and it may prove useful for all populations, including infants.

  3. Cosmopolitan Dice Recast

    ERIC Educational Resources Information Center

    Papastephanou, Marianna

    2017-01-01

    This article argues that hegemonic cosmopolitan narrativity fails to frame a complex cosmopolitan normativity. The hegemonic cosmopolitan narrative celebrates a mobile selfhood merely hospitable to the encountered, mobile diversity that comes ashore. A recent educational-theoretical "refugee-crisis" initiative serves as an illustration…

  4. Dice and Disease in the Classroom.

    ERIC Educational Resources Information Center

    Stor, Marilyn; Briggs, William L.

    1998-01-01

    Presents a mathematics activity to model the exponential growth of the common cold, AIDS, or any other communicable disease. Underscores the effect that a friend's or partner's previous behavior may have on a current relationship and on society at large. (ASK)

  5. Portion size

    MedlinePlus

    ... a deck of cards One 3-ounce (84 grams) serving of fish is a checkbook One-half cup (40 grams) of ice cream is a tennis ball One ... cheese is six dice One-half cup (80 grams) of cooked rice, pasta, or snacks such as ...

  6. SU-F-J-98: Improvement and Evaluation Of Deformation Image Registration On Parotid Glands During Radiation Therapy for Nasopharyngeal Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, S; PLA General Hospital, Beijing; Wu, Z

    2016-06-15

    Purpose: To quantitatively evaluate the strategic innovation and accuracy variation of deformation registration algorithm for parotid glands using the similarity Dice coefficient during the course of radiation therapy (RT) for nasopharyngeal cancer (NPC). Methods: Daily MVCT data for 10 patients with pathologically proven nasopharyngeal cancers were analyzed. The data were acquired using tomotherapy (TomoTherapy, Accuray) at the PLA General Hospital. The prescription dose to the primary target was 70Gy in 33 fractions. Two kinds of contours for parotid glands on daily MVCTs were obtained by populating these contours from planning CTs to the daily CTs via rigid-body registration with ormore » without the rotation shifts using the in-house tools and the Adaptive plan software (Adaptive Plan, TomoTherapy), and were edited manually if necessary. The diffeomorphic Demons algorithm developed in the in-house tool was used to propagate the parotid structures from the daily CTs to planning CTs. The differences of the mapped parotid contours in two methods were evaluated using Dice similarity index (DSI). Two-tailed t-test analysis was carried out to compare the DSI changes during the course of RT. Results: For 10 patient plans, the accuracy of deformation image registration (DIR) with the rotation shift was obviously better than those without the rotation shift. The Dice scores of the ipsi- and contra-lateral parotids for with and without the rotation shifts were found to be correlated with each other [0.904±0.031 vs 0.919±0.030 (p<0.001); 0.900±0.031 vs 0.910±0.032 (p<0.001)]. The Dice scores for the parotids have shown the reduction with the changes of parotid volumes during RT. The DSI values between the first and last fraction were 0.932±0.020 vs 0.899±0.030 in 10 patient plans. Conclusion: DIR was successfully improved using the strategic innovation for ART. And the decrease of DIR accuracy has also been found during the delivery of fractionated radiotherapy. This work was supported in part by the grant from Chinese Natural Science Foundation (Grant No. 11105225).« less

  7. A two-stage rule-constrained seedless region growing approach for mandibular body segmentation in MRI.

    PubMed

    Ji, Dong Xu; Foong, Kelvin Weng Chiong; Ong, Sim Heng

    2013-09-01

    Extraction of the mandible from 3D volumetric images is frequently required for surgical planning and evaluation. Image segmentation from MRI is more complex than CT due to lower bony signal-to-noise. An automated method to extract the human mandible body shape from magnetic resonance (MR) images of the head was developed and tested. Anonymous MR images data sets of the head from 12 subjects were subjected to a two-stage rule-constrained region growing approach to derive the shape of the body of the human mandible. An initial thresholding technique was applied followed by a 3D seedless region growing algorithm to detect a large portion of the trabecular bone (TB) regions of the mandible. This stage is followed with a rule-constrained 2D segmentation of each MR axial slice to merge the remaining portions of the TB regions with lower intensity levels. The two-stage approach was replicated to detect the cortical bone (CB) regions of the mandibular body. The TB and CB regions detected from the preceding steps were merged and subjected to a series of morphological processes for completion of the mandibular body region definition. Comparisons of the accuracy of segmentation between the two-stage approach, conventional region growing method, 3D level set method, and manual segmentation were made with Jaccard index, Dice index, and mean surface distance (MSD). The mean accuracy of the proposed method is [Formula: see text] for Jaccard index, [Formula: see text] for Dice index, and [Formula: see text] mm for MSD. The mean accuracy of CRG is [Formula: see text] for Jaccard index, [Formula: see text] for Dice index, and [Formula: see text] mm for MSD. The mean accuracy of the 3D level set method is [Formula: see text] for Jaccard index, [Formula: see text] for Dice index, and [Formula: see text] mm for MSD. The proposed method shows improvement in accuracy over CRG and 3D level set. Accurate segmentation of the body of the human mandible from MR images is achieved with the proposed two-stage rule-constrained seedless region growing approach. The accuracy achieved with the two-stage approach is higher than CRG and 3D level set.

  8. Physics Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1982

    1982-01-01

    Discusses dice model of exponential radionuclide decay; glancing and collinear perfectly elastic collisions; digital capacitance meter; use of top pan balance in physics; microcomputer calculation of gradient of straight line (includes complete Commodore PET computer program); Fresnel lenses; low-voltage radiant heater; Wheatssone's bridge used as…

  9. Albert and Erwin: decline and fall

    NASA Astrophysics Data System (ADS)

    Weaire, Denis

    2015-04-01

    More than a century has passed since quantum theory began to pose teasing questions about how we interpret our world. Books abound that offer alternative views of the problems the theory raises, and Einstein's Dice and Schrödinger's Cat is another.

  10. Hybrid thin-film amplifier

    NASA Technical Reports Server (NTRS)

    Cleveland, G.

    1977-01-01

    Miniature amplifier for bioelectronic instrumentation consumes only about 100 mW and has frequency response flat to within 0.5 dB from 0.14 to 450 Hz. Device consists of five thin film substrates, which contain eight operational amplifiers and seven field-effect transistor dice.

  11. 7 CFR 210.10 - Nutrition standards and menu planning approaches for lunches and requirements for afterschool...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... should not serve any one meat alternate or form of meat (for example, ground, diced, pieces) more than..., children in kindergarten through grade six are offered vegetables/fruits in minimum daily servings plus an...

  12. The Roll of the Dice: Differentiation Outcomes and the Role of Late Protoplanetary Impacts

    NASA Astrophysics Data System (ADS)

    Heinze, W. D.

    2018-05-01

    Because late accretion occurs by the impact of 10–100 (large) embryos which have low probability of being high-velocity events and such events are necessary for magnetic dynamos, small number statics control differentiation outcomes.

  13. Introduction

    NASA Astrophysics Data System (ADS)

    Strzałko, Jarosław; Grabski, Juliusz; Perlikowski, Przemysław; Stefanski, Andrzej; Kapitaniak, Tomasz

    The definitions of gambling and gaming are given. We discuss the main differences between these terms. A brief history of gambling is presented. Physical models of the considered mechanical randomizers , namely the coin, the dice, and the roulette are introduced. We discuss under which conditions they can be fair.

  14. Vomeronasal and Olfactory Structures in Bats Revealed by DiceCT Clarify Genetic Evidence of Function

    PubMed Central

    Yohe, Laurel R.; Hoffmann, Simone; Curtis, Abigail

    2018-01-01

    The degree to which molecular and morphological loss of function occurs synchronously during the vestigialization of traits is not well understood. The mammalian vomeronasal system, a sense critical for mediating many social and reproductive behaviors, is highly conserved across mammals. New World Leaf-nosed bats (Phyllostomidae) are under strong selection to maintain a functional vomeronasal system such that most phyllostomids possess a distinct vomeronasal organ and an intact TRPC2, a gene encoding a protein primarily involved in vomeronasal sensory neuron signal transduction. Recent genetic evidence, however, shows that TRPC2 is a pseudogene in some Caribbean nectarivorous phyllostomids. The loss-of-function mutations suggest the sensory neural tissue of the vomeronasal organ is absent in these species despite strong selection on this gene in its mainland relatives, but the anatomy was unknown in most Caribbean nectarivorous phyllostomids until this study. We used diffusible iodine-based contrast-enhanced computed tomography (diceCT) to test whether the vomeronasal and main olfactory anatomy of several phyllostomid species matched genetic evidence of function, providing insight into whether loss of a structure is linked to pseudogenization of a molecular component of the system. The vomeronasal organ is indeed rudimentary or absent in species with a disrupted TRPC2 gene. Caribbean nectar-feeders also exhibit derived olfactory turbinal morphology and a large olfactory recess that differs from closely related bats that have an intact vomeronasal organ, which may hint that the main olfactory system may compensate for loss. We emphasize non-invasive diceCT is capable of detecting the vomeronasal organ, providing a feasible approach for quantifying mammalian chemosensory anatomy across species. PMID:29867373

  15. Inferior vena cava segmentation with parameter propagation and graph cut.

    PubMed

    Yan, Zixu; Chen, Feng; Wu, Fa; Kong, Dexing

    2017-09-01

    The inferior vena cava (IVC) is one of the vital veins inside the human body. Accurate segmentation of the IVC from contrast-enhanced CT images is of great importance. This extraction not only helps the physician understand its quantitative features such as blood flow and volume, but also it is helpful during the hepatic preoperative planning. However, manual delineation of the IVC is time-consuming and poorly reproducible. In this paper, we propose a novel method to segment the IVC with minimal user interaction. The proposed method performs the segmentation block by block between user-specified beginning and end masks. At each stage, the proposed method builds the segmentation model based on information from image regional appearances, image boundaries, and a prior shape. The intensity range and the prior shape for this segmentation model are estimated based on the segmentation result from the last block, or from user- specified beginning mask if at first stage. Then, the proposed method minimizes the energy function and generates the segmentation result for current block using graph cut. Finally, a backward tracking step from the end of the IVC is performed if necessary. We have tested our method on 20 clinical datasets and compared our method to three other vessel extraction approaches. The evaluation was performed using three quantitative metrics: the Dice coefficient (Dice), the mean symmetric distance (MSD), and the Hausdorff distance (MaxD). The proposed method has achieved a Dice of [Formula: see text], an MSD of [Formula: see text] mm, and a MaxD of [Formula: see text] mm, respectively, in our experiments. The proposed approach can achieve a sound performance with a relatively low computational cost and a minimal user interaction. The proposed algorithm has high potential to be applied for the clinical applications in the future.

  16. Comparison of classification methods for voxel-based prediction of acute ischemic stroke outcome following intra-arterial intervention

    NASA Astrophysics Data System (ADS)

    Winder, Anthony J.; Siemonsen, Susanne; Flottmann, Fabian; Fiehler, Jens; Forkert, Nils D.

    2017-03-01

    Voxel-based tissue outcome prediction in acute ischemic stroke patients is highly relevant for both clinical routine and research. Previous research has shown that features extracted from baseline multi-parametric MRI datasets have a high predictive value and can be used for the training of classifiers, which can generate tissue outcome predictions for both intravenous and conservative treatments. However, with the recent advent and popularization of intra-arterial thrombectomy treatment, novel research specifically addressing the utility of predictive classi- fiers for thrombectomy intervention is necessary for a holistic understanding of current stroke treatment options. The aim of this work was to develop three clinically viable tissue outcome prediction models using approximate nearest-neighbor, generalized linear model, and random decision forest approaches and to evaluate the accuracy of predicting tissue outcome after intra-arterial treatment. Therefore, the three machine learning models were trained, evaluated, and compared using datasets of 42 acute ischemic stroke patients treated with intra-arterial thrombectomy. Classifier training utilized eight voxel-based features extracted from baseline MRI datasets and five global features. Evaluation of classifier-based predictions was performed via comparison to the known tissue outcome, which was determined in follow-up imaging, using the Dice coefficient and leave-on-patient-out cross validation. The random decision forest prediction model led to the best tissue outcome predictions with a mean Dice coefficient of 0.37. The approximate nearest-neighbor and generalized linear model performed equally suboptimally with average Dice coefficients of 0.28 and 0.27 respectively, suggesting that both non-linearity and machine learning are desirable properties of a classifier well-suited to the intra-arterial tissue outcome prediction problem.

  17. A New MRI Masking Technique Based on Multi-Atlas Brain Segmentation in Controls and Schizophrenia: A Rapid and Viable Alternative to Manual Masking.

    PubMed

    Del Re, Elisabetta C; Gao, Yi; Eckbo, Ryan; Petryshen, Tracey L; Blokland, Gabriëlla A M; Seidman, Larry J; Konishi, Jun; Goldstein, Jill M; McCarley, Robert W; Shenton, Martha E; Bouix, Sylvain

    2016-01-01

    Brain masking of MRI images separates brain from surrounding tissue and its accuracy is important for further imaging analyses. We implemented a new brain masking technique based on multi-atlas brain segmentation (MABS) and compared MABS to masks generated using FreeSurfer (FS; version 5.3), Brain Extraction Tool (BET), and Brainwash, using manually defined masks (MM) as the gold standard. We further determined the effect of different masking techniques on cortical and subcortical volumes generated by FreeSurfer. Images were acquired on a 3-Tesla MR Echospeed system General Electric scanner on five control and five schizophrenia subjects matched on age, sex, and IQ. Automated masks were generated from MABS, FS, BET, and Brainwash, and compared to MM using these metrics: a) volume difference from MM; b) Dice coefficients; and c) intraclass correlation coefficients. Mean volume difference between MM and MABS masks was significantly less than the difference between MM and FS or BET masks. Dice coefficient between MM and MABS was significantly higher than Dice coefficients between MM and FS, BET, or Brainwash. For subcortical and left cortical regions, MABS volumes were closer to MM volumes than were BET or FS volumes. For right cortical regions, MABS volumes were closer to MM volumes than were BET volumes. Brain masks generated using FreeSurfer, BET, and Brainwash are rapidly obtained, but are less accurate than manually defined masks. Masks generated using MABS, in contrast, resemble more closely the gold standard of manual masking, thereby offering a rapid and viable alternative. Copyright © 2015 by the American Society of Neuroimaging.

  18. Segmentation of malignant lesions in 3D breast ultrasound using a depth-dependent model.

    PubMed

    Tan, Tao; Gubern-Mérida, Albert; Borelli, Cristina; Manniesing, Rashindra; van Zelst, Jan; Wang, Lei; Zhang, Wei; Platel, Bram; Mann, Ritse M; Karssemeijer, Nico

    2016-07-01

    Automated 3D breast ultrasound (ABUS) has been proposed as a complementary screening modality to mammography for early detection of breast cancers. To facilitate the interpretation of ABUS images, automated diagnosis and detection techniques are being developed, in which malignant lesion segmentation plays an important role. However, automated segmentation of cancer in ABUS is challenging since lesion edges might not be well defined. In this study, the authors aim at developing an automated segmentation method for malignant lesions in ABUS that is robust to ill-defined cancer edges and posterior shadowing. A segmentation method using depth-guided dynamic programming based on spiral scanning is proposed. The method automatically adjusts aggressiveness of the segmentation according to the position of the voxels relative to the lesion center. Segmentation is more aggressive in the upper part of the lesion (close to the transducer) than at the bottom (far away from the transducer), where posterior shadowing is usually visible. The authors used Dice similarity coefficient (Dice) for evaluation. The proposed method is compared to existing state of the art approaches such as graph cut, level set, and smart opening and an existing dynamic programming method without depth dependence. In a dataset of 78 cancers, our proposed segmentation method achieved a mean Dice of 0.73 ± 0.14. The method outperforms an existing dynamic programming method (0.70 ± 0.16) on this task (p = 0.03) and it is also significantly (p < 0.001) better than graph cut (0.66 ± 0.18), level set based approach (0.63 ± 0.20) and smart opening (0.65 ± 0.12). The proposed depth-guided dynamic programming method achieves accurate breast malignant lesion segmentation results in automated breast ultrasound.

  19. Development and Implementation of a Corriedale Ovine Brain Atlas for Use in Atlas-Based Segmentation.

    PubMed

    Liyanage, Kishan Andre; Steward, Christopher; Moffat, Bradford Armstrong; Opie, Nicholas Lachlan; Rind, Gil Simon; John, Sam Emmanuel; Ronayne, Stephen; May, Clive Newton; O'Brien, Terence John; Milne, Marjorie Eileen; Oxley, Thomas James

    2016-01-01

    Segmentation is the process of partitioning an image into subdivisions and can be applied to medical images to isolate anatomical or pathological areas for further analysis. This process can be done manually or automated by the use of image processing computer packages. Atlas-based segmentation automates this process by the use of a pre-labelled template and a registration algorithm. We developed an ovine brain atlas that can be used as a model for neurological conditions such as Parkinson's disease and focal epilepsy. 17 female Corriedale ovine brains were imaged in-vivo in a 1.5T (low-resolution) MRI scanner. 13 of the low-resolution images were combined using a template construction algorithm to form a low-resolution template. The template was labelled to form an atlas and tested by comparing manual with atlas-based segmentations against the remaining four low-resolution images. The comparisons were in the form of similarity metrics used in previous segmentation research. Dice Similarity Coefficients were utilised to determine the degree of overlap between eight independent, manual and atlas-based segmentations, with values ranging from 0 (no overlap) to 1 (complete overlap). For 7 of these 8 segmented areas, we achieved a Dice Similarity Coefficient of 0.5-0.8. The amygdala was difficult to segment due to its variable location and similar intensity to surrounding tissues resulting in Dice Coefficients of 0.0-0.2. We developed a low resolution ovine brain atlas with eight clinically relevant areas labelled. This brain atlas performed comparably to prior human atlases described in the literature and to intra-observer error providing an atlas that can be used to guide further research using ovine brains as a model and is hosted online for public access.

  20. Diffusible iodine-based contrast-enhanced computed tomography (diceCT): an emerging tool for rapid, high-resolution, 3-D imaging of metazoan soft tissues.

    PubMed

    Gignac, Paul M; Kley, Nathan J; Clarke, Julia A; Colbert, Matthew W; Morhardt, Ashley C; Cerio, Donald; Cost, Ian N; Cox, Philip G; Daza, Juan D; Early, Catherine M; Echols, M Scott; Henkelman, R Mark; Herdina, A Nele; Holliday, Casey M; Li, Zhiheng; Mahlow, Kristin; Merchant, Samer; Müller, Johannes; Orsbon, Courtney P; Paluh, Daniel J; Thies, Monte L; Tsai, Henry P; Witmer, Lawrence M

    2016-06-01

    Morphologists have historically had to rely on destructive procedures to visualize the three-dimensional (3-D) anatomy of animals. More recently, however, non-destructive techniques have come to the forefront. These include X-ray computed tomography (CT), which has been used most commonly to examine the mineralized, hard-tissue anatomy of living and fossil metazoans. One relatively new and potentially transformative aspect of current CT-based research is the use of chemical agents to render visible, and differentiate between, soft-tissue structures in X-ray images. Specifically, iodine has emerged as one of the most widely used of these contrast agents among animal morphologists due to its ease of handling, cost effectiveness, and differential affinities for major types of soft tissues. The rapid adoption of iodine-based contrast agents has resulted in a proliferation of distinct specimen preparations and scanning parameter choices, as well as an increasing variety of imaging hardware and software preferences. Here we provide a critical review of the recent contributions to iodine-based, contrast-enhanced CT research to enable researchers just beginning to employ contrast enhancement to make sense of this complex new landscape of methodologies. We provide a detailed summary of recent case studies, assess factors that govern success at each step of the specimen storage, preparation, and imaging processes, and make recommendations for standardizing both techniques and reporting practices. Finally, we discuss potential cutting-edge applications of diffusible iodine-based contrast-enhanced computed tomography (diceCT) and the issues that must still be overcome to facilitate the broader adoption of diceCT going forward. © 2016 The Authors. Journal of Anatomy published by John Wiley & Sons Ltd on behalf of Anatomical Society.

  1. Sparsity-weighted outlier FLOODing (OFLOOD) method: Efficient rare event sampling method using sparsity of distribution.

    PubMed

    Harada, Ryuhei; Nakamura, Tomotake; Shigeta, Yasuteru

    2016-03-30

    As an extension of the Outlier FLOODing (OFLOOD) method [Harada et al., J. Comput. Chem. 2015, 36, 763], the sparsity of the outliers defined by a hierarchical clustering algorithm, FlexDice, was considered to achieve an efficient conformational search as sparsity-weighted "OFLOOD." In OFLOOD, FlexDice detects areas of sparse distribution as outliers. The outliers are regarded as candidates that have high potential to promote conformational transitions and are employed as initial structures for conformational resampling by restarting molecular dynamics simulations. When detecting outliers, FlexDice defines a rank in the hierarchy for each outlier, which relates to sparsity in the distribution. In this study, we define a lower rank (first ranked), a medium rank (second ranked), and the highest rank (third ranked) outliers, respectively. For instance, the first-ranked outliers are located in a given conformational space away from the clusters (highly sparse distribution), whereas those with the third-ranked outliers are nearby the clusters (a moderately sparse distribution). To achieve the conformational search efficiently, resampling from the outliers with a given rank is performed. As demonstrations, this method was applied to several model systems: Alanine dipeptide, Met-enkephalin, Trp-cage, T4 lysozyme, and glutamine binding protein. In each demonstration, the present method successfully reproduced transitions among metastable states. In particular, the first-ranked OFLOOD highly accelerated the exploration of conformational space by expanding the edges. In contrast, the third-ranked OFLOOD reproduced local transitions among neighboring metastable states intensively. For quantitatively evaluations of sampled snapshots, free energy calculations were performed with a combination of umbrella samplings, providing rigorous landscapes of the biomolecules. © 2015 Wiley Periodicals, Inc.

  2. Management of chest deformity caused by microtia reconstruction: Comparison of autogenous diced cartilage versus cadaver cartilage graft partial filling techniques.

    PubMed

    Go, Ju Young; Kang, Bo Young; Hwang, Jin Hee; Oh, Kap Sung

    2017-01-01

    Efforts to prevent chest wall deformity after costal cartilage graft are ongoing. In this study, we introduce a new method to prevent donor site deformation using irradiated cadaver cartilage (ICC) and compare this method to the autogenous diced cartilage (ADC) technique. Forty-two pediatric patients comprised the ADC group (n = 24) and the ICC group (n = 18). After harvesting costal cartilage, the empty perichondrial space was filled with autologous diced cartilage in the ADC group and cadaver cartilage in the ICC group. Digital photographs and rib cartilage three-dimensional computed tomography (CT) data were analyzed to compare the preventive effect of donor site deformity. We compared the pre- and postoperative costal cartilage volumes using 3D-CT and graded the volumes (grade I: 0%-25%, grade II: 25%-50%, grade III: 50%-75%, and grade IV: 75%-100%). The average follow-up period was 20 and 24 months in the ADC and ICC groups, respectively. Grade IV maintenance of previous costal cartilage volume was evident postoperatively in 22% of patients in the ADC group and 82% of patients in the ICC group. Intercostal space narrowing and chest wall depression were less in the ICC group. There were no complications or severe resorption of cadaver cartilage. ICC support transected costal ring and prevented stability loss by acting as a spacer. The ICC technique is more effective in preventing intercostal space narrowing and chest wall depression than the ADC technique. Samsung Medical Center Institution Review Board, Unique protocol ID: 2009-10-006-008. This study is also registered on PRS (ClinicalTrials.gov Record 2009-10-006). Copyright © 2016 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  3. Duplex-imprinted nano well arrays for promising nanoparticle assembly

    NASA Astrophysics Data System (ADS)

    Li, Xiangping; Manz, Andreas

    2018-02-01

    A large area nano-duplex-imprint technique is presented in this contribution using natural cicada wings as stamps. The glassy wings of the cicada, which are abundant in nature, exhibit strikingly interesting nanopillar structures over their membrane. This technique, with excellent performance despite the nonplanar surface of the wings, combines both top-down and bottom-up nanofabrication techniques. It transitions micro-nanofabrication from a cleanroom environment to the bench. Two different materials, dicing tape with an acrylic layer and a UV optical adhesive, are used to make replications at the same time, thus achieving duplex imprinting. The promise of a large volume of commercial manufacturing of these nanostructure elements can be envisaged through this contribution to speeding up the fabrication process and achieving a higher throughput. The contact angle of the replicated nanowell arrays before and after oxygen plasma was measured. Gold nanoparticles (50 nm) were used to test how the nanoparticles behaved on the untreated and plasma-treated replica surface. The experiments show that promising nanoparticle self-assembly can be obtained.

  4. Deep silicon etching: current capabilities and future directions

    NASA Astrophysics Data System (ADS)

    Westerman, Russ; Martinez, Linnell; Pays-Volard, David; Mackenzie, Ken; Lazerand, Thierry

    2014-03-01

    Deep Reactive Ion Etching (DRIE) has revolutionized a wide variety of MEMS applications since its inception nearly two decades ago. The DRIE technology has been largely responsible for allowing lab scale technology demonstrations to become manufacturable and profitable consumer products. As applications which utilize DRIE technologies continue to expand and evolve, they continue to spawn a range of new requirements and open up exciting opportunities for advancement of DRIE. This paper will examine a number of current and emerging DRIE applications including nanotechnology, and DRIE related packaging technologies such as Through Silicon Via (TSV) and plasma dicing. The paper will discuss a number of technical challenges and solutions associated with these applications including: feature profile control at high aspect ratios, causes and elimination of feature tilt/skew, process options for fragile device structures, and problems associated with through substrate etching. The paper will close with a short discussion around the challenges of implementing DRIE in production environments as well as looking at potentially disruptive enhancements / substitutions for DRIE.

  5. Getting a Bill through Congress.

    ERIC Educational Resources Information Center

    Walch, J. Weston

    1985-01-01

    A game to help secondary civics students understand how a Congressional bill becomes law is presented. The game board is pictured; teachers have to run off copies on the photocopier and then add dice and some colored pieces of paper for students to use as markers. (RM)

  6. Modeling Two Types of Adaptation to Climate Change

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  7. 19 CFR 10.14 - Fabricated components subject to the exemption.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... assembled, such as transistors, diodes, integrated circuits, machinery parts, or precut parts of wearing..., or integrated circuit wafers containing individual integrated circuit dice which have been scribed or... resulted in a substantial transformation of the foreign copper ingots. Example 2. An integrated circuit...

  8. Reasoning and mathematical skills contribute to normatively superior decision making under risk: evidence from the game of dice task.

    PubMed

    Pertl, Marie-Theres; Zamarian, Laura; Delazer, Margarete

    2017-08-01

    In this study, we assessed to what extent reasoning improves performance in decision making under risk in a laboratory gambling task (Game of Dice Task-Double, GDT-D). We also investigated to what degree individuals with above average mathematical competence decide better than those with average mathematical competence. Eighty-five participants performed the GDT-D and several numerical tasks. Forty-two individuals were asked to calculate the probabilities and the outcomes associated with the different options of the GDT-D before performing it. The other 43 individuals performed the GDT-D at the beginning of the test session. Both reasoning and mathematical competence had a positive effect on decision making. Different measures of mathematical competence correlated with advantageous performance in decision making. Results suggest that decision making under explicit risk conditions improves when individuals are encouraged to reflect about the contingencies of a decision situation. Interventions based on numerical reasoning may also be useful for patients with difficulties in decision making.

  9. Variations on a simple dice game

    NASA Astrophysics Data System (ADS)

    Heafner, Joe

    2018-04-01

    I begin my introductory astronomy course with a unit on critical thinking that focuses on, among other things, the differences between the "scientific method" as frequently presented in textbooks and actual scientific practice. One particular classroom activity uses a simple dice game to simulate observation of a natural phenomenon and the process of figuring out the framework, which we have previously defined as the rules that allow us to make predictions, governing the simulated phenomenon. Using games to teach scientific methodology is not new (see Maloney and Masters and Smith and references therein). I have experimented with Maloney and Masters' games and discovered that my students found them too difficult to figure out and therefore they did not learn what I hoped they would from them. I also experimented with other card games and found that too many students already knew the rules of both well-known and obscure card games. I even tried inventing my own games with, at best, mediocre results.

  10. Experiment and simulation study of laser dicing silicon with water-jet

    NASA Astrophysics Data System (ADS)

    Bao, Jiading; Long, Yuhong; Tong, Youqun; Yang, Xiaoqing; Zhang, Bin; Zhou, Zupeng

    2016-11-01

    Water-jet laser processing is an internationally advanced technique, which combines the advantages of laser processing with water jet cutting. In the study, the experiment of water-jet laser dicing are conducted with ns pulsed laser of 1064 nm irradiating, and Smooth Particle Hydrodynamic (SPH) technique by AUTODYN software was modeled to research the fluid dynamics of water and melt when water jet impacting molten material. The silicon surface morphology of the irradiated spots has an appearance as one can see in porous formation. The surface morphology exhibits a large number of cavities which indicates as bubble nucleation sites. The observed surface morphology shows that the explosive melt expulsion could be a dominant process for the laser ablating silicon in liquids with nanosecond pulse laser of 1064 nm irradiating. Self-focusing phenomenon was found and its causes are analyzed. Smooth Particle Hydrodynamic (SPH) modeling technique was employed to understand the effect of water and water-jet on debris removal during water-jet laser machining.

  11. Public Perception of Climate Change and the New Climate Dice

    NASA Technical Reports Server (NTRS)

    Hansen, James; Sato, Makiko; Ruedy, Reto

    2012-01-01

    "Climate dice", describing the chance of unusually warm or cool seasons, have become more and more "loaded" in the past 30 years, coincident with rapid global warming. The distribution of seasonal mean temperature anomalies has shifted toward higher temperatures and the range of anomalies has increased. An important change is the emergence of a category of summertime extremely hot outliers, more than three standard deviations (3 sigma) warmer than the climatology of the 1951-1980 base period. This hot extreme, which covered much less than 1% of Earth's surface during the base period, now typically covers about 10% of the land area. It follows that we can state, with a high degree of confidence, that extreme anomalies such as those in Texas and Oklahoma in 2011 and Moscow in 2010 were a consequence of global warming, because their likelihood in the absence of global warming was exceedingly small. We discuss practical implications of this substantial, growing, climate change.

  12. "Five on a dice" port placement for robot-assisted thoracoscopic right upper lobectomy using robotic stapler.

    PubMed

    Kim, Min P; Chan, Edward Y

    2017-12-01

    Early versions of the da Vinci robot system (S and Si) have been used to perform pulmonary lung resection with severe limitations. The lack of a vascular robot stapler required the presence of a trained bedside assistant whose role was to place, manipulate and fire the stapler around major vascular structures. Thus, the techniques developed for the Si robot required a skilled bedside assistant to perform stapling of the hilar structure and manipulation of the lung. With the advent of the da Vinci Xi system with a vascular robot stapler, we postulated that we could develop a new port placement and technique to provide total control for the surgeon during the pulmonary lung resection. We found that the "five on a dice" port placement and technique allows for minimal assistance during the lobectomy with full control by the surgeon. This technique uses the full capability of the Xi robot to make the robot-assisted lobectomy a safe and ergonomic operation.

  13. Spinning the wheels and rolling the dice: life-cycle risks and benefits of bicycle commuting in the U.S.

    PubMed

    Edwards, Ryan D; Mason, Carl N

    2014-07-01

    To assess the net impact on U.S. longevity of the decision to commute by bicycle rather than automobile. We construct fatality rates per distance traveled using official statistics and denominators from the 2009 National Household Travel Survey. We model the life-table impact of switching from auto to bicycle commuting. Key factors are increased risks from road accidents and reduced risks from enhanced cardiovascular health. Bicycling fatality rates in the U.S. are an order of magnitude higher than in Western Europe. Risks punish both young and old, while the health benefits guard against causes of mortality that rise rapidly with age. Although the protective effects of bicycling appear significant, it may be optimal to wait until later ages to initiate regular bicycle commuting in the current U.S. risk environment, especially if individuals discount future life years. The lifetime health benefits of bicycle commuting appear to outweigh the risks in the U.S., but individuals who sufficiently discount or disbelieve the health benefits may delay or avoid bicycling. Bicycling in middle age avoids much fatality risk while capturing health benefits. Significant cross-state variations in bicycling mortality suggest that improvements in the built environment might spur changes in transit mode. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. A Simple Statistical Thermodynamics Experiment

    ERIC Educational Resources Information Center

    LoPresto, Michael C.

    2010-01-01

    Comparing the predicted and actual rolls of combinations of both two and three dice can help to introduce many of the basic concepts of statistical thermodynamics, including multiplicity, probability, microstates, and macrostates, and demonstrate that entropy is indeed a measure of randomness, that disordered states (those of higher entropy) are…

  15. Modeling Adaptation as a Flow and Stock Decsion with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-live...

  16. Physics Notes.

    ERIC Educational Resources Information Center

    School Science Review, 1983

    1983-01-01

    Discusses the Rugby clock as a source of project material, use of ZX81 for experimental science, computer dice analog, oil recovery from reservoirs, and computer simulation of Thompson's experiment for determining e/m for an electron. Activities/procedures are provided when applicable. Also presents questions (and answers) related to time-coded…

  17. Modeling Adaptation as a Flow and Stock Decision with Mitigation

    EPA Science Inventory

    Mitigation and adaptation are the two key responses available to policymakers to reduce the risks of climate change. We model these two policies together in a new DICE-based integrated assessment model that characterizes adaptation as either short-lived flow spending or long-liv...

  18. An analysis of the pull strength behaviors of fine-pitch, flip chip solder interconnections using a Au-Pt-Pd thick film conductor on Low-Temperature, Co-fired Ceramic (LTCC) substrates.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uribe, Fernando R.; Kilgo, Alice C.; Grazier, John Mark

    2008-09-01

    The assembly of the BDYE detector requires the attachment of sixteen silicon (Si) processor dice (eight on the top side; eight on the bottom side) onto a low-temperature, co-fired ceramic (LTCC) substrate using 63Sn-37Pb (wt.%, Sn-Pb) in a double-reflow soldering process (nitrogen). There are 132 solder joints per die. The bond pads were gold-platinum-palladium (71Au-26Pt-3Pd, wt.%) thick film layers fired onto the LTCC in a post-process sequence. The pull strength and failure modes provided the quality metrics for the Sn-Pb solder joints. Pull strengths were measured in both the as-fabricated condition and after exposure to thermal cycling (-55/125 C; 15more » min hold times; 20 cycles). Extremely low pull strengths--referred to as the low pull strength phenomenon--were observed intermittently throughout the product build, resulting in added program costs, schedule delays, and a long-term reliability concern for the detector. There was no statistically significant correlation between the low pull strength phenomenon and (1) the LTCC 'sub-floor' lot; (2) grit blasting the LTCC surfaces prior to the post-process steps; (3) the post-process parameters; (4) the conductor pad height (thickness); (5) the dice soldering assembly sequence; or (5) the dice pull test sequence. Formation of an intermetallic compound (IMC)/LTCC interface caused by thick film consumption during either the soldering process or by solid-state IMC formation was not directly responsible for the low-strength phenomenon. Metallographic cross sections of solder joints from dice that exhibited the low pull strength behavior, revealed the presence of a reaction layer resulting from an interaction between Sn from the molten Sn-Pb and the glassy phase at the TKN/LTCC interface. The thick film porosity did not contribute, explicitly, to the occurrence of reaction layer. Rather, the process of printing the very thin conductor pads was too sensitive to minor thixotropic changes to ink, which resulted in inconsistent proportions of metal and glassy phase particles present during the subsequent firing process. The consequences were subtle, intermittent changes to the thick film microstructure that gave rise to the reaction layer and, thus, the low pull strength phenomenon. A mitigation strategy would be the use of physical vapor deposition (PVD) techniques to create thin film bond pads; this is multi-chip module, deposited (MCM-D) technology.« less

  19. Multiatlas whole heart segmentation of CT data using conditional entropy for atlas ranking and selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhuang, Xiahai, E-mail: zhuangxiahai@sjtu.edu.cn; Qian, Xiaohua; Bai, Wenjia

    Purpose: Cardiac computed tomography (CT) is widely used in clinical diagnosis of cardiovascular diseases. Whole heart segmentation (WHS) plays a vital role in developing new clinical applications of cardiac CT. However, the shape and appearance of the heart can vary greatly across different scans, making the automatic segmentation particularly challenging. The objective of this work is to develop and evaluate a multiatlas segmentation (MAS) scheme using a new atlas ranking and selection algorithm for automatic WHS of CT data. Research on different MAS strategies and their influence on WHS performance are limited. This work provides a detailed comparison study evaluatingmore » the impacts of label fusion, atlas ranking, and sizes of the atlas database on the segmentation performance. Methods: Atlases in a database were registered to the target image using a hierarchical registration scheme specifically designed for cardiac images. A subset of the atlases were selected for label fusion, according to the authors’ proposed atlas ranking criterion which evaluated the performance of each atlas by computing the conditional entropy of the target image given the propagated atlas labeling. Joint label fusion was used to combine multiple label estimates to obtain the final segmentation. The authors used 30 clinical cardiac CT angiography (CTA) images to evaluate the proposed MAS scheme and to investigate different segmentation strategies. Results: The mean WHS Dice score of the proposed MAS method was 0.918 ± 0.021, and the mean runtime for one case was 13.2 min on a workstation. This MAS scheme using joint label fusion generated significantly better Dice scores than the other label fusion strategies, including majority voting (0.901 ± 0.276, p < 0.01), locally weighted voting (0.905 ± 0.0247, p < 0.01), and probabilistic patch-based fusion (0.909 ± 0.0249, p < 0.01). In the atlas ranking study, the proposed criterion based on conditional entropy yielded a performance curve with higher WHS Dice scores compared to the conventional schemes (p < 0.03). In the atlas database study, the authors showed that the MAS using larger atlas databases generated better performance curves than the MAS using smaller ones, indicating larger atlas databases could produce more accurate segmentation. Conclusions: The authors have developed a new MAS framework for automatic WHS of CTA and investigated alternative implementations of MAS. With the proposed atlas ranking algorithm and joint label fusion, the MAS scheme is able to generate accurate segmentation within practically acceptable computation time. This method can be useful for the development of new clinical applications of cardiac CT.« less

  20. Classroom Idea-Sparkers

    ERIC Educational Resources Information Center

    Dettore, Ernie

    2004-01-01

    Introducing nursery rhymes to young children can inspire them to explore language and motivate them to explore word play further in meaningful experiences (like cooking) that can be integrated into all aspects of the curriculum. Whether they slice, dice, or add allspice, these actions are appealing, because they contain many activities that help…

  1. 25 CFR 140.21 - Gambling.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Gambling. 140.21 Section 140.21 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES LICENSED INDIAN TRADERS § 140.21 Gambling. Gambling, by dice, cards, or in any way whatever, is strictly prohibited in any licensed trader's store or...

  2. 25 CFR 141.28 - Gambling prohibited.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Gambling prohibited. 141.28 Section 141.28 Indians BUREAU..., HOPI AND ZUNI RESERVATIONS General Business Practices § 141.28 Gambling prohibited. No licensee may permit any person to gamble by dice, cards, or in any way whatever, including the use of any mechanical...

  3. Active Learning? Not with My Syllabus!

    ERIC Educational Resources Information Center

    Ernst, Michael D.

    2012-01-01

    We describe an approach to teaching probability that minimizes the amount of class time spent on the topic while also providing a meaningful (dice-rolling) activity to get students engaged. The activity, which has a surprising outcome, illustrates the basic ideas of informal probability and how probability is used in statistical inference.…

  4. 40 CFR 407.71 - Specialized definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., parsley, asparagus, tomatoes, green beans, corn, spinach, green onion tops, chives, leeks, whole, diced, and any other piece size ranging from sliced to powder. (i) The term dry beans shall mean the... formulated sauces, meats and gravies. (j) The term lima beans shall mean the processing of lima beans into...

  5. 40 CFR 407.71 - Specialized definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., parsley, asparagus, tomatoes, green beans, corn, spinach, green onion tops, chives, leeks, whole, diced, and any other piece size ranging from sliced to powder. (i) The term dry beans shall mean the... formulated sauces, meats and gravies. (j) The term lima beans shall mean the processing of lima beans into...

  6. 40 CFR 407.71 - Specialized definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., parsley, asparagus, tomatoes, green beans, corn, spinach, green onion tops, chives, leeks, whole, diced, and any other piece size ranging from sliced to powder. (i) The term dry beans shall mean the... formulated sauces, meats and gravies. (j) The term lima beans shall mean the processing of lima beans into...

  7. 25 CFR 141.28 - Gambling prohibited.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Gambling prohibited. 141.28 Section 141.28 Indians BUREAU..., HOPI AND ZUNI RESERVATIONS General Business Practices § 141.28 Gambling prohibited. No licensee may permit any person to gamble by dice, cards, or in any way whatever, including the use of any mechanical...

  8. 25 CFR 140.21 - Gambling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Gambling. 140.21 Section 140.21 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES LICENSED INDIAN TRADERS § 140.21 Gambling. Gambling, by dice, cards, or in any way whatever, is strictly prohibited in any licensed trader's store or...

  9. 25 CFR 140.21 - Gambling.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Gambling. 140.21 Section 140.21 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES LICENSED INDIAN TRADERS § 140.21 Gambling. Gambling, by dice, cards, or in any way whatever, is strictly prohibited in any licensed trader's store or...

  10. 25 CFR 141.28 - Gambling prohibited.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true Gambling prohibited. 141.28 Section 141.28 Indians BUREAU..., HOPI AND ZUNI RESERVATIONS General Business Practices § 141.28 Gambling prohibited. No licensee may permit any person to gamble by dice, cards, or in any way whatever, including the use of any mechanical...

  11. 25 CFR 140.21 - Gambling.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false Gambling. 140.21 Section 140.21 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES LICENSED INDIAN TRADERS § 140.21 Gambling. Gambling, by dice, cards, or in any way whatever, is strictly prohibited in any licensed trader's store or...

  12. 25 CFR 141.28 - Gambling prohibited.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false Gambling prohibited. 141.28 Section 141.28 Indians BUREAU..., HOPI AND ZUNI RESERVATIONS General Business Practices § 141.28 Gambling prohibited. No licensee may permit any person to gamble by dice, cards, or in any way whatever, including the use of any mechanical...

  13. 25 CFR 141.28 - Gambling prohibited.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false Gambling prohibited. 141.28 Section 141.28 Indians BUREAU..., HOPI AND ZUNI RESERVATIONS General Business Practices § 141.28 Gambling prohibited. No licensee may permit any person to gamble by dice, cards, or in any way whatever, including the use of any mechanical...

  14. 25 CFR 140.21 - Gambling.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 1 2012-04-01 2011-04-01 true Gambling. 140.21 Section 140.21 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES LICENSED INDIAN TRADERS § 140.21 Gambling. Gambling, by dice, cards, or in any way whatever, is strictly prohibited in any licensed trader's store or...

  15. DARPA DICE Manufacturing Optimization

    DTIC Science & Technology

    1993-01-01

    Entity ................................................... 13 3.3.4 Labor Entity ....................................................... 14 3.3.5 Equipment...51 4.2.13.4 Labor Specification .................................... 52 4.2.13.5 Facility Specification .................................. 543...resources. A I resource is any facility, labor , equipment, or consumable material used in the manufacturing U UNCLASSIFIED CDRL No.0002AB-5 process. A

  16. Exploring Multiplication: Three-in-a-Row Lucky Numbers

    ERIC Educational Resources Information Center

    Russo, James A.

    2018-01-01

    Three-in-a-Row Lucky Numbers is an engaging, enjoyable, mathematically meaningful, game-based activity involving dice and a hundred chart, which can be used to introduce students to multiplication. The game provides a mechanism for students to explore the structure of multiplication, experiment with the distributive property, and begin to…

  17. Familiar Sports and Activities Adapted for Multiply Impaired Persons.

    ERIC Educational Resources Information Center

    Schilling, Mary Lou, Ed.

    1984-01-01

    Means of adapting some familiar and popular physical activities for multiply impaired persons are described. Games reviewed are dice baseball, one base baseball, in-house bowling, wheelchair bowling, ramp bowling, swing-ball bowling, table tennis, shuffleboard, beanbag bingo and tic-tac-toe, balloon basketball, circle football, and wheelchair…

  18. Using Lotus 1-2-3 for "Non-Stop" Graphic Simulation.

    ERIC Educational Resources Information Center

    Godin, Victor B.; Rao, Ashok

    1988-01-01

    Discusses the use of Lotus 1-2-3 to create non-stop graphic displays of simulation models. Describes a simple application of this technique using the distribution resulting from repeated throws of dice. Lists other software used with this technique. Stresses the advantages of this approach in education. (CW)

  19. Dice and DNA

    ERIC Educational Resources Information Center

    Wernersson, Rasmus

    2007-01-01

    An important part of teaching students how to use the BLAST tool for searching large sequence databases, is to train the students to think critically about the quality of the sequence hits found--both in terms of the statistical significance and how informative the individual hits are. This paper describes how generating truly random sequences by…

  20. 40 CFR 407.71 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., parsley, asparagus, tomatoes, green beans, corn, spinach, green onion tops, chives, leeks, whole, diced, and any other piece size ranging from sliced to powder. (i) The term dry beans shall mean the... formulated sauces, meats and gravies. (j) The term lima beans shall mean the processing of lima beans into...

  1. Multiple-Solution Problems in a Statistics Classroom: An Example

    ERIC Educational Resources Information Center

    Chu, Chi Wing; Chan, Kevin L. T.; Chan, Wai-Sum; Kwong, Koon-Shing

    2017-01-01

    The mathematics education literature shows that encouraging students to develop multiple solutions for given problems has a positive effect on students' understanding and creativity. In this paper, we present an example of multiple-solution problems in statistics involving a set of non-traditional dice. In particular, we consider the exact…

  2. Visualizing and Understanding Probability and Statistics: Graphical Simulations Using Excel

    ERIC Educational Resources Information Center

    Gordon, Sheldon P.; Gordon, Florence S.

    2009-01-01

    The authors describe a collection of dynamic interactive simulations for teaching and learning most of the important ideas and techniques of introductory statistics and probability. The modules cover such topics as randomness, simulations of probability experiments such as coin flipping, dice rolling and general binomial experiments, a simulation…

  3. Farkle Fundamentals and Fun. Activities for Students

    ERIC Educational Resources Information Center

    Hooley, Donald E.

    2014-01-01

    The dice game Farkle provides an excellent basis for four activities that reinforce probability and expected value concepts for students in an introductory statistics class. These concepts appear in the increasingly popular AP statistics course (Peck 2011) and are used in analyzing ethical issues from insurance and gambling (COMAP 2009; Woodward…

  4. PREFACE: DICE 2006—Quantum Mechanics between Decoherence and Determinism

    NASA Astrophysics Data System (ADS)

    Diósi, Lajos; Elze, Hans-Thomas; Vitiello, Giuseppe

    2007-06-01

    These proceedings are based on the Invited Lectures and Contributed Papers of the Third International Workshop on Decoherence, Information, Complexity and Entropy—DICE 2006, which was held at Castello di Piombino (Tuscany), 11 15 September 2006. They are meant to document the stimulating exchange of ideas at this interdisciplinary workshop and to share it with the wider scientific community. It successfully continued what was begun with DICE 20021 and followed by DICE 20042 uniting more than seventy participants from more than a dozen different countries worldwide. It has been a great honour and inspiration for all of us to have Professor G. 't Hooft (Nobel Prize for Physics 1999) from the Spinoza Institute and University of Utrecht with us, who presented the lecture `A mathematical theory for deterministic quantum mechanics' (included in this volume). Discussions under the wider theme `Quantum Mechanics between decoherence and determinism: new aspects from particle physics to cosmology' took place in the very pleasant and productive atmosphere at the Castello di Piombino, with a fluctuation of stormy weather only on the evening of the conference dinner. The program of the workshop was grouped according to the following topics: complex systems, classical and quantum aspects Lorentz symmetry, neutrinos and the Universe reduction, decoherence and entanglement quantum, gravity and spacetime -- emergent reality? quantum gravity/cosmology The traditional Public Opening Lecture was presented this time by E. Del Giudice (Milano), who captivated the audience with `Old and new views on the structure of matter and the special case of living matter' on the evening of the arrival day. The workshop has been organized by S. Boccaletti (Firenze), L. Diósi (Budapest), H.-T. Elze (Pisa, chair), L. Fronzoni (Pisa), J. Halliwell (London), and G. Vitiello (Salerno), with great help from our conference secretaries M. Pesce-Rollins (Siena) and L. Baldini (Pisa). Several institutions and sponsors generously supported the workshop and their representatives and, in particular, the citizens of Piombino are deeply thanked for the hospitality: G. Anselmi (Sindaco del Comune di Piombino), O. Dell'Omodarme (Assessore alle Culture), A. Tempestini (Assessore alla Pubblica Istruzione), E. Murzi (Assessore al Turismo), A. Falchi (Dirigente dei Servizi Educativi e Culturali), M. Gianfranchi (Responsabile del Servizio Promozione Culturale), T. Ghini (Ufficio Beni Culturali), and L. Grilli, C. Boggero and P. Venturi (Ufficio Cultura), M. Pierulivo (Segreteria del Sindaco), L. Pasquinucci (URP e Comunicazione). Thanks go to Idearte (Cooperativa di Servizi Culturali) and especially to L. Pesce (Vitrium Galleria, Populonia). Funds made available by Universitá di Pisa (Centro Interdisciplinare per lo Studio dei Sistemi Complessi -- CISSC and Domus Galilaeana) and Universitá di Salerno (Dipartimento di Fisica and INFN) are gratefully acknowledged. The research papers presented at the workshop, often incorporating further developments since then, have been edited by L. Diósi, H.-T. Elze and G. Vitiello. They are collected here, essentially following the program of the workshop, however, divided into Invited Lectures and Contributed Papers, respectively. In the name of all participants, we would like to thank G. Douglas (IOP Publishing, Bristol) for his friendly advice and immediate help during the editing process. Lajos Diósi, Hans-Thomas Elze and Giuseppe Vitiello Budapest, Pisa, Salerno, March 2007 1Decoherence and Entropy in Complex Systems ed H-T Elze Lecture Notes in Physics 633 (Berlin: Springer, 2004) 2Proceedings of the Second International Workshop on Decoherence, Information, Complexity and Entropy DICE 2004 ed H-T Elze Braz. J. Phys. 35, 2A and 2B (2005) pp 205 529 freely accessible at: www.sbfisica.org.br/bjp

  5. Development of a 35-MHz piezo-composite ultrasound array for medical imaging.

    PubMed

    Cannata, Jonathan M; Williams, Jay A; Zhou, Qifa; Ritter, Timothy A; Shung, K Kirk

    2006-01-01

    This paper discusses the development of a 64-element 35-MHz composite ultrasonic array. This array was designed primarily for ocular imaging applications, and features 2-2 composite elements mechanically diced out of a fine-grain high-density Navy Type VI ceramic. Array elements were spaced at a 50-micron pitch, interconnected via a custom flexible circuit and matched to the 50-ohm system electronics via a 75-ohm transmission line coaxial cable. Elevation focusing was achieved using a cylindrically shaped epoxy lens. One functional 64-element array was fabricated and tested. Bandwidths averaging 55%, 23-dB insertion loss, and crosstalk less than -24 dB were measured. An image of a tungsten wire target phantom was acquired using a synthetic aperture reconstruction algorithm. The results from this imaging test demonstrate resolution exceeding 50 microm axially and 100 microm laterally.

  6. TECHNICAL NOTE: High-speed grinding using thin abrasive disks for microcomponents

    NASA Astrophysics Data System (ADS)

    Yeo, S. H.; Balon, S. A. P.

    2002-01-01

    This paper introduces the development of a high-speed grinding device for cylindrical grinding of microcomponents made of hard and brittle materials. The study made use of an ultraprecision diamond turning machine tool as a basic platform. The novelty of the device is based on the high-speed air bearing spindle with a thin grinding wheel, similar to the dicing technology for silicon wafer fabrication. The spindle attachment is inclined at an angle to the main spindle which holds the precision fixture mechanism via the vacuum chuck. Experiments have been conducted to verify the design and implementation of the grinding methodology. A feature size as small as 31 μm in diameter and average surface roughness of 98 nm were obtained in the experimental work. It is found that the work done is capable of manufacturing miniature components, such as microcylindrical stepped shafts.

  7. Flip chip bumping technology—Status and update

    NASA Astrophysics Data System (ADS)

    Juergen Wolf, M.; Engelmann, Gunter; Dietrich, Lothar; Reichl, Herbert

    2006-09-01

    Flip chip technology is a key driver for new complex system architectures and high-density packaging, e.g. sensor or pixel devices. Bumped wafers/dice as key elements become very important in terms of general availability at low cost, high yield and quality level. Today, different materials, e.g. Au, Ni, AuSn, SnAg, SnAgCu, SnCu, etc., are used for flip chip interconnects and different bumping approaches are available. Electroplating is the technology of choice for high-yield wafer bumping for small bump sizes and pitches. Lead-free solder bumps require an increase in knowledge in the field of under bump metallization (UBM) and the interaction of bump and substrate metallization, the formation and growth of intermetallic compounds (IMCs) during liquid- and solid-phase reactions. Results of a new bi-layer UBM of Ni-Cu which is especially designed for small-sized lead-free solder bumps will be discussed.

  8. The Accuracy and Reliability of Crowdsource Annotations of Digital Retinal Images

    PubMed Central

    Mitry, Danny; Zutis, Kris; Dhillon, Baljean; Peto, Tunde; Hayat, Shabina; Khaw, Kay-Tee; Morgan, James E.; Moncur, Wendy; Trucco, Emanuele; Foster, Paul J.

    2016-01-01

    Purpose Crowdsourcing is based on outsourcing computationally intensive tasks to numerous individuals in the online community who have no formal training. Our aim was to develop a novel online tool designed to facilitate large-scale annotation of digital retinal images, and to assess the accuracy of crowdsource grading using this tool, comparing it to expert classification. Methods We used 100 retinal fundus photograph images with predetermined disease criteria selected by two experts from a large cohort study. The Amazon Mechanical Turk Web platform was used to drive traffic to our site so anonymous workers could perform a classification and annotation task of the fundus photographs in our dataset after a short training exercise. Three groups were assessed: masters only, nonmasters only and nonmasters with compulsory training. We calculated the sensitivity, specificity, and area under the curve (AUC) of receiver operating characteristic (ROC) plots for all classifications compared to expert grading, and used the Dice coefficient and consensus threshold to assess annotation accuracy. Results In total, we received 5389 annotations for 84 images (excluding 16 training images) in 2 weeks. A specificity and sensitivity of 71% (95% confidence interval [CI], 69%–74%) and 87% (95% CI, 86%–88%) was achieved for all classifications. The AUC in this study for all classifications combined was 0.93 (95% CI, 0.91–0.96). For image annotation, a maximal Dice coefficient (∼0.6) was achieved with a consensus threshold of 0.25. Conclusions This study supports the hypothesis that annotation of abnormalities in retinal images by ophthalmologically naive individuals is comparable to expert annotation. The highest AUC and agreement with expert annotation was achieved in the nonmasters with compulsory training group. Translational Relevance The use of crowdsourcing as a technique for retinal image analysis may be comparable to expert graders and has the potential to deliver timely, accurate, and cost-effective image analysis. PMID:27668130

  9. The Accuracy and Reliability of Crowdsource Annotations of Digital Retinal Images.

    PubMed

    Mitry, Danny; Zutis, Kris; Dhillon, Baljean; Peto, Tunde; Hayat, Shabina; Khaw, Kay-Tee; Morgan, James E; Moncur, Wendy; Trucco, Emanuele; Foster, Paul J

    2016-09-01

    Crowdsourcing is based on outsourcing computationally intensive tasks to numerous individuals in the online community who have no formal training. Our aim was to develop a novel online tool designed to facilitate large-scale annotation of digital retinal images, and to assess the accuracy of crowdsource grading using this tool, comparing it to expert classification. We used 100 retinal fundus photograph images with predetermined disease criteria selected by two experts from a large cohort study. The Amazon Mechanical Turk Web platform was used to drive traffic to our site so anonymous workers could perform a classification and annotation task of the fundus photographs in our dataset after a short training exercise. Three groups were assessed: masters only, nonmasters only and nonmasters with compulsory training. We calculated the sensitivity, specificity, and area under the curve (AUC) of receiver operating characteristic (ROC) plots for all classifications compared to expert grading, and used the Dice coefficient and consensus threshold to assess annotation accuracy. In total, we received 5389 annotations for 84 images (excluding 16 training images) in 2 weeks. A specificity and sensitivity of 71% (95% confidence interval [CI], 69%-74%) and 87% (95% CI, 86%-88%) was achieved for all classifications. The AUC in this study for all classifications combined was 0.93 (95% CI, 0.91-0.96). For image annotation, a maximal Dice coefficient (∼0.6) was achieved with a consensus threshold of 0.25. This study supports the hypothesis that annotation of abnormalities in retinal images by ophthalmologically naive individuals is comparable to expert annotation. The highest AUC and agreement with expert annotation was achieved in the nonmasters with compulsory training group. The use of crowdsourcing as a technique for retinal image analysis may be comparable to expert graders and has the potential to deliver timely, accurate, and cost-effective image analysis.

  10. FinFET memory cell improvements for higher immunity against single event upsets

    NASA Astrophysics Data System (ADS)

    Sajit, Ahmed Sattar

    The 21st century is witnessing a tremendous demand for transistors. Life amenities have incorporated the transistor in every aspect of daily life, ranging from toys to rocket science. Day by day, scaling down the transistor is becoming an imperious necessity. However, it is not a straightforward process; instead, it faces overwhelming challenges. Due to these scaling changes, new technologies, such as FinFETs for example, have emerged as alternatives to the conventional bulk-CMOS technology. FinFET has more control over the channel, therefore, leakage current is reduced. FinFET could bridge the gap between silicon devices and non-silicon devices. The semiconductor industry is now incorporating FinFETs in systems and subsystems. For example, Intel has been using them in their newest processors, delivering potential saving powers and increased speeds to memory circuits. Memory sub-systems are considered a vital component in the digital era. In memory, few rows are read or written at a time, while the most rows are static; hence, reducing leakage current increases the performance. However, as a transistor shrinks, it becomes more vulnerable to the effects from radioactive particle strikes. If a particle hits a node in a memory cell, the content might flip; consequently, leading to corrupting stored data. Critical fields, such as medical and aerospace, where there are no second chances and cannot even afford to operate at 99.99% accuracy, has induced me to find a rigid circuit in a radiated working environment. This research focuses on a wide spectrum of memories such as 6T SRAM, 8T SRAM, and DICE memory cells using FinFET technology and finding the best platform in terms of Read and Write delay, susceptibility level of SNM, RSNM, leakage current, energy consumption, and Single Event Upsets (SEUs). This research has shown that the SEU tolerance that 6T and 8T FinFET SRAMs provide may not be acceptable in medical and aerospace applications where there is a very high likelihood of SEUs. Consequently, FinFET DICE memory can be a good candidate due to its high ability to tolerate SEUs of different amplitudes and long periods for both read and hold operations.

  11. Influencing Children's Pregambling Game Playing via Conditional Discrimination Training

    ERIC Educational Resources Information Center

    Johnson, Taylor E.; Dixon, Mark R.

    2009-01-01

    Past research has demonstrated a transformation of stimulus functions under similar conditions using gambling tasks and adults (e.g., Zlomke & Dixon, 2006), and the present study attempted to extend this research. Experimenters exposed 7 children (ages 7 to 10 years) to a simulated board game with concurrently available dice differing only by…

  12. Generalizing Galileo's Passe-Dix Game

    ERIC Educational Resources Information Center

    Hombas, Vassilios

    2012-01-01

    This article shows a generalization of Galileo's "passe-dix" game. The game was born following one of Galileo's [G. Galileo, "Sopra le Scoperte dei Dadi" (Galileo, Opere, Firenze, Barbera, Vol. 8). Translated by E.H. Thorne, 1898, pp. 591-594] explanations on a paradox that occurred in the experiment of tossing three fair "six-sided" dice.…

  13. A Hands-On Activity to Introduce the Effects of Transmission by an Invasive Species

    ERIC Educational Resources Information Center

    May, Barbara Jean

    2013-01-01

    This activity engages students to better understand the impact of transmission by invasive species. Using dice, poker chips, and paper plates, an entire class mimics the spread of an invasive species within a geographic region. The activity can be modified and conducted at the K-16 levels.

  14. Slicing and Dicing the ELA Common Core Standards

    ERIC Educational Resources Information Center

    Goatley, Virginia

    2012-01-01

    The English Language Arts Common Core State Standards (ELA CCSS) come at a time when many reading teachers, literacy coaches, and classroom teachers seek more extensive literacy practices than the policy mandates of No Child Left Behind and Reading First. These initiatives placed requirements for instruction in core aspects of reading at the…

  15. 16 CFR 1500.18 - Banned toys and other banned articles intended for use by children.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... be offset by the availability of substitutes for a comparable price. (D) Least burdensome requirement... dice, or balls permanently enclosed inside pinball machines, mazes, or similar outer containers. A ball... from the outer container. (iii) In determining whether such a ball is intended for use by children...

  16. Throwing the Dice: Teaching the Hemocytometer

    ERIC Educational Resources Information Center

    Salm, Sarah; Goodwyn, Lauren; van Loon, Nanette; Lind, Georgia

    2010-01-01

    One of the concepts taught to science students is the use of hemocytometer. Students in microbiology, genetics, and anatomy and physiology (A&P) classes use the hemocytometer in a variety of activities. In microbiology and genetics classes, it is used to quantify yeast cells, while in A&P classes; students learn how to count blood cells. This…

  17. Using Picture Story Books to Discover and Explore the Concept of Equivalence

    ERIC Educational Resources Information Center

    Russo, James

    2016-01-01

    This article describes activities in which students deepen their relational understanding of the equals sign through exploring inequalities in a competitive dice game, built around the familiar fairy-tale "The Three Little Pigs" and "The Big Bad Wolf." The activity can be adapted to different abilities by choosing more or less…

  18. What's Missing in Teaching Probability and Statistics: Building Cognitive Schema for Understanding Random Phenomena

    ERIC Educational Resources Information Center

    Kuzmak, Sylvia

    2016-01-01

    Teaching probability and statistics is more than teaching the mathematics itself. Historically, the mathematics of probability and statistics was first developed through analyzing games of chance such as the rolling of dice. This article makes the case that the understanding of probability and statistics is dependent upon building a…

  19. Generalizing Galileo's passé-dix game

    NASA Astrophysics Data System (ADS)

    Hombas, Vassilios

    2012-07-01

    This article shows a generalization of Galileo's 'passé-dix' game. The game was born following one of Galileo's [G. Galileo, Sopra le Scoperte dei Dadi (Galileo, Opere, Firenze, Barbera, Vol. 8). Translated by E.H. Thorne, 1898, pp. 591-594] explanations on a paradox that occurred in the experiment of tossing three fair 'six-sided' dice.

  20. Automated and real-time segmentation of suspicious breast masses using convolutional neural network

    PubMed Central

    Gregory, Adriana; Denis, Max; Meixner, Duane D.; Bayat, Mahdi; Whaley, Dana H.; Fatemi, Mostafa; Alizad, Azra

    2018-01-01

    In this work, a computer-aided tool for detection was developed to segment breast masses from clinical ultrasound (US) scans. The underlying Multi U-net algorithm is based on convolutional neural networks. Under the Mayo Clinic Institutional Review Board protocol, a prospective study of the automatic segmentation of suspicious breast masses was performed. The cohort consisted of 258 female patients who were clinically identified with suspicious breast masses and underwent clinical US scan and breast biopsy. The computer-aided detection tool effectively segmented the breast masses, achieving a mean Dice coefficient of 0.82, a true positive fraction (TPF) of 0.84, and a false positive fraction (FPF) of 0.01. By avoiding positioning of an initial seed, the algorithm is able to segment images in real time (13–55 ms per image), and can have potential clinical applications. The algorithm is at par with a conventional seeded algorithm, which had a mean Dice coefficient of 0.84 and performs significantly better (P< 0.0001) than the original U-net algorithm. PMID:29768415

  1. Automatic detection of cone photoreceptors in split detector adaptive optics scanning light ophthalmoscope images.

    PubMed

    Cunefare, David; Cooper, Robert F; Higgins, Brian; Katz, David F; Dubra, Alfredo; Carroll, Joseph; Farsiu, Sina

    2016-05-01

    Quantitative analysis of the cone photoreceptor mosaic in the living retina is potentially useful for early diagnosis and prognosis of many ocular diseases. Non-confocal split detector based adaptive optics scanning light ophthalmoscope (AOSLO) imaging reveals the cone photoreceptor inner segment mosaics often not visualized on confocal AOSLO imaging. Despite recent advances in automated cone segmentation algorithms for confocal AOSLO imagery, quantitative analysis of split detector AOSLO images is currently a time-consuming manual process. In this paper, we present the fully automatic adaptive filtering and local detection (AFLD) method for detecting cones in split detector AOSLO images. We validated our algorithm on 80 images from 10 subjects, showing an overall mean Dice's coefficient of 0.95 (standard deviation 0.03), when comparing our AFLD algorithm to an expert grader. This is comparable to the inter-observer Dice's coefficient of 0.94 (standard deviation 0.04). To the best of our knowledge, this is the first validated, fully-automated segmentation method which has been applied to split detector AOSLO images.

  2. Evaluation of tomotherapy MVCT image enhancement program for tumor volume delineation

    PubMed Central

    Martin, Spencer; Rodrigues, George; Chen, Quan; Pavamani, Simon; Read, Nancy; Ahmad, Belal; Hammond, J. Alex; Venkatesan, Varagur; Renaud, James

    2011-01-01

    The aims of this study were to investigate the variability between physicians in delineation of head and neck tumors on original tomotherapy megavoltage CT (MVCT) studies and corresponding software enhanced MVCT images, and to establish an optimal approach for evaluation of image improvement. Five physicians contoured the gross tumor volume (GTV) for three head and neck cancer patients on 34 original and enhanced MVCT studies. Variation between original and enhanced MVCT studies was quantified by DICE coefficient and the coefficient of variance. Based on volume of agreement between physicians, higher correlation in terms of average DICE coefficients was observed in GTV delineation for enhanced MVCT for patients 1, 2, and 3 by 15%, 3%, and 7%, respectively, while delineation variance among physicians was reduced using enhanced MVCT for 12 of 17 weekly image studies. Enhanced MVCT provides advantages in reduction of variance among physicians in delineation of the GTV. Agreement on contouring by the same physician on both original and enhanced MVCT was equally high. PACS numbers: 87.57.N‐, 87.57.np, 87.57.nt

  3. Comparison of thyroid segmentation techniques for 3D ultrasound

    NASA Astrophysics Data System (ADS)

    Wunderling, T.; Golla, B.; Poudel, P.; Arens, C.; Friebe, M.; Hansen, C.

    2017-02-01

    The segmentation of the thyroid in ultrasound images is a field of active research. The thyroid is a gland of the endocrine system and regulates several body functions. Measuring the volume of the thyroid is regular practice of diagnosing pathological changes. In this work, we compare three approaches for semi-automatic thyroid segmentation in freehand-tracked three-dimensional ultrasound images. The approaches are based on level set, graph cut and feature classification. For validation, sixteen 3D ultrasound records were created with ground truth segmentations, which we make publicly available. The properties analyzed are the Dice coefficient when compared against the ground truth reference and the effort of required interaction. Our results show that in terms of Dice coefficient, all algorithms perform similarly. For interaction, however, each algorithm has advantages over the other. The graph cut-based approach gives the practitioner direct influence on the final segmentation. Level set and feature classifier require less interaction, but offer less control over the result. All three compared methods show promising results for future work and provide several possible extensions.

  4. SU-E-J-133: Autosegmentation of Linac CBCT: Improved Accuracy Via Penalized Likelihood Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Y

    2015-06-15

    Purpose: To improve the quality of kV X-ray cone beam CT (CBCT) for use in radiotherapy delivery assessment and re-planning by using penalized likelihood (PL) iterative reconstruction and auto-segmentation accuracy of the resulting CBCTs as an image quality metric. Methods: Present filtered backprojection (FBP) CBCT reconstructions can be improved upon by PL reconstruction with image formation models and appropriate regularization constraints. We use two constraints: 1) image smoothing via an edge preserving filter, and 2) a constraint minimizing the differences between the reconstruction and a registered prior image. Reconstructions of prostate therapy CBCTs were computed with constraint 1 alone andmore » with both constraints. The prior images were planning CTs(pCT) deformable-registered to the FBP reconstructions. Anatomy segmentations were done using atlas-based auto-segmentation (Elekta ADMIRE). Results: We observed small but consistent improvements in the Dice similarity coefficients of PL reconstructions over the FBP results, and additional small improvements with the added prior image constraint. For a CBCT with anatomy very similar in appearance to the pCT, we observed these changes in the Dice metric: +2.9% (prostate), +8.6% (rectum), −1.9% (bladder). For a second CBCT with a very different rectum configuration, we observed +0.8% (prostate), +8.9% (rectum), −1.2% (bladder). For a third case with significant lateral truncation of the field of view, we observed: +0.8% (prostate), +8.9% (rectum), −1.2% (bladder). Adding the prior image constraint raised Dice measures by about 1%. Conclusion: Efficient and practical adaptive radiotherapy requires accurate deformable registration and accurate anatomy delineation. We show here small and consistent patterns of improved contour accuracy using PL iterative reconstruction compared with FBP reconstruction. However, the modest extent of these results and the pattern of differences across CBCT cases suggest that significant further development will be required to make CBCT useful to adaptive radiotherapy.« less

  5. Semi-automated brain tumor segmentation on multi-parametric MRI using regularized non-negative matrix factorization.

    PubMed

    Sauwen, Nicolas; Acou, Marjan; Sima, Diana M; Veraart, Jelle; Maes, Frederik; Himmelreich, Uwe; Achten, Eric; Huffel, Sabine Van

    2017-05-04

    Segmentation of gliomas in multi-parametric (MP-)MR images is challenging due to their heterogeneous nature in terms of size, appearance and location. Manual tumor segmentation is a time-consuming task and clinical practice would benefit from (semi-) automated segmentation of the different tumor compartments. We present a semi-automated framework for brain tumor segmentation based on non-negative matrix factorization (NMF) that does not require prior training of the method. L1-regularization is incorporated into the NMF objective function to promote spatial consistency and sparseness of the tissue abundance maps. The pathological sources are initialized through user-defined voxel selection. Knowledge about the spatial location of the selected voxels is combined with tissue adjacency constraints in a post-processing step to enhance segmentation quality. The method is applied to an MP-MRI dataset of 21 high-grade glioma patients, including conventional, perfusion-weighted and diffusion-weighted MRI. To assess the effect of using MP-MRI data and the L1-regularization term, analyses are also run using only conventional MRI and without L1-regularization. Robustness against user input variability is verified by considering the statistical distribution of the segmentation results when repeatedly analyzing each patient's dataset with a different set of random seeding points. Using L1-regularized semi-automated NMF segmentation, mean Dice-scores of 65%, 74 and 80% are found for active tumor, the tumor core and the whole tumor region. Mean Hausdorff distances of 6.1 mm, 7.4 mm and 8.2 mm are found for active tumor, the tumor core and the whole tumor region. Lower Dice-scores and higher Hausdorff distances are found without L1-regularization and when only considering conventional MRI data. Based on the mean Dice-scores and Hausdorff distances, segmentation results are competitive with state-of-the-art in literature. Robust results were found for most patients, although careful voxel selection is mandatory to avoid sub-optimal segmentation.

  6. Rapid Contour-based Segmentation for 18F-FDG PET Imaging of Lung Tumors by Using ITK-SNAP: Comparison to Expert-based Segmentation.

    PubMed

    Besson, Florent L; Henry, Théophraste; Meyer, Céline; Chevance, Virgile; Roblot, Victoire; Blanchet, Elise; Arnould, Victor; Grimon, Gilles; Chekroun, Malika; Mabille, Laurence; Parent, Florence; Seferian, Andrei; Bulifon, Sophie; Montani, David; Humbert, Marc; Chaumet-Riffaud, Philippe; Lebon, Vincent; Durand, Emmanuel

    2018-04-03

    Purpose To assess the performance of the ITK-SNAP software for fluorodeoxyglucose (FDG) positron emission tomography (PET) segmentation of complex-shaped lung tumors compared with an optimized, expert-based manual reference standard. Materials and Methods Seventy-six FDG PET images of thoracic lesions were retrospectively segmented by using ITK-SNAP software. Each tumor was manually segmented by six raters to generate an optimized reference standard by using the simultaneous truth and performance level estimate algorithm. Four raters segmented 76 FDG PET images of lung tumors twice by using ITK-SNAP active contour algorithm. Accuracy of ITK-SNAP procedure was assessed by using Dice coefficient and Hausdorff metric. Interrater and intrarater reliability were estimated by using intraclass correlation coefficients of output volumes. Finally, the ITK-SNAP procedure was compared with currently recommended PET tumor delineation methods on the basis of thresholding at 41% volume of interest (VOI; VOI 41 ) and 50% VOI (VOI 50 ) of the tumor's maximal metabolism intensity. Results Accuracy estimates for the ITK-SNAP procedure indicated a Dice coefficient of 0.83 (95% confidence interval: 0.77, 0.89) and a Hausdorff distance of 12.6 mm (95% confidence interval: 9.82, 15.32). Interrater reliability was an intraclass correlation coefficient of 0.94 (95% confidence interval: 0.91, 0.96). The intrarater reliabilities were intraclass correlation coefficients above 0.97. Finally, VOI 41 and VOI 50 accuracy metrics were as follows: Dice coefficient, 0.48 (95% confidence interval: 0.44, 0.51) and 0.34 (95% confidence interval: 0.30, 0.38), respectively, and Hausdorff distance, 25.6 mm (95% confidence interval: 21.7, 31.4) and 31.3 mm (95% confidence interval: 26.8, 38.4), respectively. Conclusion ITK-SNAP is accurate and reliable for active-contour-based segmentation of heterogeneous thoracic PET tumors. ITK-SNAP surpassed the recommended PET methods compared with ground truth manual segmentation. © RSNA, 2018.

  7. SU-F-J-113: Multi-Atlas Based Automatic Organ Segmentation for Lung Radiotherapy Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, J; Han, J; Ailawadi, S

    Purpose: Normal organ segmentation is one time-consuming and labor-intensive step for lung radiotherapy treatment planning. The aim of this study is to evaluate the performance of a multi-atlas based segmentation approach for automatic organs at risk (OAR) delineation. Methods: Fifteen Lung stereotactic body radiation therapy patients were randomly selected. Planning CT images and OAR contours of the heart - HT, aorta - AO, vena cava - VC, pulmonary trunk - PT, and esophagus – ES were exported and used as reference and atlas sets. For automatic organ delineation for a given target CT, 1) all atlas sets were deformably warpedmore » to the target CT, 2) the deformed sets were accumulated and normalized to produce organ probability density (OPD) maps, and 3) the OPD maps were converted to contours via image thresholding. Optimal threshold for each organ was empirically determined by comparing the auto-segmented contours against their respective reference contours. The delineated results were evaluated by measuring contour similarity metrics: DICE, mean distance (MD), and true detection rate (TD), where DICE=(intersection volume/sum of two volumes) and TD = {1.0 - (false positive + false negative)/2.0}. Diffeomorphic Demons algorithm was employed for CT-CT deformable image registrations. Results: Optimal thresholds were determined to be 0.53 for HT, 0.38 for AO, 0.28 for PT, 0.43 for VC, and 0.31 for ES. The mean similarity metrics (DICE[%], MD[mm], TD[%]) were (88, 3.2, 89) for HT, (79, 3.2, 82) for AO, (75, 2.7, 77) for PT, (68, 3.4, 73) for VC, and (51,2.7, 60) for ES. Conclusion: The investigated multi-atlas based approach produced reliable segmentations for the organs with large and relatively clear boundaries (HT and AO). However, the detection of small and narrow organs with diffused boundaries (ES) were challenging. Sophisticated atlas selection and multi-atlas fusion algorithms may further improve the quality of segmentations.« less

  8. Regulation of Plant Microprocessor Function in Shaping microRNA Landscape.

    PubMed

    Dolata, Jakub; Taube, Michał; Bajczyk, Mateusz; Jarmolowski, Artur; Szweykowska-Kulinska, Zofia; Bielewicz, Dawid

    2018-01-01

    MicroRNAs are small molecules (∼21 nucleotides long) that are key regulators of gene expression. They originate from long stem-loop RNAs as a product of cleavage by a protein complex called Microprocessor. The core components of the plant Microprocessor are the RNase type III enzyme Dicer-Like 1 (DCL1), the zinc finger protein Serrate (SE), and the double-stranded RNA binding protein Hyponastic Leaves 1 (HYL1). Microprocessor assembly and its processing of microRNA precursors have been reported to occur in discrete nuclear bodies called Dicing bodies. The accessibility of and modifications to Microprocessor components affect microRNA levels and may have dramatic consequences in plant development. Currently, numerous lines of evidence indicate that plant Microprocessor activity is tightly regulated. The cellular localization of HYL1 is dependent on a specific KETCH1 importin, and the E3 ubiquitin ligase COP1 indirectly protects HYL1 from degradation in a light-dependent manner. Furthermore, proper localization of HYL1 in Dicing bodies is regulated by MOS2. On the other hand, the Dicing body localization of DCL1 is regulated by NOT2b, which also interacts with SE in the nucleus. Post-translational modifications are substantial factors that contribute to protein functional diversity and provide a fine-tuning system for the regulation of protein activity. The phosphorylation status of HYL1 is crucial for its activity/stability and is a result of the interplay between kinases (MPK3 and SnRK2) and phosphatases (CPL1 and PP4). Additionally, MPK3 and SnRK2 are known to phosphorylate SE. Several other proteins (e.g., TGH, CDF2, SIC, and RCF3) that interact with Microprocessor have been found to influence its RNA-binding and processing activities. In this minireview, recent findings on the various modes of Microprocessor activity regulation are discussed.

  9. Regulation of Plant Microprocessor Function in Shaping microRNA Landscape

    PubMed Central

    Dolata, Jakub; Taube, Michał; Bajczyk, Mateusz; Jarmolowski, Artur; Szweykowska-Kulinska, Zofia; Bielewicz, Dawid

    2018-01-01

    MicroRNAs are small molecules (∼21 nucleotides long) that are key regulators of gene expression. They originate from long stem–loop RNAs as a product of cleavage by a protein complex called Microprocessor. The core components of the plant Microprocessor are the RNase type III enzyme Dicer-Like 1 (DCL1), the zinc finger protein Serrate (SE), and the double-stranded RNA binding protein Hyponastic Leaves 1 (HYL1). Microprocessor assembly and its processing of microRNA precursors have been reported to occur in discrete nuclear bodies called Dicing bodies. The accessibility of and modifications to Microprocessor components affect microRNA levels and may have dramatic consequences in plant development. Currently, numerous lines of evidence indicate that plant Microprocessor activity is tightly regulated. The cellular localization of HYL1 is dependent on a specific KETCH1 importin, and the E3 ubiquitin ligase COP1 indirectly protects HYL1 from degradation in a light-dependent manner. Furthermore, proper localization of HYL1 in Dicing bodies is regulated by MOS2. On the other hand, the Dicing body localization of DCL1 is regulated by NOT2b, which also interacts with SE in the nucleus. Post-translational modifications are substantial factors that contribute to protein functional diversity and provide a fine-tuning system for the regulation of protein activity. The phosphorylation status of HYL1 is crucial for its activity/stability and is a result of the interplay between kinases (MPK3 and SnRK2) and phosphatases (CPL1 and PP4). Additionally, MPK3 and SnRK2 are known to phosphorylate SE. Several other proteins (e.g., TGH, CDF2, SIC, and RCF3) that interact with Microprocessor have been found to influence its RNA-binding and processing activities. In this minireview, recent findings on the various modes of Microprocessor activity regulation are discussed. PMID:29922322

  10. A Multiphase Validation of Atlas-Based Automatic and Semiautomatic Segmentation Strategies for Prostate MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Spencer; Rodrigues, George, E-mail: george.rodrigues@lhsc.on.ca; Department of Epidemiology/Biostatistics, University of Western Ontario, London

    2013-01-01

    Purpose: To perform a rigorous technological assessment and statistical validation of a software technology for anatomic delineations of the prostate on MRI datasets. Methods and Materials: A 3-phase validation strategy was used. Phase I consisted of anatomic atlas building using 100 prostate cancer MRI data sets to provide training data sets for the segmentation algorithms. In phase II, 2 experts contoured 15 new MRI prostate cancer cases using 3 approaches (manual, N points, and region of interest). In phase III, 5 new physicians with variable MRI prostate contouring experience segmented the same 15 phase II datasets using 3 approaches: manual,more » N points with no editing, and full autosegmentation with user editing allowed. Statistical analyses for time and accuracy (using Dice similarity coefficient) endpoints used traditional descriptive statistics, analysis of variance, analysis of covariance, and pooled Student t test. Results: In phase I, average (SD) total and per slice contouring time for the 2 physicians was 228 (75), 17 (3.5), 209 (65), and 15 seconds (3.9), respectively. In phase II, statistically significant differences in physician contouring time were observed based on physician, type of contouring, and case sequence. The N points strategy resulted in superior segmentation accuracy when initial autosegmented contours were compared with final contours. In phase III, statistically significant differences in contouring time were observed based on physician, type of contouring, and case sequence again. The average relative timesaving for N points and autosegmentation were 49% and 27%, respectively, compared with manual contouring. The N points and autosegmentation strategies resulted in average Dice values of 0.89 and 0.88, respectively. Pre- and postedited autosegmented contours demonstrated a higher average Dice similarity coefficient of 0.94. Conclusion: The software provided robust contours with minimal editing required. Observed time savings were seen for all physicians irrespective of experience level and baseline manual contouring speed.« less

  11. Monitoring psychrotrophic lactic acid bacteria contamination in a ready-to-eat vegetable salad production environment.

    PubMed

    Pothakos, Vasileios; Snauwaert, Cindy; De Vos, Paul; Huys, Geert; Devlieghere, Frank

    2014-08-18

    A study monitoring lactic acid bacteria contamination was conducted in a company producing fresh, minimally processed, packaged and ready-to-eat (RTE) vegetable salads (stored at 4°C) in order to investigate the reason for high psychrotrophic LAB levels in the products at the end of shelf-life. Initially, high microbial counts exceeding the established psychrotrophic thresholds (>10(7)-10(8)CFU/g) and spoilage manifestations before the end of the shelf-life (7days) occurred in products containing an assortment of sliced and diced vegetables, but within a one year period these spoilage defects became prevalent in the entire processing plant. Environmental sampling and microbiological analyses of the raw materials and final products throughout the manufacturing process highlighted the presence of high numbers of Leuconostoc spp. in halved and unseeded, fresh sweet bell peppers provided by the supplier. A combination of two DNA fingerprinting techniques facilitated the assessment of the species diversity of LAB present in the processing environment along with the critical point of their introduction in the production facility. Probably through air mediation and surface adhesion, mainly members of the strictly psychrotrophic species Leuconostoc gelidum subsp. gasicomitatum and L. gelidum subsp. gelidum were responsible for the cross-contamination of every vegetable handled within the plant. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. A front-end wafer-level microsystem packaging technique with micro-cap array

    NASA Astrophysics Data System (ADS)

    Chiang, Yuh-Min

    2002-09-01

    The back-end packaging process is the remaining challenge for the micromachining industry to commercialize microsystem technology (MST) devices at low cost. This dissertation presents a novel wafer level protection technique as a final step of the front-end fabrication process for MSTs. It facilitates improved manufacturing throughput and automation in package assembly, wafer level testing of devices, and enhanced device performance. The method involves the use of a wafer-sized micro-cap array, which consists of an assortment of small caps micro-molded onto a material with adjustable shapes and sizes to serve as protective structures against the hostile environments during packaging. The micro-cap array is first constructed by a micromachining process with micro-molding technique, then sealed to the device wafer at wafer level. Epoxy-based wafer-level micro cap array has been successfully fabricated and showed good compatibility with conventional back-end packaging processes. An adhesive transfer technique was demonstrated to seal the micro cap array with a MEMS device wafer. No damage or gross leak was observed while wafer dicing or later during a gross leak test. Applications of the micro cap array are demonstrated on MEMS, microactuators fabricated using CRONOS MUMPS process. Depending on the application needs, the micro-molded cap can be designed and modified to facilitate additional component functions, such as optical, electrical, mechanical, and chemical functions, which are not easily achieved in the device by traditional means. Successful fabrication of a micro cap array comprised with microlenses can provide active functions as well as passive protection. An optical tweezer array could be one possibility for applications of a micro cap with microlenses. The micro cap itself could serve as micro well for DNA or bacteria amplification as well.

  13. Automated image analysis for quantification of reactive oxygen species in plant leaves.

    PubMed

    Sekulska-Nalewajko, Joanna; Gocławski, Jarosław; Chojak-Koźniewska, Joanna; Kuźniak, Elżbieta

    2016-10-15

    The paper presents an image processing method for the quantitative assessment of ROS accumulation areas in leaves stained with DAB or NBT for H 2 O 2 and O 2 - detection, respectively. Three types of images determined by the combination of staining method and background color are considered. The method is based on the principle of supervised machine learning with manually labeled image patterns used for training. The method's algorithm is developed as a JavaScript macro in the public domain Fiji (ImageJ) environment. It allows to select the stained regions of ROS-mediated histochemical reactions, subsequently fractionated according to the weak, medium and intense staining intensity and thus ROS accumulation. It also evaluates total leaf blade area. The precision of ROS accumulation area detection is validated by the Dice Similarity Coefficient in the case of manual patterns. The proposed framework reduces the computation complexity, once prepared, requires less image processing expertise than the competitive methods and represents a routine quantitative imaging assay for a general histochemical image classification. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. An Introduction to Distributions Using Weighted Dice

    ERIC Educational Resources Information Center

    Holland, Bart K.

    2011-01-01

    Distributions are the basis for an enormous amount of theoretical and applied work in statistics. While there are formal definitions of distributions and many formulas to characterize them, it is important that students at first get a clear introduction to this basic concept. For many of them, neither words nor formulas can match the power of a…

  15. Stimulating Mathematical Thinking through Domino Games

    ERIC Educational Resources Information Center

    Gough, John

    2015-01-01

    Most readers would be familiar with the standard domino set which is played with rectangular domino tiles. The domino set, sometimes called a deck or pack, consists of 28 dominoes, colloquially nicknamed bones, cards, tiles, stones, or spinners. A domino set is a generic gaming device, similar to playing cards or dice, in that a variety of games…

  16. Connecting the Dots and Nodes: A Survey of Skills Requested by Employers for Network Administrators

    ERIC Educational Resources Information Center

    Morris, Gerard; Fustos, Janos; Haga, Wayne

    2018-01-01

    One definition of a network administrator describes a person who works with computer infrastructures with an emphasis on networking. To determine the specific skills required of a network administrator by employers, data was collected from 698 nationwide job advertisements on Dice.com. The data collection focused on technical skills rather than…

  17. The Effects of Competitive vs. Cooperative Structures on Subsequent Productivity in Boys with Psychosocial Disorders.

    ERIC Educational Resources Information Center

    Nelson, David L.; Peterson, Cindee Q.

    1991-01-01

    A study compared three subject groups structured for competition to three subject groups structured for cooperation. Thirty-six 8- to 17-year-old males residing in a treatment center for nonpsychotic psychosocial disorders participated in competitive and cooperative dice games. Results did not support the hypothesis that a cooperative experience…

  18. Technology Tips: Simulation with the TI-Nspire

    ERIC Educational Resources Information Center

    Rudolph, Heidi J.

    2009-01-01

    Simulation is an important learning tool that allows students to grasp probability concepts, especially when the actual scenario does not need to be replicated entirely. In the cases of tossing coins and rolling dice, gathering the data before analyzing them can be laborious and might be a waste of precious class time--time that might be better…

  19. Methods for Restoring Shape and Structure of Compressed Dehydrated Animal and Combination Products

    DTIC Science & Technology

    1974-09-01

    controls. Meatballs showed sl ight deteriorat ion whil e the other foods scored between the extremely affected products. SECURITY CI.ASSI F ICATION...Basic Formulation of Seasoning Mix . ... ..• .•. Diced Chicken Evaluati on .. . . . .......... . . .•. Meatball Formul ation... Meatball Evaluation •..... . ....... . .. . .. . . ... Chicken and Rice Evaluati on ..... .. . .... ..• .• Chicken and Rice

  20. The Shape of Things to Come: The Computational Pictograph as a Bridge from Combinatorial Space to Outcome Distribution

    ERIC Educational Resources Information Center

    Abrahamson, Dor

    2006-01-01

    This snapshot introduces a computer-based representation and activity that enables students to simultaneously "see" the combinatorial space of a stochastic device (e.g., dice, spinner, coins) and its outcome distribution. The author argues that the "ambiguous" representation fosters student insight into probability. [Snapshots are subject to peer…

  1. 21 CFR 155.190 - Canned tomatoes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... in paragraph (a)(3) of this section and be prepared in one of the styles specified in paragraph (a)(4... that when the tomatoes are prepared in one of the styles specified in paragraphs (a)(4) (ii) to (iv) of... requirements of § 155.191. (4) Styles. (i) Whole. (ii) Diced. (iii) Sliced. (iv) Wedges. (5) Name of the food...

  2. Optimizing laser beam profiles using micro-lens arrays for efficient material processing: applications to solar cells

    NASA Astrophysics Data System (ADS)

    Hauschild, Dirk; Homburg, Oliver; Mitra, Thomas; Ivanenko, Mikhail; Jarczynski, Manfred; Meinschien, Jens; Bayer, Andreas; Lissotschenko, Vitalij

    2009-02-01

    High power laser sources are used in various production tools for microelectronic products and solar cells, including the applications annealing, lithography, edge isolation as well as dicing and patterning. Besides the right choice of the laser source suitable high performance optics for generating the appropriate beam profile and intensity distribution are of high importance for the right processing speed, quality and yield. For industrial applications equally important is an adequate understanding of the physics of the light-matter interaction behind the process. In advance simulations of the tool performance can minimize technical and financial risk as well as lead times for prototyping and introduction into series production. LIMO has developed its own software founded on the Maxwell equations taking into account all important physical aspects of the laser based process: the light source, the beam shaping optical system and the light-matter interaction. Based on this knowledge together with a unique free-form micro-lens array production technology and patented micro-optics beam shaping designs a number of novel solar cell production tool sub-systems have been built. The basic functionalities, design principles and performance results are presented with a special emphasis on resilience, cost reduction and process reliability.

  3. Fabrication and mechanical characterization of long and different penetrating length neural microelectrode arrays

    NASA Astrophysics Data System (ADS)

    Goncalves, S. B.; Peixoto, A. C.; Silva, A. F.; Correia, J. H.

    2015-05-01

    This paper presents a detailed description of the design, fabrication and mechanical characterization of 3D microelectrode arrays (MEA) that comprise high aspect-ratio shafts and different penetrating lengths of electrodes (from 3 mm to 4 mm). The array’s design relies only on a bulk silicon substrate dicing saw technology. The encapsulation process is accomplished by a medical epoxy resin and platinum is used as the transduction layer between the probe and neural tissue. The probe’s mechanical behaviour can significantly affect the neural tissue during implantation time. Thus, we measured the MEA maximum insertion force in an agar gel phantom and a porcine cadaver brain. Successful 3D MEA were produced with shafts of 3 mm, 3.5 mm and 4 mm in length. At a speed of 180 mm min-1, the MEA show maximum penetrating forces per electrode of 2.65 mN and 12.5 mN for agar and brain tissue, respectively. A simple and reproducible fabrication method was demonstrated, capable of producing longer penetrating shafts than previously reported arrays using the same fabrication technology. Furthermore, shafts with sharp tips were achieved in the fabrication process simply by using a V-shaped blade.

  4. PREFACE: DICE 2008 - From Quantum Mechanics through Complexity to Spacetime: the role of emergent dynamical structures

    NASA Astrophysics Data System (ADS)

    Diósi, Lajos; Elze, Hans-Thomas; Fronzoni, Leone; Halliwell, Jonathan; Vitiello, Giuseppe

    2009-07-01

    These proceedings present the Invited Lectures and Contributed Papers of the Fourth International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2008, held at Castello Pasquini, Castiglioncello (Tuscany), 22-26 September 2008. We deliver these proceedings as a means to document to the interested public, to the wider scientific community, and to the participants themselves the stimulating exchange of ideas at this conference. The steadily growing number of participants, among them acclaimed scientists in their respective fields, show its increasing attraction and a fruitful concept, based on bringing leading researchers together and in contact with a mix of advanced students and scholars. Thus, this series of meetings successfully continued from the beginning with DICE 2002, (Decoherence and Entropy in Complex Systems ed H-T Elze Lecture Notes in Physics 633 (Berlin: Springer, 2004)) followed by DICE 2004 (Proceedings of the Second International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2004 ed H-T Elze Braz. Journ. Phys. 35, 2A & 2B (2005) pp 205-529 free access at: www.sbfisica.org.br/bjp) and by DICE 2006, (Proceedings of the Third International Workshop on Decoherence, Information, Complexity and Entropy - DICE 2006 eds H-T Elze, L Diósi and G Vitiello Journal of Physics: Conference Series 67 (2007); free access at: http://www.iop.org/EJ/toc/1742-6596/67/1) uniting about one hundred participants from more than twenty different countries worldwide this time. It has been a great honour and inspiration for all of us to have Professor Sir Roger Penrose from the Mathematical Institute at the University of Oxford with us, who presented the lecture ``Black holes, quantum theory and cosmology'' (included in this volume). Discussions under the wider theme ``From Quantum Mechanics through Complexity to Spacetime: the role of emergent dynamical structures'' took place in the very pleasant and inspiring atmosphere of Castello Pasquini, which - with its beautiful surroundings, overlooking a piece of Tuscany's coast, and with splendid weather throughout - was conducive to the success of the meeting. The 5-day program was grouped according to the following topics: Quantum Physics and Some Important Questions it Raises Emergent Dynamics, from Quantum to Brain and Beyond Exploring Quantum Mechanics Atomistic Theories of Spacetime Quantum-Entanglement/Gravity/Cosmology A Public Roundtable Discussion formed an integral part of the program under the theme ``Dialoghi sulla complessita' - dall' atomo all' Universo'' and with the participation of physicists and philosophers: F T Arecchi (Firenze), L Fronzoni (Pisa), A M Iacono (Pisa), F Luccio (Pisa) and G Vitiello (Salerno, coordinator). This event drew a large audience, who participated in the lively discussions until late in the evening. The workshop has been organized by L Diósi (Budapest), H-T Elze (Pisa, chair), L Fronzoni (Pisa), J Halliwell (London) and G Vitiello (Salerno), with great help from our conference secretaries M Pesce-Rollins (Siena) and L Baldini (Pisa) and from our students F Caravelli and E Di Nardo, both from Pisa. Several institutions and sponsors generously supported the workshop and their representatives and, in particular, the citizens of Rosignano/Castiglioncello are deeply thanked for the help and kind hospitality: Comune di Rosignano A Nenci (Sindaco di Rosignano), S Scarpellini (Segreteria sindaco), D Del Seppia (Assessore allo Sviluppo Economico del Comune di Rosignano), A Franchi (Assessore al turismo del Comune di Rosignano/Presidente dell' associazione Armunia), A Corsini (Ufficio economato del Comune di Rosignano). REA Rosignano Energia Ambiente s.p.a. F Ghelardini (Presidente della REA), A Cecchini (Ufficio - Responsabile stampa della REA). Solvay Chimica Italia s.a. Dott S Piccoli (Responsabile Relazioni Esterne, Solvay Rosignano), G Becherucci (Comunicazione e Relazioni Esterne). Associazione Armunia M Paganelli (Direttore), G Mannari (Programmazione). Special thanks go to G Mannari for her advice and great help in all the many practical matters that had to be dealt with, in order to run the meeting at Castello Pasquini smoothly. Funds made available by Universitá di Pisa, by Domus Galilaeana (Pisa), Centro Interdisciplinare per lo Studio dei Sistemi Complessi - CISSC (Pisa), Dipartimento di Matematica e Informatica (Universitá di Salerno), Istituto Italiano per gli Studi Filosofici - IISF (Napoli), and by IOP Publishing (Bristol) are gratefully acknowledged. Last but not least, special thanks go to L Pesce (Vitrium Galleria, Populonia) for her artwork (``Art and Science'') displayed during the conference at Castello Pasquini. The research papers presented at the workshop, often incorporating further developments since then, or presenting original new work, have been edited by L Diósi, H-T Elze, L Fronzoni, J J Halliwell and G Vitiello, with major assistance from J Yearsley (London), which we gratefully acknowledge. They are collected here, essentially following the program of the workshop, however, divided into Invited Lectures (we regret that lectures by E Arimondo, N Gisin, and W Schleich could not be reproduced here) and Contributed Papers, respectively. In the name of all participants, we would like to thank Dr J Schwarz and G Douglas (IOP Publishing, Bristol), and their collaborators, for friendly advice, always immediate help during the editing process, and for their efforts making the Journal of Physics: Conference Series available to all. Budapest, Pisa, London and Salerno, May 2009 Lajos Diósi, Hans-Thomas Elze, Leone Fronzoni, Jonathan Halliwell and Giuseppe Vitiello

  5. 77 FR 25736 - Notice of Intent To Repatriate Cultural Items: Northwest Museum of Arts & Culture, Spokane, WA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-01

    ... unassociated funerary objects are 7 beaver tooth dice, 1 bone awl pendant, 27 dentalia beads, 4 copper pendants, 1 copper bracelet, 1 projectile point, 1 bone awl, 2 scrapers, and 1 hammerstone. In the Federal... pendant, 5 scrapers, 2 bone awls, 1 piece of matting, 1 flake, 2 dentalia necklace fragments, 1 small box...

  6. States Roll Dice on New Funding: Gambling Linked to School Aid in Fresh Wave of Ballot Measures

    ERIC Educational Resources Information Center

    McNeil, Michele

    2008-01-01

    This article reports that amid tight budgets and shrinking revenue, states are wagering that voters in next month's elections will agree to expand state-sanctioned gambling in exchange for increased school aid. Initiatives on six state ballots Nov. 4 involve gambling revenue intended to raise money for everything from community college funding and…

  7. Pre-Service Mathematics Teachers' Use of Probability Models in Making Informal Inferences about a Chance Game

    ERIC Educational Resources Information Center

    Kazak, Sibel; Pratt, Dave

    2017-01-01

    This study considers probability models as tools for both making informal statistical inferences and building stronger conceptual connections between data and chance topics in teaching statistics. In this paper, we aim to explore pre-service mathematics teachers' use of probability models for a chance game, where the sum of two dice matters in…

  8. "Como Se Dice HIV?" Adapting Human Immunodeficiency Virus Prevention Messages to Reach Homosexual and Bisexual Hispanic Men: The Importance of Hispanic Cultural and Health Beliefs.

    ERIC Educational Resources Information Center

    Bowdy, Matthew A.

    HIV/AIDS prevention messages catered to Anglo homosexual/bisexual men are not effective in teaching preventative behaviors to Hispanic homosexual/bisexual men. Hispanic sociocultural traits associated with homosexuality and bisexuality prevent the effectiveness of these messages. The Hispanic family is also extremely important in influencing…

  9. Assessing the Problem Formulation in an Integrated Assessment Model: Implications for Climate Policy Decision-Support

    NASA Astrophysics Data System (ADS)

    Garner, G. G.; Reed, P. M.; Keller, K.

    2014-12-01

    Integrated assessment models (IAMs) are often used with the intent to aid in climate change decisionmaking. Numerous studies have analyzed the effects of parametric and/or structural uncertainties in IAMs, but uncertainties regarding the problem formulation are often overlooked. Here we use the Dynamic Integrated model of Climate and the Economy (DICE) to analyze the effects of uncertainty surrounding the problem formulation. The standard DICE model adopts a single objective to maximize a weighted sum of utilities of per-capita consumption. Decisionmakers, however, may be concerned with a broader range of values and preferences that are not captured by this a priori definition of utility. We reformulate the problem by introducing three additional objectives that represent values such as (i) reliably limiting global average warming to two degrees Celsius and minimizing both (ii) the costs of abatement and (iii) the damages due to climate change. We derive a set of Pareto-optimal solutions over which decisionmakers can trade-off and assess performance criteria a posteriori. We illustrate the potential for myopia in the traditional problem formulation and discuss the capability of this multiobjective formulation to provide decision support.

  10. Hidden Markov random field model and Broyden-Fletcher-Goldfarb-Shanno algorithm for brain image segmentation

    NASA Astrophysics Data System (ADS)

    Guerrout, EL-Hachemi; Ait-Aoudia, Samy; Michelucci, Dominique; Mahiou, Ramdane

    2018-05-01

    Many routine medical examinations produce images of patients suffering from various pathologies. With the huge number of medical images, the manual analysis and interpretation became a tedious task. Thus, automatic image segmentation became essential for diagnosis assistance. Segmentation consists in dividing the image into homogeneous and significant regions. We focus on hidden Markov random fields referred to as HMRF to model the problem of segmentation. This modelisation leads to a classical function minimisation problem. Broyden-Fletcher-Goldfarb-Shanno algorithm referred to as BFGS is one of the most powerful methods to solve unconstrained optimisation problem. In this paper, we investigate the combination of HMRF and BFGS algorithm to perform the segmentation operation. The proposed method shows very good segmentation results comparing with well-known approaches. The tests are conducted on brain magnetic resonance image databases (BrainWeb and IBSR) largely used to objectively confront the results obtained. The well-known Dice coefficient (DC) was used as similarity metric. The experimental results show that, in many cases, our proposed method approaches the perfect segmentation with a Dice Coefficient above .9. Moreover, it generally outperforms other methods in the tests conducted.

  11. Altered Reward Reactivity as a Behavioural Endophenotype in Eating Disorders: A Pilot Investigation in Twins.

    PubMed

    Kanakam, Natalie; Krug, Isabel; Collier, David; Treasure, Janet

    2017-05-01

    Altered reward reactivity is a potential risk endophenotype for eating disorders (EDs). The aim of this study was to examine reward reactivity in female twins with EDs and compare it with a twin control group. A sample of 112 twins [n = 51 met lifetime DSM-IV ED criteria (anorexia nervosa n = 26; bulimic disorders n = 24), n = 19 unaffected cotwins and n = 42 control twins] was administered measures assessing reward reactivity, including the Game of Dice Task, the Behavioural Inhibition/Activation (BIS/BAS) Scales and the Appetitive Motivation Scale (AMS). Within pair, correlations for monozygotic and dizygotic twins were calculated and generalised estimating equations compared probands with non-ED cotwins and controls. The BAS and the AMS were reduced in EDs and negatively associated with restrictive symptoms. In addition, monozygotic twins pairs demonstrated significant within pair similarity for the BAS and AMS. Conversely, there was less evidence to support the BIS or risky decision-making as measured by the Game of Dice Task as an endophenotype in EDs. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  12. Local durian (Durio zibethinus murr.) exploration for potentially superior tree as parents in Ngrambe District, Ngawi

    NASA Astrophysics Data System (ADS)

    Yuniastuti, E.; Anggita, A.; Nandariyah; Sukaya

    2018-03-01

    The characteristics durian based on specific area gives a wide diversity of phenotype. This research objective was to build an inventory of the local durian of Ngrambe as well as to obtain potentially superior local durian as prospective parent trees. The research was conducted in Ngrambe sub-district, on October 2015 until April 2016 using the explorative descriptive method. The determination of sample point used the non-probability method of snowball sampling type. Primary data include the morphology of plant characters, trunks, leaves, flower, fruits and seeds and their superiority. The data of the research were analyzed using SIMQUAL (Similarity for Qualitative) function based on the DICE coefficient on NTSYS v.2.02. The data cluster and dendrogram analyses were determined by Unweighted Pair-Group Arithmetic Average (UPGMA) method. The result of DICE coefficient analyses of 58 local durian accession based on the phenotypic character of vegetative organs ranged from 0.84-1.0. The phenotypic character of the vegetative and generative organ from 3 local durian accession superior potential ranged from 0.7 to 0.8. In conclusion, the accession of local durian which were Miyem and Rusmiyati have advantage and potential as prospective parent trees.

  13. Segmentation of thalamus from MR images via task-driven dictionary learning

    NASA Astrophysics Data System (ADS)

    Liu, Luoluo; Glaister, Jeffrey; Sun, Xiaoxia; Carass, Aaron; Tran, Trac D.; Prince, Jerry L.

    2016-03-01

    Automatic thalamus segmentation is useful to track changes in thalamic volume over time. In this work, we introduce a task-driven dictionary learning framework to find the optimal dictionary given a set of eleven features obtained from T1-weighted MRI and diffusion tensor imaging. In this dictionary learning framework, a linear classifier is designed concurrently to classify voxels as belonging to the thalamus or non-thalamus class. Morphological post-processing is applied to produce the final thalamus segmentation. Due to the uneven size of the training data samples for the non-thalamus and thalamus classes, a non-uniform sampling scheme is pro- posed to train the classifier to better discriminate between the two classes around the boundary of the thalamus. Experiments are conducted on data collected from 22 subjects with manually delineated ground truth. The experimental results are promising in terms of improvements in the Dice coefficient of the thalamus segmentation overstate-of-the-art atlas-based thalamus segmentation algorithms.

  14. Segmentation of Thalamus from MR images via Task-Driven Dictionary Learning.

    PubMed

    Liu, Luoluo; Glaister, Jeffrey; Sun, Xiaoxia; Carass, Aaron; Tran, Trac D; Prince, Jerry L

    2016-02-27

    Automatic thalamus segmentation is useful to track changes in thalamic volume over time. In this work, we introduce a task-driven dictionary learning framework to find the optimal dictionary given a set of eleven features obtained from T1-weighted MRI and diffusion tensor imaging. In this dictionary learning framework, a linear classifier is designed concurrently to classify voxels as belonging to the thalamus or non-thalamus class. Morphological post-processing is applied to produce the final thalamus segmentation. Due to the uneven size of the training data samples for the non-thalamus and thalamus classes, a non-uniform sampling scheme is proposed to train the classifier to better discriminate between the two classes around the boundary of the thalamus. Experiments are conducted on data collected from 22 subjects with manually delineated ground truth. The experimental results are promising in terms of improvements in the Dice coefficient of the thalamus segmentation over state-of-the-art atlas-based thalamus segmentation algorithms.

  15. Tunable MOEMS Fabry-Perot interferometer for miniaturized spectral sensing in near-infrared

    NASA Astrophysics Data System (ADS)

    Rissanen, A.; Mannila, R.; Tuohiniemi, M.; Akujärvi, A.; Antila, J.

    2014-03-01

    This paper presents a novel MOEMS Fabry-Perot interferometer (FPI) process platform for the range of 800 - 1050 nm. Simulation results including design and optimization of device properties in terms of transmission peak width, tuning range and electrical properties are discussed. Process flow for the device fabrication is presented, with overall process integration and backend dicing steps resulting in successful fabrication yield. The mirrors of the FPI consist of LPCVD (low-pressure chemical vapor) deposited polySi-SiN λ/4-thin film Bragg reflectors, with the air gap formed by sacrificial SiO2 etching in HF vapor. Silicon substrate below the optical aperture is removed by inductively coupled plasma (ICP) etching to ensure transmission in the visible - near infra-red (NIR), which is below silicon transmission range. The characterized optical properties of the chips are compared to the simulated values. Achieved optical aperture diameter size enables utilization of the chips in both imaging as well as single-point spectral sensors.

  16. Design and fabrication of PZN-7%PT single crystal high frequency angled needle ultrasound transducers.

    PubMed

    Zhou, Qifa; Wu, Dawei; Jin, Jing; Hu, Chang-hong; Xu, Xiaochen; Williams, Jay; Cannata, Jonathan M; Lim, Leongchew; Shung, K Kirk

    2008-01-01

    A high-frequency angled needle ultrasound transducer with an aperture size of 0.4 x 0.56 mm2 was fabricated using a lead zinc niobate-lead titanate (PZN- 7%PT) single crystal as the active piezoelectric material. The single crystal was bonded to a conductive silver particle matching layer and a conductive epoxy backing material through direct contact curing. A parylene outer matching layer was formed by vapor deposition. Angled needle probe configuration was achieved by dicing at 45 degrees to the single crystal poling direction to satisfy a clinical request for blood flow measurement in the posterior portion of the eye. The electrical impedance magnitude and phase of the transducer were 42 Omega and -63 degrees , respectively. The measured center frequency and the fractional bandwidth at -6 dB were 43 MHz and 45%, respectively. The two-way insertion loss was approximately 17 dB. Wire phantom imaging using fabricated PZN-7%PT single crystal transducers was obtained and spatial resolutions were assessed.

  17. A novel approach for establishing benchmark CBCT/CT deformable image registrations in prostate cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Kim, Jinkoo; Kumar, Sanath; Liu, Chang; Zhong, Hualiang; Pradhan, Deepak; Shah, Mira; Cattaneo, Richard; Yechieli, Raphael; Robbins, Jared R.; Elshaikh, Mohamed A.; Chetty, Indrin J.

    2013-11-01

    Deformable image registration (DIR) is an integral component for adaptive radiation therapy. However, accurate registration between daily cone-beam computed tomography (CBCT) and treatment planning CT is challenging, due to significant daily variations in rectal and bladder fillings as well as the increased noise levels in CBCT images. Another significant challenge is the lack of ‘ground-truth’ registrations in the clinical setting, which is necessary for quantitative evaluation of various registration algorithms. The aim of this study is to establish benchmark registrations of clinical patient data. Three pairs of CT/CBCT datasets were chosen for this institutional review board approved retrospective study. On each image, in order to reduce the contouring uncertainty, ten independent sets of organs were manually delineated by five physicians. The mean contour set for each image was derived from the ten contours. A set of distinctive points (round natural calcifications and three implanted prostate fiducial markers) were also manually identified. The mean contours and point features were then incorporated as constraints into a B-spline based DIR algorithm. Further, a rigidity penalty was imposed on the femurs and pelvic bones to preserve their rigidity. A piecewise-rigid registration approach was adapted to account for the differences in femur pose and the sliding motion between bones. For each registration, the magnitude of the spatial Jacobian (|JAC|) was calculated to quantify the tissue compression and expansion. Deformation grids and finite-element-model-based unbalanced energy maps were also reviewed visually to evaluate the physical soundness of the resultant deformations. Organ DICE indices (indicating the degree of overlap between registered organs) and residual misalignments of the fiducial landmarks were quantified. Manual organ delineation on CBCT images varied significantly among physicians with overall mean DICE index of only 0.7 among redundant contours. Seminal vesicle contours were found to have the lowest correlation amongst physicians (DICE = 0.5). After DIR, the organ surfaces between CBCT and planning CT were in good alignment with mean DICE indices of 0.9 for prostate, rectum, and bladder, and 0.8 for seminal vesicles. The Jacobian magnitudes |JAC| in the prostate, rectum, and seminal vesicles were in the range of 0.4-1.5, indicating mild compression/expansion. The bladder volume differences were larger between CBCT and CT images with mean |JAC| values of 2.2, 0.7, and 1.0 for three respective patients. Bone deformation was negligible (|JAC| = ˜ 1.0). The difference between corresponding landmark points between CBCT and CT was less than 1.0 mm after DIR. We have presented a novel method of establishing benchmark DIR accuracy between CT and CBCT images in the pelvic region. The method incorporates manually delineated organ surfaces and landmark points as well as pixel similarity in the optimization, while ensuring bone rigidity and avoiding excessive deformation in soft tissue organs. Redundant contouring is necessary to reduce the overall registration uncertainty.

  18. A novel approach for establishing benchmark CBCT/CT deformable image registrations in prostate cancer radiotherapy

    PubMed Central

    Kim, Jinkoo; Kumar, Sanath; Liu, Chang; Zhong, Hualiang; Pradhan, Deepak; Shah, Mira; Cattaneo, Richard; Yechieli, Raphael; Robbins, Jared R.; Elshaikh, Mohamed A.; Chetty, Indrin J.

    2014-01-01

    Purpose Deformable image registration (DIR) is an integral component for adaptive radiation therapy. However, accurate registration between daily cone-beam computed tomography (CBCT) and treatment planning CT is challenging, due to significant daily variations in rectal and bladder fillings as well as the increased noise levels in CBCT images. Another significant challenge is the lack of “ground-truth” registrations in the clinical setting, which is necessary for quantitative evaluation of various registration algorithms. The aim of this study is to establish benchmark registrations of clinical patient data. Materials/Methods Three pairs of CT/CBCT datasets were chosen for this IRB-approved retrospective study. On each image, in order to reduce the contouring uncertainty, ten independent sets of organs were manually delineated by five physicians. The mean contour set for each image was derived from the ten contours. A set of distinctive points (round natural calcifications and 3 implanted prostate fiducial markers) were also manually identified. The mean contours and point features were then incorporated as constraints into a B-spline based DIR algorithm. Further, a rigidity penalty was imposed on the femurs and pelvic bones to preserve their rigidity. A piecewise-rigid registration approach was adapted to account for the differences in femur pose and the sliding motion between bones. For each registration, the magnitude of the spatial Jacobian (|JAC|) was calculated to quantify the tissue compression and expansion. Deformation grids and finite-element-model-based unbalanced energy maps were also reviewed visually to evaluate the physical soundness of the resultant deformations. Organ DICE indices (indicating the degree of overlap between registered organs) and residual misalignments of the fiducial landmarks were quantified. Results Manual organ delineation on CBCT images varied significantly among physicians with overall mean DICE index of only 0.7 among redundant contours. Seminal vesicle contours were found to have the lowest correlation amongst physicians (DICE=0.5). After DIR, the organ surfaces between CBCT and planning CT were in good alignment with mean DICE indices of 0.9 for prostate, rectum, and bladder, and 0.8 for seminal vesicles. The Jacobian magnitudes |JAC| in the prostate, rectum, and seminal vesicles were in the range of 0.4–1.5, indicating mild compression/expansion. The bladder volume differences were larger between CBCT and CT images with mean |JAC| values of 2.2, 0.7, and 1.0 for three respective patients. Bone deformation was negligible (|JAC|=~1.0). The difference between corresponding landmark points between CBCT and CT was less than 1.0 mm after DIR. Conclusions We have presented a novel method of establishing benchmark deformable image registration accuracy between CT and CBCT images in the pelvic region. The method incorporates manually delineated organ surfaces and landmark points as well as pixel similarity in the optimization, while ensuring bone rigidity and avoiding excessive deformation in soft tissue organs. Redundant contouring is necessary to reduce the overall registration uncertainty. PMID:24171908

  19. AIRCRAFT SHELTER-DICE THROW Data Report

    DTIC Science & Technology

    1977-03-01

    damping fluid viscosity is temperature dependent, a number of thermistors were installed at velocity transducer locations. Accurate calibra- tion of these... thermistors enabled the temperatures at the velocity gage locations to be _etermi.ied through measurement of the thermistor resistances. These...stationary (reference) targets. As shown in Figures C-3 and C-5, targets were fabricated from steel pipe and welded to imbedded steel plates in the

  20. New data tool provides wealth of clinical, financial benchmarks by census region.

    PubMed

    1998-08-01

    Data Library: Compare your departmental expenses, administrative expense ratio, length of stay, and other clinical-financial data to benchmarks for your census region. A new CD-rom product that provides access to four years of Medicare Cost Report data for every reporting hospital in the nation allows users to slice and dice the data by more than 200 different performance measures.

  1. Visualizing the Transition State: A Hands-On Approach to the Arrhenius Equation

    ERIC Educational Resources Information Center

    Kuntzleman, Thomas S.; Swanson, Matthew S.; Sayers, Deborah K.

    2007-01-01

    An exercise is presented in which the kinetics of the irreversible "reaction" of pennies in the heads-up state to pennies in the tails-up state is simulated by a hands-on, Monte Carlo approach. In addition, the exercise incorporates a second simulation in which the irreversible "reaction" of dice with a red face uppermost to a blue face uppermost…

  2. An Online Game Approach for Improving Students' Learning Performance in Web-Based Problem-Solving Activities

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Wu, Po-Han; Chen, Chi-Chang

    2012-01-01

    In this paper, an online game was developed in the form of a competitive board game for conducting web-based problem-solving activities. The participants of the game determined their move by throwing a dice. Each location of the game board corresponds to a gaming task, which could be a web-based information-searching question or a mini-game; the…

  3. On the Performance of Carbon Nanotubes in Extreme Conditions and in the Presence of Microwaves

    DTIC Science & Technology

    2013-01-01

    been considered for use as transparent conductors include: transparent conducting oxides (TCOs), intrinsically conducting polymers (ICPs), graphene ...optical transmission properties, but are extremely sensitive to environmental conditions (such as temperature and humidity). Graphene has recently...during the dicing procedure, silver paint was applied to the sample to serve as improvised contact/probe-landing points. Figure 1 shows the CNT thin

  4. Automatic liver segmentation in computed tomography using general-purpose shape modeling methods.

    PubMed

    Spinczyk, Dominik; Krasoń, Agata

    2018-05-29

    Liver segmentation in computed tomography is required in many clinical applications. The segmentation methods used can be classified according to a number of criteria. One important criterion for method selection is the shape representation of the segmented organ. The aim of the work is automatic liver segmentation using general purpose shape modeling methods. As part of the research, methods based on shape information at various levels of advancement were used. The single atlas based segmentation method was used as the simplest shape-based method. This method is derived from a single atlas using the deformable free-form deformation of the control point curves. Subsequently, the classic and modified Active Shape Model (ASM) was used, using medium body shape models. As the most advanced and main method generalized statistical shape models, Gaussian Process Morphable Models was used, which are based on multi-dimensional Gaussian distributions of the shape deformation field. Mutual information and sum os square distance were used as similarity measures. The poorest results were obtained for the single atlas method. For the ASM method in 10 analyzed cases for seven test images, the Dice coefficient was above 55[Formula: see text], of which for three of them the coefficient was over 70[Formula: see text], which placed the method in second place. The best results were obtained for the method of generalized statistical distribution of the deformation field. The DICE coefficient for this method was 88.5[Formula: see text] CONCLUSIONS: This value of 88.5 [Formula: see text] Dice coefficient can be explained by the use of general-purpose shape modeling methods with a large variance of the shape of the modeled object-the liver and limitations on the size of our training data set, which was limited to 10 cases. The obtained results in presented fully automatic method are comparable with dedicated methods for liver segmentation. In addition, the deforamtion features of the model can be modeled mathematically by using various kernel functions, which allows to segment the liver on a comparable level using a smaller learning set.

  5. Segmentation of multiple heart cavities in 3-D transesophageal ultrasound images.

    PubMed

    Haak, Alexander; Vegas-Sánchez-Ferrero, Gonzalo; Mulder, Harriët W; Ren, Ben; Kirişli, Hortense A; Metz, Coert; van Burken, Gerard; van Stralen, Marijn; Pluim, Josien P W; van der Steen, Antonius F W; van Walsum, Theo; Bosch, Johannes G

    2015-06-01

    Three-dimensional transesophageal echocardiography (TEE) is an excellent modality for real-time visualization of the heart and monitoring of interventions. To improve the usability of 3-D TEE for intervention monitoring and catheter guidance, automated segmentation is desired. However, 3-D TEE segmentation is still a challenging task due to the complex anatomy with multiple cavities, the limited TEE field of view, and typical ultrasound artifacts. We propose to segment all cavities within the TEE view with a multi-cavity active shape model (ASM) in conjunction with a tissue/blood classification based on a gamma mixture model (GMM). 3-D TEE image data of twenty patients were acquired with a Philips X7-2t matrix TEE probe. Tissue probability maps were estimated by a two-class (blood/tissue) GMM. A statistical shape model containing the left ventricle, right ventricle, left atrium, right atrium, and aorta was derived from computed tomography angiography (CTA) segmentations by principal component analysis. ASMs of the whole heart and individual cavities were generated and consecutively fitted to tissue probability maps. First, an average whole-heart model was aligned with the 3-D TEE based on three manually indicated anatomical landmarks. Second, pose and shape of the whole-heart ASM were fitted by a weighted update scheme excluding parts outside of the image sector. Third, pose and shape of ASM for individual heart cavities were initialized by the previous whole heart ASM and updated in a regularized manner to fit the tissue probability maps. The ASM segmentations were validated against manual outlines by two observers and CTA derived segmentations. Dice coefficients and point-to-surface distances were used to determine segmentation accuracy. ASM segmentations were successful in 19 of 20 cases. The median Dice coefficient for all successful segmentations versus the average observer ranged from 90% to 71% compared with an inter-observer range of 95% to 84%. The agreement against the CTA segmentations was slightly lower with a median Dice coefficient between 85% and 57%. In this work, we successfully showed the accuracy and robustness of the proposed multi-cavity segmentation scheme. This is a promising development for intraoperative procedure guidance, e.g., in cardiac electrophysiology.

  6. Automated Segmentation of Kidneys from MR Images in Patients with Autosomal Dominant Polycystic Kidney Disease

    PubMed Central

    Kim, Youngwoo; Ge, Yinghui; Tao, Cheng; Zhu, Jianbing; Chapman, Arlene B.; Torres, Vicente E.; Yu, Alan S.L.; Mrug, Michal; Bennett, William M.; Flessner, Michael F.; Landsittel, Doug P.

    2016-01-01

    Background and objectives Our study developed a fully automated method for segmentation and volumetric measurements of kidneys from magnetic resonance images in patients with autosomal dominant polycystic kidney disease and assessed the performance of the automated method with the reference manual segmentation method. Design, setting, participants, & measurements Study patients were selected from the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease. At the enrollment of the Consortium for Radiologic Imaging Studies of Polycystic Kidney Disease Study in 2000, patients with autosomal dominant polycystic kidney disease were between 15 and 46 years of age with relatively preserved GFRs. Our fully automated segmentation method was on the basis of a spatial prior probability map of the location of kidneys in abdominal magnetic resonance images and regional mapping with total variation regularization and propagated shape constraints that were formulated into a level set framework. T2–weighted magnetic resonance image sets of 120 kidneys were selected from 60 patients with autosomal dominant polycystic kidney disease and divided into the training and test datasets. The performance of the automated method in reference to the manual method was assessed by means of two metrics: Dice similarity coefficient and intraclass correlation coefficient of segmented kidney volume. The training and test sets were swapped for crossvalidation and reanalyzed. Results Successful segmentation of kidneys was performed with the automated method in all test patients. The segmented kidney volumes ranged from 177.2 to 2634 ml (mean, 885.4±569.7 ml). The mean Dice similarity coefficient ±SD between the automated and manual methods was 0.88±0.08. The mean correlation coefficient between the two segmentation methods for the segmented volume measurements was 0.97 (P<0.001 for each crossvalidation set). The results from the crossvalidation sets were highly comparable. Conclusions We have developed a fully automated method for segmentation of kidneys from abdominal magnetic resonance images in patients with autosomal dominant polycystic kidney disease with varying kidney volumes. The performance of the automated method was in good agreement with that of manual method. PMID:26797708

  7. DICER-ARGONAUTE2 Complex in Continuous Fluorogenic Assays of RNA Interference Enzymes

    PubMed Central

    Bernard, Mark A.; Wang, Leyu; Tachado, Souvenir D.

    2015-01-01

    Mechanistic studies of RNA processing in the RNA-Induced Silencing Complex (RISC) have been hindered by lack of methods for continuous monitoring of enzymatic activity. “Quencherless” fluorogenic substrates of RNAi enzymes enable continuous monitoring of enzymatic reactions for detailed kinetics studies. Recombinant RISC enzymes cleave the fluorogenic substrates targeting human thymidylate synthase (TYMS) and hypoxia-inducible factor 1-α subunit (HIF1A). Using fluorogenic dsRNA DICER substrates and fluorogenic siRNA, DICER+ARGONAUTE2 mixtures exhibit synergistic enzymatic activity relative to either enzyme alone, and addition of TRBP does not enhance the apparent activity. Titration of AGO2 and DICER in enzyme assays suggests that AGO2 and DICER form a functional high-affinity complex in equimolar ratio. DICER and DICER+AGO2 exhibit Michaelis-Menten kinetics with DICER substrates. However, AGO2 cannot process the fluorogenic siRNA without DICER enzyme, suggesting that AGO2 cannot self-load siRNA into its active site. The DICER+AGO2 combination processes the fluorogenic siRNA substrate (K m=74 nM) with substrate inhibition kinetics (K i=105 nM), demonstrating experimentally that siRNA binds two different sites that affect Dicing and AGO2-loading reactions in RISC. This result suggests that siRNA (product of DICER) bound in the active site of DICER may undergo direct transfer (as AGO2 substrate) to the active site of AGO2 in the DICER+AGO2 complex. Competitive substrate assays indicate that DICER+AGO2 cleavage of fluorogenic siRNA is specific, since unlabeled siRNA and DICER substrates serve as competing substrates that cause a concentration-dependent decrease in fluorescent rates. Competitive substrate assays of a series of DICER substrates in vitro were correlated with cell-based assays of HIF1A mRNA knockdown (log-log slope=0.29), suggesting that improved DICER substrate designs with 10-fold greater processing by the DICER+AGO2 complex can provide a strong (~2800-fold) improvement in potency for mRNA knockdown. This study lays the foundation of a systematic biochemical approach to optimize nucleic acid-based therapeutics for Dicing and ARGONAUTE2-loading for improving efficacy. PMID:25793518

  8. An automatic brain tumor segmentation tool.

    PubMed

    Diaz, Idanis; Boulanger, Pierre; Greiner, Russell; Hoehn, Bret; Rowe, Lindsay; Murtha, Albert

    2013-01-01

    This paper introduces an automatic brain tumor segmentation method (ABTS) for segmenting multiple components of brain tumor using four magnetic resonance image modalities. ABTS's four stages involve automatic histogram multi-thresholding and morphological operations including geodesic dilation. Our empirical results, on 16 real tumors, show that ABTS works very effectively, achieving a Dice accuracy compared to expert segmentation of 81% in segmenting edema and 85% in segmenting gross tumor volume (GTV).

  9. Automatic liver segmentation on Computed Tomography using random walkers for treatment planning

    PubMed Central

    Moghbel, Mehrdad; Mashohor, Syamsiah; Mahmud, Rozi; Saripan, M. Iqbal Bin

    2016-01-01

    Segmentation of the liver from Computed Tomography (CT) volumes plays an important role during the choice of treatment strategies for liver diseases. Despite lots of attention, liver segmentation remains a challenging task due to the lack of visible edges on most boundaries of the liver coupled with high variability of both intensity patterns and anatomical appearances with all these difficulties becoming more prominent in pathological livers. To achieve a more accurate segmentation, a random walker based framework is proposed that can segment contrast-enhanced livers CT images with great accuracy and speed. Based on the location of the right lung lobe, the liver dome is automatically detected thus eliminating the need for manual initialization. The computational requirements are further minimized utilizing rib-caged area segmentation, the liver is then extracted by utilizing random walker method. The proposed method was able to achieve one of the highest accuracies reported in the literature against a mixed healthy and pathological liver dataset compared to other segmentation methods with an overlap error of 4.47 % and dice similarity coefficient of 0.94 while it showed exceptional accuracy on segmenting the pathological livers with an overlap error of 5.95 % and dice similarity coefficient of 0.91. PMID:28096782

  10. Hippocampal unified multi-atlas network (HUMAN): protocol and scale validation of a novel segmentation tool.

    PubMed

    Amoroso, N; Errico, R; Bruno, S; Chincarini, A; Garuccio, E; Sensi, F; Tangaro, S; Tateo, A; Bellotti, R

    2015-11-21

    In this study we present a novel fully automated Hippocampal Unified Multi-Atlas-Networks (HUMAN) algorithm for the segmentation of the hippocampus in structural magnetic resonance imaging. In multi-atlas approaches atlas selection is of crucial importance for the accuracy of the segmentation. Here we present an optimized method based on the definition of a small peri-hippocampal region to target the atlas learning with linear and non-linear embedded manifolds. All atlases were co-registered to a data driven template resulting in a computationally efficient method that requires only one test registration. The optimal atlases identified were used to train dedicated artificial neural networks whose labels were then propagated and fused to obtain the final segmentation. To quantify data heterogeneity and protocol inherent effects, HUMAN was tested on two independent data sets provided by the Alzheimer's Disease Neuroimaging Initiative and the Open Access Series of Imaging Studies. HUMAN is accurate and achieves state-of-the-art performance (Dice[Formula: see text] and Dice[Formula: see text]). It is also a robust method that remains stable when applied to the whole hippocampus or to sub-regions (patches). HUMAN also compares favorably with a basic multi-atlas approach and a benchmark segmentation tool such as FreeSurfer.

  11. Neuropsychological correlates of decision making in patients with bulimia nervosa.

    PubMed

    Brand, Matthias; Franke-Sievert, Christiane; Jacoby, Georg E; Markowitsch, Hans J; Tuschen-Caffier, Brunna

    2007-11-01

    In addition to the core psychopathology of bulimia nervosa (BN), patients with BN often show impulsive behavior that has been related to decision making deficits in other patient groups, such as individuals with anorexia nervosa and pathological gamblers. However, it remains unclear whether BN patients also show difficulties in decision making. In this study, 14 patients with BN and 14 healthy comparison subjects, matched for age, gender, education, body mass index, and intelligence, were examined with the Game of Dice Task (M. Brand, E. Fujiwara, et al., 2005), a gambling task that has fixed winning probabilities and explicit rules for gains and losses, as well as with a neuropsychological test battery and personality questionnaires. On the task, the patients with BN chose the disadvantageous alternatives more frequently than did the comparison subjects. Performance on the Game of Dice Task was related to executive functioning but not to other neuropsychological functions, personality, or disease-specific variables in the BN group. Thus, in patients with BN, decision making abnormalities and executive reductions can be demonstrated and might be neuropsychological correlates of the patients' dysfunctional everyday-life decision making behavior. Neurocognitive functions should be considered in the treatment of BN. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  12. Derived Transformation of Children's Pregambling Game Playing

    PubMed Central

    Dymond, Simon; Bateman, Helena; Dixon, Mark R

    2010-01-01

    Contemporary behavior-analytic perspectives on gambling emphasize the impact of verbal relations, or derived relational responding and the transformation of stimulus functions, on the initiation and maintenance of gambling. Approached in this way, it is possible to undertake experimental analysis of the role of verbal/mediational variables in gambling behavior. The present study therefore sought to demonstrate the ways new stimuli could come to have functions relevant to gambling without those functions being trained directly. Following a successful derived-equivalence-relations test, a simulated board game established high- and low-roll functions for two concurrently presented dice labelled with members of the derived relations. During the test for derived transformation, children were reexposed to the board game with dice labelled with indirectly related stimuli. All participants except 1 who passed the equivalence relations test selected the die that was indirectly related to the trained high-roll die more often than the die that was indirectly related to low-roll die, despite the absence of differential outcomes. All participants except 3 also gave the derived high-roll die higher liking ratings than the derived low-roll die. The implications of the findings for behavior-analytic research on gambling and the development of verbally-based interventions for disordered gambling are discussed. PMID:21541176

  13. Derived transformation of children's pregambling game playing.

    PubMed

    Dymond, Simon; Bateman, Helena; Dixon, Mark R

    2010-11-01

    Contemporary behavior-analytic perspectives on gambling emphasize the impact of verbal relations, or derived relational responding and the transformation of stimulus functions, on the initiation and maintenance of gambling. Approached in this way, it is possible to undertake experimental analysis of the role of verbal/mediational variables in gambling behavior. The present study therefore sought to demonstrate the ways new stimuli could come to have functions relevant to gambling without those functions being trained directly. Following a successful derived-equivalence-relations test, a simulated board game established high- and low-roll functions for two concurrently presented dice labelled with members of the derived relations. During the test for derived transformation, children were reexposed to the board game with dice labelled with indirectly related stimuli. All participants except 1 who passed the equivalence relations test selected the die that was indirectly related to the trained high-roll die more often than the die that was indirectly related to low-roll die, despite the absence of differential outcomes. All participants except 3 also gave the derived high-roll die higher liking ratings than the derived low-roll die. The implications of the findings for behavior-analytic research on gambling and the development of verbally-based interventions for disordered gambling are discussed.

  14. Crosstalk Reduction for High-Frequency Linear-Array Ultrasound Transducers Using 1–3 Piezocomposites With Pseudo-Random Pillars

    PubMed Central

    Yang, Hao-Chung; Cannata, Jonathan; Williams, Jay; Shung, K. Kirk

    2013-01-01

    The goal of this research was to develop a novel diced 1–3 piezocomposite geometry to reduce pulse–echo ring down and acoustic crosstalk between high-frequency ultrasonic array elements. Two PZT-5H-based 1–3 composites (10 and 15 MHz) of different pillar geometries [square (SQ), 45° triangle (TR), and pseudo-random (PR)] were fabricated and then made into single-element ultrasound transducers. The measured pulse–echo waveforms and their envelopes indicate that the PR composites had the shortest −20-dB pulse length and highest sensitivity among the composites evaluated. Using these composites, 15-MHz array subapertures with a 0.95λ pitch were fabricated to assess the acoustic crosstalk between array elements. The combined electrical and acoustical crosstalk between the nearest array elements of the PR array sub-apertures (−31.8 dB at 15 MHz) was 6.5 and 2.2 dB lower than those of the SQ and the TR array subapertures, respectively. These results demonstrate that the 1–3 piezocomposite with the pseudo-random pillars may be a better choice for fabricating enhanced high-frequency linear-array ultrasound transducers; especially when mechanical dicing is used. PMID:23143580

  15. Hippocampal unified multi-atlas network (HUMAN): protocol and scale validation of a novel segmentation tool

    NASA Astrophysics Data System (ADS)

    Amoroso, N.; Errico, R.; Bruno, S.; Chincarini, A.; Garuccio, E.; Sensi, F.; Tangaro, S.; Tateo, A.; Bellotti, R.; Alzheimers Disease Neuroimaging Initiative,the

    2015-11-01

    In this study we present a novel fully automated Hippocampal Unified Multi-Atlas-Networks (HUMAN) algorithm for the segmentation of the hippocampus in structural magnetic resonance imaging. In multi-atlas approaches atlas selection is of crucial importance for the accuracy of the segmentation. Here we present an optimized method based on the definition of a small peri-hippocampal region to target the atlas learning with linear and non-linear embedded manifolds. All atlases were co-registered to a data driven template resulting in a computationally efficient method that requires only one test registration. The optimal atlases identified were used to train dedicated artificial neural networks whose labels were then propagated and fused to obtain the final segmentation. To quantify data heterogeneity and protocol inherent effects, HUMAN was tested on two independent data sets provided by the Alzheimer’s Disease Neuroimaging Initiative and the Open Access Series of Imaging Studies. HUMAN is accurate and achieves state-of-the-art performance (Dice{{}\\text{ADNI}} =0.929+/- 0.003 and Dice{{}\\text{OASIS}} =0.869+/- 0.002 ). It is also a robust method that remains stable when applied to the whole hippocampus or to sub-regions (patches). HUMAN also compares favorably with a basic multi-atlas approach and a benchmark segmentation tool such as FreeSurfer.

  16. Segmentation of Hyperacute Cerebral Infarcts Based on Sparse Representation of Diffusion Weighted Imaging.

    PubMed

    Zhang, Xiaodong; Jing, Shasha; Gao, Peiyi; Xue, Jing; Su, Lu; Li, Weiping; Ren, Lijie; Hu, Qingmao

    2016-01-01

    Segmentation of infarcts at hyperacute stage is challenging as they exhibit substantial variability which may even be hard for experts to delineate manually. In this paper, a sparse representation based classification method is explored. For each patient, four volumetric data items including three volumes of diffusion weighted imaging and a computed asymmetry map are employed to extract patch features which are then fed to dictionary learning and classification based on sparse representation. Elastic net is adopted to replace the traditional L 0 -norm/ L 1 -norm constraints on sparse representation to stabilize sparse code. To decrease computation cost and to reduce false positives, regions-of-interest are determined to confine candidate infarct voxels. The proposed method has been validated on 98 consecutive patients recruited within 6 hours from onset. It is shown that the proposed method could handle well infarcts with intensity variability and ill-defined edges to yield significantly higher Dice coefficient (0.755 ± 0.118) than the other two methods and their enhanced versions by confining their segmentations within the regions-of-interest (average Dice coefficient less than 0.610). The proposed method could provide a potential tool to quantify infarcts from diffusion weighted imaging at hyperacute stage with accuracy and speed to assist the decision making especially for thrombolytic therapy.

  17. Moisture variations in brine-salted pasta filata cheese.

    PubMed

    Kindstedt, P S

    2001-01-01

    A study was made of the moisture distribution in brine-salted pasta filata cheese. Brine-salted cheeses usually develop reasonably smooth and predictable gradients of decreasing moisture from center to surface, resulting from outward diffusion of moisture in response to inward diffusion of salt. However, patterns of moisture variation within brine-salted pasta filata cheeses, notably pizza cheese, are more variable and less predictable because of the peculiar conditions that occur when warm cheese is immersed in cold brine. In this study, cold brining resulted in less moisture loss from the cheese surface to the brine. Also it created substantial temperature gradients within the cheese, which persisted after brining and influenced the movement of moisture within the cheese independently of that caused by the inward diffusion of salt. Depending on brining conditions and age, pizza cheese may contain decreasing, increasing, or irregular gradients of moisture from center to surface, which may vary considerably at different locations within a single block. This complicates efforts to obtain representative samples for moisture and composition testing. Dicing the entire block into small (e.g., 1.5 cm) cubes and collecting a composite sample after thorough mixing may serve as a practical sampling approach for manufacturers and users of pizza cheese that have ready access to dicing equipment.

  18. Probabilistic modeling of the fate of Listeria monocytogenes in diced bacon during the manufacturing process.

    PubMed

    Billoir, Elise; Denis, Jean-Baptiste; Cammeau, Natalie; Cornu, Marie; Zuliani, Veronique

    2011-02-01

    To assess the impact of the manufacturing process on the fate of Listeria monocytogenes, we built a generic probabilistic model intended to simulate the successive steps in the process. Contamination evolution was modeled in the appropriate units (breasts, dice, and then packaging units through the successive steps in the process). To calibrate the model, parameter values were estimated from industrial data, from the literature, and based on expert opinion. By means of simulations, the model was explored using a baseline calibration and alternative scenarios, in order to assess the impact of changes in the process and of accidental events. The results are reported as contamination distributions and as the probability that the product will be acceptable with regards to the European regulatory safety criterion. Our results are consistent with data provided by industrial partners and highlight that tumbling is a key step for the distribution of the contamination at the end of the process. Process chain models could provide an important added value for risk assessment models that basically consider only the outputs of the process in their risk mitigation strategies. Moreover, a model calibrated to correspond to a specific plant could be used to optimize surveillance. © 2010 Society for Risk Analysis.

  19. Agent-based Model for the Coupled Human-Climate System

    NASA Astrophysics Data System (ADS)

    Zvoleff, A.; Werner, B.

    2006-12-01

    Integrated assessment models have been used to predict the outcome of coupled economic growth, resource use, greenhouse gas emissions and climate change, both for scientific and policy purposes. These models generally have employed significant simplifications that suppress nonlinearities and the possibility of multiple equilibria in both their economic (DeCanio, 2005) and climate (Schneider and Kuntz-Duriseti, 2002) components. As one step toward exploring general features of the nonlinear dynamics of the coupled system, we have developed a series of variations on the well studied RICE and DICE models, which employ different forms of agent-based market dynamics and "climate surprises." Markets are introduced through the replacement of the production function of the DICE/RICE models with an agent-based market modeling the interactions of producers, policymakers, and consumer agents. Technological change and population growth are treated endogenously. Climate surprises are representations of positive (for example, ice sheet collapse) or negative (for example, increased aerosols from desertification) feedbacks that are turned on with probability depending on warming. Initial results point toward the possibility of large amplitude instabilities in the coupled human-climate system owing to the mismatch between short outlook market dynamics and long term climate responses. Implications for predictability of future climate will be discussed. Supported by the Andrew W Mellon Foundation and the UC Academic Senate.

  20. Large-Aperture Wide-Bandwidth Anti-Reflection-Coated Silicon Lenses for Millimeter Wavelengths

    NASA Technical Reports Server (NTRS)

    Datta, R.; Munson, C. D.; Niemack, M. D.; McMahon, J. J.; Britton, J.; Wollack, E. J.; Beall, J.; Devlin, M. J.; Fowler, J.; Gallardo, P.; hide

    2013-01-01

    The increasing scale of cryogenic detector arrays for sub-millimeter and millimeter wavelength astrophysics has led to the need for large aperture, high index of refraction, low loss, cryogenic refracting optics. Silicon with n = 3.4, low loss, and relatively high thermal conductivity is a nearly optimal material for these purposes, but requires an antireflection (AR) coating with broad bandwidth, low loss, low reflectance, and a matched coffecient of thermal expansion. We present an AR coating for curved silicon optics comprised of subwavelength features cut into the lens surface with a custom three axis silicon dicing saw. These features constitute a metamaterial that behaves as a simple dielectric coating. We have fabricated and coated silicon lenses as large as 33.4 cm in diameter with coatings optimized for use between 125-165 GHz. Our design reduces average reflections to a few tenths of a percent for angles of incidence up to 30 deg. with low cross-polarization. We describe the design, tolerance, manufacture, and measurements of these coatings and present measurements of the optical properties of silicon at millimeter wavelengths at cryogenic and room temperatures. This coating and lens fabrication approach is applicable from centimeter to sub-millimeter wavelengths and can be used to fabricate coatings with greater than octave bandwidth.

  1. Large-aperture Wide-bandwidth Antireflection-coated Silicon Lenses for Millimeter Wavelengths

    NASA Technical Reports Server (NTRS)

    Datta, R.; Munson, C. D.; Niemack, M. D.; McMahon, J. J.; Britton, J.; Wollack, Edward J.; Beall, J.; Devlin, M. J.; Fowler, J.; Gallardo, P.; hide

    2013-01-01

    The increasing scale of cryogenic detector arrays for submillimeter and millimeter wavelength astrophysics has led to the need for large aperture, high index of refraction, low loss, cryogenic refracting optics. Silicon with n 3.4, low loss, and high thermal conductivity is a nearly optimal material for these purposes but requires an antireflection (AR) coating with broad bandwidth, low loss, low reflectance, and a matched coefficient of thermal expansion. We present an AR coating for curved silicon optics comprised of subwavelength features cut into the lens surface with a custom three-axis silicon dicing saw. These features constitute a metamaterial that behaves as a simple dielectric coating.We have fabricated silicon lenses as large as 33.4 cm in diameter with micromachined layers optimized for use between 125 and 165 GHz. Our design reduces average reflections to a few tenths of a percent for angles of incidence up to 30deg with low cross polarization.We describe the design, tolerance, manufacture, and measurements of these coatings and present measurements of the optical properties of silicon at millimeter wavelengths at cryogenic and room temperatures. This coating and lens fabrication approach is applicable from centimeter to submillimeter wavelengths and can be used to fabricate coatings with greater than octave bandwidth.

  2. Proceedings of the DICE THROW Symposium 21-23 June 1977. Volume 1

    DTIC Science & Technology

    1977-07-01

    different scaled ANFO events to insure yield scalability. Phase 1 of the program consisted of a series of one-pound events to examine cratering and...characterization of a 500-ton-equivalent event. A large number of agencies were involved in different facets of the development program. Probably most...charge geometry observed in the 1000-pound series, supported the observations from the Phase 1 program. Differences were observed in the fireball

  3. Air & Space Power Journal. Volume 26, Number 1, January-February 2012

    DTIC Science & Technology

    2012-02-01

    2 Support the Combatant Commander, Develop the Force, or Roll the Dice? What the Air Force’s Deployment Tasking Process Doesn’t Do...presents a once-in-a-generation opportunity for the Air Force to capitalize on new technology and processes that can fundamentally alter the way we do... process of turning challenges into opportuni- ties. ASPJ is charged with providing a forum in which professional Airmen can make significant contributions

  4. Design, Fabrication, and Packaging of Mach-Zehnder Interferometers for Biological Sensing Applications

    NASA Astrophysics Data System (ADS)

    Novak, Joseph

    Optical biological sensors are widely used in the fields of medical testing, water treatment and safety, gene identification, and many others due to advances in nanofabrication technology. This work focuses on the design of fiber-coupled Mach-Zehnder Interferometer (MZI) based biosensors fabricated on silicon-on-insulator (SOI) wafer. Silicon waveguide sensors are designed with multimode and single-mode dimensions. Input coupling efficiency is investigated by design of various taper structures. Integration processing and packaging is performed for fiber attachment and enhancement of input coupling efficiency. Optical guided-wave sensors rely on single-mode operation to extract an induced phase-shift from the output signal. A silicon waveguide MZI sensor designed and fabricated for both multimode and single-mode dimensions. Sensitivity of the sensors is analyzed for waveguide dimensions and materials. An s-bend structure is designed for the multimode waveguide to eliminate higher-order mode power as an alternative to single-mode confinement. Single-mode confinement is experimentally demonstrated through near field imaging of waveguide output. Y-junctions are designed for 3dB power splitting to the MZI arms and for power recombination after sensing to utilize the interferometric function of the MZI. Ultra-short 10microm taper structures with curved geometries are designed to improve insertion loss from fiber-to-chip without significantly increasing device area and show potential for applications requiring misalignment tolerance. An novel v-groove process is developed for self-aligned integration of fiber grooves for attachment to sensor chips. Thermal oxidation at temperatures from 1050-1150°C during groove processing creates an SiO2 layer on the waveguide end facet to protect the waveguide facet during integration etch processing without additional e-beam lithography processing. Experimental results show improvement of insertion loss compared to dicing preparation and Focused Ion Beam methods using the thermal oxidation process.

  5. Genetic Diversity of Clinical and Environmental Strains of Salmonella enterica Serotype Weltevreden Isolated in Malaysia

    PubMed Central

    Thong, K. L.; Goh, Y. L.; Radu, S.; Noorzaleha, S.; Yasin, R.; Koh, Y. T.; Lim, V. K. E.; Rusul, G.; Puthucheary, S. D.

    2002-01-01

    The incidence of food-borne salmonellosis due to Salmonella enterica serotype Weltevreden is reported to be on the increase in Malaysia. The pulsed-field gel electrophoresis (PFGE) subtyping method was used to assess the extent of genetic diversity and clonality of Salmonella serotype Weltevreden strains from humans and the environment. PFGE of XbaI-digested chromosomal DNA from 95 strains of Salmonella serotype Weltevreden gave 39 distinct profiles with a wide range of Dice coefficients (0.27 to 1.00), indicating that PFGE is very discriminative and that multiple clones of Salmonella serotype Weltevreden exist among clinical and environmental isolates. Strains of one dominant pulsotype (pulsotype X1/X2) appeared to be endemic in this region, as they were consistently recovered from humans with salmonellosis between 1996 and 2001 and from raw vegetables. In addition, the sharing of similar PFGE profiles among isolates from humans, vegetables, and beef provides indirect evidence of the possible transmission of salmonellosis from contaminated raw vegetables and meat to humans. Furthermore, the recurrence of PFGE profile X21 among isolates found in samples of vegetables from one wet market indicated the persistence of this clone. The environment in the wet markets may represent a major source of cross-contamination of vegetables with Salmonella serotype Weltevreden. Antibiotic sensitivity tests showed that the clinical isolates of Salmonella serotype Weltevreden remained drug sensitive but that the vegetable isolates were resistant to at least two antibiotics. To the best of our knowledge, this is the first study to compare clinical and environmental isolates of Salmonella serotype Weltevreden in Malaysia. PMID:12089269

  6. Genetic diversity of clinical and environmental strains of Salmonella enterica serotype Weltevreden isolated in Malaysia.

    PubMed

    Thong, K L; Goh, Y L; Radu, S; Noorzaleha, S; Yasin, R; Koh, Y T; Lim, V K E; Rusul, G; Puthucheary, S D

    2002-07-01

    The incidence of food-borne salmonellosis due to Salmonella enterica serotype Weltevreden is reported to be on the increase in Malaysia. The pulsed-field gel electrophoresis (PFGE) subtyping method was used to assess the extent of genetic diversity and clonality of Salmonella serotype Weltevreden strains from humans and the environment. PFGE of XbaI-digested chromosomal DNA from 95 strains of Salmonella serotype Weltevreden gave 39 distinct profiles with a wide range of Dice coefficients (0.27 to 1.00), indicating that PFGE is very discriminative and that multiple clones of Salmonella serotype Weltevreden exist among clinical and environmental isolates. Strains of one dominant pulsotype (pulsotype X1/X2) appeared to be endemic in this region, as they were consistently recovered from humans with salmonellosis between 1996 and 2001 and from raw vegetables. In addition, the sharing of similar PFGE profiles among isolates from humans, vegetables, and beef provides indirect evidence of the possible transmission of salmonellosis from contaminated raw vegetables and meat to humans. Furthermore, the recurrence of PFGE profile X21 among isolates found in samples of vegetables from one wet market indicated the persistence of this clone. The environment in the wet markets may represent a major source of cross-contamination of vegetables with Salmonella serotype Weltevreden. Antibiotic sensitivity tests showed that the clinical isolates of Salmonella serotype Weltevreden remained drug sensitive but that the vegetable isolates were resistant to at least two antibiotics. To the best of our knowledge, this is the first study to compare clinical and environmental isolates of Salmonella serotype Weltevreden in Malaysia.

  7. Multi-Site Simultaneous Time-Resolved Photometry with a Low Cost Electro-Optics System †

    PubMed Central

    Gasdia, Forrest; Barjatya, Aroh; Bilardi, Sergei

    2017-01-01

    Sunlight reflected off of resident space objects can be used as an optical signal for astrometric orbit determination and for deducing geometric information about the object. With the increasing population of small satellites and debris in low Earth orbit, photometry is a powerful tool in operational support of space missions, whether for anomaly resolution or object identification. To accurately determine size, shape, spin rate, status of deployables, or attitude information of an unresolved resident space object, multi-hertz sample rate photometry is required to capture the relatively rapid changes in brightness that these objects can exhibit. OSCOM, which stands for Optical tracking and Spectral characterization of CubeSats for Operational Missions, is a low cost and portable telescope system capable of time-resolved small satellite photometry, and is field deployable on short notice for simultaneous observation from multiple sites. We present the electro-optical design principles behind OSCOM and light curves of the 1.5 U DICE-2 CubeSat and simultaneous observations of the main body of the ASTRO-H satellite after its fragmentation event. PMID:28556802

  8. Hierarchical Bayesian models to assess between- and within-batch variability of pathogen contamination in food.

    PubMed

    Commeau, Natalie; Cornu, Marie; Albert, Isabelle; Denis, Jean-Baptiste; Parent, Eric

    2012-03-01

    Assessing within-batch and between-batch variability is of major interest for risk assessors and risk managers in the context of microbiological contamination of food. For example, the ratio between the within-batch variability and the between-batch variability has a large impact on the results of a sampling plan. Here, we designed hierarchical Bayesian models to represent such variability. Compatible priors were built mathematically to obtain sound model comparisons. A numeric criterion is proposed to assess the contamination structure comparing the ability of the models to replicate grouped data at the batch level using a posterior predictive loss approach. Models were applied to two case studies: contamination by Listeria monocytogenes of pork breast used to produce diced bacon and contamination by the same microorganism on cold smoked salmon at the end of the process. In the first case study, a contamination structure clearly exists and is located at the batch level, that is, between batches variability is relatively strong, whereas in the second a structure also exists but is less marked. © 2012 Society for Risk Analysis.

  9. On our rapidly shrinking capacity to comply with the planetary boundaries on climate change.

    PubMed

    Mathias, Jean-Denis; Anderies, John M; Janssen, Marco A

    2017-02-07

    The planetary boundary framework constitutes an opportunity for decision makers to define climate policy through the lens of adaptive governance. Here, we use the DICE model to analyze the set of adaptive climate policies that comply with the two planetary boundaries related to climate change: (1) staying below a CO 2 concentration of 550 ppm until 2100 and (2) returning to 350 ppm in 2100. Our results enable decision makers to assess the following milestones: (1) a minimum of 33% reduction of CO 2 emissions by 2055 in order to stay below 550 ppm by 2100 (this milestone goes up to 46% in the case of delayed policies); and (2) carbon neutrality and the effective implementation of innovative geoengineering technologies (10% negative emissions) before 2060 in order to return to 350 ppm in 2100, under the assumption of getting out of the baseline scenario without delay. Finally, we emphasize the need to use adaptive path-based approach instead of single point target for climate policy design.

  10. On our rapidly shrinking capacity to comply with the planetary boundaries on climate change

    PubMed Central

    Mathias, Jean-Denis; Anderies, John M.; Janssen, Marco A.

    2017-01-01

    The planetary boundary framework constitutes an opportunity for decision makers to define climate policy through the lens of adaptive governance. Here, we use the DICE model to analyze the set of adaptive climate policies that comply with the two planetary boundaries related to climate change: (1) staying below a CO2 concentration of 550 ppm until 2100 and (2) returning to 350 ppm in 2100. Our results enable decision makers to assess the following milestones: (1) a minimum of 33% reduction of CO2 emissions by 2055 in order to stay below 550 ppm by 2100 (this milestone goes up to 46% in the case of delayed policies); and (2) carbon neutrality and the effective implementation of innovative geoengineering technologies (10% negative emissions) before 2060 in order to return to 350 ppm in 2100, under the assumption of getting out of the baseline scenario without delay. Finally, we emphasize the need to use adaptive path-based approach instead of single point target for climate policy design. PMID:28169336

  11. On our rapidly shrinking capacity to comply with the planetary boundaries on climate change

    NASA Astrophysics Data System (ADS)

    Mathias, Jean-Denis; Anderies, John M.; Janssen, Marco A.

    2017-02-01

    The planetary boundary framework constitutes an opportunity for decision makers to define climate policy through the lens of adaptive governance. Here, we use the DICE model to analyze the set of adaptive climate policies that comply with the two planetary boundaries related to climate change: (1) staying below a CO2 concentration of 550 ppm until 2100 and (2) returning to 350 ppm in 2100. Our results enable decision makers to assess the following milestones: (1) a minimum of 33% reduction of CO2 emissions by 2055 in order to stay below 550 ppm by 2100 (this milestone goes up to 46% in the case of delayed policies); and (2) carbon neutrality and the effective implementation of innovative geoengineering technologies (10% negative emissions) before 2060 in order to return to 350 ppm in 2100, under the assumption of getting out of the baseline scenario without delay. Finally, we emphasize the need to use adaptive path-based approach instead of single point target for climate policy design.

  12. Structural features of microRNA (miRNA) precursors and their relevance to miRNA biogenesis and small interfering RNA/short hairpin RNA design.

    PubMed

    Krol, Jacek; Sobczak, Krzysztof; Wilczynska, Urszula; Drath, Maria; Jasinska, Anna; Kaczynska, Danuta; Krzyzosiak, Wlodzimierz J

    2004-10-01

    We have established the structures of 10 human microRNA (miRNA) precursors using biochemical methods. Eight of these structures turned out to be different from those that were computer-predicted. The differences localized in the terminal loop region and at the opposite side of the precursor hairpin stem. We have analyzed the features of these structures from the perspectives of miRNA biogenesis and active strand selection. We demonstrated the different thermodynamic stability profiles for pre-miRNA hairpins harboring miRNAs at their 5'- and 3'-sides and discussed their functional implications. Our results showed that miRNA prediction based on predicted precursor structures may give ambiguous results, and the success rate is significantly higher for the experimentally determined structures. On the other hand, the differences between the predicted and experimentally determined structures did not affect the stability of termini produced through "conceptual dicing." This result confirms the value of thermodynamic analysis based on mfold as a predictor of strand section by RNAi-induced silencing complex (RISC).

  13. A translation micromirror with large quasi-static displacement and high surface quality

    NASA Astrophysics Data System (ADS)

    Xue, Yuan; He, Siyuan

    2017-01-01

    A large displacement with high surface quality translation micromirror is presented. The micromirror consists of a magnetic actuator and a mirror plate. The actuator and the mirror plate are fabricated separately using two processes and then bonded together. The actuator consists of a moving film which is a 20 µm thick nickel film fabricated by MetalMUMPs and a solenoid located underneath the moving film. The moving film is designed to curve up through the residual stress gradient in the nickel film and a curve-up mechanism which includes four trapezoidal plates and anchoring springs. The mirror plate is simply diced from a polished silicon wafer and coated with a metal thin film. The mirror plate is bonded onto the central ring of the moving film. A solenoid attracts the moving film along with the mirror plate downwards to realize translation. A quasi-static displacement of 123 µm is achieved at a driving current of 400 mA. A high mirror surface quality is realized, e.g. 15.6 m of curvature radius and 2 nm surface roughness.

  14. Compact cantilever couplers for low-loss fiber coupling to silicon photonic integrated circuits.

    PubMed

    Wood, Michael; Sun, Peng; Reano, Ronald M

    2012-01-02

    We demonstrate coupling from tapered optical fibers to 450 nm by 250 nm silicon strip waveguides using compact cantilever couplers. The couplers consist of silicon inverse width tapers embedded within silicon dioxide cantilevers. Finite difference time domain simulations are used to design the length of the silicon inverse width taper to as short as 6.5 μm for a cantilever width of 2 μm. Modeling of various strip waveguide taper profiles shows reduced coupling losses for a quadratic taper profile. Infrared measurements of fabricated devices demonstrate average coupling losses of 0.62 dB per connection for the quasi-TE mode and 0.50 dB per connection for the quasi-TM mode across the optical telecommunications C band. In the wavelength range from 1477 nm to 1580 nm, coupling losses for both polarizations are less than 1 dB per connection. The compact, broadband, and low-loss coupling scheme enables direct access to photonic integrated circuits on an entire chip surface without the need for dicing or cleaving the chip.

  15. Results from an investigation of the physical origins of nonproportionality in CsI(Tl)

    NASA Astrophysics Data System (ADS)

    Asztalos, S.; Hennig, W.; Warburton, W. K.

    2011-10-01

    The relative scintillation response per energy deposited by Compton electrons, or nonproportionality, has traditionally been considered an intrinsic scintillator property. However, such an interpretation is inconsistent with recent results that show nonproportionality to depend on external factors such as shaping time, temperature and supplier. Apparently, at least some of the overall nonproportionality has an extrinsic origin. In this work we describe the results from a suite of measurements designed to test the hypothesis that nonproportionality in CsI(Tl) material has an extrinsic component that correlates with impurity levels. Our choice of material was motivated by the excellent energy resolution observed in one bulk crystal (6.4%)—a marked departure from that measured with conventional CsI(Tl) stock (8-8.5%). Six bulk CsI(Tl) crystals were procured and diced into 44 wafers. Using X-ray fluorescence techniques no conclusive evidence for impurities was found in any of the wafers at the 1-50 ppm level. One crystal exhibited a distinct correlation among energy resolution, decay lifetimes, nonproportionality and a very low level of Tl doping.

  16. Multi-Site Simultaneous Time-Resolved Photometry with a Low Cost Electro-Optics System.

    PubMed

    Gasdia, Forrest; Barjatya, Aroh; Bilardi, Sergei

    2017-05-30

    Sunlight reflected off of resident space objects can be used as an optical signal for astrometric orbit determination and for deducing geometric information about the object. With the increasing population of small satellites and debris in low Earth orbit, photometry is a powerful tool in operational support of space missions, whether for anomaly resolution or object identification. To accurately determine size, shape, spin rate, status of deployables, or attitude information of an unresolved resident space object, multi-hertz sample rate photometry is required to capture the relatively rapid changes in brightness that these objects can exhibit. OSCOM, which stands for Optical tracking and Spectral characterization of CubeSats for Operational Missions, is a low cost and portable telescope system capable of time-resolved small satellite photometry, and is field deployable on short notice for simultaneous observation from multiple sites. We present the electro-optical design principles behind OSCOM and light curves of the 1.5 U DICE-2 CubeSat and simultaneous observations of the main body of the ASTRO-H satellite after its fragmentation event.

  17. A study on the impact of nuclear power plant construction relative to decommissioning Fossil Fuel Power Plant in order to reduce carbon dioxide emissions using a modified Nordhaus Vensim DICE model

    NASA Astrophysics Data System (ADS)

    Colpetzer, Jason Lee

    The current levels of CO2 emissions and high levels accumulating in the atmosphere have climate scientists concerned. The Dynamic Integrated Climate Economy Model or "DICE" for short is a highly developed model that has been used to simulate climate change and evaluate factors addressing global warming. The model was developed by Yale's Nordhaus along with collaborators and the compilation of numerous scientific publications. The purpose of this study is to recreate DICE using Vensim and modify it to evaluate the use of nuclear power plants (NPPs) as a means to counter global temperature increases in the atmosphere and oceans and the associated cost of damages. The amount of greenhouse gas emissions from a NPP are about 6% per Megawatt as that from a Fossil Fuel Power Plant (FFPP). Based on this, a model was developed to simulate construction of NPPs with subsequent decommissioning of FFPPs with an equivalent power output. The results produced through multiple simulation runs utilizing variable NPP construction rates show that some minor benefit is achievable if all of the more than 10,000 FFPPs currently in operation in the U.S. are replaced with NPPs. The results show that a reduction in CO 2 emissions of 2.48% will occur if all of the FFPPs are decommissioned. At a minimum rate of 50 NPPs constructed per year, the largest reduction in CO2 in the atmosphere, 1.94% or 44.5 billion tons of carbon, is possible. This results in a reduction in global warming of 0.068°C or 1.31%. The results also show that this reduction in global warming will be equivalent to a reduction of 8.2% or $148 B in anticipated annual spending as a result of climate change damages. Further results indicate that using NPPs to address climate change will provide a small benefit; ultimately, it will not be enough to reduce CO2 emissions or atmospheric CO 2 to control global warming. The amount of CO2 in the atmosphere is predicted to be 1055 parts per million (ppm) even in the best case scenario, which is well above the current limit of 350 ppm proposed by Hansen et. al.

  18. TU-CD-BRA-01: A Novel 3D Registration Method for Multiparametric Radiological Images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akhbardeh, A; Parekth, VS; Jacobs, MA

    2015-06-15

    Purpose: Multiparametric and multimodality radiological imaging methods, such as, magnetic resonance imaging(MRI), computed tomography(CT), and positron emission tomography(PET), provide multiple types of tissue contrast and anatomical information for clinical diagnosis. However, these radiological modalities are acquired using very different technical parameters, e.g.,field of view(FOV), matrix size, and scan planes, which, can lead to challenges in registering the different data sets. Therefore, we developed a hybrid registration method based on 3D wavelet transformation and 3D interpolations that performs 3D resampling and rotation of the target radiological images without loss of information Methods: T1-weighted, T2-weighted, diffusion-weighted-imaging(DWI), dynamic-contrast-enhanced(DCE) MRI and PET/CT were usedmore » in the registration algorithm from breast and prostate data at 3T MRI and multimodality(PET/CT) cases. The hybrid registration scheme consists of several steps to reslice and match each modality using a combination of 3D wavelets, interpolations, and affine registration steps. First, orthogonal reslicing is performed to equalize FOV, matrix sizes and the number of slices using wavelet transformation. Second, angular resampling of the target data is performed to match the reference data. Finally, using optimized angles from resampling, 3D registration is performed using similarity transformation(scaling and translation) between the reference and resliced target volume is performed. After registration, the mean-square-error(MSE) and Dice Similarity(DS) between the reference and registered target volumes were calculated. Results: The 3D registration method registered synthetic and clinical data with significant improvement(p<0.05) of overlap between anatomical structures. After transforming and deforming the synthetic data, the MSE and Dice similarity were 0.12 and 0.99. The average improvement of the MSE in breast was 62%(0.27 to 0.10) and prostate was 63%(0.13 to 0.04;p<0.05). The Dice similarity was in breast 8%(0.91 to 0.99) and for prostate was 89%(0.01 to 0.90;p<0.05) Conclusion: Our 3D wavelet hybrid registration approach registered diverse breast and prostate data of different radiological images(MR/PET/CT) with a high accuracy.« less

  19. Proceedings of the DICE THROW Symposium 21-23 June 1977. Volume 3

    DTIC Science & Technology

    1977-07-01

    30-gal. steel oil drum. Rubber -tire hingts and rubber -tire seals wade a snug closure between the door and the upper part of the vertical entry. The...crack or rupture locations which implies that considerable strength degradation Lisociated with the presence of a fastener hole may lead to premature...occur can be partially attributed to the strength degradation of theaircraft structure, resulting from extensive in-service use, as we.l as to the

  20. Industrial Equipment Survival/Recovery Feasibility Program during Event DICE THROW

    DTIC Science & Technology

    1976-12-31

    WE i ° t S~DEFINITIONS OF TERMS AND ACRONYMS •’AFB Air Force Base • •ANFO Ammonium nitrate fuel oil explosive. SChimneying Falling of material from...tank half full and transmission at normal oil level. (For best orotection all tanks should be completely full.) 2. Mechanical/electrical calculator. 3...3263 Chief of Naval Research ATTN: Code 533, Tech. Library ATTN: Code 464, Jacob L. Warner ATTN: Carl Austin ATTN: Nicholas Perrone ATTN: Tech

  1. clustep: Initial conditions for galaxy cluster halo simulations

    NASA Astrophysics Data System (ADS)

    Ruggiero, Rafael

    2017-11-01

    clustep generates a snapshot in GADGET-2 (ascl:0003.001) format containing a galaxy cluster halo in equilibrium; this snapshot can also be read in RAMSES (ascl:1011.007) using the DICE patch. The halo is made of a dark matter component and a gas component, with the latter representing the ICM. Each of these components follows a Dehnen density profile, with gamma=0 or gamma=1. If gamma=1, then the profile corresponds to a Hernquist profile.

  2. Revenge of the nerds. How Dungeons & Dragons prepared me for the current age of narrative driven media.

    NASA Astrophysics Data System (ADS)

    Hut, Rolf

    2017-04-01

    "I want to cast magic missile" As a somewhat shy 14 year old, I could never have predicted that the years of playing the role-playing game with the weird dice would perfectly train me in "getting my research into the media". In this talk I will draw parallels between the skills a young role-playing nerd learns and a media-savvy researcher now a days need.

  3. Protection by Purines in Toxin Models of Parkinson’s Disease

    DTIC Science & Technology

    2015-08-01

    2014. Accepted for publication Sep 23, 2014. Address correspondence to Dr Ascherio, 667 Huntington Ave, Department of Nutrition , Building 2, 3rd floor...Boston, MA 02115. E-mail: aascherio@hsph.harvard.edu From the 1Department of Nutrition , Harvard School of Public Health, Boston, MA; 2Channing... Yucatan miniature pig. Evidence that inosine functions as an in vivo energy substrate. Biochim Biophys Acta 842:214–224. Zai L, Ferrari C, Dice C

  4. Fighting the Hobbesian Trinity in Columbia: A New Strategy for Peace

    DTIC Science & Technology

    2001-04-01

    that corruption can be avoided in an electoral system, nor do they address the issue that the electoral system can be used to maintain corrupt elites...Yet when corruption assists elites to manipulate the electoral system, then accountability, the very purpose of the electoral system, is nullified. In...Ibid., p. 375. 98. Paz y Derechos Humanos, “Mininterior dice que armar a civiles aumentaria violencia; No a milicias: Gobierno,” El Colombiano, November

  5. Graph cuts and neural networks for segmentation and porosity quantification in Synchrotron Radiation X-ray μCT of an igneous rock sample.

    PubMed

    Meneses, Anderson Alvarenga de Moura; Palheta, Dayara Bastos; Pinheiro, Christiano Jorge Gomes; Barroso, Regina Cely Rodrigues

    2018-03-01

    X-ray Synchrotron Radiation Micro-Computed Tomography (SR-µCT) allows a better visualization in three dimensions with a higher spatial resolution, contributing for the discovery of aspects that could not be observable through conventional radiography. The automatic segmentation of SR-µCT scans is highly valuable due to its innumerous applications in geological sciences, especially for morphology, typology, and characterization of rocks. For a great number of µCT scan slices, a manual process of segmentation would be impractical, either for the time expended and for the accuracy of results. Aiming the automatic segmentation of SR-µCT geological sample images, we applied and compared Energy Minimization via Graph Cuts (GC) algorithms and Artificial Neural Networks (ANNs), as well as the well-known K-means and Fuzzy C-Means algorithms. The Dice Similarity Coefficient (DSC), Sensitivity and Precision were the metrics used for comparison. Kruskal-Wallis and Dunn's tests were applied and the best methods were the GC algorithms and ANNs (with Levenberg-Marquardt and Bayesian Regularization). For those algorithms, an approximate Dice Similarity Coefficient of 95% was achieved. Our results confirm the possibility of usage of those algorithms for segmentation and posterior quantification of porosity of an igneous rock sample SR-µCT scan. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Methods for 2-D and 3-D Endobronchial Ultrasound Image Segmentation.

    PubMed

    Zang, Xiaonan; Bascom, Rebecca; Gilbert, Christopher; Toth, Jennifer; Higgins, William

    2016-07-01

    Endobronchial ultrasound (EBUS) is now commonly used for cancer-staging bronchoscopy. Unfortunately, EBUS is challenging to use and interpreting EBUS video sequences is difficult. Other ultrasound imaging domains, hampered by related difficulties, have benefited from computer-based image-segmentation methods. Yet, so far, no such methods have been proposed for EBUS. We propose image-segmentation methods for 2-D EBUS frames and 3-D EBUS sequences. Our 2-D method adapts the fast-marching level-set process, anisotropic diffusion, and region growing to the problem of segmenting 2-D EBUS frames. Our 3-D method builds upon the 2-D method while also incorporating the geodesic level-set process for segmenting EBUS sequences. Tests with lung-cancer patient data showed that the methods ran fully automatically for nearly 80% of test cases. For the remaining cases, the only user-interaction required was the selection of a seed point. When compared to ground-truth segmentations, the 2-D method achieved an overall Dice index = 90.0% ±4.9%, while the 3-D method achieved an overall Dice index = 83.9 ± 6.0%. In addition, the computation time (2-D, 0.070 s/frame; 3-D, 0.088 s/frame) was two orders of magnitude faster than interactive contour definition. Finally, we demonstrate the potential of the methods for EBUS localization in a multimodal image-guided bronchoscopy system.

  7. Comparative Approach of MRI-Based Brain Tumor Segmentation and Classification Using Genetic Algorithm.

    PubMed

    Bahadure, Nilesh Bhaskarrao; Ray, Arun Kumar; Thethi, Har Pal

    2018-01-17

    The detection of a brain tumor and its classification from modern imaging modalities is a primary concern, but a time-consuming and tedious work was performed by radiologists or clinical supervisors. The accuracy of detection and classification of tumor stages performed by radiologists is depended on their experience only, so the computer-aided technology is very important to aid with the diagnosis accuracy. In this study, to improve the performance of tumor detection, we investigated comparative approach of different segmentation techniques and selected the best one by comparing their segmentation score. Further, to improve the classification accuracy, the genetic algorithm is employed for the automatic classification of tumor stage. The decision of classification stage is supported by extracting relevant features and area calculation. The experimental results of proposed technique are evaluated and validated for performance and quality analysis on magnetic resonance brain images, based on segmentation score, accuracy, sensitivity, specificity, and dice similarity index coefficient. The experimental results achieved 92.03% accuracy, 91.42% specificity, 92.36% sensitivity, and an average segmentation score between 0.82 and 0.93 demonstrating the effectiveness of the proposed technique for identifying normal and abnormal tissues from brain MR images. The experimental results also obtained an average of 93.79% dice similarity index coefficient, which indicates better overlap between the automated extracted tumor regions with manually extracted tumor region by radiologists.

  8. Concerning Dice and Divinity

    NASA Astrophysics Data System (ADS)

    Appleby, D. M.

    2007-02-01

    Einstein initially objected to the probabilistic aspect of quantum mechanics—the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that Einstein's initial intuition was perfectly sound, and that it is precisely the fact that quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation. Probability is an intrinsically logical concept. This means that the quantum state has an essentially logical significance. It is extremely difficult to reconcile that fact with Einstein's belief, that it is the task of physics to give us a vision of the world apprehended sub specie aeternitatis. Quantum mechanics thus presents us with a simple choice: either to follow Einstein in looking for a theory which is not probabilistic at the fundamental level, or else to accept that physics does not in fact put us in the position of God looking down on things from above. There is a widespread fear that the latter alternative must inevitably lead to a greatly impoverished, positivistic view of physical theory. It appears to us, however, that the truth is just the opposite. The Einsteinian vision is much less attractive than it seems at first sight. In particular, it is closely connected with philosophical reductionism.

  9. Machine learning in a graph framework for subcortical segmentation

    NASA Astrophysics Data System (ADS)

    Guo, Zhihui; Kashyap, Satyananda; Sonka, Milan; Oguz, Ipek

    2017-02-01

    Automated and reliable segmentation of subcortical structures from human brain magnetic resonance images is of great importance for volumetric and shape analyses in quantitative neuroimaging studies. However, poor boundary contrast and variable shape of these structures make the automated segmentation a tough task. We propose a 3D graph-based machine learning method, called LOGISMOS-RF, to segment the caudate and the putamen from brain MRI scans in a robust and accurate way. An atlas-based tissue classification and bias-field correction method is applied to the images to generate an initial segmentation for each structure. Then a 3D graph framework is utilized to construct a geometric graph for each initial segmentation. A locally trained random forest classifier is used to assign a cost to each graph node. The max-flow algorithm is applied to solve the segmentation problem. Evaluation was performed on a dataset of T1-weighted MRI's of 62 subjects, with 42 images used for training and 20 images for testing. For comparison, FreeSurfer, FSL and BRAINSCut approaches were also evaluated using the same dataset. Dice overlap coefficients and surface-to-surfaces distances between the automated segmentation and expert manual segmentations indicate the results of our method are statistically significantly more accurate than the three other methods, for both the caudate (Dice: 0.89 +/- 0.03) and the putamen (0.89 +/- 0.03).

  10. Fully automatic acute ischemic lesion segmentation in DWI using convolutional neural networks.

    PubMed

    Chen, Liang; Bentley, Paul; Rueckert, Daniel

    2017-01-01

    Stroke is an acute cerebral vascular disease, which is likely to cause long-term disabilities and death. Acute ischemic lesions occur in most stroke patients. These lesions are treatable under accurate diagnosis and treatments. Although diffusion-weighted MR imaging (DWI) is sensitive to these lesions, localizing and quantifying them manually is costly and challenging for clinicians. In this paper, we propose a novel framework to automatically segment stroke lesions in DWI. Our framework consists of two convolutional neural networks (CNNs): one is an ensemble of two DeconvNets (Noh et al., 2015), which is the EDD Net; the second CNN is the multi-scale convolutional label evaluation net (MUSCLE Net), which aims to evaluate the lesions detected by the EDD Net in order to remove potential false positives. To the best of our knowledge, it is the first attempt to solve this problem and using both CNNs achieves very good results. Furthermore, we study the network architectures and key configurations in detail to ensure the best performance. It is validated on a large dataset comprising clinical acquired DW images from 741 subjects. A mean accuracy of Dice coefficient obtained is 0.67 in total. The mean Dice scores based on subjects with only small and large lesions are 0.61 and 0.83, respectively. The lesion detection rate achieved is 0.94.

  11. BEaST: brain extraction based on nonlocal segmentation technique.

    PubMed

    Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir; Manjón, José V; Leung, Kelvin K; Guizard, Nicolas; Wassef, Shafik N; Østergaard, Lasse Riis; Collins, D Louis

    2012-02-01

    Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer's Disease Neuroimaging Initiative databases. In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. A concept for holistic whole body MRI data analysis, Imiomics

    PubMed Central

    Malmberg, Filip; Johansson, Lars; Lind, Lars; Sundbom, Magnus; Ahlström, Håkan; Kullberg, Joel

    2017-01-01

    Purpose To present and evaluate a whole-body image analysis concept, Imiomics (imaging–omics) and an image registration method that enables Imiomics analyses by deforming all image data to a common coordinate system, so that the information in each voxel can be compared between persons or within a person over time and integrated with non-imaging data. Methods The presented image registration method utilizes relative elasticity constraints of different tissue obtained from whole-body water-fat MRI. The registration method is evaluated by inverse consistency and Dice coefficients and the Imiomics concept is evaluated by example analyses of importance for metabolic research using non-imaging parameters where we know what to expect. The example analyses include whole body imaging atlas creation, anomaly detection, and cross-sectional and longitudinal analysis. Results The image registration method evaluation on 128 subjects shows low inverse consistency errors and high Dice coefficients. Also, the statistical atlas with fat content intensity values shows low standard deviation values, indicating successful deformations to the common coordinate system. The example analyses show expected associations and correlations which agree with explicit measurements, and thereby illustrate the usefulness of the proposed Imiomics concept. Conclusions The registration method is well-suited for Imiomics analyses, which enable analyses of relationships to non-imaging data, e.g. clinical data, in new types of holistic targeted and untargeted big-data analysis. PMID:28241015

  13. Association between bacterial survival and free chlorine concentration during commercial fresh-cut produce wash operation.

    PubMed

    Luo, Yaguang; Zhou, Bin; Van Haute, Sam; Nou, Xiangwu; Zhang, Boce; Teng, Zi; Turner, Ellen R; Wang, Qin; Millner, Patricia D

    2018-04-01

    Determining the minimal effective free chlorine (FC) concentration for preventing pathogen survival and cross-contamination during produce washing is critical for developing science- and risk-based food safety practices. The correlation between dynamic FC concentrations and bacterial survival was investigated during commercial washing of chopped Romaine lettuce, shredded Iceberg lettuce, and diced cabbage as pathogen inoculation study during commercial operation is not feasible. Wash water was sampled every 30 min and assayed for organic loading, FC, and total aerobic mesophilic bacteria after chlorine neutralization. Water turbidity, chemical oxygen demand, and total dissolved solids increased significantly over time, with more rapid increases in diced cabbage water. Combined chlorine increased consistently while FC fluctuated in response to rates of chlorine dosing, product loading, and water replenishment. Total bacterial survival showed a strong correlation with real-time FC concentration. Under approximately 10 mg/L, increasing FC significantly reduced the frequency and population of surviving bacteria detected. Increasing FC further resulted in the reduction of the aerobic plate count to below the detection limit (50 CFU/100 mL), except for a few sporadic positive samples with low cell counts. This study confirms that maintaining at least 10 mg/L FC in wash water strongly reduced the likelihood of bacterial survival and thus potential cross contamination of washed produce. Published by Elsevier Ltd.

  14. Statistical Validation of Automatic Methods for Hippocampus Segmentation in MR Images of Epileptic Patients

    PubMed Central

    Hosseini, Mohammad-Parsa; Nazem-Zadeh, Mohammad R.; Pompili, Dario; Soltanian-Zadeh, Hamid

    2015-01-01

    Hippocampus segmentation is a key step in the evaluation of mesial Temporal Lobe Epilepsy (mTLE) by MR images. Several automated segmentation methods have been introduced for medical image segmentation. Because of multiple edges, missing boundaries, and shape changing along its longitudinal axis, manual outlining still remains the benchmark for hippocampus segmentation, which however, is impractical for large datasets due to time constraints. In this study, four automatic methods, namely FreeSurfer, Hammer, Automatic Brain Structure Segmentation (ABSS), and LocalInfo segmentation, are evaluated to find the most accurate and applicable method that resembles the bench-mark of hippocampus. Results from these four methods are compared against those obtained using manual segmentation for T1-weighted images of 157 symptomatic mTLE patients. For performance evaluation of automatic segmentation, Dice coefficient, Hausdorff distance, Precision, and Root Mean Square (RMS) distance are extracted and compared. Among these four automated methods, ABSS generates the most accurate results and the reproducibility is more similar to expert manual outlining by statistical validation. By considering p-value<0.05, the results of performance measurement for ABSS reveal that, Dice is 4%, 13%, and 17% higher, Hausdorff is 23%, 87%, and 70% lower, precision is 5%, -5%, and 12% higher, and RMS is 19%, 62%, and 65% lower compared to LocalInfo, FreeSurfer, and Hammer, respectively. PMID:25571043

  15. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer

    NASA Astrophysics Data System (ADS)

    Luiza Bondar, M.; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-01

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  16. Intra-patient semi-automated segmentation of the cervix-uterus in CT-images for adaptive radiotherapy of cervical cancer.

    PubMed

    Bondar, M Luiza; Hoogeman, Mischa; Schillemans, Wilco; Heijmen, Ben

    2013-08-07

    For online adaptive radiotherapy of cervical cancer, fast and accurate image segmentation is required to facilitate daily treatment adaptation. Our aim was twofold: (1) to test and compare three intra-patient automated segmentation methods for the cervix-uterus structure in CT-images and (2) to improve the segmentation accuracy by including prior knowledge on the daily bladder volume or on the daily coordinates of implanted fiducial markers. The tested methods were: shape deformation (SD) and atlas-based segmentation (ABAS) using two non-rigid registration methods: demons and a hierarchical algorithm. Tests on 102 CT-scans of 13 patients demonstrated that the segmentation accuracy significantly increased by including the bladder volume predicted with a simple 1D model based on a manually defined bladder top. Moreover, manually identified implanted fiducial markers significantly improved the accuracy of the SD method. For patients with large cervix-uterus volume regression, the use of CT-data acquired toward the end of the treatment was required to improve segmentation accuracy. Including prior knowledge, the segmentation results of SD (Dice similarity coefficient 85 ± 6%, error margin 2.2 ± 2.3 mm, average time around 1 min) and of ABAS using hierarchical non-rigid registration (Dice 82 ± 10%, error margin 3.1 ± 2.3 mm, average time around 30 s) support their use for image guided online adaptive radiotherapy of cervical cancer.

  17. Filtering NetCDF Files by Using the EverVIEW Slice and Dice Tool

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Network Common Data Form (NetCDF) is a self-describing, machine-independent file format for storing array-oriented scientific data. It was created to provide a common interface between applications and real-time meteorological and other scientific data. Over the past few years, there has been a growing movement within the community of natural resource managers in The Everglades, Fla., to use NetCDF as the standard data container for datasets based on multidimensional arrays. As a consequence, a need surfaced for additional tools to view and manipulate NetCDF datasets, specifically to filter the files by creating subsets of large NetCDF files. The U.S. Geological Survey (USGS) and the Joint Ecosystem Modeling (JEM) group are working to address these needs with applications like the EverVIEW Slice and Dice Tool, which allows users to filter grid-based NetCDF files, thus targeting those data most important to them. The major functions of this tool are as follows: (1) to create subsets of NetCDF files temporally, spatially, and by data value; (2) to view the NetCDF data in table form; and (3) to export the filtered data to a comma-separated value (CSV) file format. The USGS and JEM will continue to work with scientists and natural resource managers across The Everglades to solve complex restoration problems through technological advances.

  18. The effects of gamma irradiation on the microbiological, physical and sensory qualities of diced tomatoes

    NASA Astrophysics Data System (ADS)

    Prakash, Anuradha; Manley, Jacqueline; DeCosta, Suresh; Caporaso, Fred; Foley, Denise

    2002-03-01

    Diced Roma tomatoes were treated with gamma irradiation and evaluated for changes in microbial, physical, chemical and sensory properties. Dosages for Trial 1 were 0.0, 0.39, 0.56 and 1.82 kGy and in Trial 2, 0.0, 0.50, 1.24 and 3.70 kGy. Irradiation at 3.70 kGy resulted in no aerobic populations through day 12 and significantly fewer colonies through day 15 whereas yeast and mold populations experienced a 2 log reduction through day 12. Color, titratable acidity, and °Brix were not significantly affected by irradiation. Tissue firmness decreased with increasing dose but not with storage time. Treatment with 3.7 kGy decreased firmness by 50% and 20% with 0.5 kGy, however, the reduced firmness induced by 0.50 kGy was undetected by a 9 member trained sensory panel. A significant ( p⩽0.05) inverse correlation between changes in texture and water-soluble pectin (WSP) was determined while total pectin remained relatively constant and oxalate soluble pectin content decreased slightly with irradiation dose. The significant inverse correlation between the loss of firmness and WSP indicates that the changes in WSP play an important role in the tissue softening of tomatoes, This study indicates that irradiation at 0.5 kGy can reduce microbial counts substantially to improve microbial shelf life without adverse effects on sensory qualities.

  19. Efficient patient modeling for visuo-haptic VR simulation using a generic patient atlas.

    PubMed

    Mastmeyer, Andre; Fortmeier, Dirk; Handels, Heinz

    2016-08-01

    This work presents a new time-saving virtual patient modeling system by way of example for an existing visuo-haptic training and planning virtual reality (VR) system for percutaneous transhepatic cholangio-drainage (PTCD). Our modeling process is based on a generic patient atlas to start with. It is defined by organ-specific optimized models, method modules and parameters, i.e. mainly individual segmentation masks, transfer functions to fill the gaps between the masks and intensity image data. In this contribution, we show how generic patient atlases can be generalized to new patient data. The methodology consists of patient-specific, locally-adaptive transfer functions and dedicated modeling methods such as multi-atlas segmentation, vessel filtering and spline-modeling. Our full image volume segmentation algorithm yields median DICE coefficients of 0.98, 0.93, 0.82, 0.74, 0.51 and 0.48 regarding soft-tissue, liver, bone, skin, blood and bile vessels for ten test patients and three selected reference patients. Compared to standard slice-wise manual contouring time saving is remarkable. Our segmentation process shows out efficiency and robustness for upper abdominal puncture simulation systems. This marks a significant step toward establishing patient-specific training and hands-on planning systems in a clinical environment. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Auto-Context Convolutional Neural Network (Auto-Net) for Brain Extraction in Magnetic Resonance Imaging.

    PubMed

    Mohseni Salehi, Seyed Sadegh; Erdogmus, Deniz; Gholipour, Ali

    2017-11-01

    Brain extraction or whole brain segmentation is an important first step in many of the neuroimage analysis pipelines. The accuracy and the robustness of brain extraction, therefore, are crucial for the accuracy of the entire brain analysis process. The state-of-the-art brain extraction techniques rely heavily on the accuracy of alignment or registration between brain atlases and query brain anatomy, and/or make assumptions about the image geometry, and therefore have limited success when these assumptions do not hold or image registration fails. With the aim of designing an accurate, learning-based, geometry-independent, and registration-free brain extraction tool, in this paper, we present a technique based on an auto-context convolutional neural network (CNN), in which intrinsic local and global image features are learned through 2-D patches of different window sizes. We consider two different architectures: 1) a voxelwise approach based on three parallel 2-D convolutional pathways for three different directions (axial, coronal, and sagittal) that implicitly learn 3-D image information without the need for computationally expensive 3-D convolutions and 2) a fully convolutional network based on the U-net architecture. Posterior probability maps generated by the networks are used iteratively as context information along with the original image patches to learn the local shape and connectedness of the brain to extract it from non-brain tissue. The brain extraction results we have obtained from our CNNs are superior to the recently reported results in the literature on two publicly available benchmark data sets, namely, LPBA40 and OASIS, in which we obtained the Dice overlap coefficients of 97.73% and 97.62%, respectively. Significant improvement was achieved via our auto-context algorithm. Furthermore, we evaluated the performance of our algorithm in the challenging problem of extracting arbitrarily oriented fetal brains in reconstructed fetal brain magnetic resonance imaging (MRI) data sets. In this application, our voxelwise auto-context CNN performed much better than the other methods (Dice coefficient: 95.97%), where the other methods performed poorly due to the non-standard orientation and geometry of the fetal brain in MRI. Through training, our method can provide accurate brain extraction in challenging applications. This, in turn, may reduce the problems associated with image registration in segmentation tasks.

  1. Improving accuracy of simultaneously reconstructed activity and attenuation maps using deep learning.

    PubMed

    Hwang, Donghwi; Kim, Kyeong Yun; Kang, Seung Kwan; Seo, Seongho; Paeng, Jin Chul; Lee, Dong Soo; Lee, Jae Sung

    2018-02-15

    Simultaneous reconstruction of activity and attenuation using the maximum likelihood reconstruction of activity and attenuation (MLAA) augmented by time-of-flight (TOF) information is a promising method for positron emission tomography (PET) attenuation correction. However, it still suffers from several problems, including crosstalk artifacts, slow convergence speed, and noisy attenuation maps (μ-maps). In this work, we developed deep convolutional neural networks (CNNs) to overcome these MLAA limitations, and we verified their feasibility using a clinical brain PET data set. Methods: We applied the proposed method to one of the most challenging PET cases for simultaneous image reconstruction ( 18 F-FP-CIT PET scans with highly specific binding to striatum of the brain). Three different CNN architectures (convolutional autoencoder (CAE), U-net, hybrid of CAE and U-net) were designed and trained to learn x-ray computed tomography (CT) derived μ-map (μ-CT) from the MLAA-generated activity distribution and μ-map (μ-MLAA). PET/CT data of 40 patients with suspected Parkinson's disease were employed for five-fold cross-validation. For the training of CNNs, 800,000 transverse PET slices and CTs augmented from 32 patient data sets were used. The similarity to μ-CT of the CNN-generated μ-maps (μ-CAE, μ-Unet, and μ-Hybrid) and μ-MLAA was compared using Dice similarity coefficients. In addition, we compared the activity concentration of specific (striatum) and non-specific binding regions (cerebellum and occipital cortex) and the binding ratios in the striatum in the PET activity images reconstructed using those μ-maps. Results: The CNNs generated less noisy and more uniform μ-maps than original μ-MLAA. Moreover, the air cavities and bones were better resolved in the proposed CNN outputs. In addition, the proposed deep learning approach was useful for mitigating the crosstalk problem in the MLAA reconstruction. The hybrid network of CAE and U-net yielded the most similar μ-maps to μ-CT (Dice similarity coefficient in the whole head = 0.79 in the bone and 0.72 in air cavities), resulting in only approximately 5% errors in activity and biding ratio quantification. Conclusion: The proposed deep learning approach is promising for accurate attenuation correction of activity distribution in TOF PET systems. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  2. Proceedings of the DICE THROW Symposium 21-23 June 1977. Volume 2

    DTIC Science & Technology

    1977-07-01

    structure between 2.7 km and 4.3 km MSL that could cause distant blast focusing. Detailed acoustic ray calculations showed a caustic ring about 10 km...depending on just wheie the focus or caustic wave might strike. Propagation toward Truth or Consequences, MM, shown by Figure 14, was slightly ducted...layer. Thus there probably was no focus or caustic that struck any part of that small town. The recorded signal with 370-Pa amplitude was noisy

  3. Binomial test statistics using Psi functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, Kimiko o

    2007-01-01

    For the negative binomial model (probability generating function (p + 1 - pt){sup -k}) a logarithmic derivative is the Psi function difference {psi}(k + x) - {psi}(k); this and its derivatives lead to a test statistic to decide on the validity of a specified model. The test statistic uses a data base so there exists a comparison available between theory and application. Note that the test function is not dominated by outliers. Applications to (i) Fisher's tick data, (ii) accidents data, (iii) Weldon's dice data are included.

  4. Structural Modeling and Response of Command, Control and Communication Shelter Systems for Event DICE THROW.

    DTIC Science & Technology

    1980-03-01

    Construction ... ........... ... 71 7 FkAEC"IWAG FitAZbL~ai-am naJ" LIST OF ILLUSTRATIONS (CONT’D) Figure Page 2.26 Grid Point System for AN/TRC-145...systems consists of a basic shelter structure whose side walls are of sandwich construction with internal stiffeners. Channel extrusions along each...free edge of the shelter provide additional strength and stiffening. The shelters contain electronic equipment racks of open framework construction using

  5. Solder Reflow Failures in Electronic Components During Manual Soldering

    NASA Technical Reports Server (NTRS)

    Teverovsky, Alexander; Greenwell, Chris; Felt, Frederick

    2008-01-01

    This viewgraph presentation reviews the solder reflow failures in electronic components that occur during manual soldering. It discusses the specifics of manual-soldering-induced failures in plastic devices with internal solder joints. The failure analysis turned up that molten solder had squeezed up to the die surface along the die molding compound interface, and the dice were not protected with glassivation allowing solder to short gate and source to the drain contact. The failure analysis concluded that the parts failed due to overheating during manual soldering.

  6. Genetic diversity of popcorn genotypes using molecular analysis.

    PubMed

    Resh, F S; Scapim, C A; Mangolin, C A; Machado, M F P S; do Amaral, A T; Ramos, H C C; Vivas, M

    2015-08-19

    In this study, we analyzed dominant molecular markers to estimate the genetic divergence of 26 popcorn genotypes and evaluate whether using various dissimilarity coefficients with these dominant markers influences the results of cluster analysis. Fifteen random amplification of polymorphic DNA primers produced 157 amplified fragments, of which 65 were monomorphic and 92 were polymorphic. To calculate the genetic distances among the 26 genotypes, the complements of the Jaccard, Dice, and Rogers and Tanimoto similarity coefficients were used. A matrix of Dij values (dissimilarity matrix) was constructed, from which the genetic distances among genotypes were represented in a more simplified manner as a dendrogram generated using the unweighted pair-group method with arithmetic average. Clusters determined by molecular analysis generally did not group material from the same parental origin together. The largest genetic distance was between varieties 17 (UNB-2) and 18 (PA-091). In the identification of genotypes with the smallest genetic distance, the 3 coefficients showed no agreement. The 3 dissimilarity coefficients showed no major differences among their grouping patterns because agreement in determining the genotypes with large, medium, and small genetic distances was high. The largest genetic distances were observed for the Rogers and Tanimoto dissimilarity coefficient (0.74), followed by the Jaccard coefficient (0.65) and the Dice coefficient (0.48). The 3 coefficients showed similar estimations for the cophenetic correlation coefficient. Correlations among the matrices generated using the 3 coefficients were positive and had high magnitudes, reflecting strong agreement among the results obtained using the 3 evaluated dissimilarity coefficients.

  7. An Approach for Reducing the Error Rate in Automated Lung Segmentation

    PubMed Central

    Gill, Gurman; Beichel, Reinhard R.

    2016-01-01

    Robust lung segmentation is challenging, especially when tens of thousands of lung CT scans need to be processed, as required by large multi-center studies. The goal of this work was to develop and assess a method for the fusion of segmentation results from two different methods to generate lung segmentations that have a lower failure rate than individual input segmentations. As basis for the fusion approach, lung segmentations generated with a region growing and model-based approach were utilized. The fusion result was generated by comparing input segmentations and selectively combining them using a trained classification system. The method was evaluated on a diverse set of 204 CT scans of normal and diseased lungs. The fusion approach resulted in a Dice coefficient of 0.9855 ± 0.0106 and showed a statistically significant improvement compared to both input segmentation methods. In addition, the failure rate at different segmentation accuracy levels was assessed. For example, when requiring that lung segmentations must have a Dice coefficient of better than 0.97, the fusion approach had a failure rate of 6.13%. In contrast, the failure rate for region growing and model-based methods was 18.14% and 15.69%, respectively. Therefore, the proposed method improves the quality of the lung segmentations, which is important for subsequent quantitative analysis of lungs. Also, to enable a comparison with other methods, results on the LOLA11 challenge test set are reported. PMID:27447897

  8. Microscopic validation of whole mouse micro-metastatic tumor imaging agents using cryo-imaging and sliding organ image registration.

    PubMed

    Liu, Yiqiao; Zhou, Bo; Qutaish, Mohammed; Wilson, David L

    2016-01-01

    We created a metastasis imaging, analysis platform consisting of software and multi-spectral cryo-imaging system suitable for evaluating emerging imaging agents targeting micro-metastatic tumor. We analyzed CREKA-Gd in MRI, followed by cryo-imaging which repeatedly sectioned and tiled microscope images of the tissue block face, providing anatomical bright field and molecular fluorescence, enabling 3D microscopic imaging of the entire mouse with single metastatic cell sensitivity. To register MRI volumes to the cryo bright field reference, we used our standard mutual information, non-rigid registration which proceeded: preprocess → affine → B-spline non-rigid 3D registration. In this report, we created two modified approaches: mask where we registered locally over a smaller rectangular solid, and sliding organ . Briefly, in sliding organ , we segmented the organ, registered the organ and body volumes separately and combined results. Though s liding organ required manual annotation, it provided the best result as a standard to measure other registration methods. Regularization parameters for standard and mask methods were optimized in a grid search. Evaluations consisted of DICE, and visual scoring of a checkerboard display. Standard had accuracy of 2 voxels in all regions except near the kidney, where there were 5 voxels sliding. After mask and sliding organ correction, kidneys sliding were within 2 voxels, and Dice overlap increased 4%-10% in mask compared to standard . Mask generated comparable results with sliding organ and allowed a semi-automatic process.

  9. Automatic Nuclei Segmentation in H&E Stained Breast Cancer Histopathology Images

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Kornegoor, Robert; Huisman, André; Viergever, Max A.; Pluim, Josien P. W.

    2013-01-01

    The introduction of fast digital slide scanners that provide whole slide images has led to a revival of interest in image analysis applications in pathology. Segmentation of cells and nuclei is an important first step towards automatic analysis of digitized microscopy images. We therefore developed an automated nuclei segmentation method that works with hematoxylin and eosin (H&E) stained breast cancer histopathology images, which represent regions of whole digital slides. The procedure can be divided into four main steps: 1) pre-processing with color unmixing and morphological operators, 2) marker-controlled watershed segmentation at multiple scales and with different markers, 3) post-processing for rejection of false regions and 4) merging of the results from multiple scales. The procedure was developed on a set of 21 breast cancer cases (subset A) and tested on a separate validation set of 18 cases (subset B). The evaluation was done in terms of both detection accuracy (sensitivity and positive predictive value) and segmentation accuracy (Dice coefficient). The mean estimated sensitivity for subset A was 0.875 (±0.092) and for subset B 0.853 (±0.077). The mean estimated positive predictive value was 0.904 (±0.075) and 0.886 (±0.069) for subsets A and B, respectively. For both subsets, the distribution of the Dice coefficients had a high peak around 0.9, with the vast majority of segmentations having values larger than 0.8. PMID:23922958

  10. Automatic nuclei segmentation in H&E stained breast cancer histopathology images.

    PubMed

    Veta, Mitko; van Diest, Paul J; Kornegoor, Robert; Huisman, André; Viergever, Max A; Pluim, Josien P W

    2013-01-01

    The introduction of fast digital slide scanners that provide whole slide images has led to a revival of interest in image analysis applications in pathology. Segmentation of cells and nuclei is an important first step towards automatic analysis of digitized microscopy images. We therefore developed an automated nuclei segmentation method that works with hematoxylin and eosin (H&E) stained breast cancer histopathology images, which represent regions of whole digital slides. The procedure can be divided into four main steps: 1) pre-processing with color unmixing and morphological operators, 2) marker-controlled watershed segmentation at multiple scales and with different markers, 3) post-processing for rejection of false regions and 4) merging of the results from multiple scales. The procedure was developed on a set of 21 breast cancer cases (subset A) and tested on a separate validation set of 18 cases (subset B). The evaluation was done in terms of both detection accuracy (sensitivity and positive predictive value) and segmentation accuracy (Dice coefficient). The mean estimated sensitivity for subset A was 0.875 (±0.092) and for subset B 0.853 (±0.077). The mean estimated positive predictive value was 0.904 (±0.075) and 0.886 (±0.069) for subsets A and B, respectively. For both subsets, the distribution of the Dice coefficients had a high peak around 0.9, with the vast majority of segmentations having values larger than 0.8.

  11. Predicting non-square 2D dice probabilities

    NASA Astrophysics Data System (ADS)

    Pender, G. A. T.; Uhrin, M.

    2014-07-01

    The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.

  12. Random forest classification of large volume structures for visuo-haptic rendering in CT images

    NASA Astrophysics Data System (ADS)

    Mastmeyer, Andre; Fortmeier, Dirk; Handels, Heinz

    2016-03-01

    For patient-specific voxel-based visuo-haptic rendering of CT scans of the liver area, the fully automatic segmentation of large volume structures such as skin, soft tissue, lungs and intestine (risk structures) is important. Using a machine learning based approach, several existing segmentations from 10 segmented gold-standard patients are learned by random decision forests individually and collectively. The core of this paper is feature selection and the application of the learned classifiers to a new patient data set. In a leave-some-out cross-validation, the obtained full volume segmentations are compared to the gold-standard segmentations of the untrained patients. The proposed classifiers use a multi-dimensional feature space to estimate the hidden truth, instead of relying on clinical standard threshold and connectivity based methods. The result of our efficient whole-body section classification are multi-label maps with the considered tissues. For visuo-haptic simulation, other small volume structures would have to be segmented additionally. We also take a look into these structures (liver vessels). For an experimental leave-some-out study consisting of 10 patients, the proposed method performs much more efficiently compared to state of the art methods. In two variants of leave-some-out experiments we obtain best mean DICE ratios of 0.79, 0.97, 0.63 and 0.83 for skin, soft tissue, hard bone and risk structures. Liver structures are segmented with DICE 0.93 for the liver, 0.43 for blood vessels and 0.39 for bile vessels.

  13. Automatic Organ Segmentation for CT Scans Based on Super-Pixel and Convolutional Neural Networks.

    PubMed

    Liu, Xiaoming; Guo, Shuxu; Yang, Bingtao; Ma, Shuzhi; Zhang, Huimao; Li, Jing; Sun, Changjian; Jin, Lanyi; Li, Xueyan; Yang, Qi; Fu, Yu

    2018-04-20

    Accurate segmentation of specific organ from computed tomography (CT) scans is a basic and crucial task for accurate diagnosis and treatment. To avoid time-consuming manual optimization and to help physicians distinguish diseases, an automatic organ segmentation framework is presented. The framework utilized convolution neural networks (CNN) to classify pixels. To reduce the redundant inputs, the simple linear iterative clustering (SLIC) of super-pixels and the support vector machine (SVM) classifier are introduced. To establish the perfect boundary of organs in one-pixel-level, the pixels need to be classified step-by-step. First, the SLIC is used to cut an image into grids and extract respective digital signatures. Next, the signature is classified by the SVM, and the rough edges are acquired. Finally, a precise boundary is obtained by the CNN, which is based on patches around each pixel-point. The framework is applied to abdominal CT scans of livers and high-resolution computed tomography (HRCT) scans of lungs. The experimental CT scans are derived from two public datasets (Sliver 07 and a Chinese local dataset). Experimental results show that the proposed method can precisely and efficiently detect the organs. This method consumes 38 s/slice for liver segmentation. The Dice coefficient of the liver segmentation results reaches to 97.43%. For lung segmentation, the Dice coefficient is 97.93%. This finding demonstrates that the proposed framework is a favorable method for lung segmentation of HRCT scans.

  14. A high resolution and large solid angle x-ray Raman spectroscopy end-station at the Stanford Synchrotron Radiation Lightsource

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokaras, D.; Nordlund, D.; Weng, T.-C.

    2012-04-15

    We present a new x-ray Raman spectroscopy end-station recently developed, installed, and operated at the Stanford Synchrotron Radiation Lightsource. The end-station is located at wiggler beamline 6-2 equipped with two monochromators-Si(111) and Si(311) as well as collimating and focusing optics. It consists of two multi-crystal Johann type spectrometers arranged on intersecting Rowland circles of 1 m diameter. The first one, positioned at the forward scattering angles (low-q), consists of 40 spherically bent and diced Si(110) crystals with 100 mm diameters providing about 1.9% of 4{pi} sr solid angle of detection. When operated in the (440) order in combination with themore » Si (311) monochromator, an overall energy resolution of 270 meV is obtained at 6462.20 eV. The second spectrometer, consisting of 14 spherically bent Si(110) crystal analyzers (not diced), is positioned at the backward scattering angles (high-q) enabling the study of non-dipole transitions. The solid angle of this spectrometer is about 0.9% of 4{pi} sr, with a combined energy resolution of 600 meV using the Si (311) monochromator. These features exceed the specifications of currently existing relevant instrumentation, opening new opportunities for the routine application of this photon-in/photon-out hard x-ray technique to emerging research in multidisciplinary scientific fields, such as energy-related sciences, material sciences, physical chemistry, etc.« less

  15. Interfacial characterization of Al-Al thermocompression bonds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malik, N., E-mail: nishantmalik1987@gmail.com; SINTEF ICT, Department of Microsystems and Nanotechnology, P.O. Box 124 Blindern, N-0314 Oslo; Carvalho, P. A.

    2016-05-28

    Interfaces formed by Al-Al thermocompression bonding were studied by the transmission electron microscopy. Si wafer pairs having patterned bonding frames were bonded using Al films deposited on Si or SiO{sub 2} as intermediate bonding media. A bond force of 36 or 60 kN at bonding temperatures ranging from 400–550 °C was applied for a duration of 60 min. Differences in the bonded interfaces of 200 μm wide sealing frames were investigated. It was observed that the interface had voids for bonding with 36 kN at 400 °C for Al deposited both on Si and on SiO{sub 2}. However, the dicing yield was 33% for Al onmore » Si and 98% for Al on SiO{sub 2}, attesting for the higher quality of the latter bonds. Both a bond force of 60 kN applied at 400 °C and a bond force of 36 kN applied at 550 °C resulted in completely bonded frames with dicing yields of, respectively, 100% and 96%. A high density of long dislocations in the Al grains was observed for the 60 kN case, while the higher temperature resulted in grain boundary rotation away from the original Al-Al interface towards more stable configurations. Possible bonding mechanisms and reasons for the large difference in bonding quality of the Al films deposited on Si or SiO{sub 2} are discussed.« less

  16. Microscopic validation of whole mouse micro-metastatic tumor imaging agents using cryo-imaging and sliding organ image registration

    NASA Astrophysics Data System (ADS)

    Liu, Yiqiao; Zhou, Bo; Qutaish, Mohammed; Wilson, David L.

    2016-03-01

    We created a metastasis imaging, analysis platform consisting of software and multi-spectral cryo-imaging system suitable for evaluating emerging imaging agents targeting micro-metastatic tumor. We analyzed CREKA-Gd in MRI, followed by cryo-imaging which repeatedly sectioned and tiled microscope images of the tissue block face, providing anatomical bright field and molecular fluorescence, enabling 3D microscopic imaging of the entire mouse with single metastatic cell sensitivity. To register MRI volumes to the cryo bright field reference, we used our standard mutual information, non-rigid registration which proceeded: preprocess --> affine --> B-spline non-rigid 3D registration. In this report, we created two modified approaches: mask where we registered locally over a smaller rectangular solid, and sliding organ. Briefly, in sliding organ, we segmented the organ, registered the organ and body volumes separately and combined results. Though sliding organ required manual annotation, it provided the best result as a standard to measure other registration methods. Regularization parameters for standard and mask methods were optimized in a grid search. Evaluations consisted of DICE, and visual scoring of a checkerboard display. Standard had accuracy of 2 voxels in all regions except near the kidney, where there were 5 voxels sliding. After mask and sliding organ correction, kidneys sliding were within 2 voxels, and Dice overlap increased 4%-10% in mask compared to standard. Mask generated comparable results with sliding organ and allowed a semi-automatic process.

  17. Automated 3D renal segmentation based on image partitioning

    NASA Astrophysics Data System (ADS)

    Yeghiazaryan, Varduhi; Voiculescu, Irina D.

    2016-03-01

    Despite several decades of research into segmentation techniques, automated medical image segmentation is barely usable in a clinical context, and still at vast user time expense. This paper illustrates unsupervised organ segmentation through the use of a novel automated labelling approximation algorithm followed by a hypersurface front propagation method. The approximation stage relies on a pre-computed image partition forest obtained directly from CT scan data. We have implemented all procedures to operate directly on 3D volumes, rather than slice-by-slice, because our algorithms are dimensionality-independent. The results picture segmentations which identify kidneys, but can easily be extrapolated to other body parts. Quantitative analysis of our automated segmentation compared against hand-segmented gold standards indicates an average Dice similarity coefficient of 90%. Results were obtained over volumes of CT data with 9 kidneys, computing both volume-based similarity measures (such as the Dice and Jaccard coefficients, true positive volume fraction) and size-based measures (such as the relative volume difference). The analysis considered both healthy and diseased kidneys, although extreme pathological cases were excluded from the overall count. Such cases are difficult to segment both manually and automatically due to the large amplitude of Hounsfield unit distribution in the scan, and the wide spread of the tumorous tissue inside the abdomen. In the case of kidneys that have maintained their shape, the similarity range lies around the values obtained for inter-operator variability. Whilst the procedure is fully automated, our tools also provide a light level of manual editing.

  18. Counterfactual Volcano Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2013-04-01

    The historical database of past disasters is a cornerstone of catastrophe risk assessment. Whereas disasters are fortunately comparatively rare, near-misses are quite common for both natural and man-made hazards. The word disaster originally means 'an unfavourable aspect of a star'. Except for astrologists, disasters are no longer perceived fatalistically as pre-determined. Nevertheless, to this day, historical disasters are treated statistically as fixed events, although in reality there is a large luck element involved in converting a near-miss crisis situation into a disaster statistic. It is possible to conceive a stochastic simulation of the past to explore the implications of this chance factor. Counterfactual history is the exercise of hypothesizing alternative paths of history from what actually happened. Exploring history from a counterfactual perspective is instructive for a variety of reasons. First, it is easy to be fooled by randomness and see regularity in event patterns which are illusory. The past is just one realization of a variety of possible evolutions of history, which may be analyzed through a stochastic simulation of an array of counterfactual scenarios. In any hazard context, there is a random component equivalent to dice being rolled to decide whether a near-miss becomes an actual disaster. The fact that there may be no observed disaster over a period of time may belie the occurrence of numerous near-misses. This may be illustrated using the simple dice paradigm. Suppose a dice is rolled every month for a year, and an event is recorded if a six is thrown. There is still an 11% chance of no events occurring during the year. A variety of perils may be used to illustrate the use of near-miss information within a counterfactual disaster analysis. In the domain of natural hazards, near-misses are a notable feature of the threat landscape. Storm surges are an obvious example. Sea defences may protect against most meteorological scenarios. However, if a major storm surge happens to arrive at a high astronomical tide, sea walls may be overtopped and flooding may ensue. In the domain of geological hazards, periods of volcanic unrest may generate precursory signals suggestive of imminent volcanic danger, but without leading to an actual eruption. Near-miss unrest periods provide vital evidence for assessing the dynamics of volcanoes close to eruption. Where the volcano catalogue has been diligently revised to include the maximum amount of information on the phenomenology of unrest periods, dynamic modelling and hazard assessment may be significantly refined. This is illustrated with some topical volcano hazard examples, including Montserrat and Santorini.

  19. An experiment on radioactive equilibrium and its modelling using the ‘radioactive dice’ approach

    NASA Astrophysics Data System (ADS)

    Santostasi, Davide; Malgieri, Massimiliano; Montagna, Paolo; Vitulo, Paolo

    2017-07-01

    In this article we describe an educational activity on radioactive equilibrium we performed with secondary school students (17-18 years old) in the context of a vocational guidance stage for talented students at the Department of Physics of the University of Pavia. Radioactive equilibrium is investigated experimentally by having students measure the activity of 214Bi from two different samples, obtained using different preparation procedures from an uraniferous rock. Students are guided in understanding the mathematical structure of radioactive equilibrium through a modelling activity in two parts. Before the lab measurements, a dice game, which extends the traditional ‘radioactive dice’ activity to the case of a chain of two decaying nuclides, is performed by students divided into small groups. At the end of the laboratory work, students design and run a simple spreadsheet simulation modelling the same basic radioactive chain with user defined decay constants. By setting the constants to realistic values corresponding to nuclides of the uranium decay chain, students can deepen their understanding of the meaning of the experimental data, and also explore the difference between cases of non-equilibrium, transient and secular equilibrium.

  20. An Application of the Direct Coulomb Electron Pair Production Process to the Energy Measurement of the "VH-Group" in the "Knee" Region of the "All-Particle" Energy Spectrum

    NASA Technical Reports Server (NTRS)

    Derrickson, J. H.; Wu, J.; Christl, M. J.; Fountain, W. F.; Parnell, T. A.

    1999-01-01

    The "all-particle" cosmic ray energy spectrum appears to be exhibiting a significant change in the spectral index just above approximately 3000 TeV. This could indicate (1) a change in the propagation of the cosmic rays in the galactic medium, and/or (2) the upper limit of the supernova shock wave acceleration mechanism, and/or (3) a new source of high-energy cosmic rays. Air shower and JACEE data indicate the spectral change is associated with a composition change to a heavier element mixture whereas DICE does not indicate this. A detector concept will be presented that utilizes the energy dependence of the production of direct Coulomb electron-positron pairs by energetic heavy ions. Monte Carlo simulations of a direct electron pair detector consisting of Pb target foils interleaved with planes of 1-mm square scintillating optical fibers will be discussed. The goal is to design a large area, non-saturating instrument to measure the energy spectrum of the individual cosmic ray elements in the "VH-group" for energies greater than 10 TeV/nucleon.

  1. Design, fabrication and skin-electrode contact analysis of polymer microneedle-based ECG electrodes

    NASA Astrophysics Data System (ADS)

    O'Mahony, Conor; Grygoryev, Konstantin; Ciarlone, Antonio; Giannoni, Giuseppe; Kenthao, Anan; Galvin, Paul

    2016-08-01

    Microneedle-based ‘dry’ electrodes have immense potential for use in diagnostic procedures such as electrocardiography (ECG) analysis, as they eliminate several of the drawbacks associated with the conventional ‘wet’ electrodes currently used for physiological signal recording. To be commercially successful in such a competitive market, it is essential that dry electrodes are manufacturable in high volumes and at low cost. In addition, the topographical nature of these emerging devices means that electrode performance is likely to be highly dependent on the quality of the skin-electrode contact. This paper presents a low-cost, wafer-level micromoulding technology for the fabrication of polymeric ECG electrodes that use microneedle structures to make a direct electrical contact to the body. The double-sided moulding process can be used to eliminate post-process via creation and wafer dicing steps. In addition, measurement techniques have been developed to characterize the skin-electrode contact force. We perform the first analysis of signal-to-noise ratio dependency on contact force, and show that although microneedle-based electrodes can outperform conventional gel electrodes, the quality of ECG recordings is significantly dependent on temporal and mechanical aspects of the skin-electrode interface.

  2. The last straw!: a tool for participatory education about the social determinants of health.

    PubMed

    Rossiter, Kate; Reeve, Kate

    2008-01-01

    In response to a scarcity of teaching tools regarding the social determinants of health (SDOH), Kate Reeve and Kate Rossiter created The Last Straw! board game, an innovative participatory education tool to facilitate and engage critical thinking about the SDOH. The Last Straw! is designed to encourage discussion about the SDOH, promote critical thinking, and build empathy with marginalized people. The game begins as each player rolls the dice to create a character profile, including socioeconomic status (SES), race, and gender. Based on this profile, players then receive a certain number of "vitality chips." Moving across the board, players encounter scenarios that cause them to gain and lose chips based on their profile. The player who finishes the game with the most chips wins the game. The game can be facilitated for a variety of audiences, including both players with no prior knowledge of the SDOH and those experienced in the field. The game has been played with students, policymakers, and community workers, among others, and has been met with immense enthusiasm. Here, we detail the game's reception within the community, including benefits, limitations, and next steps.

  3. Design Fixation in the Wild: Design Environments and Their Influence on Fixation

    ERIC Educational Resources Information Center

    Youmans, Robert J.

    2011-01-01

    Many studies of design fixation ask designers to work in controlled laboratory or classroom environments, but innovative design work frequently occurs in dynamic, social environments. The two studies reviewed in this paper investigated how three independent variables likely to be present in many design environments affect design fixation. The…

  4. The Automated DC Parameter Testing of GaAs MESFETs Using the Singer Automatic Integrated Circuit Test System.

    DTIC Science & Technology

    1980-09-01

    Timing Diagram Showing Relationship of Control Signals to Phase Clocks 219 70 Sample MESFET Used to Obtain Error Factors 231 x LIST OF TABLES TABLE PAGE...each chi,,, tested ear Leio within the fixture. This means that -:acij vii to e testd must be diced from the wafer. Some sicans urine - ,ut si - nals of...dy anhc testing of GaA; MEVET_’.’ . It would therefore be necess-ry to add a storage buffer between the tri-state fubber and the measurinv instrument

  5. Leading While Blindfolded: Examining the Defense Business Board’s Recommendations to Reform the Military Retirement System

    DTIC Science & Technology

    2012-06-04

    data from the DOD Office of the Actuary , it is possible to calculate the approximate savings from this option. For purposes of the example in table 4...savings 21% Source: U.S. Department of Defense, Office of the Actuary . This option rated the highest ratio of negative to positive reactions amongst all...Chaos: Making a New Science, (New York: Penguin Books, 1987), 20. 2 Ian Stewart, Does God Play Dice? The New Mathematics of Chaos, 2nd ed. (Malden

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jungho; Shi, Xianbo; Casa, Diego

    Advances in resonant inelastic X-ray scattering (RIXS) have come in lockstep with improvements in energy resolution. Currently, the best energy resolution at the IrL 3-edge stands at ~25 meV, which is achieved using a diced Si(844) spherical crystal analyzer. However, spherical analyzers are limited by their intrinsic reflection width. A novel analyzer system using multiple flat crystals provides a promising way to overcome this limitation. For the present design, an energy resolution at or below 10 meV was selected. Recognizing that the angular acceptance of flat crystals is severely limited, a collimating element is essential to achieve the necessary solid-anglemore » acceptance. For this purpose, a laterally graded, parabolic, multilayer Montel mirror was designed for use at the IrL 3-absorption edge. It provides an acceptance larger than 10 mrad, collimating the reflected X-ray beam to smaller than 100 µrad, in both vertical and horizontal directions. The performance of this mirror was studied at beamline 27-ID at the Advanced Photon Source. X-rays from a diamond (111) monochromator illuminated a scattering source of diameter 5 µm, generating an incident beam on the mirror with a well determined divergence of 40 mrad. A flat Si(111) crystal after the mirror served as the divergence analyzer. From X-ray measurements, ray-tracing simulations and optical metrology results, it was established that the Montel mirror satisfied the specifications of angular acceptance and collimation quality necessary for a high-resolution RIXS multi-crystal analyzer system.« less

  7. Design and Fabrication of the Second-Generation KID-Based Light Detectors of CALDER

    NASA Astrophysics Data System (ADS)

    Colantoni, I.; Cardani, L.; Casali, N.; Cruciani, A.; Bellini, F.; Castellano, M. G.; Cosmelli, C.; D'Addabbo, A.; Di Domizio, S.; Martinez, M.; Tomei, C.; Vignati, M.

    2018-04-01

    The goal of the cryogenic wide-area light detectors with excellent resolution project is the development of light detectors with large active area and noise energy resolution smaller than 20 eV RMS using phonon-mediated kinetic inductance detectors (KIDs). The detectors are developed to improve the background suppression in large-mass bolometric experiments such as CUORE, via the double readout of the light and the heat released by particles interacting in the bolometers. In this work we present the fabrication process, starting from the silicon wafer arriving to the single chip. In the first part of the project, we designed and fabricated KID detectors using aluminum. Detectors are designed by means of state-of-the-art software for electromagnetic analysis (SONNET). The Al thin films (40 nm) are evaporated on high-quality, high-resistivity (> 10 kΩ cm) Si(100) substrates using an electron beam evaporator in a HV chamber. Detectors are patterned in direct-write mode, using electron beam lithography (EBL), positive tone resist poly-methyl methacrylate and lift-off process. Finally, the chip is diced into 20 × 20 mm2 chips and assembled in a holder OFHC (oxygen-free high conductivity) copper using PTFE support. To increase the energy resolution of our detectors, we are changing the superconductor to sub-stoichiometric TiN (TiN x ) deposited by means of DC magnetron sputtering. We are optimizing its deposition by means of DC magnetron reactive sputtering. For this kind of material, the fabrication process is subtractive and consists of EBL patterning through negative tone resist AR-N 7700 and deep reactive ion etching. Critical temperature of TiN x samples was measured in a dedicated cryostat.

  8. Teaching basic life support: a prospective randomized study on low-cost training strategies in secondary schools.

    PubMed

    Van Raemdonck, Veerle; Monsieurs, Koenraad G; Aerenhouts, Dirk; De Martelaer, Kristine

    2014-08-01

    Cardiopulmonary resuscitation (CPR) training at school is recommended. Limited school resources prevent implementation. The learning efficacy of low-cost training strategies is unknown. To evaluate the efficacy of different CPR learning strategies using low-cost didactic tools. Children (n=593, 15-16 years) were randomized to four training conditions: (1) manikin+teacher instruction (control group), (2) manikin+video instruction, (3) foam dice+plastic bag+peer training+teacher instruction, and (4) foam dice+plastic bag+peer training+video instruction. After a 50 min training, a 3 min CPR test on a manikin was performed using SkillReporting Software (Laerdal, Norway), and repeated after 6 months. The data of children without previous CPR training were analysed. Analysis of variance and the χ-test assessed differences between groups. Complete data sets were available for 165 pupils. Initially, group 3 scored lower on the mean ventilation volume (P<0.05). The control group scored better than the alternative groups (P<0.05) on the mean compression rate. After 6 months, the differences disappeared. All groups scored equally on ventilation volume (P=0.12), compression depth (P=0.11), compression rate (P=0.10), correct hand position (P=0.46) and number of correct compressions (P=0.76). Ventilation volume was sufficient in 32% of the pupils, 18% had a correct compression depth and 59% had a correct compression rate. Training efficacy with low-cost equipment was not different from training with a manikin. The outcome for all training strategies was suboptimal. The basics of CPR can be taught with alternative equipment if manikins are not available.

  9. Fully convolutional networks (FCNs)-based segmentation method for colorectal tumors on T2-weighted magnetic resonance images.

    PubMed

    Jian, Junming; Xiong, Fei; Xia, Wei; Zhang, Rui; Gu, Jinhui; Wu, Xiaodong; Meng, Xiaochun; Gao, Xin

    2018-06-01

    Segmentation of colorectal tumors is the basis of preoperative prediction, staging, and therapeutic response evaluation. Due to the blurred boundary between lesions and normal colorectal tissue, it is hard to realize accurate segmentation. Routinely manual or semi-manual segmentation methods are extremely tedious, time-consuming, and highly operator-dependent. In the framework of FCNs, a segmentation method for colorectal tumor was presented. Normalization was applied to reduce the differences among images. Borrowing from transfer learning, VGG-16 was employed to extract features from normalized images. We conducted five side-output blocks from the last convolutional layer of each block of VGG-16 along the network, these side-output blocks can deep dive multiscale features, and produced corresponding predictions. Finally, all of the predictions from side-output blocks were fused to determine the final boundaries of the tumors. A quantitative comparison of 2772 colorectal tumor manual segmentation results from T2-weighted magnetic resonance images shows that the average Dice similarity coefficient, positive predictive value, specificity, sensitivity, Hammoude distance, and Hausdorff distance were 83.56, 82.67, 96.75, 87.85%, 0.2694, and 8.20, respectively. The proposed method is superior to U-net in colorectal tumor segmentation (P < 0.05). There is no difference between cross-entropy loss and Dice-based loss in colorectal tumor segmentation (P > 0.05). The results indicate that the introduction of FCNs contributed to accurate segmentation of colorectal tumors. This method has the potential to replace the present time-consuming and nonreproducible manual segmentation method.

  10. A Multi-center Milestone Study of Clinical Vertebral CT Segmentation

    PubMed Central

    Yao, Jianhua; Burns, Joseph E.; Forsberg, Daniel; Seitel, Alexander; Rasoulian, Abtin; Abolmaesumi, Purang; Hammernik, Kerstin; Urschler, Martin; Ibragimov, Bulat; Korez, Robert; Vrtovec, Tomaž; Castro-Mateos, Isaac; Pozo, Jose M.; Frangi, Alejandro F.; Summers, Ronald M.; Li, Shuo

    2017-01-01

    A multiple center milestone study of clinical vertebra segmentation is presented in this paper. Vertebra segmentation is a fundamental step for spinal image analysis and intervention. The first half of the study was conducted in the spine segmentation challenge in 2014 International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI) Workshop on Computational Spine Imaging (CSI 2014). The objective was to evaluate the performance of several state-of-the-art vertebra segmentation algorithms on computed tomography (CT) scans using ten training and five testing dataset, all healthy cases; the second half of the study was conducted after the challenge, where additional 5 abnormal cases are used for testing to evaluate the performance under abnormal cases. Dice coefficients and absolute surface distances were used as evaluation metrics. Segmentation of each vertebra as a single geometric unit, as well as separate segmentation of vertebra substructures, was evaluated. Five teams participated in the comparative study. The top performers in the study achieved Dice coefficient of 0.93 in the upper thoracic, 0.95 in the lower thoracic and 0.96 in the lumbar spine for healthy cases, and 0.88 in the upper thoracic, 0.89 in the lower thoracic and 0.92 in the lumbar spine for osteoporotic and fractured cases. The strengths and weaknesses of each method as well as future suggestion for improvement are discussed. This is the first multi-center comparative study for vertebra segmentation methods, which will provide an up-to-date performance milestone for the fast growing spinal image analysis and intervention. PMID:26878138

  11. Automated cerebral infarct volume measurement in follow-up noncontrast CT scans of patients with acute ischemic stroke.

    PubMed

    Boers, A M; Marquering, H A; Jochem, J J; Besselink, N J; Berkhemer, O A; van der Lugt, A; Beenen, L F; Majoie, C B

    2013-08-01

    Cerebral infarct volume as observed in follow-up CT is an important radiologic outcome measure of the effectiveness of treatment of patients with acute ischemic stroke. However, manual measurement of CIV is time-consuming and operator-dependent. The purpose of this study was to develop and evaluate a robust automated measurement of the CIV. The CIV in early follow-up CT images of 34 consecutive patients with acute ischemic stroke was segmented with an automated intensity-based region-growing algorithm, which includes partial volume effect correction near the skull, midline determination, and ventricle and hemorrhage exclusion. Two observers manually delineated the CIV. Interobserver variability of the manual assessments and the accuracy of the automated method were evaluated by using the Pearson correlation, Bland-Altman analysis, and Dice coefficients. The accuracy was defined as the correlation with the manual assessment as a reference standard. The Pearson correlation for the automated method compared with the reference standard was similar to the manual correlation (R = 0.98). The accuracy of the automated method was excellent with a mean difference of 0.5 mL with limits of agreement of -38.0-39.1 mL, which were more consistent than the interobserver variability of the 2 observers (-40.9-44.1 mL). However, the Dice coefficients were higher for the manual delineation. The automated method showed a strong correlation and accuracy with the manual reference measurement. This approach has the potential to become the standard in assessing the infarct volume as a secondary outcome measure for evaluating the effectiveness of treatment.

  12. Molecular analysis of bacterial microbiota associated with oysters (Crassostrea gigas and Crassostrea corteziensis) in different growth phases at two cultivation sites.

    PubMed

    Trabal, Natalia; Mazón-Suástegui, José M; Vázquez-Juárez, Ricardo; Asencio-Valle, Felipe; Morales-Bojórquez, Enrique; Romero, Jaime

    2012-08-01

    Microbiota presumably plays an essential role in inhibiting pathogen colonization and in the maintenance of health in oysters, but limited data exist concerning their different growth phases and conditions. We analyzed the bacterial microbiota composition of two commercial oysters: Crassostrea gigas and Crassostrea corteziensis. Differences in microbiota were assayed in three growth phases: post-larvae at the hatchery, juvenile, and adult at two grow-out cultivation sites. Variations in the microbiota were assessed by PCR analysis of the 16S rRNA gene in DNA extracted from depurated oysters. Restriction fragment length polymorphism (RFLP) profiles were studied using Dice's similarity coefficient (Cs) and statistical principal component analysis (PCA). The microbiota composition was determined by sequencing temperature gradient gel electrophoresis (TGGE) bands. The RFLP analysis of post-larvae revealed homology in the microbiota of both oyster species (Cs > 88 %). Dice and PCA analyses of C. corteziensis but not C. gigas showed differences in the microbiota according to the cultivation sites. The sequencing analysis revealed low bacterial diversity (primarily β-Proteobacteria, Firmicutes, and Spirochaetes), with Burkholderia cepacia being the most abundant bacteria in both oyster species. This study provides the first description of the microbiota in C. corteziensis, which was shown to be influenced by cultivation site conditions. During early growth, we observed that B. cepacia colonized and remained strongly associated with the two oysters, probably in a symbiotic host-bacteria relationship. This association was maintained in the three growth phases and was not altered by environmental conditions or the management of the oysters at the grow-out site.

  13. The influence of the image registration method on the adaptive radiotherapy. A proof of the principle in a selected case of prostate IMRT.

    PubMed

    Berenguer, Roberto; de la Vara, Victoria; Lopez-Honrubia, Veronica; Nuñez, Ana Teresa; Rivera, Miguel; Villas, Maria Victoria; Sabater, Sebastia

    2018-01-01

    To analyse the influence of the image registration method on the adaptive radiotherapy of an IMRT prostate treatment, and to compare the dose accumulation according to 3 different image registration methods with the planned dose. The IMRT prostate patient was CT imaged 3 times throughout his treatment. The prostate, PTV, rectum and bladder were segmented on each CT. A Rigid, a deformable (DIR) B-spline and a DIR with landmarks registration algorithms were employed. The difference between the accumulated doses and planned doses were evaluated by the gamma index. The Dice coefficient and Hausdorff distance was used to evaluate the overlap between volumes, to quantify the quality of the registration. When comparing adaptive vs no adaptive RT, the gamma index calculation showed large differences depending on the image registration method (as much as 87.6% in the case of DIR B-spline). The quality of the registration was evaluated using an index such as the Dice coefficient. This showed that the best result was obtained with DIR with landmarks compared with the rest and it was always above 0.77, reported as a recommended minimum value for prostate studies in a multi-centre review. Apart from showing the importance of the application of an adaptive RT protocol in a particular treatment, this work shows that the election of the registration method is decisive in the result of the adaptive radiotherapy and dose accumulation. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Automatic 3D segmentation of spinal cord MRI using propagated deformable models

    NASA Astrophysics Data System (ADS)

    De Leener, B.; Cohen-Adad, J.; Kadoury, S.

    2014-03-01

    Spinal cord diseases or injuries can cause dysfunction of the sensory and locomotor systems. Segmentation of the spinal cord provides measures of atrophy and allows group analysis of multi-parametric MRI via inter-subject registration to a template. All these measures were shown to improve diagnostic and surgical intervention. We developed a framework to automatically segment the spinal cord on T2-weighted MR images, based on the propagation of a deformable model. The algorithm is divided into three parts: first, an initialization step detects the spinal cord position and orientation by using the elliptical Hough transform on multiple adjacent axial slices to produce an initial tubular mesh. Second, a low-resolution deformable model is iteratively propagated along the spinal cord. To deal with highly variable contrast levels between the spinal cord and the cerebrospinal fluid, the deformation is coupled with a contrast adaptation at each iteration. Third, a refinement process and a global deformation are applied on the low-resolution mesh to provide an accurate segmentation of the spinal cord. Our method was evaluated against a semi-automatic edge-based snake method implemented in ITK-SNAP (with heavy manual adjustment) by computing the 3D Dice coefficient, mean and maximum distance errors. Accuracy and robustness were assessed from 8 healthy subjects. Each subject had two volumes: one at the cervical and one at the thoracolumbar region. Results show a precision of 0.30 +/- 0.05 mm (mean absolute distance error) in the cervical region and 0.27 +/- 0.06 mm in the thoracolumbar region. The 3D Dice coefficient was of 0.93 for both regions.

  15. Fully automated contour detection of the ascending aorta in cardiac 2D phase-contrast MRI.

    PubMed

    Codari, Marina; Scarabello, Marco; Secchi, Francesco; Sforza, Chiarella; Baselli, Giuseppe; Sardanelli, Francesco

    2018-04-01

    In this study we proposed a fully automated method for localizing and segmenting the ascending aortic lumen with phase-contrast magnetic resonance imaging (PC-MRI). Twenty-five phase-contrast series were randomly selected out of a large population dataset of patients whose cardiac MRI examination, performed from September 2008 to October 2013, was unremarkable. The local Ethical Committee approved this retrospective study. The ascending aorta was automatically identified on each phase of the cardiac cycle using a priori knowledge of aortic geometry. The frame that maximized the area, eccentricity, and solidity parameters was chosen for unsupervised initialization. Aortic segmentation was performed on each frame using active contouring without edges techniques. The entire algorithm was developed using Matlab R2016b. To validate the proposed method, the manual segmentation performed by a highly experienced operator was used. Dice similarity coefficient, Bland-Altman analysis, and Pearson's correlation coefficient were used as performance metrics. Comparing automated and manual segmentation of the aortic lumen on 714 images, Bland-Altman analysis showed a bias of -6.68mm 2 , a coefficient of repeatability of 91.22mm 2 , a mean area measurement of 581.40mm 2 , and a reproducibility of 85%. Automated and manual segmentation were highly correlated (R=0.98). The Dice similarity coefficient versus the manual reference standard was 94.6±2.1% (mean±standard deviation). A fully automated and robust method for identification and segmentation of ascending aorta on PC-MRI was developed. Its application on patients with a variety of pathologic conditions is advisable. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. The Role of the Occupational Therapist in the Management of Neuropsychiatric Symptoms of Dementia in Clinical Settings

    PubMed Central

    Fraker, Joyce; Kales, Helen C.; Blazek, Mary; Kavanagh, Janet; Gitlin, Laura N.

    2014-01-01

    Neuropsychiatric symptoms (NPS) of dementia include aggression, agitation, depression, anxiety, delusions, hallucinations, apathy, and disinhibition. NPS affect dementia patients nearly universally across dementia stages and etiologies. They are associated with poor patient and caregiver outcomes including increased health care utilization, excess morbidity and mortality, and earlier nursing home placement, as well as caregiver stress, depression and reduced employment. There are no FDA-approved medications for NPS, but it is common clinical practice to use psychotropic medications such as antipsychotics to control symptoms; however, antipsychotics show only modest efficacy in improving NPS and have significant risks for patients, including side effects and mortality. Non-pharmacologic treatments are considered first-line by multiple medical bodies and expert consensus, show evidence for efficacy and have limited potential for adverse effects. Ideally, non-pharmacological management of NPS in clinical settings occurs in multidisciplinary teams where occupational therapists (OTs) play an important collaborative role in the care of the person with dementia. Our group has articulated an evidence-informed structured approach to the management of NPS that can be integrated into diverse practice settings and used by providers of various disciplines. The “DICE” (Describe, Investigate, Create, and Evaluate) approach is inherently patient- and caregiver- centered as patient and caregiver concerns are integral to each step of the process. DICE offers a clinical reasoning approach through which providers can more efficiently and effectively choose optimal treatment plans. The purpose of this paper is to describe the role of the OT in using the DICE approach for NPS management. PMID:24354328

  17. Quantification of esophageal wall thickness in CT using atlas-based segmentation technique

    NASA Astrophysics Data System (ADS)

    Wang, Jiahui; Kang, Min Kyu; Kligerman, Seth; Lu, Wei

    2015-03-01

    Esophageal wall thickness is an important predictor of esophageal cancer response to therapy. In this study, we developed a computerized pipeline for quantification of esophageal wall thickness using computerized tomography (CT). We first segmented the esophagus using a multi-atlas-based segmentation scheme. The esophagus in each atlas CT was manually segmented to create a label map. Using image registration, all of the atlases were aligned to the imaging space of the target CT. The deformation field from the registration was applied to the label maps to warp them to the target space. A weighted majority-voting label fusion was employed to create the segmentation of esophagus. Finally, we excluded the lumen from the esophagus using a threshold of -600 HU and measured the esophageal wall thickness. The developed method was tested on a dataset of 30 CT scans, including 15 esophageal cancer patients and 15 normal controls. The mean Dice similarity coefficient (DSC) and mean absolute distance (MAD) between the segmented esophagus and the reference standard were employed to evaluate the segmentation results. Our method achieved a mean Dice coefficient of 65.55 ± 10.48% and mean MAD of 1.40 ± 1.31 mm for all the cases. The mean esophageal wall thickness of cancer patients and normal controls was 6.35 ± 1.19 mm and 6.03 ± 0.51 mm, respectively. We conclude that the proposed method can perform quantitative analysis of esophageal wall thickness and would be useful for tumor detection and tumor response evaluation of esophageal cancer.

  18. 3D-Printed Visceral Aneurysm Models Based on CT Data for Simulations of Endovascular Embolization: Evaluation of Size and Shape Accuracy.

    PubMed

    Shibata, Eisuke; Takao, Hidemasa; Amemiya, Shiori; Ohtomo, Kuni

    2017-08-01

    The objective of this study is to verify the accuracy of 3D-printed hollow models of visceral aneurysms created from CT angiography (CTA) data, by evaluating the sizes and shapes of aneurysms and related arteries. From March 2006 to August 2015, 19 true visceral aneurysms were embolized via interventional radiologic treatment provided by the radiology department at our institution; aneurysms with bleeding (n = 3) or without thin-slice (< 1 mm) preembolization CT data (n = 1) were excluded. A total of 15 consecutive true visceral aneurysms from 11 patients (eight women and three men; mean age, 61 years; range, 53-72 years) whose aneurysms were embolized via endovascular procedures were included in this study. Three-dimensional-printed hollow models of aneurysms and related arteries were fabricated from CTA data. The accuracies of the sizes and shapes of the 3D-printed hollow models were evaluated using the nonparametric Wilcoxon signed rank test and the Dice coefficient index. Aneurysm sizes ranged from 138 to 18,691 mm 3 (diameter, 6.1-35.7 mm), and no statistically significant difference was noted between patient data and 3D-printed models (p = 0.56). Shape analysis of whole aneurysms and related arteries indicated a high level of accuracy (Dice coefficient index value, 84.2-95.8%; mean [± SD], 91.1 ± 4.1%). The sizes and shapes of 3D-printed hollow visceral aneurysm models created from CTA data were accurate. These models can be used for simulations of endovascular treatment and precise anatomic information.

  19. SRB ascent aerodynamic heating design criteria reduction study, volume 1

    NASA Technical Reports Server (NTRS)

    Crain, W. K.; Frost, C. L.; Engel, C. D.

    1989-01-01

    An independent set of solid rocket booster (SRB) convective ascent design environments were produced which would serve as a check on the Rockwell IVBC-3 environments used to design the ascent phase of flight. In addition, support was provided for lowering the design environments such that Thermal Protection System (TPS), based on conservative estimates, could be removed leading to a reduction in SRB refurbishment time and cost. Ascent convective heating rates and loads were generated at locations in the SRB where lowering the thermal environment would impact the TPS design. The ascent thermal environments are documented along with the wind tunnel/flight test data base used as well as the trajectory and environment generation methodology. Methodology, as well as, environment summaries compared to the 1980 Design and Rockwell IVBC-3 Design Environment are presented in this volume, 1.

  20. Effects of environmental design on patient outcome: a systematic review.

    PubMed

    Laursen, Jannie; Danielsen, Anne; Rosenberg, Jacob

    2014-01-01

    The aim of this systematic review was to assess how inpatients were affected by the built environment design during their hospitalization. Over the last decade, the healthcare system has become increasingly aware of how focus on healthcare environment might affect patient satisfaction. The focus on environmental design has become a field with great potential because of its possible impact on cost control while improving quality of care. A systematic literature search was conducted to identify current and past studies about evidence-based healthcare design. The following databases were searched: Medline/PubMed, Cinahl, and Embase. Inclusion criteria were randomized clinical trials (RCTs) investigating the effect of built environment design interventions such as music, natural murals, and plants in relation to patients' health outcome. Built environment design aspects such as audio environment and visual environment had a positive influence on patients' health outcomes. Specifically the studies indicated a decrease in patients' anxiety, pain, and stress levels when exposed to certain built environment design interventions. The built environment, especially specific audio and visual aspects, seems to play an important role in patients' outcomes, making hospitals a better healing environment for patients. Built environment, evidence-based design, healing environments, hospitals, literature review.

  1. Interactive Environment Design in Smart City

    NASA Astrophysics Data System (ADS)

    Deng, DeXiang; Chen, LanSha; Zhou, Xi

    2017-08-01

    The interactive environment design of smart city is not just an interactive progress or interactive mode design, rather than generate an environment such as the “organic” life entity as human beings through interactive design, forming a smart environment with perception, memory, thinking, and reaction.

  2. Using Business Intelligence to Analyze and Share Health System Infrastructure Data in a Rural Health Authority

    PubMed Central

    Urquhart, Bonnie; Berg, Emery; Dhanoa, Ramandeep

    2014-01-01

    Background Health care organizations gather large volumes of data, which has been traditionally stored in legacy formats making it difficult to analyze or use effectively. Though recent government-funded initiatives have improved the situation, the quality of most existing data is poor, suffers from inconsistencies, and lacks integrity. Generating reports from such data is generally not considered feasible due to extensive labor, lack of reliability, and time constraints. Advanced data analytics is one way of extracting useful information from such data. Objective The intent of this study was to propose how Business Intelligence (BI) techniques can be applied to health system infrastructure data in order to make this information more accessible and comprehensible for a broader group of people. Methods An integration process was developed to cleanse and integrate data from disparate sources into a data warehouse. An Online Analytical Processing (OLAP) cube was then built to allow slicing along multiple dimensions determined by various key performance indicators (KPIs), representing population and patient profiles, case mix groups, and healthy community indicators. The use of mapping tools, customized shape files, and embedded objects further augment the navigation. Finally, Web forms provide a mechanism for remote uploading of data and transparent processing of the cube. For privileged information, access controls were implemented. Results Data visualization has eliminated tedious analysis through legacy reports and provided a mechanism for optimally aligning resources with needs. Stakeholders are able to visualize KPIs on a main dashboard, slice-and-dice data, generate ad hoc reports, and quickly find the desired information. In addition, comparison, availability, and service level reports can also be generated on demand. All reports can be drilled down for navigation at a finer granularity. Conclusions We have demonstrated how BI techniques and tools can be used in the health care environment to make informed decisions with reference to resource allocation and enhancement of the quality of patient care. The data can be uploaded immediately upon collection, thus keeping reports current. The modular design can be expanded to add new datasets such as for smoking rates, teen pregnancies, human immunodeficiency virus (HIV) rates, immunization coverage, and vital statistical summaries. PMID:25599727

  3. 3D marker-controlled watershed for kidney segmentation in clinical CT exams.

    PubMed

    Wieclawek, Wojciech

    2018-02-27

    Image segmentation is an essential and non trivial task in computer vision and medical image analysis. Computed tomography (CT) is one of the most accessible medical examination techniques to visualize the interior of a patient's body. Among different computer-aided diagnostic systems, the applications dedicated to kidney segmentation represent a relatively small group. In addition, literature solutions are verified on relatively small databases. The goal of this research is to develop a novel algorithm for fully automated kidney segmentation. This approach is designed for large database analysis including both physiological and pathological cases. This study presents a 3D marker-controlled watershed transform developed and employed for fully automated CT kidney segmentation. The original and the most complex step in the current proposition is an automatic generation of 3D marker images. The final kidney segmentation step is an analysis of the labelled image obtained from marker-controlled watershed transform. It consists of morphological operations and shape analysis. The implementation is conducted in a MATLAB environment, Version 2017a, using i.a. Image Processing Toolbox. 170 clinical CT abdominal studies have been subjected to the analysis. The dataset includes normal as well as various pathological cases (agenesis, renal cysts, tumors, renal cell carcinoma, kidney cirrhosis, partial or radical nephrectomy, hematoma and nephrolithiasis). Manual and semi-automated delineations have been used as a gold standard. Wieclawek Among 67 delineated medical cases, 62 cases are 'Very good', whereas only 5 are 'Good' according to Cohen's Kappa interpretation. The segmentation results show that mean values of Sensitivity, Specificity, Dice, Jaccard, Cohen's Kappa and Accuracy are 90.29, 99.96, 91.68, 85.04, 91.62 and 99.89% respectively. All 170 medical cases (with and without outlines) have been classified by three independent medical experts as 'Very good' in 143-148 cases, as 'Good' in 15-21 cases and as 'Moderate' in 6-8 cases. An automatic kidney segmentation approach for CT studies to compete with commonly known solutions was developed. The algorithm gives promising results, that were confirmed during validation procedure done on a relatively large database, including 170 CTs with both physiological and pathological cases.

  4. Using business intelligence to analyze and share health system infrastructure data in a rural health authority.

    PubMed

    Haque, Waqar; Urquhart, Bonnie; Berg, Emery; Dhanoa, Ramandeep

    2014-08-06

    Health care organizations gather large volumes of data, which has been traditionally stored in legacy formats making it difficult to analyze or use effectively. Though recent government-funded initiatives have improved the situation, the quality of most existing data is poor, suffers from inconsistencies, and lacks integrity. Generating reports from such data is generally not considered feasible due to extensive labor, lack of reliability, and time constraints. Advanced data analytics is one way of extracting useful information from such data. The intent of this study was to propose how Business Intelligence (BI) techniques can be applied to health system infrastructure data in order to make this information more accessible and comprehensible for a broader group of people. An integration process was developed to cleanse and integrate data from disparate sources into a data warehouse. An Online Analytical Processing (OLAP) cube was then built to allow slicing along multiple dimensions determined by various key performance indicators (KPIs), representing population and patient profiles, case mix groups, and healthy community indicators. The use of mapping tools, customized shape files, and embedded objects further augment the navigation. Finally, Web forms provide a mechanism for remote uploading of data and transparent processing of the cube. For privileged information, access controls were implemented. Data visualization has eliminated tedious analysis through legacy reports and provided a mechanism for optimally aligning resources with needs. Stakeholders are able to visualize KPIs on a main dashboard, slice-and-dice data, generate ad hoc reports, and quickly find the desired information. In addition, comparison, availability, and service level reports can also be generated on demand. All reports can be drilled down for navigation at a finer granularity. We have demonstrated how BI techniques and tools can be used in the health care environment to make informed decisions with reference to resource allocation and enhancement of the quality of patient care. The data can be uploaded immediately upon collection, thus keeping reports current. The modular design can be expanded to add new datasets such as for smoking rates, teen pregnancies, human immunodeficiency virus (HIV) rates, immunization coverage, and vital statistical summaries.

  5. Cognitive biases and decision making in gambling.

    PubMed

    Chóliz, Mariano

    2010-08-01

    Heuristics and cognitive biases can occur in reasoning and decision making. Some of them are very common in gamblers (illusion of control, representativeness, availability, etc.). Structural characteristics and functioning of games of chance favor the appearance of these biases. Two experiments were conducted with nonpathological gamblers. The first experiment was a game of dice with wagers. In the second experiment, the participants played two bingo games. Specific rules of the games favored the appearance of cognitive bias (illusion of control) and heuristics (representativeness and availability) and influence on the bets. Results and implications for gambling are discussed.

  6. Assessment of the Navy’s North West Region Advance Food Menu Gallery Workload and Food Cost Impact Trade-Offs

    DTIC Science & Technology

    2009-06-01

    Self serve Same All All Assorted Indiv Fresh Fruits Bananas , apples, oranges, etc. Self serve Same All B/Br Omelets/Eggs to Order Scratch- from fresh...Prepared by night crew (M-F), and day crew for Saturday and Sunday All B/Br E02401 Assorted Cereal Adv Individual bowls with peel lids M-F L-SO N01207...Fresh Fruit Vegetable Prep [FSAs]  Wash, slice, dice, cut, peel or other process FFV, etc.  Clean/sanitize FFV Prep room equipment.  Clean FFV

  7. Modified Diet Recipes for Army Medical Facilities

    DTIC Science & Technology

    1983-10-20

    Mashed 1/2 cup 100 Whole .1-2" diam. 100 Dried Beans 1/2 cup 100 Kidney Corn 1/3 cup 80 Macaroni 1/2 cup 70 Noodles 1/2 cup 80 Peas, green 1/2...5 mg sodium Na/R Bread or Toast Cereal, Cooked Na/R Potato, Sweet Na/R Mashed Na/R Whole Dried Beans Kidney Corn Macaroni Noodles Peas...cup Mashed - no milk 1/3 cup Baked-1/3 of 2-1/4 diam. Diced 1/3 cup Mashed - no milk 1/3 cup Beans, Kidney 1/3 cup Corn Macaroni Noodles

  8. Gambling as a teaching aid in the introductory physics laboratory

    NASA Astrophysics Data System (ADS)

    Horodynski-Matsushigue, L. B.; Pascholati, P. R.; Vanin, V. R.; Dias, J. F.; Yoneama, M.-L.; Siqueira, P. T. D.; Amaku, M.; Duarte, J. L. M.

    1998-07-01

    Dice throwing is used to illustrate relevant concepts of the statistical theory of uncertainties, in particular the meaning of a limiting distribution, the standard deviation, and the standard deviation of the mean. It is an important part in a sequence of especially programmed laboratory activities, developed for freshmen, at the Institute of Physics of the University of São Paulo. It is shown how this activity is employed within a constructive teaching approach, which aims at a growing understanding of the measuring processes and of the fundamentals of correct statistical handling of experimental data.

  9. Addressable test matrix for measuring analog transfer characteristics of test elements used for integrated process control and device evaluation

    NASA Technical Reports Server (NTRS)

    Buehler, Martin G. (Inventor)

    1988-01-01

    A set of addressable test structures, each of which uses addressing schemes to access individual elements of the structure in a matrix, is used to test the quality of a wafer before integrated circuits produced thereon are diced, packaged and subjected to final testing. The electrical characteristic of each element is checked and compared to the electrical characteristic of all other like elements in the matrix. The effectiveness of the addressable test matrix is in readily analyzing the electrical characteristics of the test elements and in providing diagnostic information.

  10. Revisiting the social cost of carbon.

    PubMed

    Nordhaus, William D

    2017-02-14

    The social cost of carbon (SCC) is a central concept for understanding and implementing climate change policies. This term represents the economic cost caused by an additional ton of carbon dioxide emissions or its equivalent. The present study presents updated estimates based on a revised DICE model (Dynamic Integrated model of Climate and the Economy). The study estimates that the SCC is $31 per ton of CO 2 in 2010 US$ for the current period (2015). For the central case, the real SCC grows at 3% per year over the period to 2050. The paper also compares the estimates with those from other sources.

  11. Revisiting the social cost of carbon

    NASA Astrophysics Data System (ADS)

    Nordhaus, William D.

    2017-02-01

    The social cost of carbon (SCC) is a central concept for understanding and implementing climate change policies. This term represents the economic cost caused by an additional ton of carbon dioxide emissions or its equivalent. The present study presents updated estimates based on a revised DICE model (Dynamic Integrated model of Climate and the Economy). The study estimates that the SCC is 31 per ton of CO2 in 2010 US for the current period (2015). For the central case, the real SCC grows at 3% per year over the period to 2050. The paper also compares the estimates with those from other sources.

  12. Comparison of probabilistic and deterministic fiber tracking of cranial nerves.

    PubMed

    Zolal, Amir; Sobottka, Stephan B; Podlesek, Dino; Linn, Jennifer; Rieger, Bernhard; Juratli, Tareq A; Schackert, Gabriele; Kitzler, Hagen H

    2017-09-01

    OBJECTIVE The depiction of cranial nerves (CNs) using diffusion tensor imaging (DTI) is of great interest in skull base tumor surgery and DTI used with deterministic tracking methods has been reported previously. However, there are still no good methods usable for the elimination of noise from the resulting depictions. The authors have hypothesized that probabilistic tracking could lead to more accurate results, because it more efficiently extracts information from the underlying data. Moreover, the authors have adapted a previously described technique for noise elimination using gradual threshold increases to probabilistic tracking. To evaluate the utility of this new approach, a comparison is provided with this work between the gradual threshold increase method in probabilistic and deterministic tracking of CNs. METHODS Both tracking methods were used to depict CNs II, III, V, and the VII+VIII bundle. Depiction of 240 CNs was attempted with each of the above methods in 30 healthy subjects, which were obtained from 2 public databases: the Kirby repository (KR) and Human Connectome Project (HCP). Elimination of erroneous fibers was attempted by gradually increasing the respective thresholds (fractional anisotropy [FA] and probabilistic index of connectivity [PICo]). The results were compared with predefined ground truth images based on corresponding anatomical scans. Two label overlap measures (false-positive error and Dice similarity coefficient) were used to evaluate the success of both methods in depicting the CN. Moreover, the differences between these parameters obtained from the KR and HCP (with higher angular resolution) databases were evaluated. Additionally, visualization of 10 CNs in 5 clinical cases was attempted with both methods and evaluated by comparing the depictions with intraoperative findings. RESULTS Maximum Dice similarity coefficients were significantly higher with probabilistic tracking (p < 0.001; Wilcoxon signed-rank test). The false-positive error of the last obtained depiction was also significantly lower in probabilistic than in deterministic tracking (p < 0.001). The HCP data yielded significantly better results in terms of the Dice coefficient in probabilistic tracking (p < 0.001, Mann-Whitney U-test) and in deterministic tracking (p = 0.02). The false-positive errors were smaller in HCP data in deterministic tracking (p < 0.001) and showed a strong trend toward significance in probabilistic tracking (p = 0.06). In the clinical cases, the probabilistic method visualized 7 of 10 attempted CNs accurately, compared with 3 correct depictions with deterministic tracking. CONCLUSIONS High angular resolution DTI scans are preferable for the DTI-based depiction of the cranial nerves. Probabilistic tracking with a gradual PICo threshold increase is more effective for this task than the previously described deterministic tracking with a gradual FA threshold increase and might represent a method that is useful for depicting cranial nerves with DTI since it eliminates the erroneous fibers without manual intervention.

  13. Improved UTE-based attenuation correction for cranial PET-MR using dynamic magnetic field monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aitken, A. P.; Giese, D.; Tsoumpas, C.

    2014-01-15

    Purpose: Ultrashort echo time (UTE) MRI has been proposed as a way to produce segmented attenuation maps for PET, as it provides contrast between bone, air, and soft tissue. However, UTE sequences require samples to be acquired during rapidly changing gradient fields, which makes the resulting images prone to eddy current artifacts. In this work it is demonstrated that this can lead to misclassification of tissues in segmented attenuation maps (AC maps) and that these effects can be corrected for by measuring the true k-space trajectories using a magnetic field camera. Methods: The k-space trajectories during a dual echo UTEmore » sequence were measured using a dynamic magnetic field camera. UTE images were reconstructed using nominal trajectories and again using the measured trajectories. A numerical phantom was used to demonstrate the effect of reconstructing with incorrect trajectories. Images of an ovine leg phantom were reconstructed and segmented and the resulting attenuation maps were compared to a segmented map derived from a CT scan of the same phantom, using the Dice similarity measure. The feasibility of the proposed method was demonstrated inin vivo cranial imaging in five healthy volunteers. Simulated PET data were generated for one volunteer to show the impact of misclassifications on the PET reconstruction. Results: Images of the numerical phantom exhibited blurring and edge artifacts on the bone–tissue and air–tissue interfaces when nominal k-space trajectories were used, leading to misclassification of soft tissue as bone and misclassification of bone as air. Images of the tissue phantom and thein vivo cranial images exhibited the same artifacts. The artifacts were greatly reduced when the measured trajectories were used. For the tissue phantom, the Dice coefficient for bone in MR relative to CT was 0.616 using the nominal trajectories and 0.814 using the measured trajectories. The Dice coefficients for soft tissue were 0.933 and 0.934 for the nominal and measured cases, respectively. For air the corresponding figures were 0.991 and 0.993. Compared to an unattenuated reference image, the mean error in simulated PET uptake in the brain was 9.16% when AC maps derived from nominal trajectories was used, with errors in the SUV{sub max} for simulated lesions in the range of 7.17%–12.19%. Corresponding figures when AC maps derived from measured trajectories were used were 0.34% (mean error) and −0.21% to +1.81% (lesions). Conclusions: Eddy current artifacts in UTE imaging can be corrected for by measuring the true k-space trajectories during a calibration scan and using them in subsequent image reconstructions. This improves the accuracy of segmented PET attenuation maps derived from UTE sequences and subsequent PET reconstruction.« less

  14. Gray matter segmentation of the spinal cord with active contours in MR images.

    PubMed

    Datta, Esha; Papinutto, Nico; Schlaeger, Regina; Zhu, Alyssa; Carballido-Gamio, Julio; Henry, Roland G

    2017-02-15

    Fully or partially automated spinal cord gray matter segmentation techniques for spinal cord gray matter segmentation will allow for pivotal spinal cord gray matter measurements in the study of various neurological disorders. The objective of this work was multi-fold: (1) to develop a gray matter segmentation technique that uses registration methods with an existing delineation of the cord edge along with Morphological Geodesic Active Contour (MGAC) models; (2) to assess the accuracy and reproducibility of the newly developed technique on 2D PSIR T1 weighted images; (3) to test how the algorithm performs on different resolutions and other contrasts; (4) to demonstrate how the algorithm can be extended to 3D scans; and (5) to show the clinical potential for multiple sclerosis patients. The MGAC algorithm was developed using a publicly available implementation of a morphological geodesic active contour model and the spinal cord segmentation tool of the software Jim (Xinapse Systems) for initial estimate of the cord boundary. The MGAC algorithm was demonstrated on 2D PSIR images of the C2/C3 level with two different resolutions, 2D T2* weighted images of the C2/C3 level, and a 3D PSIR image. These images were acquired from 45 healthy controls and 58 multiple sclerosis patients selected for the absence of evident lesions at the C2/C3 level. Accuracy was assessed though visual assessment, Hausdorff distances, and Dice similarity coefficients. Reproducibility was assessed through interclass correlation coefficients. Validity was assessed through comparison of segmented gray matter areas in images with different resolution for both manual and MGAC segmentations. Between MGAC and manual segmentations in healthy controls, the mean Dice similarity coefficient was 0.88 (0.82-0.93) and the mean Hausdorff distance was 0.61 (0.46-0.76) mm. The interclass correlation coefficient from test and retest scans of healthy controls was 0.88. The percent change between the manual segmentations from high and low-resolution images was 25%, while the percent change between the MGAC segmentations from high and low resolution images was 13%. Between MGAC and manual segmentations in MS patients, the average Dice similarity coefficient was 0.86 (0.8-0.92) and the average Hausdorff distance was 0.83 (0.29-1.37) mm. We demonstrate that an automatic segmentation technique, based on a morphometric geodesic active contours algorithm, can provide accurate and precise spinal cord gray matter segmentations on 2D PSIR images. We have also shown how this automated technique can potentially be extended to other imaging protocols. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Integrating Survey and Molecular Approaches to Better Understand Wildlife Disease Ecology

    PubMed Central

    Cowled, Brendan D.; Ward, Michael P.; Laffan, Shawn W.; Galea, Francesca; Garner, M. Graeme; MacDonald, Anna J.; Marsh, Ian; Muellner, Petra; Negus, Katherine; Quasim, Sumaiya; Woolnough, Andrew P.; Sarre, Stephen D.

    2012-01-01

    Infectious wildlife diseases have enormous global impacts, leading to human pandemics, global biodiversity declines and socio-economic hardship. Understanding how infection persists and is transmitted in wildlife is critical for managing diseases, but our understanding is limited. Our study aim was to better understand how infectious disease persists in wildlife populations by integrating genetics, ecology and epidemiology approaches. Specifically, we aimed to determine whether environmental or host factors were stronger drivers of Salmonella persistence or transmission within a remote and isolated wild pig (Sus scrofa) population. We determined the Salmonella infection status of wild pigs. Salmonella isolates were genotyped and a range of data was collected on putative risk factors for Salmonella transmission. We a priori identified several plausible biological hypotheses for Salmonella prevalence (cross sectional study design) versus transmission (molecular case series study design) and fit the data to these models. There were 543 wild pig Salmonella observations, sampled at 93 unique locations. Salmonella prevalence was 41% (95% confidence interval [CI]: 37–45%). The median Salmonella DICE coefficient (or Salmonella genetic similarity) was 52% (interquartile range [IQR]: 42–62%). Using the traditional cross sectional prevalence study design, the only supported model was based on the hypothesis that abundance of available ecological resources determines Salmonella prevalence in wild pigs. In the molecular study design, spatial proximity and herd membership as well as some individual risk factors (sex, condition score and relative density) determined transmission between pigs. Traditional cross sectional surveys and molecular epidemiological approaches are complementary and together can enhance understanding of disease ecology: abundance of ecological resources critical for wildlife influences Salmonella prevalence, whereas Salmonella transmission is driven by local spatial, social, density and individual factors, rather than resources. This enhanced understanding has implications for the control of diseases in wildlife populations. Attempts to manage wildlife disease using simplistic density approaches do not acknowledge the complexity of disease ecology. PMID:23071552

  16. A Dual-Layer Transducer Array for 3-D Rectilinear Imaging

    PubMed Central

    Yen, Jesse T.; Seo, Chi Hyung; Awad, Samer I.; Jeong, Jong S.

    2010-01-01

    2-D arrays for 3-D rectilinear imaging require very large element counts (16,000–65,000). The difficulties in fabricating and interconnecting 2-D arrays with a large number of elements (>5,000) have limited the development of suitable transducers for 3-D rectilinear imaging. In this paper, we propose an alternative solution to this problem by using a dual-layer transducer array design. This design consists of two perpendicular 1-D arrays for clinical 3-D imaging of targets near the transducer. These targets include the breast, carotid artery, and musculoskeletal system. This transducer design reduces the fabrication complexity and the channel count making 3-D rectilinear imaging more realizable. With this design, an effective N × N 2-D array can be developed using only N transmitters and N receivers. This benefit becomes very significant when N becomes greater than 128, for example. To demonstrate feasibility, we constructed a 4 × 4 cm prototype dual-layer array. The transmit array uses diced PZT-5H elements, and the receive array is a single sheet of undiced P[VDF-TrFE] copolymer. The receive elements are defined by the copper traces on the flexible interconnect circuit. The measured −6 dB fractional bandwidth was 80% with a center frequency of 4.8 MHz. At 5 MHz, the nearest neighbor crosstalk of the PZT array and PVDF array was −30.4 ± 3.1 dB and −28.8 ± 3.7 dB respectively. This dual-layer transducer was interfaced with an Ultrasonix Sonix RP system, and a synthetic aperture 3-D data set was acquired. We then performed off-line 3-D beamforming to obtain volumes of nylon wire targets. The theoretical lateral beamwidth was 0.52 mm compared to measured beamwidths of 0.65 mm and 0.67 mm in azimuth and elevation respectively. 3-D images of an 8 mm diameter anechoic cyst phantom were also acquired. PMID:19213647

  17. Natural environment design criteria for the Space Station definition and preliminary design

    NASA Astrophysics Data System (ADS)

    Vaughan, W. W.; Green, C. E.

    1985-03-01

    The natural environment design criteria for the Space Station Program (SSP) definition and preliminary design are presented. Information on the atmospheric, dynamic and thermodynamic environments, meteoroids, radiation, magnetic fields, physical constants, etc. is provided with the intension of enabling all groups involved in the definition and preliminary design studies to proceed with a common and consistent set of natural environment criteria requirements. The space station program elements (SSPE) shall be designed with no operational sensitivity to natural environment conditions during assembly, checkout, stowage, launch, and orbital operations to the maximum degree practical.

  18. Natural environment design criteria for the Space Station definition and preliminary design

    NASA Technical Reports Server (NTRS)

    Vaughan, W. W.; Green, C. E.

    1985-01-01

    The natural environment design criteria for the Space Station Program (SSP) definition and preliminary design are presented. Information on the atmospheric, dynamic and thermodynamic environments, meteoroids, radiation, magnetic fields, physical constants, etc. is provided with the intension of enabling all groups involved in the definition and preliminary design studies to proceed with a common and consistent set of natural environment criteria requirements. The space station program elements (SSPE) shall be designed with no operational sensitivity to natural environment conditions during assembly, checkout, stowage, launch, and orbital operations to the maximum degree practical.

  19. Review of Opinions of Math Teachers Concerning the Learning Environment That They Design

    ERIC Educational Resources Information Center

    Aydin, Bünyamin; Yavuz, Ayse

    2016-01-01

    Design of appropriate learning environment has a significant importance in creation of aims of the math teaching. In the design of learning environments, teachers play a significant role. The aim of this study is determination of opinions of the math teachers concerning the learning environment that they design. In accordance with this aim, an…

  20. Dissociation between decision-making under risk and decision-making under ambiguity in premanifest and manifest Huntington's disease.

    PubMed

    Adjeroud, Najia; Besnard, Jeremy; Verny, Christophe; Prundean, Adriana; Scherer, Clarisse; Gohier, Bénédicte; Bonneau, Dominique; Massioui, Nicole El; Allain, Philippe

    2017-08-01

    We investigated decision-making under ambiguity (DM-UA) and decision making under risk (DM-UR) in individuals with premanifest and manifest Huntington's disease (HD). Twenty individuals with premanifest HD and 23 individuals with manifest HD, on one hand, and 39 healthy individuals divided into two control groups, on the other, undertook a modified version of the Iowa Gambling Task (IGT), an adaptation of a DM-UA task, and a modified version of the Game of Dice Task (GDT), an adaptation of a DM-UR task. Participants also filled in a questionnaire of impulsivity and responded to cognitive tests specifically designed to assess executive functions. Compared to controls, individuals with premanifest HD were unimpaired in performing executive tests as well as in decision-making tasks, except for the Stroop task. In contrast, individuals with manifest HD were impaired in both the IGT and executive tasks, but not in the GDT. No sign of impulsivity was observed in individuals with premanifest or manifest HD. Our results suggest that the progression of HD impairs DM-UA without affecting DM-UR, and indicate that decision-making abilities are preserved during the premanifest stage of HD. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Automated Visual Inspection Of Integrated Circuits

    NASA Astrophysics Data System (ADS)

    Noppen, G.; Oosterlinck, Andre J.

    1989-07-01

    One of the major application fields of image processing techniques is the 'visual inspection'. For a number of rea-sons, the automated visual inspection of Integrated Circuits (IC's) has drawn a lot of attention. : Their very strict design makes them very suitable for an automated inspection. : There is already a lot of experience in the comparable Printed Circuit Board (PCB) and mask inspection. : The mechanical handling of wafers and dice is already an established technology. : Military and medical IC's should be a 100 % failproof. : IC inspection gives a high and allinost immediate payback. In this paper we wil try to give an outline of the problems involved in IC inspection, and the algorithms and methods used to overcome these problems. We will not go into de-tail, but we will try to give a general understanding. Our attention will go to the following topics. : An overview of the inspection process, with an emphasis on the second visual inspection. : The problems encountered in IC inspection, as opposed to the comparable PCB and mask inspection. : The image acquisition devices that can be used to obtain 'inspectable' images. : A general overview of the algorithms that can be used. : A short description of the algorithms developed at the ESAT-MI2 division of the katholieke Universiteit Leuven.

  2. On public space design for Chinese urban residential area based on integrated architectural physics environment evaluation

    NASA Astrophysics Data System (ADS)

    Dong, J. Y.; Cheng, W.; Ma, C. P.; Tan, Y. T.; Xin, L. S.

    2017-04-01

    The residential public space is an important part in designing the ecological residence, and a proper physics environment of public space is of greater significance to urban residence in China. Actually, the measure to apply computer aided design software into residential design can effectively avoid an inconformity of design intent with actual using condition, and a negative impact on users due to bad architectural physics environment of buildings, etc. The paper largely adopts a design method of analyzing architectural physics environment of residential public space. By analyzing and evaluating various physics environments, a suitability assessment is obtained for residential public space, thereby guiding the space design.

  3. Fully automatic registration and segmentation of first-pass myocardial perfusion MR image sequences.

    PubMed

    Gupta, Vikas; Hendriks, Emile A; Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F

    2010-11-01

    Derivation of diagnostically relevant parameters from first-pass myocardial perfusion magnetic resonance images involves the tedious and time-consuming manual segmentation of the myocardium in a large number of images. To reduce the manual interaction and expedite the perfusion analysis, we propose an automatic registration and segmentation method for the derivation of perfusion linked parameters. A complete automation was accomplished by first registering misaligned images using a method based on independent component analysis, and then using the registered data to automatically segment the myocardium with active appearance models. We used 18 perfusion studies (100 images per study) for validation in which the automatically obtained (AO) contours were compared with expert drawn contours on the basis of point-to-curve error, Dice index, and relative perfusion upslope in the myocardium. Visual inspection revealed successful segmentation in 15 out of 18 studies. Comparison of the AO contours with expert drawn contours yielded 2.23 ± 0.53 mm and 0.91 ± 0.02 as point-to-curve error and Dice index, respectively. The average difference between manually and automatically obtained relative upslope parameters was found to be statistically insignificant (P = .37). Moreover, the analysis time per slice was reduced from 20 minutes (manual) to 1.5 minutes (automatic). We proposed an automatic method that significantly reduced the time required for analysis of first-pass cardiac magnetic resonance perfusion images. The robustness and accuracy of the proposed method were demonstrated by the high spatial correspondence and statistically insignificant difference in perfusion parameters, when AO contours were compared with expert drawn contours. Copyright © 2010 AUR. Published by Elsevier Inc. All rights reserved.

  4. WRIST: A WRist Image Segmentation Toolkit for carpal bone delineation from MRI.

    PubMed

    Foster, Brent; Joshi, Anand A; Borgese, Marissa; Abdelhafez, Yasser; Boutin, Robert D; Chaudhari, Abhijit J

    2018-01-01

    Segmentation of the carpal bones from 3D imaging modalities, such as magnetic resonance imaging (MRI), is commonly performed for in vivo analysis of wrist morphology, kinematics, and biomechanics. This crucial task is typically carried out manually and is labor intensive, time consuming, subject to high inter- and intra-observer variability, and may result in topologically incorrect surfaces. We present a method, WRist Image Segmentation Toolkit (WRIST), for 3D semi-automated, rapid segmentation of the carpal bones of the wrist from MRI. In our method, the boundary of the bones were iteratively found using prior known anatomical constraints and a shape-detection level set. The parameters of the method were optimized using a training dataset of 48 manually segmented carpal bones and evaluated on 112 carpal bones which included both healthy participants without known wrist conditions and participants with thumb basilar osteoarthritis (OA). Manual segmentation by two expert human observers was considered as a reference. On the healthy subject dataset we obtained a Dice overlap of 93.0 ± 3.8, Jaccard Index of 87.3 ± 6.2, and a Hausdorff distance of 2.7 ± 3.4 mm, while on the OA dataset we obtained a Dice overlap of 90.7 ± 8.6, Jaccard Index of 83.0 ± 10.6, and a Hausdorff distance of 4.0 ± 4.4 mm. The short computational time of 20.8 s per bone (or 5.1 s per bone in the parallelized version) and the high agreement with the expert observers gives WRIST the potential to be utilized in musculoskeletal research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Fully-automated segmentation of fluid regions in exudative age-related macular degeneration subjects: Kernel graph cut in neutrosophic domain

    PubMed Central

    Rashno, Abdolreza; Nazari, Behzad; Koozekanani, Dara D.; Drayna, Paul M.; Sadri, Saeed; Rabbani, Hossein

    2017-01-01

    A fully-automated method based on graph shortest path, graph cut and neutrosophic (NS) sets is presented for fluid segmentation in OCT volumes for exudative age related macular degeneration (EAMD) subjects. The proposed method includes three main steps: 1) The inner limiting membrane (ILM) and the retinal pigment epithelium (RPE) layers are segmented using proposed methods based on graph shortest path in NS domain. A flattened RPE boundary is calculated such that all three types of fluid regions, intra-retinal, sub-retinal and sub-RPE, are located above it. 2) Seed points for fluid (object) and tissue (background) are initialized for graph cut by the proposed automated method. 3) A new cost function is proposed in kernel space, and is minimized with max-flow/min-cut algorithms, leading to a binary segmentation. Important properties of the proposed steps are proven and quantitative performance of each step is analyzed separately. The proposed method is evaluated using a publicly available dataset referred as Optima and a local dataset from the UMN clinic. For fluid segmentation in 2D individual slices, the proposed method outperforms the previously proposed methods by 18%, 21% with respect to the dice coefficient and sensitivity, respectively, on the Optima dataset, and by 16%, 11% and 12% with respect to the dice coefficient, sensitivity and precision, respectively, on the local UMN dataset. Finally, for 3D fluid volume segmentation, the proposed method achieves true positive rate (TPR) and false positive rate (FPR) of 90% and 0.74%, respectively, with a correlation of 95% between automated and expert manual segmentations using linear regression analysis. PMID:29059257

  6. A 3D high resolution ex vivo white matter atlas of the common squirrel monkey (saimiri sciureus) based on diffusion tensor imaging

    NASA Astrophysics Data System (ADS)

    Gao, Yurui; Parvathaneni, Prasanna; Schilling, Kurt G.; Wang, Feng; Stepniewska, Iwona; Xu, Zhoubing; Choe, Ann S.; Ding, Zhaohua; Gore, John C.; Chen, Li min; Landman, Bennett A.; Anderson, Adam W.

    2016-03-01

    Modern magnetic resonance imaging (MRI) brain atlases are high quality 3-D volumes with specific structures labeled in the volume. Atlases are essential in providing a common space for interpretation of results across studies, for anatomical education, and providing quantitative image-based navigation. Extensive work has been devoted to atlas construction for humans, macaque, and several non-primate species (e.g., rat). One notable gap in the literature is the common squirrel monkey - for which the primary published atlases date from the 1960's. The common squirrel monkey has been used extensively as surrogate for humans in biomedical studies, given its anatomical neuro-system similarities and practical considerations. This work describes the continued development of a multi-modal MRI atlas for the common squirrel monkey, for which a structural imaging space and gray matter parcels have been previously constructed. This study adds white matter tracts to the atlas. The new atlas includes 49 white matter (WM) tracts, defined using diffusion tensor imaging (DTI) in three animals and combines these data to define the anatomical locations of these tracks in a standardized coordinate system compatible with previous development. An anatomist reviewed the resulting tracts and the inter-animal reproducibility (i.e., the Dice index of each WM parcel across animals in common space) was assessed. The Dice indices range from 0.05 to 0.80 due to differences of local registration quality and the variation of WM tract position across individuals. However, the combined WM labels from the 3 animals represent the general locations of WM parcels, adding basic connectivity information to the atlas.

  7. Efficient brain lesion segmentation using multi-modality tissue-based feature selection and support vector machines.

    PubMed

    Fiot, Jean-Baptiste; Cohen, Laurent D; Raniga, Parnesh; Fripp, Jurgen

    2013-09-01

    Support vector machines (SVM) are machine learning techniques that have been used for segmentation and classification of medical images, including segmentation of white matter hyper-intensities (WMH). Current approaches using SVM for WMH segmentation extract features from the brain and classify these followed by complex post-processing steps to remove false positives. The method presented in this paper combines advanced pre-processing, tissue-based feature selection and SVM classification to obtain efficient and accurate WMH segmentation. Features from 125 patients, generated from up to four MR modalities [T1-w, T2-w, proton-density and fluid attenuated inversion recovery(FLAIR)], differing neighbourhood sizes and the use of multi-scale features were compared. We found that although using all four modalities gave the best overall classification (average Dice scores of 0.54  ±  0.12, 0.72  ±  0.06 and 0.82  ±  0.06 respectively for small, moderate and severe lesion loads); this was not significantly different (p = 0.50) from using just T1-w and FLAIR sequences (Dice scores of 0.52  ±  0.13, 0.71  ±  0.08 and 0.81  ±  0.07). Furthermore, there was a negligible difference between using 5 × 5 × 5 and 3 × 3 × 3 features (p = 0.93). Finally, we show that careful consideration of features and pre-processing techniques not only saves storage space and computation time but also leads to more efficient classification, which outperforms the one based on all features with post-processing. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Automatic liver tumor segmentation on computed tomography for patient treatment planning and monitoring

    PubMed Central

    Moghbel, Mehrdad; Mashohor, Syamsiah; Mahmud, Rozi; Saripan, M. Iqbal Bin

    2016-01-01

    Segmentation of liver tumors from Computed Tomography (CT) and tumor burden analysis play an important role in the choice of therapeutic strategies for liver diseases and treatment monitoring. In this paper, a new segmentation method for liver tumors from contrast-enhanced CT imaging is proposed. As manual segmentation of tumors for liver treatment planning is both labor intensive and time-consuming, a highly accurate automatic tumor segmentation is desired. The proposed framework is fully automatic requiring no user interaction. The proposed segmentation evaluated on real-world clinical data from patients is based on a hybrid method integrating cuckoo optimization and fuzzy c-means algorithm with random walkers algorithm. The accuracy of the proposed method was validated using a clinical liver dataset containing one of the highest numbers of tumors utilized for liver tumor segmentation containing 127 tumors in total with further validation of the results by a consultant radiologist. The proposed method was able to achieve one of the highest accuracies reported in the literature for liver tumor segmentation compared to other segmentation methods with a mean overlap error of 22.78 % and dice similarity coefficient of 0.75 in 3Dircadb dataset and a mean overlap error of 15.61 % and dice similarity coefficient of 0.81 in MIDAS dataset. The proposed method was able to outperform most other tumor segmentation methods reported in the literature while representing an overlap error improvement of 6 % compared to one of the best performing automatic methods in the literature. The proposed framework was able to provide consistently accurate results considering the number of tumors and the variations in tumor contrast enhancements and tumor appearances while the tumor burden was estimated with a mean error of 0.84 % in 3Dircadb dataset. PMID:27540353

  9. Automatic segmentation of right ventricular ultrasound images using sparse matrix transform and a level set

    NASA Astrophysics Data System (ADS)

    Qin, Xulei; Cong, Zhibin; Fei, Baowei

    2013-11-01

    An automatic segmentation framework is proposed to segment the right ventricle (RV) in echocardiographic images. The method can automatically segment both epicardial and endocardial boundaries from a continuous echocardiography series by combining sparse matrix transform, a training model, and a localized region-based level set. First, the sparse matrix transform extracts main motion regions of the myocardium as eigen-images by analyzing the statistical information of the images. Second, an RV training model is registered to the eigen-images in order to locate the position of the RV. Third, the training model is adjusted and then serves as an optimized initialization for the segmentation of each image. Finally, based on the initializations, a localized, region-based level set algorithm is applied to segment both epicardial and endocardial boundaries in each echocardiograph. Three evaluation methods were used to validate the performance of the segmentation framework. The Dice coefficient measures the overall agreement between the manual and automatic segmentation. The absolute distance and the Hausdorff distance between the boundaries from manual and automatic segmentation were used to measure the accuracy of the segmentation. Ultrasound images of human subjects were used for validation. For the epicardial and endocardial boundaries, the Dice coefficients were 90.8 ± 1.7% and 87.3 ± 1.9%, the absolute distances were 2.0 ± 0.42 mm and 1.79 ± 0.45 mm, and the Hausdorff distances were 6.86 ± 1.71 mm and 7.02 ± 1.17 mm, respectively. The automatic segmentation method based on a sparse matrix transform and level set can provide a useful tool for quantitative cardiac imaging.

  10. The Social Cost of Stochastic and Irreversible Climate Change

    NASA Astrophysics Data System (ADS)

    Cai, Y.; Judd, K. L.; Lontzek, T.

    2013-12-01

    Many scientists are worried about climate change triggering abrupt and irreversible events leading to significant and long-lasting damages. For example, a rapid release of methane from permafrost may lead to amplified global warming, and global warming may increase the frequency and severity of heavy rainfall or typhoon, destroying large cities and killing numerous people. Some elements of the climate system which might exhibit such a triggering effect are called tipping elements. There is great uncertainty about the impact of anthropogenic carbon and tipping elements on future economic wellbeing. Any rational policy choice must consider the great uncertainty about the magnitude and timing of global warming's impact on economic productivity. While the likelihood of tipping points may be a function of contemporaneous temperature, their effects are long lasting and might be independent of future temperatures. It is assumed that some of these tipping points might occur even in this century, but also that their duration and post-tipping impact are uncertain. A faithful representation of the possibility of tipping points for the calculation of social cost of carbon would require a fully stochastic formulation of irreversibility, and accounting for the deep layer of uncertainties regarding the duration of the tipping process and also its economic impact. We use DSICE, a DSGE extension of the DICE2007 model of William Nordhaus, which incorporates beliefs about the uncertain economic impact of possible climate tipping events and uses empirically plausible parameterizations of Epstein-Zin preferences to represent attitudes towards risk. We find that the uncertainty associated with anthropogenic climate change imply carbon taxes much higher than implied by deterministic models. This analysis indicates that the absence of uncertainty in DICE2007 and similar IAM models may result in substantial understatement of the potential benefits of policies to reduce GHG emissions.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rueegsegger, Michael B.; Bach Cuadra, Meritxell; Pica, Alessia

    Purpose: Ocular anatomy and radiation-associated toxicities provide unique challenges for external beam radiation therapy. For treatment planning, precise modeling of organs at risk and tumor volume are crucial. Development of a precise eye model and automatic adaptation of this model to patients' anatomy remain problematic because of organ shape variability. This work introduces the application of a 3-dimensional (3D) statistical shape model as a novel method for precise eye modeling for external beam radiation therapy of intraocular tumors. Methods and Materials: Manual and automatic segmentations were compared for 17 patients, based on head computed tomography (CT) volume scans. A 3Dmore » statistical shape model of the cornea, lens, and sclera as well as of the optic disc position was developed. Furthermore, an active shape model was built to enable automatic fitting of the eye model to CT slice stacks. Cross-validation was performed based on leave-one-out tests for all training shapes by measuring dice coefficients and mean segmentation errors between automatic segmentation and manual segmentation by an expert. Results: Cross-validation revealed a dice similarity of 95% {+-} 2% for the sclera and cornea and 91% {+-} 2% for the lens. Overall, mean segmentation error was found to be 0.3 {+-} 0.1 mm. Average segmentation time was 14 {+-} 2 s on a standard personal computer. Conclusions: Our results show that the solution presented outperforms state-of-the-art methods in terms of accuracy, reliability, and robustness. Moreover, the eye model shape as well as its variability is learned from a training set rather than by making shape assumptions (eg, as with the spherical or elliptical model). Therefore, the model appears to be capable of modeling nonspherically and nonelliptically shaped eyes.« less

  12. Effect of sample size on multi-parametric prediction of tissue outcome in acute ischemic stroke using a random forest classifier

    NASA Astrophysics Data System (ADS)

    Forkert, Nils Daniel; Fiehler, Jens

    2015-03-01

    The tissue outcome prediction in acute ischemic stroke patients is highly relevant for clinical and research purposes. It has been shown that the combined analysis of diffusion and perfusion MRI datasets using high-level machine learning techniques leads to an improved prediction of final infarction compared to single perfusion parameter thresholding. However, most high-level classifiers require a previous training and, until now, it is ambiguous how many subjects are required for this, which is the focus of this work. 23 MRI datasets of acute stroke patients with known tissue outcome were used in this work. Relative values of diffusion and perfusion parameters as well as the binary tissue outcome were extracted on a voxel-by- voxel level for all patients and used for training of a random forest classifier. The number of patients used for training set definition was iteratively and randomly reduced from using all 22 other patients to only one other patient. Thus, 22 tissue outcome predictions were generated for each patient using the trained random forest classifiers and compared to the known tissue outcome using the Dice coefficient. Overall, a logarithmic relation between the number of patients used for training set definition and tissue outcome prediction accuracy was found. Quantitatively, a mean Dice coefficient of 0.45 was found for the prediction using the training set consisting of the voxel information from only one other patient, which increases to 0.53 if using all other patients (n=22). Based on extrapolation, 50-100 patients appear to be a reasonable tradeoff between tissue outcome prediction accuracy and effort required for data acquisition and preparation.

  13. Automatic lung segmentation using control feedback system: morphology and texture paradigm.

    PubMed

    Noor, Norliza M; Than, Joel C M; Rijal, Omar M; Kassim, Rosminah M; Yunus, Ashari; Zeki, Amir A; Anzidei, Michele; Saba, Luca; Suri, Jasjit S

    2015-03-01

    Interstitial Lung Disease (ILD) encompasses a wide array of diseases that share some common radiologic characteristics. When diagnosing such diseases, radiologists can be affected by heavy workload and fatigue thus decreasing diagnostic accuracy. Automatic segmentation is the first step in implementing a Computer Aided Diagnosis (CAD) that will help radiologists to improve diagnostic accuracy thereby reducing manual interpretation. Automatic segmentation proposed uses an initial thresholding and morphology based segmentation coupled with feedback that detects large deviations with a corrective segmentation. This feedback is analogous to a control system which allows detection of abnormal or severe lung disease and provides a feedback to an online segmentation improving the overall performance of the system. This feedback system encompasses a texture paradigm. In this study we studied 48 males and 48 female patients consisting of 15 normal and 81 abnormal patients. A senior radiologist chose the five levels needed for ILD diagnosis. The results of segmentation were displayed by showing the comparison of the automated and ground truth boundaries (courtesy of ImgTracer™ 1.0, AtheroPoint™ LLC, Roseville, CA, USA). The left lung's performance of segmentation was 96.52% for Jaccard Index and 98.21% for Dice Similarity, 0.61 mm for Polyline Distance Metric (PDM), -1.15% for Relative Area Error and 4.09% Area Overlap Error. The right lung's performance of segmentation was 97.24% for Jaccard Index, 98.58% for Dice Similarity, 0.61 mm for PDM, -0.03% for Relative Area Error and 3.53% for Area Overlap Error. The segmentation overall has an overall similarity of 98.4%. The segmentation proposed is an accurate and fully automated system.

  14. Longitudinal Neuroimaging Hippocampal Markers for Diagnosing Alzheimer's Disease.

    PubMed

    Platero, Carlos; Lin, Lin; Tobar, M Carmen

    2018-05-21

    Hippocampal atrophy measures from magnetic resonance imaging (MRI) are powerful tools for monitoring Alzheimer's disease (AD) progression. In this paper, we introduce a longitudinal image analysis framework based on robust registration and simultaneous hippocampal segmentation and longitudinal marker classification of brain MRI of an arbitrary number of time points. The framework comprises two innovative parts: a longitudinal segmentation and a longitudinal classification step. The results show that both steps of the longitudinal pipeline improved the reliability and the accuracy of the discrimination between clinical groups. We introduce a novel approach to the joint segmentation of the hippocampus across multiple time points; this approach is based on graph cuts of longitudinal MRI scans with constraints on hippocampal atrophy and supported by atlases. Furthermore, we use linear mixed effect (LME) modeling for differential diagnosis between clinical groups. The classifiers are trained from the average residue between the longitudinal marker of the subjects and the LME model. In our experiments, we analyzed MRI-derived longitudinal hippocampal markers from two publicly available datasets (Alzheimer's Disease Neuroimaging Initiative, ADNI and Minimal Interval Resonance Imaging in Alzheimer's Disease, MIRIAD). In test/retest reliability experiments, the proposed method yielded lower volume errors and significantly higher dice overlaps than the cross-sectional approach (volume errors: 1.55% vs 0.8%; dice overlaps: 0.945 vs 0.975). To diagnose AD, the discrimination ability of our proposal gave an area under the receiver operating characteristic (ROC) curve (AUC) [Formula: see text] 0.947 for the control vs AD, AUC [Formula: see text] 0.720 for mild cognitive impairment (MCI) vs AD, and AUC [Formula: see text] 0.805 for the control vs MCI.

  15. Parity among interpretation methods of MLEE patterns and disparity among clustering methods in epidemiological typing of Candida albicans.

    PubMed

    Boriollo, Marcelo Fabiano Gomes; Rosa, Edvaldo Antonio Ribeiro; Gonçalves, Reginaldo Bruno; Höfling, José Francisco

    2006-03-01

    The typing of C. albicans by MLEE (multilocus enzyme electrophoresis) is dependent on the interpretation of enzyme electrophoretic patterns, and the study of the epidemiological relationships of these yeasts can be conducted by cluster analysis. Therefore, the aims of the present study were to first determine the discriminatory power of genetic interpretation (deduction of the allelic composition of diploid organisms) and numerical interpretation (mere determination of the presence and absence of bands) of MLEE patterns, and then to determine the concordance (Pearson product-moment correlation coefficient) and similarity (Jaccard similarity coefficient) of the groups of strains generated by three cluster analysis models, and the discriminatory power of such models as well [model A: genetic interpretation, genetic distance matrix of Nei (d(ij)) and UPGMA dendrogram; model B: genetic interpretation, Dice similarity matrix (S(D1)) and UPGMA dendrogram; model C: numerical interpretation, Dice similarity matrix (S(D2)) and UPGMA dendrogram]. MLEE was found to be a powerful and reliable tool for the typing of C. albicans due to its high discriminatory power (>0.9). Discriminatory power indicated that numerical interpretation is a method capable of discriminating a greater number of strains (47 versus 43 subtypes), but also pointed to model B as a method capable of providing a greater number of groups, suggesting its use for the typing of C. albicans by MLEE and cluster analysis. Very good agreement was only observed between the elements of the matrices S(D1) and S(D2), but a large majority of the groups generated in the three UPGMA dendrograms showed similarity S(J) between 4.8% and 75%, suggesting disparities in the conclusions obtained by the cluster assays.

  16. A 3D high resolution ex vivo white matter atlas of the common squirrel monkey (Saimiri sciureus) based on diffusion tensor imaging

    PubMed Central

    Gao, Yurui; Parvathaneni, Prasanna; Schilling, Kurt G.; Wang, Feng; Stepniewska, Iwona; Xu, Zhoubing; Choe, Ann S.; Ding, Zhaohua; Gore, John C.; Chen, Li Min; Landman, Bennett A.; Anderson, Adam W.

    2016-01-01

    Modern magnetic resonance imaging (MRI) brain atlases are high quality 3-D volumes with specific structures labeled in the volume. Atlases are essential in providing a common space for interpretation of results across studies, for anatomical education, and providing quantitative image-based navigation. Extensive work has been devoted to atlas construction for humans, macaque, and several non-primate species (e.g., rat). One notable gap in the literature is the common squirrel monkey – for which the primary published atlases date from the 1960’s. The common squirrel monkey has been used extensively as surrogate for humans in biomedical studies, given its anatomical neuro-system similarities and practical considerations. This work describes the continued development of a multi-modal MRI atlas for the common squirrel monkey, for which a structural imaging space and gray matter parcels have been previously constructed. This study adds white matter tracts to the atlas. The new atlas includes 49 white matter (WM) tracts, defined using diffusion tensor imaging (DTI) in three animals and combines these data to define the anatomical locations of these tracks in a standardized coordinate system compatible with previous development. An anatomist reviewed the resulting tracts and the inter-animal reproducibility (i.e., the Dice index of each WM parcel across animals in common space) was assessed. The Dice indices range from 0.05 to 0.80 due to differences of local registration quality and the variation of WM tract position across individuals. However, the combined WM labels from the 3 animals represent the general locations of WM parcels, adding basic connectivity information to the atlas. PMID:27064328

  17. Reliability of Semi-Automated Segmentations in Glioblastoma.

    PubMed

    Huber, T; Alber, G; Bette, S; Boeckh-Behrens, T; Gempt, J; Ringel, F; Alberts, E; Zimmer, C; Bauer, J S

    2017-06-01

    In glioblastoma, quantitative volumetric measurements of contrast-enhancing or fluid-attenuated inversion recovery (FLAIR) hyperintense tumor compartments are needed for an objective assessment of therapy response. The aim of this study was to evaluate the reliability of a semi-automated, region-growing segmentation tool for determining tumor volume in patients with glioblastoma among different users of the software. A total of 320 segmentations of tumor-associated FLAIR changes and contrast-enhancing tumor tissue were performed by different raters (neuroradiologists, medical students, and volunteers). All patients underwent high-resolution magnetic resonance imaging including a 3D-FLAIR and a 3D-MPRage sequence. Segmentations were done using a semi-automated, region-growing segmentation tool. Intra- and inter-rater-reliability were addressed by intra-class-correlation (ICC). Root-mean-square error (RMSE) was used to determine the precision error. Dice score was calculated to measure the overlap between segmentations. Semi-automated segmentation showed a high ICC (> 0.985) for all groups indicating an excellent intra- and inter-rater-reliability. Significant smaller precision errors and higher Dice scores were observed for FLAIR segmentations compared with segmentations of contrast-enhancement. Single rater segmentations showed the lowest RMSE for FLAIR of 3.3 % (MPRage: 8.2 %). Both, single raters and neuroradiologists had the lowest precision error for longitudinal evaluation of FLAIR changes. Semi-automated volumetry of glioblastoma was reliably performed by all groups of raters, even without neuroradiologic expertise. Interestingly, segmentations of tumor-associated FLAIR changes were more reliable than segmentations of contrast enhancement. In longitudinal evaluations, an experienced rater can detect progressive FLAIR changes of less than 15 % reliably in a quantitative way which could help to detect progressive disease earlier.

  18. Automated segmentation of lesions including subretinal hyperreflective material in neovascular age-related macular degeneration.

    PubMed

    Lee, Hyungwoo; Kang, Kyung Eun; Chung, Hyewon; Kim, Hyung Chan

    2018-04-12

    To evaluate an automated segmentation algorithm with a convolutional neural network (CNN) to quantify and detect intraretinal fluid (IRF), subretinal fluid (SRF), pigment epithelial detachment (PED), and subretinal hyperreflective material (SHRM) through analyses of spectral domain optical coherence tomography (SD-OCT) images from patients with neovascular age-related macular degeneration (nAMD). Reliability and validity analysis of a diagnostic tool. We constructed a dataset including 930 B-scans from 93 eyes of 93 patients with nAMD. A CNN-based deep neural network was trained using 11550 augmented images derived from 550 B-scans. The performance of the trained network was evaluated using a validation set including 140 B-scans and a test set of 240 B-scans. The Dice coefficient, positive predictive value (PPV), sensitivity, relative area difference (RAD), and intraclass correlation coefficient (ICC) were used to evaluate segmentation and detection performance. Good agreement was observed for both segmentation and detection of lesions between the trained network and clinicians. The Dice coefficients for segmentation of IRF, SRF, SHRM, and PED were 0.78, 0.82, 0.75, and 0.80, respectively; the PPVs were 0.79, 0.80, 0.75, and 0.80, respectively; and the sensitivities were 0.77, 0.84, 0.73, and 0.81, respectively. The RADs were -4.32%, -10.29%, 4.13%, and 0.34%, respectively, and the ICCs were 0.98, 0.98, 0.97, and 0.98, respectively. All lesions were detected with high PPVs (range 0.94-0.99) and sensitivities (range 0.97-0.99). A CNN-based network provides clinicians with quantitative data regarding nAMD through automatic segmentation and detection of pathological lesions, including IRF, SRF, PED, and SHRM. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. A 3D high resolution ex vivo white matter atlas of the common squirrel monkey (Saimiri sciureus) based on diffusion tensor imaging.

    PubMed

    Gao, Yurui; Parvathaneni, Prasanna; Schilling, Kurt G; Wang, Feng; Stepniewska, Iwona; Xu, Zhoubing; Choe, Ann S; Ding, Zhaohua; Gore, John C; Chen, Li Min; Landman, Bennett A; Anderson, Adam W

    2016-02-27

    Modern magnetic resonance imaging (MRI) brain atlases are high quality 3-D volumes with specific structures labeled in the volume. Atlases are essential in providing a common space for interpretation of results across studies, for anatomical education, and providing quantitative image-based navigation. Extensive work has been devoted to atlas construction for humans, macaque, and several non-primate species (e.g., rat). One notable gap in the literature is the common squirrel monkey - for which the primary published atlases date from the 1960's. The common squirrel monkey has been used extensively as surrogate for humans in biomedical studies, given its anatomical neuro-system similarities and practical considerations. This work describes the continued development of a multi-modal MRI atlas for the common squirrel monkey, for which a structural imaging space and gray matter parcels have been previously constructed. This study adds white matter tracts to the atlas. The new atlas includes 49 white matter (WM) tracts, defined using diffusion tensor imaging (DTI) in three animals and combines these data to define the anatomical locations of these tracks in a standardized coordinate system compatible with previous development. An anatomist reviewed the resulting tracts and the inter-animal reproducibility (i.e., the Dice index of each WM parcel across animals in common space) was assessed. The Dice indices range from 0.05 to 0.80 due to differences of local registration quality and the variation of WM tract position across individuals. However, the combined WM labels from the 3 animals represent the general locations of WM parcels, adding basic connectivity information to the atlas.

  20. Orbital shape in intentional skull deformations and adult sagittal craniosynostoses.

    PubMed

    Sandy, Ronak; Hennocq, Quentin; Nysjö, Johan; Giran, Guillaume; Friess, Martin; Khonsari, Roman Hossein

    2018-06-21

    Intentional cranial deformations are the result of external mechanical forces exerted on the skull vault that modify the morphology of various craniofacial structures such as the skull base, the orbits and the zygoma. In this controlled study, we investigated the 3D shape of the orbital inner mould and the orbital volume in various types of intentional deformations and in adult non-operated scaphocephaly - the most common type of craniosynostosis - using dedicated morphometric methods. CT scans were performed on 32 adult skulls with intentional deformations, 21 adult skull with scaphocephaly and 17 non-deformed adult skulls from the collections of the Muséum national d'Histoire naturelle in Paris, France. The intentional deformations group included six skulls with Toulouse deformations, eight skulls with circumferential deformations and 18 skulls with antero-posterior deformations. Mean shape models were generated based on a semi-automatic segmentation technique. Orbits were then aligned and compared qualitatively and quantitatively using colour-coded distance maps and by computing the mean absolute distance, the Hausdorff distance, and the Dice similarity coefficient. Orbital symmetry was assessed after mirroring, superimposition and Dice similarity coefficient computation. We showed that orbital shapes were significantly and symmetrically modified in intentional deformations and scaphocephaly compared with non-deformed control skulls. Antero-posterior and circumferential deformations demonstrated a similar and severe orbital deformation pattern resulting in significant smaller orbital volumes. Scaphocephaly and Toulouse deformations had similar deformation patterns but had no effect on orbital volumes. This study showed that intentional deformations and scaphocephaly significantly interact with orbital growth. Our approach was nevertheless not sufficient to identify specific modifications caused by the different types of skull deformations or by scaphocephaly. © 2018 Anatomical Society.

  1. SU-F-BRF-12: Investigating Dosimetric Effects of Inter-Fraction Deformation in Lung Cancer Stereotactic Body Radiotherapy (SBRT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jia, J; Tian, Z; Gu, X

    2014-06-15

    Purpose: We studied dosimetric effects of inter-fraction deformation in lung stereotactic body radiotherapy (SBRT), in order to investigate the necessity of adaptive re-planning for lung SBRT treatments. Methods: Six lung cancer patients with different treatment fractions were retrospectively investigated. All the patients were immobilized and localized with a stereotactic body frame and were treated under cone-beam CT (CBCT) image guidance at each fraction. We calculated the actual delivered dose of the treatment plan using the up-to-date patient geometry of each fraction, and compared the dose with the intended plan dose to investigate the dosimetric effects of the inter-fraction deformation. Deformablemore » registration was carried out between the treatment planning CT and the CBCT of each fraction to obtain deformed planning CT for more accurate dose calculations of the delivered dose. The extent of the inter-fraction deformation was also evaluated by calculating the dice similarity coefficient between the delineated structures on the planning CT and those on the deformed planning CT. Results: The average dice coefficients for PTV, spinal cord, esophagus were 0.87, 0.83 and 0.69, respectively. The volume of PTV covered by prescription dose was decreased by 23.78% on average for all fractions and all patients. For spinal cord and esophagus, the volumes covered by the constraint dose were increased by 4.57% and 3.83%. The maximum dose was also increased by 4.11% for spinal cord and 4.29% for esophagus. Conclusion: Due to inter-fraction deformation, large deterioration was found in both PTV coverage and OAR sparing, which demonstrated the needs for adaptive re-planning of lung SBRT cases to improve target coverage while reducing radiation dose to nearby normal tissues.« less

  2. A minimum spanning forest based classification method for dedicated breast CT images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pike, Robert; Sechopoulos, Ioannis; Fei, Baowei, E-mail: bfei@emory.edu

    Purpose: To develop and test an automated algorithm to classify different types of tissue in dedicated breast CT images. Methods: Images of a single breast of five different patients were acquired with a dedicated breast CT clinical prototype. The breast CT images were processed by a multiscale bilateral filter to reduce noise while keeping edge information and were corrected to overcome cupping artifacts. As skin and glandular tissue have similar CT values on breast CT images, morphologic processing is used to identify the skin based on its position information. A support vector machine (SVM) is trained and the resulting modelmore » used to create a pixelwise classification map of fat and glandular tissue. By combining the results of the skin mask with the SVM results, the breast tissue is classified as skin, fat, and glandular tissue. This map is then used to identify markers for a minimum spanning forest that is grown to segment the image using spatial and intensity information. To evaluate the authors’ classification method, they use DICE overlap ratios to compare the results of the automated classification to those obtained by manual segmentation on five patient images. Results: Comparison between the automatic and the manual segmentation shows that the minimum spanning forest based classification method was able to successfully classify dedicated breast CT image with average DICE ratios of 96.9%, 89.8%, and 89.5% for fat, glandular, and skin tissue, respectively. Conclusions: A 2D minimum spanning forest based classification method was proposed and evaluated for classifying the fat, skin, and glandular tissue in dedicated breast CT images. The classification method can be used for dense breast tissue quantification, radiation dose assessment, and other applications in breast imaging.« less

  3. Will the use of a carbon tax for revenue generation produce an incentive to continue carbon emissions?

    NASA Astrophysics Data System (ADS)

    Wang, Rong; Moreno-Cruz, Juan; Caldeira, Ken

    2017-05-01

    Integrated assessment models are commonly used to generate optimal carbon prices based on an objective function that maximizes social welfare. Such models typically project an initially low carbon price that increases with time. This framework does not reflect the incentives of decision makers who are responsible for generating tax revenue. If a rising carbon price is to result in near-zero emissions, it must ultimately result in near-zero carbon tax revenue. That means that at some point, policy makers will be asked to increase the tax rate on carbon emissions to such an extent that carbon tax revenue will fall. Therefore, there is a risk that the use of a carbon tax to generate revenue could eventually create a perverse incentive to continue carbon emissions in order to provide a continued stream of carbon tax revenue. Using the Dynamic Integrated Climate Economy (DICE) model, we provide evidence that this risk is not a concern for the immediate future but that a revenue-generating carbon tax could create this perverse incentive as time goes on. This incentive becomes perverse at about year 2085 under the default configuration of DICE, but the timing depends on a range of factors including the cost of climate damages and the cost of decarbonizing the global energy system. While our study is based on a schematic model, it highlights the importance of considering a broader spectrum of incentives in studies using more comprehensive integrated assessment models. Our study demonstrates that the use of a carbon tax for revenue generation could potentially motivate implementation of such a tax today, but this source of revenue generation risks motivating continued carbon emissions far into the future.

  4. Automated synovium segmentation in doppler ultrasound images for rheumatoid arthritis assessment

    NASA Astrophysics Data System (ADS)

    Yeung, Pak-Hei; Tan, York-Kiat; Xu, Shuoyu

    2018-02-01

    We need better clinical tools to improve monitoring of synovitis, synovial inflammation in the joints, in rheumatoid arthritis (RA) assessment. Given its economical, safe and fast characteristics, ultrasound (US) especially Doppler ultrasound is frequently used. However, manual scoring of synovitis in US images is subjective and prone to observer variations. In this study, we propose a new and robust method for automated synovium segmentation in the commonly affected joints, i.e. metacarpophalangeal (MCP) and metatarsophalangeal (MTP) joints, which would facilitate automation in quantitative RA assessment. The bone contour in the US image is firstly detected based on a modified dynamic programming method, incorporating angular information for detecting curved bone surface and using image fuzzification to identify missing bone structure. K-means clustering is then performed to initialize potential synovium areas by utilizing the identified bone contour as boundary reference. After excluding invalid candidate regions, the final segmented synovium is identified by reconnecting remaining candidate regions using level set evolution. 15 MCP and 15 MTP US images were analyzed in this study. For each image, segmentations by our proposed method as well as two sets of annotations performed by an experienced clinician at different time-points were acquired. Dice's coefficient is 0.77+/-0.12 between the two sets of annotations. Similar Dice's coefficients are achieved between automated segmentation and either the first set of annotations (0.76+/-0.12) or the second set of annotations (0.75+/-0.11), with no significant difference (P = 0.77). These results verify that the accuracy of segmentation by our proposed method and by clinician is comparable. Therefore, reliable synovium identification can be made by our proposed method.

  5. Glioblastoma Segmentation: Comparison of Three Different Software Packages.

    PubMed

    Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid

    2016-01-01

    To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.

  6. Probabilistic multiple sclerosis lesion classification based on modeling regional intensity variability and local neighborhood information.

    PubMed

    Harmouche, Rola; Subbanna, Nagesh K; Collins, D Louis; Arnold, Douglas L; Arbel, Tal

    2015-05-01

    In this paper, a fully automatic probabilistic method for multiple sclerosis (MS) lesion classification is presented, whereby the posterior probability density function over healthy tissues and two types of lesions (T1-hypointense and T2-hyperintense) is generated at every voxel. During training, the system explicitly models the spatial variability of the intensity distributions throughout the brain by first segmenting it into distinct anatomical regions and then building regional likelihood distributions for each tissue class based on multimodal magnetic resonance image (MRI) intensities. Local class smoothness is ensured by incorporating neighboring voxel information in the prior probability through Markov random fields. The system is tested on two datasets from real multisite clinical trials consisting of multimodal MRIs from a total of 100 patients with MS. Lesion classification results based on the framework are compared with and without the regional information, as well as with other state-of-the-art methods against the labels from expert manual raters. The metrics for comparison include Dice overlap, sensitivity, and positive predictive rates for both voxel and lesion classifications. Statistically significant improvements in Dice values ( ), for voxel-based and lesion-based sensitivity values ( ), and positive predictive rates ( and respectively) are shown when the proposed method is compared to the method without regional information, and to a widely used method [1]. This holds particularly true in the posterior fossa, an area where classification is very challenging. The proposed method allows us to provide clinicians with accurate tissue labels for T1-hypointense and T2-hyperintense lesions, two types of lesions that differ in appearance and clinical ramifications, and with a confidence level in the classification, which helps clinicians assess the classification results.

  7. Research on the Design of Public Space Environment for Aging Society

    NASA Astrophysics Data System (ADS)

    Fang, Gu; Soo, Kim Chul

    2018-03-01

    This paper studies the living space environment suitable for the elderly, because the elderly and the disabled have become increasingly prominent social problems. Through the discussion of the humanistic environment design method of the elderly and the disabled, the paper puts forward a new environment design which has the traditional characteristics and adapts to the new society to care for the elderly (the disabled).By studying and analyzing the background of social aging, the theory of public space environment design and the needs of the elderly, it is pointed out that the design of public space environment in the aged society needs to be implemented in detail design. The number of elderly people in public space will increase, give full attention to the public space outdoor environment quality, for the elderly to provide a variety of environmental facilities have long-term significance.

  8. Physical and Emotional Benefits of Different Exercise Environments Designed for Treadmill Running

    PubMed Central

    Churchill, Sarah M.; Brymer, Eric; Davids, Keith

    2017-01-01

    (1) Background: Green physical activity promotes physical health and mental wellbeing and interesting questions concern effects of this information on designing indoor exercise environments. This study examined the physical and emotional effects of different nature-based environments designed for indoor treadmill running; (2) Methods: In a counterbalanced experimental design, 30 participants performed three, twenty-minute treadmill runs at a self-selected pace while viewing either a static nature image, a dynamic nature image or self-selected entertainment. Distance ran, heart rate (HR) and five pre-and post-exercise emotional states were measured; (3) Results: Participants ran farther, and with higher HRs, with self-selected entertainment compared to the two nature-based environment designs. Participants attained lowered anger, dejection, anxiety and increased excitement post exercise in all of the designed environments. Happiness increased during the two nature-based environment designs compared with self-selected entertainment; (4) Conclusions: Self-selected entertainment encouraged greater physical performances whereas running in nature-based exercise environments elicited greater happiness immediately after running. PMID:28696384

  9. Physical and Emotional Benefits of Different Exercise Environments Designed for Treadmill Running.

    PubMed

    Yeh, Hsiao-Pu; Stone, Joseph A; Churchill, Sarah M; Brymer, Eric; Davids, Keith

    2017-07-11

    (1) Background: Green physical activity promotes physical health and mental wellbeing and interesting questions concern effects of this information on designing indoor exercise environments. This study examined the physical and emotional effects of different nature-based environments designed for indoor treadmill running; (2) Methods: In a counterbalanced experimental design, 30 participants performed three, twenty-minute treadmill runs at a self-selected pace while viewing either a static nature image, a dynamic nature image or self-selected entertainment. Distance ran, heart rate (HR) and five pre-and post-exercise emotional states were measured; (3) Results: Participants ran farther, and with higher HRs, with self-selected entertainment compared to the two nature-based environment designs. Participants attained lowered anger, dejection, anxiety and increased excitement post exercise in all of the designed environments. Happiness increased during the two nature-based environment designs compared with self-selected entertainment; (4) Conclusions: Self-selected entertainment encouraged greater physical performances whereas running in nature-based exercise environments elicited greater happiness immediately after running.

  10. Feasibility analysis on integration of luminous environment measuring and design based on exposure curve calibration

    NASA Astrophysics Data System (ADS)

    Zou, Yuan; Shen, Tianxing

    2013-03-01

    Besides illumination calculating during architecture and luminous environment design, to provide more varieties of photometric data, the paper presents combining relation between luminous environment design and SM light environment measuring system, which contains a set of experiment devices including light information collecting and processing modules, and can offer us various types of photometric data. During the research process, we introduced a simulation method for calibration, which mainly includes rebuilding experiment scenes in 3ds Max Design, calibrating this computer aid design software in simulated environment under conditions of various typical light sources, and fitting the exposure curves of rendered images. As analytical research went on, the operation sequence and points for attention during the simulated calibration were concluded, connections between Mental Ray renderer and SM light environment measuring system were established as well. From the paper, valuable reference conception for coordination between luminous environment design and SM light environment measuring system was pointed out.

  11. Sex and HIV serostatus differences in decision making under risk among substance-dependent individuals.

    PubMed

    Martin, Eileen; Gonzalez, Raul; Vassileva, Jasmin; Maki, Pauline M; Bechara, Antoine; Brand, Matthias

    2016-01-01

    HIV+ individuals with and without substance use disorders make significantly poorer decisions when information about the probability and magnitude of wins and losses is not available. We administered the Game of Dice Task, a measure of decision making under risk that provides this information explicitly, to 92 HIV+ and 134 HIV- substance-dependent men and women. HIV+ participants made significantly poorer decisions than HIV- participants, but this deficit appeared more prominent among HIV+ women. These data indicate that decision making under risk is impaired among HIV+ substance-dependent individuals (SDIs). Potential factors for the HIV+ women's relatively greater impairment are discussed.

  12. Production Guides for Meat and Vegetable Entrees and Desserts Developed for Use in the Frozen Foil Pack Feeding System, F.E. Warren Air Force Base

    DTIC Science & Technology

    1976-02-01

    8217-i~ ", i " ’ im i-II U BAKED BEEF WITH NOODLS 1i1 Yield: 100 Portions Each Portion: 8 ounces Ingredients Pounds Grams Procedure Beef, boneless, diced...simmer for approxi- mately 1 hour. Noodles , dry 4.00 1,816 2. Approximately * hour prior to completion of beef cook, start cooking noodles in a separate...tap 27.00 12,260 ture thickens (cook 15 minutes). 4. Add noodles from step 2; mix well and adjust volume to 7.8 gallons. Heat to 180 F.*2 5. Weigh 8

  13. Revisiting the social cost of carbon

    PubMed Central

    Nordhaus, William D.

    2017-01-01

    The social cost of carbon (SCC) is a central concept for understanding and implementing climate change policies. This term represents the economic cost caused by an additional ton of carbon dioxide emissions or its equivalent. The present study presents updated estimates based on a revised DICE model (Dynamic Integrated model of Climate and the Economy). The study estimates that the SCC is $31 per ton of CO2 in 2010 US$ for the current period (2015). For the central case, the real SCC grows at 3% per year over the period to 2050. The paper also compares the estimates with those from other sources. PMID:28143934

  14. Silicon saw-tooth refractive lens for high-energy x-rays made using a diamond saw.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Said, A. H.; Shastri, S. D.; X-Ray Science Division

    2010-01-01

    Silicon is a material well suited for refractive lenses operating at high X-ray energies (>50 keV), particularly if implemented in a single-crystal form to minimize small-angle scattering. A single-crystal silicon saw-tooth refractive lens, fabricated by a dicing process using a thin diamond wheel, was tested with 115 keV X-rays, giving an ideal 17 {mu}m line focus width in a long focal length, 2:1 ratio demagnification geometry, with a source-to-focus distance of 58.5 m. The fabrication is simple, using resources typically available at any synchrotron facility's optics shop.

  15. AOTV Low L/D Preliminary Aeroheating Design Environment

    NASA Technical Reports Server (NTRS)

    Engel, C. D.

    1983-01-01

    The aerothermal environment to a configuration with a brake face which exhibits a low lift to drag ratio (L/D) of below 0.75 is emphasized. The five times geosynchronous (5 x Geo) orbit entry was selected as the design trajectory. The available data base and math model is discussed. The resulting preliminary design environment is documented. Recommendations as to how the design environment may be improved through technological advances are given.

  16. Optimal Living Environments for the Elderly: A Design Simulation Approach.

    ERIC Educational Resources Information Center

    Hoffman, Stephanie B.; And Others

    PLANNED AGE (Planned Alternatives for Gerontological Environments) is a consumer/advocate-oriented design simulation package that provides: (a) a medium for user-planner interaction in the design of living and service environments for the aged; (b) an educational, planning, design, and evaluation tool that can be used by the elderly, their…

  17. A Computer Environment for Beginners' Learning of Sorting Algorithms: Design and Pilot Evaluation

    ERIC Educational Resources Information Center

    Kordaki, M.; Miatidis, M.; Kapsampelis, G.

    2008-01-01

    This paper presents the design, features and pilot evaluation study of a web-based environment--the SORTING environment--for the learning of sorting algorithms by secondary level education students. The design of this environment is based on modeling methodology, taking into account modern constructivist and social theories of learning while at…

  18. Automatic segmentation of MR brain images of preterm infants using supervised classification.

    PubMed

    Moeskops, Pim; Benders, Manon J N L; Chiţ, Sabina M; Kersbergen, Karina J; Groenendaal, Floris; de Vries, Linda S; Viergever, Max A; Išgum, Ivana

    2015-09-01

    Preterm birth is often associated with impaired brain development. The state and expected progression of preterm brain development can be evaluated using quantitative assessment of MR images. Such measurements require accurate segmentation of different tissue types in those images. This paper presents an algorithm for the automatic segmentation of unmyelinated white matter (WM), cortical grey matter (GM), and cerebrospinal fluid in the extracerebral space (CSF). The algorithm uses supervised voxel classification in three subsequent stages. In the first stage, voxels that can easily be assigned to one of the three tissue types are labelled. In the second stage, dedicated analysis of the remaining voxels is performed. The first and the second stages both use two-class classification for each tissue type separately. Possible inconsistencies that could result from these tissue-specific segmentation stages are resolved in the third stage, which performs multi-class classification. A set of T1- and T2-weighted images was analysed, but the optimised system performs automatic segmentation using a T2-weighted image only. We have investigated the performance of the algorithm when using training data randomly selected from completely annotated images as well as when using training data from only partially annotated images. The method was evaluated on images of preterm infants acquired at 30 and 40weeks postmenstrual age (PMA). When the method was trained using random selection from the completely annotated images, the average Dice coefficients were 0.95 for WM, 0.81 for GM, and 0.89 for CSF on an independent set of images acquired at 30weeks PMA. When the method was trained using only the partially annotated images, the average Dice coefficients were 0.95 for WM, 0.78 for GM and 0.87 for CSF for the images acquired at 30weeks PMA, and 0.92 for WM, 0.80 for GM and 0.85 for CSF for the images acquired at 40weeks PMA. Even though the segmentations obtained using training data from the partially annotated images resulted in slightly lower Dice coefficients, the performance in all experiments was close to that of a second human expert (0.93 for WM, 0.79 for GM and 0.86 for CSF for the images acquired at 30weeks, and 0.94 for WM, 0.76 for GM and 0.87 for CSF for the images acquired at 40weeks). These results show that the presented method is robust to age and acquisition protocol and that it performs accurate segmentation of WM, GM, and CSF when the training data is extracted from complete annotations as well as when the training data is extracted from partial annotations only. This extends the applicability of the method by reducing the time and effort necessary to create training data in a population with different characteristics. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Design Quality in the Context of Healthcare Environments: A Scoping Review.

    PubMed

    Anåker, Anna; Heylighen, Ann; Nordin, Susanna; Elf, Marie

    2017-07-01

    We explored the concept of design quality in relation to healthcare environments. In addition, we present a taxonomy that illustrates the wide range of terms used in connection with design quality in healthcare. High-quality physical environments can promote health and well-being. Developments in healthcare technology and methodology put high demands on the design quality of care environments, coupled with increasing expectations and demands from patients and staff that care environments be person centered, welcoming, and accessible while also supporting privacy and security. In addition, there are demands that decisions about the design of healthcare architecture be based on the best available information from credible research and the evaluation of existing building projects. The basic principles of Arksey and O'Malley's model of scoping review design were used. Data were derived from literature searches in scientific databases. A total of 18 articles and books were found that referred to design quality in a healthcare context. Design quality of physical healthcare environments involves three different themes: (i) environmental sustainability and ecological values, (ii) social and cultural interactions and values, and (iii) resilience of the engineering and building construction. Design quality was clarified herein with a definition. Awareness of what is considered design quality in relation to healthcare architecture could help to design healthcare environments based on evidence. To operationalize the concept, its definition must be clear and explicit and able to meet the complex needs of the stakeholders in a healthcare context, including patients, staff, and significant others.

  20. Fabrication of Pop-up Detector Arrays on Si Wafers

    NASA Technical Reports Server (NTRS)

    Li, Mary J.; Allen, Christine A.; Gordon, Scott A.; Kuhn, Jonathan L.; Mott, David B.; Stahle, Caroline K.; Wang, Liqin L.

    1999-01-01

    High sensitivity is a basic requirement for a new generation of thermal detectors. To meet the requirement, close-packed, two-dimensional silicon detector arrays have been developed in NASA Goddard Space Flight Center. The goal of the task is to fabricate detector arrays configured with thermal detectors such as infrared bolometers and x-ray calorimeters to use in space fliGht missions. This paper focuses on the fabrication and the mechanical testing of detector arrays in a 0.2 mm pixel size, the smallest pop-up detectors being developed so far. These array structures, nicknamed "PUDS" for "Pop-Up Detectors", are fabricated on I pm thick, single-crystal, silicon membranes. Their designs have been refined so we can utilize the flexibility of thin silicon films by actually folding the silicon membranes to 90 degrees in order to obtain close-packed two-dimensional arrays. The PUD elements consist of a detector platform and two legs for mechanical support while also serving as electrical and thermal paths. Torsion bars and cantilevers connecting the detector platform to the legs provide additional flexures for strain relief. Using micro-electromechanical structure (MEMS) fabrication techniques, including photolithography, anisotropic chemical etching, reactive-ion etching, and laser dicing, we have fabricated PLTD detector arrays of fourteen designs with a variation of four parameters including cantilever length, torsion bar length and width, and leg length. Folding tests were conducted to test mechanical stress distribution for the array structures. We obtained folding yields and selected optimum design parameters to reach minimal stress levels. Computer simulation was also employed to verify mechanical behaviors of PUDs in the folding process. In addition, scanning electron microscopy was utilized to examine the flatness of detectors and the alignment of detector pixels in arrays. The fabrication of thermistors and heaters on the pop-up detectors is under way, preparing us for the next step of the experiment, the thermal test.

  1. Can the Physical Environment Have an Impact on the Learning Environment?

    ERIC Educational Resources Information Center

    Lippman, Peter C.

    2010-01-01

    This article argues in favour of challenging "best practice" generally accepted by the architectural profession by embracing a responsive design approach for creating learning environments. Such an approach accepts that the environment shapes the learner, and that learners influence their environment. A responsive design approach would embrace the…

  2. The Influence of Free Space Environment in the Mission Life Cycle: Material Selection

    NASA Technical Reports Server (NTRS)

    Edwards, David L.; Burns, Howard D.; de Groh, Kim K.

    2014-01-01

    The natural space environment has a great influence on the ability of space systems to perform according to mission design specification. Understanding the natural space environment and its influence on space system performance is critical to the concept formulation, design, development, and operation of space systems. Compatibility with the natural space environment is a primary factor in determining the functional lifetime of the space system. Space systems being designed and developed today are growing in complexity. In many instances, the increased complexity also increases its sensitivity to space environmental effects. Sensitivities to the natural space environment can be tempered through appropriate design measures, material selection, ground processing, mitigation strategies, and/or the acceptance of known risks. The design engineer must understand the effects of the natural space environment on the space system and its components. This paper will discuss the influence of the natural space environment in the mission life cycle with a specific focus on the role of material selection.

  3. Perceptions of Pre-Service Teachers on the Design of a Learning Environment Based on the Seven Principles of Good Practice

    ERIC Educational Resources Information Center

    Al-Furaih, Suad Abdul Aziz

    2017-01-01

    This study explored the perceptions of 88 pre-service teachers on the design of a learning environment using the Seven Principles of Good Practice and its effect on participants' abilities to create their Cloud Learning Environment (CLE). In designing the learning environment, a conceptual model under the name 7 Principles and Integrated Learning…

  4. Experimental measurement of microwave ablation heating pattern and comparison to computer simulations.

    PubMed

    Deshazer, Garron; Prakash, Punit; Merck, Derek; Haemmerich, Dieter

    2017-02-01

    For computational models of microwave ablation (MWA), knowledge of the antenna design is necessary, but the proprietary design of clinical applicators is often unknown. We characterised the specific absorption rate (SAR) during MWA experimentally and compared to a multi-physics simulation. An infrared (IR) camera was used to measure SAR during MWA within a split ex vivo liver model. Perseon Medical's short-tip (ST) or long-tip (LT) MWA antenna were placed on top of a tissue sample (n = 6), and microwave power (15 W) was applied for 6 min, while intermittently interrupting power. Tissue surface temperature was recorded via IR camera (3.3 fps, 320 × 240 resolution). SAR was calculated intermittently based on temperature slope before and after power interruption. Temperature and SAR data were compared to simulation results. Experimentally measured SAR changed considerably once tissue temperatures exceeded 100 °C, contrary to simulation results. The ablation zone diameters were 1.28 cm and 1.30 ± 0.03 cm (transverse), and 2.10 cm and 2.66 ± -0.22 cm (axial), for simulation and experiment, respectively. The average difference in temperature between the simulation and experiment were 5.6 °C (ST) and 6.2 °C (LT). Dice coefficients for 1000 W/kg SAR iso-contour were 0.74 ± 0.01 (ST) and 0.77 (± 0.03) (LT), suggesting good agreement of SAR contours. We experimentally demonstrated changes in SAR during MWA ablation, which were not present in simulation, suggesting inaccuracies in dielectric properties. The measured SAR may be used in simplified computer simulations to predict tissue temperature when the antenna geometry is unknown.

  5. Fast and accurate semi-automated segmentation method of spinal cord MR images at 3T applied to the construction of a cervical spinal cord template.

    PubMed

    El Mendili, Mohamed-Mounir; Chen, Raphaël; Tiret, Brice; Villard, Noémie; Trunet, Stéphanie; Pélégrini-Issac, Mélanie; Lehéricy, Stéphane; Pradat, Pierre-François; Benali, Habib

    2015-01-01

    To design a fast and accurate semi-automated segmentation method for spinal cord 3T MR images and to construct a template of the cervical spinal cord. A semi-automated double threshold-based method (DTbM) was proposed enabling both cross-sectional and volumetric measures from 3D T2-weighted turbo spin echo MR scans of the spinal cord at 3T. Eighty-two healthy subjects, 10 patients with amyotrophic lateral sclerosis, 10 with spinal muscular atrophy and 10 with spinal cord injuries were studied. DTbM was compared with active surface method (ASM), threshold-based method (TbM) and manual outlining (ground truth). Accuracy of segmentations was scored visually by a radiologist in cervical and thoracic cord regions. Accuracy was also quantified at the cervical and thoracic levels as well as at C2 vertebral level. To construct a cervical template from healthy subjects' images (n=59), a standardization pipeline was designed leading to well-centered straight spinal cord images and accurate probability tissue map. Visual scoring showed better performance for DTbM than for ASM. Mean Dice similarity coefficient (DSC) was 95.71% for DTbM and 90.78% for ASM at the cervical level and 94.27% for DTbM and 89.93% for ASM at the thoracic level. Finally, at C2 vertebral level, mean DSC was 97.98% for DTbM compared with 98.02% for TbM and 96.76% for ASM. DTbM showed similar accuracy compared with TbM, but with the advantage of limited manual interaction. A semi-automated segmentation method with limited manual intervention was introduced and validated on 3T images, enabling the construction of a cervical spinal cord template.

  6. Design Quality in the Context of Healthcare Environments: A Scoping Review

    PubMed Central

    Anåker, Anna; Heylighen, Ann; Nordin, Susanna; Elf, Marie

    2016-01-01

    Objective: We explored the concept of design quality in relation to healthcare environments. In addition, we present a taxonomy that illustrates the wide range of terms used in connection with design quality in healthcare. Background: High-quality physical environments can promote health and well-being. Developments in healthcare technology and methodology put high demands on the design quality of care environments, coupled with increasing expectations and demands from patients and staff that care environments be person centered, welcoming, and accessible while also supporting privacy and security. In addition, there are demands that decisions about the design of healthcare architecture be based on the best available information from credible research and the evaluation of existing building projects. Method: The basic principles of Arksey and O’Malley’s model of scoping review design were used. Data were derived from literature searches in scientific databases. A total of 18 articles and books were found that referred to design quality in a healthcare context. Results: Design quality of physical healthcare environments involves three different themes: (i) environmental sustainability and ecological values, (ii) social and cultural interactions and values, and (iii) resilience of the engineering and building construction. Design quality was clarified herein with a definition. Conclusions: Awareness of what is considered design quality in relation to healthcare architecture could help to design healthcare environments based on evidence. To operationalize the concept, its definition must be clear and explicit and able to meet the complex needs of the stakeholders in a healthcare context, including patients, staff, and significant others. PMID:28643560

  7. Toward a research and action agenda on urban planning/design and health equity in cities in low and middle-income countries.

    PubMed

    Smit, Warren; Hancock, Trevor; Kumaresen, Jacob; Santos-Burgoa, Carlos; Sánchez-Kobashi Meneses, Raúl; Friel, Sharon

    2011-10-01

    The importance of reestablishing the link between urban planning and public health has been recognized in recent decades; this paper focuses on the relationship between urban planning/design and health equity, especially in cities in low and middle-income countries (LMICs). The physical urban environment can be shaped through various planning and design processes including urban planning, urban design, landscape architecture, infrastructure design, architecture, and transport planning. The resultant urban environment has important impacts on the health of the people who live and work there. Urban planning and design processes can also affect health equity through shaping the extent to which the physical urban environments of different parts of cities facilitate the availability of adequate housing and basic infrastructure, equitable access to the other benefits of urban life, a safe living environment, a healthy natural environment, food security and healthy nutrition, and an urban environment conducive to outdoor physical activity. A new research and action agenda for the urban environment and health equity in LMICs should consist of four main components. We need to better understand intra-urban health inequities in LMICs; we need to better understand how changes in the built environment in LMICs affect health equity; we need to explore ways of successfully planning, designing, and implementing improved health/health equity; and we need to develop evidence-based recommendations for healthy urban planning/design in LMICs.

  8. A Well Designed School Environment Facilitates Brain Learning.

    ERIC Educational Resources Information Center

    Chan, Tak Cheung; Petrie, Garth

    2000-01-01

    Examines how school design facilitates learning by complementing how the brain learns. How the brain learns is discussed and how an artistic environment, spaciousness in the learning areas, color and lighting, and optimal thermal and acoustical environments aid student learning. School design suggestions conclude the article. (GR)

  9. THE DEFINITION AND INTERPRETATION OF TERRESTRIAL ENVIRONMENT DESIGN INPUTS FOR VEHICLE DESIGN CONSIDERATIONS

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Keller, Vernon W.; Vaughan, William W.

    2005-01-01

    The description and interpretation of the terrestrial environment (0-90 km altitude) is an important driver of aerospace vehicle structural, control, and thermal system design. NASA is currently in the process of reviewing the meteorological information acquired over the past decade and producing an update to the 1993 Terrestrial Environment Guidelines for Aerospace Vehicle Design and Development handbook. This paper addresses the contents of this updated handbook, with special emphasis on new material being included in the areas of atmospheric thermodynamic models, wind dynamics, atmospheric composition, atmospheric electricity, cloud phenomena, atmospheric extremes, sea state, etc. In addition, the respective engineering design elements will be discussed relative to the importance and influence of terrestrial environment inputs that require consideration and interpretation for design applications. Specific lessons learned that have contributed to the advancements made in the acquisition, interpretation, application and awareness of terrestrial environment inputs for aerospace engineering applications are discussed.

  10. Spatial context learning approach to automatic segmentation of pleural effusion in chest computed tomography images

    NASA Astrophysics Data System (ADS)

    Mansoor, Awais; Casas, Rafael; Linguraru, Marius G.

    2016-03-01

    Pleural effusion is an abnormal collection of fluid within the pleural cavity. Excessive accumulation of pleural fluid is an important bio-marker for various illnesses, including congestive heart failure, pneumonia, metastatic cancer, and pulmonary embolism. Quantification of pleural effusion can be indicative of the progression of disease as well as the effectiveness of any treatment being administered. Quantification, however, is challenging due to unpredictable amounts and density of fluid, complex topology of the pleural cavity, and the similarity in texture and intensity of pleural fluid to the surrounding tissues in computed tomography (CT) scans. Herein, we present an automated method for the segmentation of pleural effusion in CT scans based on spatial context information. The method consists of two stages: first, a probabilistic pleural effusion map is created using multi-atlas segmentation. The probabilistic map assigns a priori probabilities to the presence of pleural uid at every location in the CT scan. Second, a statistical pattern classification approach is designed to annotate pleural regions using local descriptors based on a priori probabilities, geometrical, and spatial features. Thirty seven CT scans from a diverse patient population containing confirmed cases of minimal to severe amounts of pleural effusion were used to validate the proposed segmentation method. An average Dice coefficient of 0.82685 and Hausdorff distance of 16.2155 mm was obtained.

  11. Segmentation of histological images and fibrosis identification with a convolutional neural network.

    PubMed

    Fu, Xiaohang; Liu, Tong; Xiong, Zhaohan; Smaill, Bruce H; Stiles, Martin K; Zhao, Jichao

    2018-07-01

    Segmentation of histological images is one of the most crucial tasks for many biomedical analyses involving quantification of certain tissue types, such as fibrosis via Masson's trichrome staining. However, challenges are posed by the high variability and complexity of structural features in such images, in addition to imaging artifacts. Further, the conventional approach of manual thresholding is labor-intensive, and highly sensitive to inter- and intra-image intensity variations. An accurate and robust automated segmentation method is of high interest. We propose and evaluate an elegant convolutional neural network (CNN) designed for segmentation of histological images, particularly those with Masson's trichrome stain. The network comprises 11 successive convolutional - rectified linear unit - batch normalization layers. It outperformed state-of-the-art CNNs on a dataset of cardiac histological images (labeling fibrosis, myocytes, and background) with a Dice similarity coefficient of 0.947. With 100 times fewer (only 300,000) trainable parameters than the state-of-the-art, our CNN is less susceptible to overfitting, and is efficient. Additionally, it retains image resolution from input to output, captures fine-grained details, and can be trained end-to-end smoothly. To the best of our knowledge, this is the first deep CNN tailored to the problem of concern, and may potentially be extended to solve similar segmentation tasks to facilitate investigations into pathology and clinical treatment. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. An Efficient Implementation of Deep Convolutional Neural Networks for MRI Segmentation.

    PubMed

    Hoseini, Farnaz; Shahbahrami, Asadollah; Bayat, Peyman

    2018-02-27

    Image segmentation is one of the most common steps in digital image processing, classifying a digital image into different segments. The main goal of this paper is to segment brain tumors in magnetic resonance images (MRI) using deep learning. Tumors having different shapes, sizes, brightness and textures can appear anywhere in the brain. These complexities are the reasons to choose a high-capacity Deep Convolutional Neural Network (DCNN) containing more than one layer. The proposed DCNN contains two parts: architecture and learning algorithms. The architecture and the learning algorithms are used to design a network model and to optimize parameters for the network training phase, respectively. The architecture contains five convolutional layers, all using 3 × 3 kernels, and one fully connected layer. Due to the advantage of using small kernels with fold, it allows making the effect of larger kernels with smaller number of parameters and fewer computations. Using the Dice Similarity Coefficient metric, we report accuracy results on the BRATS 2016, brain tumor segmentation challenge dataset, for the complete, core, and enhancing regions as 0.90, 0.85, and 0.84 respectively. The learning algorithm includes the task-level parallelism. All the pixels of an MR image are classified using a patch-based approach for segmentation. We attain a good performance and the experimental results show that the proposed DCNN increases the segmentation accuracy compared to previous techniques.

  13. Performing a secondary executive task with affective stimuli interferes with decision making under risk conditions.

    PubMed

    Gathmann, Bettina; Pawlikowski, Mirko; Schöler, Tobias; Brand, Matthias

    2014-05-01

    Previous studies demonstrated that executive functions are crucial for advantageous decision making under risk and that therefore decision making is disrupted when working memory capacity is demanded while working on a decision task. While some studies also showed that emotions can affect decision making under risk, it is unclear how affective processing and executive functions predict decision-making performance in interaction. The current experimental study used a between-subjects design to examine whether affective pictures (positive and negative pictures compared to neutral pictures), included in a parallel executive task (working memory 2-back task), have an impact on decision making under risk as assessed by the Game of Dice Task (GDT). Moreover, the performance GDT plus 2-back task was compared to the performance in the GDT without any additional task (GDT solely). The results show that the performance in the GDT differed between groups (positive, negative, neutral, and GDT solely). The groups with affective pictures, especially those with positive pictures in the 2-back task, showed more disadvantageous decisions in the GDT than the groups with neutral pictures and the group performing the GDT without any additional task. However, executive functions moderated the effect of the affective pictures. Regardless of affective influence, subjects with good executive functions performed advantageously in the GDT. These findings support the assumption that executive functions and emotional processing interact in predicting decision making under risk.

  14. Laser cutting sandwich structure glass-silicon-glass wafer with laser induced thermal-crack propagation

    NASA Astrophysics Data System (ADS)

    Cai, Yecheng; Wang, Maolu; Zhang, Hongzhi; Yang, Lijun; Fu, Xihong; Wang, Yang

    2017-08-01

    Silicon-glass devices are widely used in IC industry, MEMS and solar energy system because of their reliability and simplicity of the manufacturing process. With the trend toward the wafer level chip scale package (WLCSP) technology, the suitable dicing method of silicon-glass bonded structure wafer has become necessary. In this paper, a combined experimental and computational approach is undertaken to investigate the feasibility of cutting the sandwich structure glass-silicon-glass (SGS) wafer with laser induced thermal-crack propagation (LITP) method. A 1064 nm semiconductor laser cutting system with double laser beams which could simultaneously irradiate on the top and bottom of the sandwich structure wafer has been designed. A mathematical model for describing the physical process of the interaction between laser and SGS wafer, which consists of two surface heating sources and two volumetric heating sources, has been established. The temperature stress distribution are simulated by using finite element method (FEM) analysis software ABAQUS. The crack propagation process is analyzed by using the J-integral method. In the FEM model, a stationary planar crack is embedded in the wafer and the J-integral values around the crack front edge are determined using the FEM. A verification experiment under typical parameters is conducted and the crack propagation profile on the fracture surface is examined by the optical microscope and explained from the stress distribution and J-integral value.

  15. Interindividual registration and dose mapping for voxelwise population analysis of rectal toxicity in prostate cancer radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dréan, Gaël; Acosta, Oscar, E-mail: Oscar.Acosta@univ-rennes1.fr; Simon, Antoine

    2016-06-15

    Purpose: Recent studies revealed a trend toward voxelwise population analysis in order to understand the local dose/toxicity relationships in prostate cancer radiotherapy. Such approaches require, however, an accurate interindividual mapping of the anatomies and 3D dose distributions toward a common coordinate system. This step is challenging due to the high interindividual variability. In this paper, the authors propose a method designed for interindividual nonrigid registration of the rectum and dose mapping for population analysis. Methods: The method is based on the computation of a normalized structural description of the rectum using a Laplacian-based model. This description takes advantage of themore » tubular structure of the rectum and its centerline to be embedded in a nonrigid registration-based scheme. The performances of the method were evaluated on 30 individuals treated for prostate cancer in a leave-one-out cross validation. Results: Performance was measured using classical metrics (Dice score and Hausdorff distance), along with new metrics devised to better assess dose mapping in relation with structural deformation (dose-organ overlap). Considering these scores, the proposed method outperforms intensity-based and distance maps-based registration methods. Conclusions: The proposed method allows for accurately mapping interindividual 3D dose distributions toward a single anatomical template, opening the way for further voxelwise statistical analysis.« less

  16. Dual-mode intracranial catheter integrating 3D ultrasound imaging and hyperthermia for neuro-oncology: feasibility study.

    PubMed

    Herickhoff, Carl D; Light, Edward D; Bing, Kristin F; Mukundan, Srinivasan; Grant, Gerald A; Wolf, Patrick D; Smith, Stephen W

    2009-04-01

    In this study, we investigated the feasibility of an intracranial catheter transducer with dual-mode capability of real-time 3D (RT3D) imaging and ultrasound hyperthermia, for application in the visualization and treatment of tumors in the brain. Feasibility is demonstrated in two ways: first by using a 50-element linear array transducer (17 mm x 3.1 mm aperture) operating at 4.4 MHz with our Volumetrics diagnostic scanner and custom, electrical impedance-matching circuits to achieve a temperature rise over 4 degrees C in excised pork muscle, and second, by designing and constructing a 12 Fr, integrated matrix and linear-array catheter transducer prototype for combined RT3D imaging and heating capability. This dual-mode catheter incorporated 153 matrix array elements and 11 linear array elements diced on a 0.2 mm pitch, with a total aperture size of 8.4 mm x 2.3 mm. This 3.64 MHz array achieved a 3.5 degrees C in vitro temperature rise at a 2 cm focal distance in tissue-mimicking material. The dual-mode catheter prototype was compared with a Siemens 10 Fr AcuNav catheter as a gold standard in experiments assessing image quality and therapeutic potential and both probes were used in an in vivo canine brain model to image anatomical structures and color Doppler blood flow and to attempt in vivo heating.

  17. Dual-mode Intracranial Catheter Integrating 3D Ultrasound Imaging & Hyperthermia for Neuro-oncology: Feasibility Study

    PubMed Central

    Herickhoff, Carl D.; Light, Edward D.; Bing, Kristin F.; Mukundan, Srinivasan; Grant, Gerald A.; Wolf, Patrick D.; Smith, Stephen W.

    2010-01-01

    In this study, we investigated the feasibility of an intracranial catheter transducer with dual-mode capability of real-time 3D (RT3D) imaging and ultrasound hyperthermia, for application in the visualization and treatment of tumors in the brain. Feasibility is demonstrated in two ways: first by using a 50-element linear array transducer (17 mm × 3.1 mm aperture) operating at 4.4 MHz with our Volumetrics diagnostic scanner and custom electrical impedance matching circuits to achieve a temperature rise over 4°C in excised pork muscle, and second by designing and constructing a 12 Fr, integrated matrix and linear array catheter transducer prototype for combined RT3D imaging and heating capability. This dual-mode catheter incorporated 153 matrix array elements and 11 linear array elements diced on a 0.2 mm pitch, with a total aperture size of 8.4 mm × 2.3 mm. This array achieved a 3.5°C in vitro temperature rise at a 2 cm focal distance in tissue-mimicking material. The dual-mode catheter prototype was compared with a Siemens 10 Fr AcuNav™ catheter as a gold standard in experiments assessing image quality and therapeutic potential, and both probes were used in a canine brain model to image anatomical structures and color Doppler blood flow and to attempt in vivo heating. PMID:19630251

  18. Development of patient-specific phantoms for verification of stereotactic body radiation therapy planning in patients with metallic screw fixation

    NASA Astrophysics Data System (ADS)

    Oh, Dongryul; Hong, Chae-Seon; Ju, Sang Gyu; Kim, Minkyu; Koo, Bum Yong; Choi, Sungback; Park, Hee Chul; Choi, Doo Ho; Pyo, Hongryull

    2017-01-01

    A new technique for manufacturing a patient-specific dosimetric phantom using three-dimensional printing (PSDP_3DP) was developed, and its geometrical and dosimetric accuracy was analyzed. External body contours and structures of the spine and metallic fixation screws (MFS) were delineated from CT images of a patient with MFS who underwent stereotactic body radiation therapy for spine metastasis. Contours were converted into a STereoLithography file format using in-house program. A hollow, four-section PSDP was designed and manufactured using three types of 3DP to allow filling with a muscle-equivalent liquid and insertion of dosimeters. To evaluate the geometrical accuracy of PSDP_3DP, CT images were obtained and compared with patient CT data for volume, mean density, and Dice similarity coefficient for contours. The dose distribution in the PSDP_3DP was calculated by applying the same beam parameters as for the patient, and the dosimetric characteristics of the PSDP_3DP were compared with the patient plan. The registered CT of the PSDP_3DP was well matched with that of the real patient CT in the axial, coronal, and sagittal planes. The physical accuracy and dosimetric characteristics of PSDP_3DP were comparable to those of a real patient. The ability to manufacture a PSDP representing an extreme patient condition was demonstrated.

  19. Developing Learning Theory by Refining Conjectures Embodied in Educational Designs

    ERIC Educational Resources Information Center

    Sandoval, William A.

    2004-01-01

    Designed learning environments embody conjectures about learning and instruction, and the empirical study of learning environments allows such conjectures to be refined over time. The construct of embodied conjecture is introduced as a way to demonstrate the theoretical nature of learning environment design and to frame methodological issues in…

  20. Kitchen Science Investigators: Promoting Identity Development as Scientific Reasoners and Thinkers

    ERIC Educational Resources Information Center

    Clegg, Tamara Lynnette

    2010-01-01

    My research centers upon designing transformative learning environments and supporting technologies. Kitchen Science Investigators (KSI) is an out-of-school transformative learning environment we designed to help young people learn science through cooking. My dissertation considers the question, "How can we design a learning environment in which…

  1. The numerical analysis of outdoor wind and thermal environment in a residential area in Liaocheng, China

    NASA Astrophysics Data System (ADS)

    Zhang, Linfang; Yu, Zhenyang; Liu, Jiying; Zhang, Linhua

    2018-02-01

    With the improvement of people’s living standard, people not only pay attention to the indoor environment, but also the outdoor environment. The paper simulated the outdoor wind environment and thermal environment for the building in its design stage, then suggestions are provided for further design stage using a case study in a residential area in Liaocheng, China. SketchUp is used to establish 3D model and PHOENICS is adopted to simulate wind environment and thermal environment. The evaluation criterion mainly utilized Green Building Evaluation Criteria and Urban Residential Area Thermal Environment Design Criteria and ISO7243. Through the analysis of the wind and thermal environment problems, this paper puts forward measures and suggestions to provide reference for the later planning.

  2. Interactions of age and cognitive functions in predicting decision making under risky conditions over the life span.

    PubMed

    Brand, Matthias; Schiebener, Johannes

    2013-01-01

    Little is known about how normal healthy aging affects decision-making competence. In this study 538 participants (age 18-80 years) performed the Game of Dice Task (GDT). Subsamples also performed the Iowa Gambling Task as well as tasks measuring logical thinking and executive functions. In a moderated regression analysis, the significant interaction between age and executive components indicates that older participants with good executive functioning perform well on the GDT, while older participants with reduced executive functions make more risky choices. The same pattern emerges for the interaction of age and logical thinking. Results demonstrate that age and cognitive functions act in concert in predicting the decision-making performance.

  3. Note: Comparative experimental studies on the performance of 2-2 piezocomposite for medical ultrasound transducers

    NASA Astrophysics Data System (ADS)

    Marinozzi, F.; Bini, F.; Biagioni, A.; Grandoni, A.; Spicci, L.

    2013-09-01

    The paper reports the experimental investigation of the behavior of 2-2 Lead Zirconate Titanate (PZT)-polymer composite transducers array for clinical ultrasound equipments. Several 2-2 plate composites having the same dicing pitch of 0.11 mm and different volume fractions were manufactured and investigated. Measurements were performed through different techniques such as electrical impedance, pulse-echo, and Laser Doppler Vibrometer. With the last one, maps of the surface displacement were presented relative to thickness mode and first lateral mode resonance frequencies. The transducers with volume fractions of the 40% resulted markedly inefficient, whereas the largest bandwidth and best band shape were achieved by the 50%.

  4. Deep convolutional neural network for prostate MR segmentation

    NASA Astrophysics Data System (ADS)

    Tian, Zhiqiang; Liu, Lizhi; Fei, Baowei

    2017-03-01

    Automatic segmentation of the prostate in magnetic resonance imaging (MRI) has many applications in prostate cancer diagnosis and therapy. We propose a deep fully convolutional neural network (CNN) to segment the prostate automatically. Our deep CNN model is trained end-to-end in a single learning stage based on prostate MR images and the corresponding ground truths, and learns to make inference for pixel-wise segmentation. Experiments were performed on our in-house data set, which contains prostate MR images of 20 patients. The proposed CNN model obtained a mean Dice similarity coefficient of 85.3%+/-3.2% as compared to the manual segmentation. Experimental results show that our deep CNN model could yield satisfactory segmentation of the prostate.

  5. Spot on for liars! How public scrutiny influences ethical behavior

    PubMed Central

    2017-01-01

    We examine whether people are more honest in public than in private. In a laboratory experiment, we have subjects roll dice and report outcomes either in public or in private. Higher reports yield more money and lies cannot be detected. We also elicit subjects’ ethical mindsets and their expectations about others’ reports. We find that outcome-minded subjects lie less in public to conform with their expectations about others’ reports. Ironically, these expectations are false. Rule-minded subjects, in turn, do not respond to public scrutiny. These findings challenge the common faith in public scrutiny to promote ethical behavior. While public scrutiny eventually increases honesty, this effect is contingent on people’s mindsets and expectations. PMID:28715476

  6. MBE growth of VCSELs for high volume applications

    NASA Astrophysics Data System (ADS)

    Jäger, Roland; Riedl, Michael C.

    2011-05-01

    Mass market applications like laser computer mouse or optical data transmission based on vertical-cavity surface-emitting laser (VCSEL) chips need a high over all yield including epitaxy, processing, dicing, mounting and testing. One yield limitation for VCSEL structures is the emission wavelength variation of the substrate surface area leading to the fraction on laser chips which are below or above the specification limits. For most 850 nm VCSEL products a resonator wavelength variation of ±2 nm is common. This represents an average resonator thickness variation of much less than 1% which is quite challenging to be fulfilled on the entire processed wafer surface area. A high over all yield is demonstrated on MBE grown VCSEL structures.

  7. Spot on for liars! How public scrutiny influences ethical behavior.

    PubMed

    Ostermaier, Andreas; Uhl, Matthias

    2017-01-01

    We examine whether people are more honest in public than in private. In a laboratory experiment, we have subjects roll dice and report outcomes either in public or in private. Higher reports yield more money and lies cannot be detected. We also elicit subjects' ethical mindsets and their expectations about others' reports. We find that outcome-minded subjects lie less in public to conform with their expectations about others' reports. Ironically, these expectations are false. Rule-minded subjects, in turn, do not respond to public scrutiny. These findings challenge the common faith in public scrutiny to promote ethical behavior. While public scrutiny eventually increases honesty, this effect is contingent on people's mindsets and expectations.

  8. [The genetic diversity and homology of Anabaena azollae and its host plant (Azolla) based on rapd analysis].

    PubMed

    Chen, Jian; Zheng, Wei-wen; Xu, Guo-zhong; Song, Tie-ying; Tang, Long-fei

    2002-01-01

    Symbiotic Anabeana azollae and its host plant Anabeana-free Azolla were isolated from 16 Azolla accessions representing different Azolla species or geographic origins.DNA polymorphic fragments were obtained by simultaneous RAPD amplification of both symbiont and host. The UPGMA clusters of Anabeana azollae and its host Azolla were established separately based on Dice coefficient caculation and a coordinated relationship was shown between Anabeana azollae and its Azolla host along both individual genetic divergence,but this genetic homology was reduced among different strains within Azolla species while the obvious mutants of Anabeana azollae were detected in some Azolla tested strains collected from different geographic area in the same host species.

  9. RED or READ: the built environment is colored

    NASA Astrophysics Data System (ADS)

    Smith, Dianne

    2002-06-01

    How important is color in the design of our built environment? Prototypes and massing models for designs are often presented in white or monochromatic combinations, irrespective of the materials incorporated and the colors that may be applied in the final constructed building, interior or object. Therefore, it is of interest to identify the way color is positioned by designers in how they go about the business of making environments. The built environment is understood by the designers and design researchers generally in one of four fields - as object, as product, as communicator, or as social domain. In addition, Franz identified four conceptions of designing held by designers - the experiential conception, the structural conception, the production conception and the retail conception. Fashion and style are often associated with color in a local context and may simply be applied to the physical environment because it is in fashion, rather than because of what it communicates more broadly. It is assumed that the integration of color in the built environment is influenced by these understandings. In order to address color's position in the design process and the importance of color in relation to space, form, and the experience of place, a selection of Queensland architects and interior designers were surveyed. The study is not conclusive, however, it does identify differences and commonalities between the participants that are of interest in light of the above issues. Explorations into environmental meaning, in addition to color theory and decorative applications, are hypothesized to be important sources of information for designers involved in the coloration of the built environment.

  10. Space Vehicle Terrestrial Environment Design Requirements Guidelines

    NASA Technical Reports Server (NTRS)

    Johnson, Dale L.; Keller, Vernon W.; Vaughan, William W.

    2006-01-01

    The terrestrial environment is an important driver of space vehicle structural, control, and thermal system design. NASA is currently in the process of producing an update to an earlier Terrestrial Environment Guidelines for Aerospace Vehicle Design and Development Handbook. This paper addresses the contents of this updated handbook, with special emphasis on new material being included in the areas of atmospheric thermodynamic models, wind dynamics, atmospheric composition, atmospheric electricity, cloud phenomena, atmospheric extremes, and sea state. In addition, the respective engineering design elements are discussed relative to terrestrial environment inputs that require consideration. Specific lessons learned that have contributed to the advancements made in the application and awareness of terrestrial environment inputs for aerospace engineering applications are presented.

  11. Computer aided design environment for the analysis and design of multi-body flexible structures

    NASA Technical Reports Server (NTRS)

    Ramakrishnan, Jayant V.; Singh, Ramen P.

    1989-01-01

    A computer aided design environment consisting of the programs NASTRAN, TREETOPS and MATLAB is presented in this paper. With links for data transfer between these programs, the integrated design of multi-body flexible structures is significantly enhanced. The CAD environment is used to model the Space Shuttle/Pinhole Occulater Facility. Then a controller is designed and evaluated in the nonlinear time history sense. Recent enhancements and ongoing research to add more capabilities are also described.

  12. Learning Environments in Children's Museums: Aesthetics, Environmental Preference and Creativity.

    ERIC Educational Resources Information Center

    Lackney, Jeffery A.

    This paper discusses environmental preference, particularly related to the design of children's museums. It explains that preference for an environment leads to motivation to interact with the environment, which leads to learning. It lays out several design principles: (1) involve children in the process of children's museum design in a way that…

  13. Architecture and Children: Learning Environments and Design Education.

    ERIC Educational Resources Information Center

    Taylor, Anne, Ed.; Muhlberger, Joe, Ed.

    1998-01-01

    This issue addresses (1) growing international interest in learning environments and their effects on behavior, and (2) design education, an integrated model for visual-spatial lifelong learning. It focuses on this new and emerging integrated field which integrates elements in education, new learning environment design, and the use of more two-…

  14. Learning System Design Consideration in Creating an Online Learning Environment.

    ERIC Educational Resources Information Center

    Schaffer, Scott

    This paper describes the design of a Web-based learning environment for leadership facilitators in a United States military organization. The overall aim of this project was to design a prototype of an online learning environment that supports leadership facilitators' knowledge development in the content area of motivation. The learning…

  15. Designing E-Learning Environments for Flexible Activity and Instruction

    ERIC Educational Resources Information Center

    Wilson, Brent G.

    2004-01-01

    The contributions to this issue share a focus on design of e-learning environments. Instructional designers stand at very early stages of knowledge in this area, but with great potential for growth and progress. This commentary offers an activity-based perspective on e-learning environments, resulting in a flexible stance toward instructional…

  16. Creating Next Generation Blended Learning Environments Using Mixed Reality, Video Games and Simulations

    ERIC Educational Resources Information Center

    Kirkley, Sonny E.; Kirkley, Jamie R.

    2005-01-01

    In this article, the challenges and issues of designing next generation learning environments using current and emerging technologies are addressed. An overview of the issues is provided as well as design principles that support the design of instruction and the overall learning environment. Specific methods for creating cognitively complex,…

  17. A Stochastic Model of Plausibility in Live Virtual Constructive Environments

    DTIC Science & Technology

    2017-09-14

    objective in virtual environment research and design is the maintenance of adequate consistency levels in the face of limited system resources such as...provides some commentary with regard to system design considerations and future research directions. II. SYSTEM MODEL DVEs are often designed as a...exceed the system’s requirements. Research into predictive models of virtual environment consistency is needed to provide designers the tools to

  18. The natural space environment: Effects on spacecraft

    NASA Technical Reports Server (NTRS)

    James, Bonnie F.; Norton, O. W. (Compiler); Alexander, Margaret B. (Editor)

    1994-01-01

    The effects of the natural space environments on spacecraft design, development, and operation are the topic of a series of NASA Reference Publications currently being developed by the Electromagnetics and Environments Branch, Systems Analysis and Integration Laboratory, Marshall Space Flight Center. This primer provides an overview of the natural space environments and their effect on spacecraft design, development, and operations, and also highlights some of the new developments in science and technology for each space environment. It is hoped that a better understanding of the space environment and its effect on spacecraft will enable program management to more effectively minimize program risks and costs, optimize design quality, and successfully achieve mission objectives.

  19. Design and "As Flown" Radiation Environments for Materials in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Minow, Joseph; McWilliams, Brett; Altstatt, Richard; Koontz, Steven

    2006-01-01

    A conservative design approach was adopted by the International Space Station Program for specifying total ionizing radiation dose requirements for use in selecting and qualifying materials for construction of the International Space Station. The total ionizing dose design environment included in SSP 30512 Space Station Ionizing Radiation Design Environment is based on trapped proton and electron fluence derived from the solar maximum versions of the AE-8 and AP-8 models, respectively, specified for a circular orbit at 500 km altitude and 51.7 degree inclination. Since launch, the range of altitudes utilized for Space Station operations vary from a minimum of approximately 330 km to a maximum of approximately 405 km with a mean operational altitude less than 400 km. The design environment, therefore, overestimates the radiation environment because the particle flux in the South Atlantic Anomaly is the primary contributor to radiation dose in low Earth orbit and flux within the Anomaly is altitude dependent. In addition, a 2X multiplier is often applied to the design environment to cover effects from the contributions of galactic cosmic rays, solar energetic particle events, geomagnetic storms, and uncertainties in the trapped radiation models which are not explicitly included in the design environment. Application of this environment may give radiation dose overestimates on the order of 1OX to 30X for materials exposed to the space environment, suggesting that materials originally qualified for ten year exposures on orbit may be used for longer periods without replacement. In this paper we evaluate the "as flown" radiation environments derived from historical records of the ISS flight trajectory since launch and compare the results with the SSP 30512 design environment to document the magnitude of the radiation dose overestimate provided by the design environment. "As flown" environments are obtained from application of the AE-8/AP-8 trapped particle models along the ISS flight trajectory including variations in altitude due to decay of the vehicle orbit and periodic reboosts to higher altitudes. In addition, an estimate of the AE-8 model to predict low Earth orbit electron flux (because the radiation dose for thin materials is dominated by the electron component of the radiation environment) is presented based on comparisons of the AE-8 model to measurements of electron integral flux at approximately 850 km from the Medium Energy Proton and Electron Detector on board the NOAA Polar Operational Environmental Satellite.

  20. Citizen Science as a REAL Environment for Authentic Scientific Inquiry

    ERIC Educational Resources Information Center

    Meyer, Nathan J.; Scott, Siri; Strauss, Andrea Lorek; Nippolt, Pamela L.; Oberhauser, Karen S.; Blair, Robert B.

    2014-01-01

    Citizen science projects can serve as constructivist learning environments for programming focused on science, technology, engineering, and math (STEM) for youth. Attributes of "rich environments for active learning" (REALs) provide a framework for design of Extension STEM learning environments. Guiding principles and design strategies…

Top