Science.gov

Sample records for 2d computer screen

  1. Computational Screening of 2D Materials for Photocatalysis.

    PubMed

    Singh, Arunima K; Mathew, Kiran; Zhuang, Houlong L; Hennig, Richard G

    2015-03-19

    Two-dimensional (2D) materials exhibit a range of extraordinary electronic, optical, and mechanical properties different from their bulk counterparts with potential applications for 2D materials emerging in energy storage and conversion technologies. In this Perspective, we summarize the recent developments in the field of solar water splitting using 2D materials and review a computational screening approach to rapidly and efficiently discover more 2D materials that possess properties suitable for solar water splitting. Computational tools based on density-functional theory can predict the intrinsic properties of potential photocatalyst such as their electronic properties, optical absorbance, and solubility in aqueous solutions. Computational tools enable the exploration of possible routes to enhance the photocatalytic activity of 2D materials by use of mechanical strain, bias potential, doping, and pH. We discuss future research directions and needed method developments for the computational design and optimization of 2D materials for photocatalysis.

  2. Model dielectric function for 2D semiconductors including substrate screening

    PubMed Central

    Trolle, Mads L.; Pedersen, Thomas G.; Véniard, Valerie

    2017-01-01

    Dielectric screening of excitons in 2D semiconductors is known to be a highly non-local effect, which in reciprocal space translates to a strong dependence on momentum transfer q. We present an analytical model dielectric function, including the full non-linear q-dependency, which may be used as an alternative to more numerically taxing ab initio screening functions. By verifying the good agreement between excitonic optical properties calculated using our model dielectric function, and those derived from ab initio methods, we demonstrate the versatility of this approach. Our test systems include: Monolayer hBN, monolayer MoS2, and the surface exciton of a 2 × 1 reconstructed Si(111) surface. Additionally, using our model, we easily take substrate screening effects into account. Hence, we include also a systematic study of the effects of substrate media on the excitonic optical properties of MoS2 and hBN. PMID:28117326

  3. 2D NMR-spectroscopic screening reveals polyketides in ladybugs.

    PubMed

    Deyrup, Stephen T; Eckman, Laura E; McCarthy, Patrick H; Smedley, Scott R; Meinwald, Jerrold; Schroeder, Frank C

    2011-06-14

    Small molecules of biological origin continue to yield the most promising leads for drug design, but systematic approaches for exploring nature's cache of structural diversity are lacking. Here, we demonstrate the use of 2D NMR spectroscopy to screen a library of biorationally selected insect metabolite samples for partial structures indicating the presence of new chemical entities. This NMR-spectroscopic survey enabled detection of novel compounds in complex metabolite mixtures without prior fractionation or isolation. Our screen led to discovery and subsequent isolation of two families of tricyclic pyrones in Delphastus catalinae, a tiny ladybird beetle that is employed commercially as a biological pest control agent. The D. catalinae pyrones are based on 23-carbon polyketide chains forming 1,11-dioxo-2,6,10-trioxaanthracene and 4,8-dioxo-1,9,13-trioxaanthracene derivatives, representing ring systems not previously found in nature. This study highlights the utility of 2D NMR-spectroscopic screening for exploring nature's structure space and suggests that insect metabolomes remain vastly underexplored.

  4. 2D NMR-spectroscopic screening reveals polyketides in ladybugs

    PubMed Central

    Deyrup, Stephen T.; Eckman, Laura E.; McCarthy, Patrick H.; Smedley, Scott R.; Meinwald, Jerrold; Schroeder, Frank C.

    2011-01-01

    Small molecules of biological origin continue to yield the most promising leads for drug design, but systematic approaches for exploring nature’s cache of structural diversity are lacking. Here, we demonstrate the use of 2D NMR spectroscopy to screen a library of biorationally selected insect metabolite samples for partial structures indicating the presence of new chemical entities. This NMR-spectroscopic survey enabled detection of novel compounds in complex metabolite mixtures without prior fractionation or isolation. Our screen led to discovery and subsequent isolation of two families of tricyclic pyrones in Delphastus catalinae, a tiny ladybird beetle that is employed commercially as a biological pest control agent. The D. catalinae pyrones are based on 23-carbon polyketide chains forming 1,11-dioxo-2,6,10-trioxaanthracene and 4,8-dioxo-1,9,13-trioxaanthracene derivatives, representing ring systems not previously found in nature. This study highlights the utility of 2D NMR-spectroscopic screening for exploring nature’s structure space and suggests that insect metabolomes remain vastly underexplored. PMID:21646540

  5. Validation and testing of the VAM2D computer code

    SciTech Connect

    Kool, J.B.; Wu, Y.S. )

    1991-10-01

    This document describes two modeling studies conducted by HydroGeoLogic, Inc. for the US NRC under contract no. NRC-04089-090, entitled, Validation and Testing of the VAM2D Computer Code.'' VAM2D is a two-dimensional, variably saturated flow and transport code, with applications for performance assessment of nuclear waste disposal. The computer code itself is documented in a separate NUREG document (NUREG/CR-5352, 1989). The studies presented in this report involve application of the VAM2D code to two diverse subsurface modeling problems. The first one involves modeling of infiltration and redistribution of water and solutes in an initially dry, heterogeneous field soil. This application involves detailed modeling over a relatively short, 9-month time period. The second problem pertains to the application of VAM2D to the modeling of a waste disposal facility in a fractured clay, over much larger space and time scales and with particular emphasis on the applicability and reliability of using equivalent porous medium approach for simulating flow and transport in fractured geologic media. Reflecting the separate and distinct nature of the two problems studied, this report is organized in two separate parts. 61 refs., 31 figs., 9 tabs.

  6. Screening and transport in 2D semiconductor systems at low temperatures.

    PubMed

    Das Sarma, S; Hwang, E H

    2015-11-17

    Low temperature carrier transport properties in 2D semiconductor systems can be theoretically well-understood within RPA-Boltzmann theory as being limited by scattering from screened Coulomb disorder arising from random quenched charged impurities in the environment. In this work, we derive a number of analytical formula, supported by realistic numerical calculations, for the relevant density, mobility, and temperature range where 2D transport should manifest strong intrinsic (i.e., arising purely from electronic effects) metallic temperature dependence in different semiconductor materials arising entirely from the 2D screening properties, thus providing an explanation for why the strong temperature dependence of the 2D resistivity can only be observed in high-quality and low-disorder 2D samples and also why some high-quality 2D materials manifest much weaker metallicity than other materials. We also discuss effects of interaction and disorder on the 2D screening properties in this context as well as compare 2D and 3D screening functions to comment why such a strong intrinsic temperature dependence arising from screening cannot occur in 3D metallic carrier transport. Experimentally verifiable predictions are made about the quantitative magnitude of the maximum possible low-temperature metallicity in 2D systems and the scaling behavior of the temperature scale controlling the quantum to classical crossover.

  7. Screening and transport in 2D semiconductor systems at low temperatures

    PubMed Central

    Das Sarma, S.; Hwang, E. H.

    2015-01-01

    Low temperature carrier transport properties in 2D semiconductor systems can be theoretically well-understood within RPA-Boltzmann theory as being limited by scattering from screened Coulomb disorder arising from random quenched charged impurities in the environment. In this work, we derive a number of analytical formula, supported by realistic numerical calculations, for the relevant density, mobility, and temperature range where 2D transport should manifest strong intrinsic (i.e., arising purely from electronic effects) metallic temperature dependence in different semiconductor materials arising entirely from the 2D screening properties, thus providing an explanation for why the strong temperature dependence of the 2D resistivity can only be observed in high-quality and low-disorder 2D samples and also why some high-quality 2D materials manifest much weaker metallicity than other materials. We also discuss effects of interaction and disorder on the 2D screening properties in this context as well as compare 2D and 3D screening functions to comment why such a strong intrinsic temperature dependence arising from screening cannot occur in 3D metallic carrier transport. Experimentally verifiable predictions are made about the quantitative magnitude of the maximum possible low-temperature metallicity in 2D systems and the scaling behavior of the temperature scale controlling the quantum to classical crossover. PMID:26572738

  8. Computing 2D constrained delaunay triangulation using the GPU.

    PubMed

    Qi, Meng; Cao, Thanh-Tung; Tan, Tiow-Seng

    2013-05-01

    We propose the first graphics processing unit (GPU) solution to compute the 2D constrained Delaunay triangulation (CDT) of a planar straight line graph (PSLG) consisting of points and edges. There are many existing CPU algorithms to solve the CDT problem in computational geometry, yet there has been no prior approach to solve this problem efficiently using the parallel computing power of the GPU. For the special case of the CDT problem where the PSLG consists of just points, which is simply the normal Delaunay triangulation (DT) problem, a hybrid approach using the GPU together with the CPU to partially speed up the computation has already been presented in the literature. Our work, on the other hand, accelerates the entire computation on the GPU. Our implementation using the CUDA programming model on NVIDIA GPUs is numerically robust, and runs up to an order of magnitude faster than the best sequential implementations on the CPU. This result is reflected in our experiment with both randomly generated PSLGs and real-world GIS data having millions of points and edges.

  9. Preconditioning 2D Integer Data for Fast Convex Hull Computations.

    PubMed

    Cadenas, José Oswaldo; Megson, Graham M; Luengo Hendriks, Cris L

    2016-01-01

    In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved.

  10. Prestack depth migration for complex 2D structure using phase-screen propagators

    SciTech Connect

    Roberts, P.; Huang, Lian-Jie; Burch, C.; Fehler, M.; Hildebrand, S.

    1997-11-01

    We present results for the phase-screen propagator method applied to prestack depth migration of the Marmousi synthetic data set. The data were migrated as individual common-shot records and the resulting partial images were superposed to obtain the final complete Image. Tests were performed to determine the minimum number of frequency components required to achieve the best quality image and this in turn provided estimates of the minimum computing time. Running on a single processor SUN SPARC Ultra I, high quality images were obtained in as little as 8.7 CPU hours and adequate images were obtained in as little as 4.4 CPU hours. Different methods were tested for choosing the reference velocity used for the background phase-shift operation and for defining the slowness perturbation screens. Although the depths of some of the steeply dipping, high-contrast features were shifted slightly the overall image quality was fairly insensitive to the choice of the reference velocity. Our jests show the phase-screen method to be a reliable and fast algorithm for imaging complex geologic structures, at least for complex 2D synthetic data where the velocity model is known.

  11. Efficient screening of 2D molecular polymorphs at the solution-solid interface

    NASA Astrophysics Data System (ADS)

    Lee, Shern-Long; Adisoejoso, Jinne; Fang, Yuan; Tahara, Kazukuni; Tobe, Yoshito; Mali, Kunal S.; de Feyter, Steven

    2015-03-01

    Formation of multiple polymorphs during two-dimensional (2D) crystallization of organic molecules is more of a routine occurrence than rarity. Although such diverse crystalline structures provide exciting possibilities for studying crystal engineering in 2D, predicting the occurrence of polymorphs for a given building block is often non-trivial. Moreover, there is scarcity of methods that can experimentally verify the presence of such crystalline polymorphs in a straightforward fashion. Here we demonstrate a relatively simple experimental approach for screening of 2D polymorphs formed at the solution-solid interface. The strategy involves use of solution flow produced by contacting a piece of tissue paper to the sample to generate a lateral density gradient along the substrate surface. In situ generation of such gradient allows rapid discovery and nanoscale separation of multiple 2D polymorphs in a single experiment. The concept is demonstrated using three structurally different building blocks that differ in terms of intermolecular interactions responsible for 2D crystal formation. The method described here represents a powerful tool for efficient screening of 2D polymorphs formed at the solution-solid interface.Formation of multiple polymorphs during two-dimensional (2D) crystallization of organic molecules is more of a routine occurrence than rarity. Although such diverse crystalline structures provide exciting possibilities for studying crystal engineering in 2D, predicting the occurrence of polymorphs for a given building block is often non-trivial. Moreover, there is scarcity of methods that can experimentally verify the presence of such crystalline polymorphs in a straightforward fashion. Here we demonstrate a relatively simple experimental approach for screening of 2D polymorphs formed at the solution-solid interface. The strategy involves use of solution flow produced by contacting a piece of tissue paper to the sample to generate a lateral density

  12. Comparison of 2D versus 3D mammography with screening cases: an observer study

    NASA Astrophysics Data System (ADS)

    Fernandez, James Reza; Deshpande, Ruchi; Hovanessian-Larsen, Linda; Liu, Brent

    2012-02-01

    Breast cancer is the most common type of non-skin cancer in women. 2D mammography is a screening tool to aid in the early detection of breast cancer, but has diagnostic limitations of overlapping tissues, especially in dense breasts. 3D mammography has the potential to improve detection outcomes by increasing specificity, and a new 3D screening tool with a 3D display for mammography aims to improve performance and efficiency as compared to 2D mammography. An observer study using human studies collected from was performed to compare traditional 2D mammography with this new 3D mammography technique. A prior study using a mammography phantom revealed no difference in calcification detection, but improved mass detection in 2D as compared to 3D. There was a significant decrease in reading time for masses, calcifications, and normals in 3D compared to 2D, however, as well as more favorable confidence levels in reading normal cases. Data for this current study is currently being obtained, and a full report should be available in the next few weeks.

  13. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  14. Long ranged interactions in computer simulations and for quasi-2D systems

    NASA Astrophysics Data System (ADS)

    Mazars, Martial

    2011-03-01

    Taking correctly into account long ranged interactions in molecular simulations of molecular models based on classical atomistic representations are essential to obtain reliable results on model systems and in simulations of biological systems. A lot of numerical methods have been developed to this end; the most important of them are reviewed in this paper. Particular attention is paid to the analytical relations between the methods, this allows comparisons on efficiency and accuracy between them and also to proceed to precise implementations of these techniques. While most of the methods have been developed for Coulomb interactions, we give also some analytical details to apply these methods to screened Coulomb (Yukawa interactions), inverse power law and dipolar interactions. Particular types of systems, the quasi-2D systems, are also considered in this paper. Quasi-2D systems represent a large class of physical systems where the spatial extension in one direction of the space is very small by comparison to the other two. These representations are very useful to describe the properties of interfaces, surfaces, fluids confined in slab geometry, etc. In computer simulations, these systems are studied with partial periodic boundary conditions: periodic boundary conditions are taken in directions where spatial extensions are large and some other boundary conditions are taken in directions with smaller extensions. In this review, we describe also the numerical methods developed to handle long ranged interactions in numerical simulations of quasi-2D systems. The properties of quasi-2D systems depend strongly on interactions between components; more specifically electrostatic and magnetic interactions and interactions with external fields are of particular interest in these systems.

  15. 2D focal-field aberration dependence on time/phase screen position and correlation lengths

    NASA Astrophysics Data System (ADS)

    Näsholm, Sven Peter

    2004-05-01

    For high-frequency annular array transducers used in medical ultrasound imaging, aberrations due to tissue and body wall have a significant effect on energy transfer from the main lobe to the sidelobes of the acoustic field: that is, the aberrations make the total sidelobe level increase. This effect makes the ultrasound image poor when imaging heterogeneous organs. This study performs an analysis of the focal-field quality as a function of time/phase screen z position and time/phase screen correlation length. It establishes some rules of thumb which indicate when the focal-field sidelobe energy is at its highest. It also introduces a simple screen-scaling model which is useful as long as the screen position is not closer to the focus than a certain limit distance. The scaling model allows the real screen at a depth z=zscreen to be treated as a scaled screen at the position z=ztransd. 2D sound fields after 3D propagation from the annular arrays to the focal plane have been simulated using an angular spectrum method. The aberrators are represented by amplitude and phase/time screens.

  16. Pharmacophore, QSAR, and binding mode studies of substrates of human cytochrome P450 2D6 (CYP2D6) using molecular docking and virtual mutations and an application to chinese herbal medicine screening.

    PubMed

    Mo, Sui-Lin; Liu, Wei-Feng; Li, Chun-Guang; Zhou, Zhi-Wei; Luo, Hai-Bin; Chew, Helen; Liang, Jun; Zhou, Shu-Feng

    2012-07-01

    matched our pharmacophore model for CYP2D6 substrates. Fifty four out of these 60 compounds could be docked into the active site of CYP2D6 and 24 of 54 compounds formed hydrogen bonds with Glu216, Asp301, Ser304, and Ala305 in CYP2D6. These results have provided further insights into the factors that determining the binding modes of substrates to CYP2D6. Screening of high-affinity ligands for CYP2D6 from herbal formula using computational models is a useful approach to identify potential herb-drug interactions.

  17. Cytochrome P450-2D6 Screening Among Elderly Using Antidepressants (CYSCE)

    ClinicalTrials.gov

    2016-10-24

    Depression; Depressive Disorder; Poor Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Intermediate Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Ultrarapid Metabolizer Due to Cytochrome P450 CYP2D6 Variant

  18. Completeness of the classical 2D Ising model and universal quantum computation.

    PubMed

    Van den Nest, M; Dür, W; Briegel, H J

    2008-03-21

    We prove that the 2D Ising model is complete in the sense that the partition function of any classical q-state spin model (on an arbitrary graph) can be expressed as a special instance of the partition function of a 2D Ising model with complex inhomogeneous couplings and external fields. In the case where the original model is an Ising or Potts-type model, we find that the corresponding 2D square lattice requires only polynomially more spins with respect to the original one, and we give a constructive method to map such models to the 2D Ising model. For more general models the overhead in system size may be exponential. The results are established by connecting classical spin models with measurement-based quantum computation and invoking the universality of the 2D cluster states.

  19. GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System

    SciTech Connect

    James Menart

    2013-06-07

    This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..

  20. Generation and validation of rapid computational filters for cyp2d6 and cyp3a4.

    PubMed

    Ekins, Sean; Berbaum, Jennifer; Harrison, Richard K

    2003-09-01

    CYP2D6 and CYP3A4 represent two particularly important members of the cytochrome p450 enzyme family due to their involvement in the metabolism of many commercially available drugs. Avoiding potent inhibitory interactions with both of these enzymes is highly desirable in early drug discovery, long before entering clinical trials. Computational prediction of this liability as early as possible is desired. Using a commercially available data set of over 1750 molecules to train computer models that were generated with commercially available software enabled predictions of inhibition for CYP2D6 and CYP3A4, which were compared with empirical data. The results suggest that using a recursive partitioning (tree) technique with augmented atom descriptors enables a statistically significant rank ordering of test-set molecules (Spearman's rho of 0.61 and 0.48 for CYP2D6 and CYP3A4, respectively), which represents an increased rate of identifying the best compounds when compared with the random rate. This approach represents a valuable computational filter in early drug discovery to identify compounds that may have p450 inhibition liabilities prior to molecule synthesis. Such computational filters offer a new approach in which lead optimization in silico can occur with virtual molecules simultaneously tested against multiple enzymes implicated in drug-drug interactions, with a resultant cost savings from a decreased level of molecule synthesis and in vitro screening.

  1. 8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN TRACKS RESIDUAL CHLORINE; INDICATES AMOUNT OF SUNLIGHT WHICH ENABLES OPERATOR TO ESTIMATE NEEDED CHLORINE; CENTER SCREEN SHOWS TURNOUT STRUCTURES; RIGHT SCREEN SHOWS INDICATORS OF ALUMINUM SULFATE TANK FARM. - F. E. Weymouth Filtration Plant, 700 North Moreno Avenue, La Verne, Los Angeles County, CA

  2. CAST2D: A finite element computer code for casting process modeling

    SciTech Connect

    Shapiro, A.B.; Hallquist, J.O.

    1991-10-01

    CAST2D is a coupled thermal-stress finite element computer code for casting process modeling. This code can be used to predict the final shape and stress state of cast parts. CAST2D couples the heat transfer code TOPAZ2D and solid mechanics code NIKE2D. CAST2D has the following features in addition to all the features contained in the TOPAZ2D and NIKE2D codes: (1) a general purpose thermal-mechanical interface algorithm (i.e., slide line) that calculates the thermal contact resistance across the part-mold interface as a function of interface pressure and gap opening; (2) a new phase change algorithm, the delta function method, that is a robust method for materials undergoing isothermal phase change; (3) a constitutive model that transitions between fluid behavior and solid behavior, and accounts for material volume change on phase change; and (4) a modified plot file data base that allows plotting of thermal variables (e.g., temperature, heat flux) on the deformed geometry. Although the code is specialized for casting modeling, it can be used for other thermal stress problems (e.g., metal forming).

  3. Synchronous two-dimensional MIR correlation spectroscopy (2D-COS) as a novel method for screening smoke tainted wine.

    PubMed

    Fudge, Anthea L; Wilkinson, Kerry L; Ristic, Renata; Cozzolino, Daniel

    2013-08-15

    In this study, two-dimensional correlation spectroscopy (2D-COS) combined with mid-infrared (MIR) spectroscopy was evaluated as a novel technique for the identification of spectral regions associated with smoke-affected wine, for the purpose of screening taint arising from grapevine exposure to smoke. Smoke-affected wines obtained from experimental and industry sources were analysed using MIR spectroscopy and chemometrics, and calibration models developed. 2D-COS analysis was used to generate synchronous data maps for red and white cask wines spiked with guaiacol, a marker of smoke taint. Correlations were observed at wavelengths that could be attributable to aromatic C-C stretching, i.e., between 1400 and 1500 cm(-1), indicative of volatile phenols. These results demonstrate the potential of 2D-COS as a rapid, high-throughput technique for the preliminary screening of smoke tainted wine.

  4. Bacterial contamination of computer touch screens.

    PubMed

    Gerba, Charles P; Wuollet, Adam L; Raisanen, Peter; Lopez, Gerardo U

    2016-03-01

    The goal of this study was to determine the occurrence of opportunistic bacterial pathogens on the surfaces of computer touch screens used in hospitals and grocery stores. Opportunistic pathogenic bacteria were isolated on touch screens in hospitals; Clostridium difficile and vancomycin-resistant Enterococcus and in grocery stores; methicillin-resistant Staphylococcus aureus. Enteric bacteria were more common on grocery store touch screens than on hospital computer touch screens.

  5. Hierarchy of universal entanglement in 2D measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Miller, Jacob; Miyake, Akimasa

    2016-11-01

    Measurement-based quantum computation (MQC) is a paradigm for studying quantum computation using many-body entanglement and single-qubit measurements. Although MQC has inspired wide-ranging discoveries throughout quantum information, our understanding of the general principles underlying MQC seems to be biased by its historical reliance upon the archetypal 2D cluster state. Here we utilise recent advances in the subject of symmetry-protected topological order (SPTO) to introduce a novel MQC resource state, whose physical and computational behaviour differs fundamentally from that of the cluster state. We show that, in sharp contrast to the cluster state, our state enables universal quantum computation using only measurements of single-qubit Pauli X, Y, and Z operators. This novel computational feature is related to the 'genuine' 2D SPTO possessed by our state, and which is absent in the cluster state. Our concrete connection between the latent computational complexity of many-body systems and macroscopic quantum orders may find applications in quantum many-body simulation for benchmarking classically intractable complexity.

  6. Numerical Simulation of Slinger Combustor Using 2-D Axisymmetric Computational Model

    NASA Astrophysics Data System (ADS)

    Lee, Semin; Park, Soo Hyung; Lee, Donghun

    2010-06-01

    Small-size turbojet engines have difficulties in maintaining the chemical reaction due to the limitation of chamber size. The combustion chamber is generally designed to improve the reaction efficiency by the generation of vortices in the chamber and to enhance air-fuel mixing characteristics. In the initial stage of designing the combustor, analysis of the 3-D full configuration is not practical due to the huge time consuming computation and grid generation followed by modifications of the geometry. In the present paper, an axisymmetric model maintaining geometric similarity and flow characteristic of 3-D configuration is developed. Based on numerical results from the full 3-D configuration, model reduction is achieved toward 2-D axisymmetric configuration. In the modeling process, the area and location of each hole in 3-D full configuration are considered reasonably and replaced to the 2-D axisymmetric model. By using the 2-D axisymmetric model, the factor that can affect the performance is investigated with the assumption that the flow is non-reacting and turbulent. Numerical results from the present model show a good agreement with numerical results from 3-D full configuration model such as existence of vortex pair in forward region and total pressure loss. By simplifying the complex 3-D model, computing time can be remarkably reduced and it makes easy to find effects of geometry modification.

  7. Automatic computation of 2D cardiac measurements from B-mode echocardiography

    NASA Astrophysics Data System (ADS)

    Park, JinHyeong; Feng, Shaolei; Zhou, S. Kevin

    2012-03-01

    We propose a robust and fully automatic algorithm which computes the 2D echocardiography measurements recommended by America Society of Echocardiography. The algorithm employs knowledge-based imaging technologies which can learn the expert's knowledge from the training images and expert's annotation. Based on the models constructed from the learning stage, the algorithm searches initial location of the landmark points for the measurements by utilizing heart structure of left ventricle including mitral valve aortic valve. It employs the pseudo anatomic M-mode image generated by accumulating the line images in 2D parasternal long axis view along the time to refine the measurement landmark points. The experiment results with large volume of data show that the algorithm runs fast and is robust comparable to expert.

  8. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in

  9. Computation of nozzle flow fields using the PARC2D Navier-Stokes code

    NASA Technical Reports Server (NTRS)

    Collins, Frank G.

    1986-01-01

    Supersonic nozzles which operate at low Reynolds numbers and have large expansion ratios have very thick boundary layers at their exit. This leads to a very strong viscous/inviscid interaction upon the flow within the nozzle and the traditional nozzle design techniques which correct the inviscid core with a boundary layer displacement do not accurately predict the nozzle exit conditions. A full Navier-Stokes code (PARC2D) was used to compute the nozzle flow field. Grids were generated using the interactive grid generator code TBGG. All computations were made on the NASA MSFC CRAY X-MP computer. Comparison was made between the computations and in-house wall pressure measurements for CO2 flow through a conical nozzle having an area ratio of 40. Satisfactory agreement existed between the computations and measurements for a stagnation pressure of 29.4 psia and stagnation temperature of 1060 R. However, agreement did not exist at a stagnation pressure of 7.4 psia. Several reasons for the lack of agreement are possible. The computational code assumed a constant gas gamma whereas gamma for CO2 varied from 1.22 in the plenum chamber to 1.38 at the nozzle exit. Finally, it is possible that condensation occurred during the expansion at the lower stagnation pressure.

  10. Computationally Efficient 2D DOA Estimation with Uniform Rectangular Array in Low-Grazing Angle

    PubMed Central

    Shi, Junpeng; Hu, Guoping; Zhang, Xiaofei; Sun, Fenggang; Xiao, Yu

    2017-01-01

    In this paper, we propose a computationally efficient spatial differencing matrix set (SDMS) method for two-dimensional direction of arrival (2D DOA) estimation with uniform rectangular arrays (URAs) in a low-grazing angle (LGA) condition. By rearranging the auto-correlation and cross-correlation matrices in turn among different subarrays, the SDMS method can estimate the two parameters independently with one-dimensional (1D) subspace-based estimation techniques, where we only perform difference for auto-correlation matrices and the cross-correlation matrices are kept completely. Then, the pair-matching of two parameters is achieved by extracting the diagonal elements of URA. Thus, the proposed method can decrease the computational complexity, suppress the effect of additive noise and also have little information loss. Simulation results show that, in LGA, compared to other methods, the proposed methods can achieve performance improvement in the white or colored noise conditions. PMID:28245634

  11. A computationally efficient 2D hydraulic approach for global flood hazard modeling

    NASA Astrophysics Data System (ADS)

    Begnudelli, L.; Kaheil, Y.; Sanders, B. F.

    2014-12-01

    We present a physically-based flood hazard model that incorporates two main components: a hydrologic model and a hydraulic model. For hydrology we use TOPNET, a more comprehensive version of the original TOPMODEL. To simulate flood propagation, we use a 2D Godunov-type finite volume shallow water model. Physically-based global flood hazard simulation poses enormous computational challenges stemming from the increasingly fine resolution of available topographic data which represents the key input. Parallel computing helps to distribute the computational cost, but the computationally-intensive hydraulic model must be made far faster and agile for global-scale feasibility. Here we present a novel technique for hydraulic modeling whereby the computational grid is much coarser (e.g., 5-50 times) than the available topographic data, but the coarse grid retains the storage and conveyance (cross-sectional area) of the fine resolution data. This allows the 2D hydraulic model to be run on extremely large domains (e.g. thousands km2) with a single computational processor, and opens the door to global coverage with parallel computing. The model also downscales the coarse grid results onto the high-resolution topographic data to produce fine-scale predictions of flood depths and velocities. The model achieves computational speeds typical of very coarse grids while achieving an accuracy expected of a much finer resolution. In addition, the model has potential for assimilation of remotely sensed water elevations, to define boundary conditions based on water levels or river discharges and to improve model results. The model is applied to two river basins: the Susquehanna River in Pennsylvania, and the Ogeechee River in Florida. The two rivers represent different scales and span a wide range of topographic characteristics. Comparing spatial resolutions ranging between 30 m to 500 m in both river basins, the new technique was able to reduce simulation runtime by at least 25 fold

  12. Breast density measurement: 3D cone beam computed tomography (CBCT) images versus 2D digital mammograms

    NASA Astrophysics Data System (ADS)

    Han, Tao; Lai, Chao-Jen; Chen, Lingyun; Liu, Xinming; Shen, Youtao; Zhong, Yuncheng; Ge, Shuaiping; Yi, Ying; Wang, Tianpeng; Yang, Wei T.; Shaw, Chris C.

    2009-02-01

    Breast density has been recognized as one of the major risk factors for breast cancer. However, breast density is currently estimated using mammograms which are intrinsically 2D in nature and cannot accurately represent the real breast anatomy. In this study, a novel technique for measuring breast density based on the segmentation of 3D cone beam CT (CBCT) images was developed and the results were compared to those obtained from 2D digital mammograms. 16 mastectomy breast specimens were imaged with a bench top flat-panel based CBCT system. The reconstructed 3D CT images were corrected for the cupping artifacts and then filtered to reduce the noise level, followed by using threshold-based segmentation to separate the dense tissue from the adipose tissue. For each breast specimen, volumes of the dense tissue structures and the entire breast were computed and used to calculate the volumetric breast density. BI-RADS categories were derived from the measured breast densities and compared with those estimated from conventional digital mammograms. The results show that in 10 of 16 cases the BI-RADS categories derived from the CBCT images were lower than those derived from the mammograms by one category. Thus, breasts considered as dense in mammographic examinations may not be considered as dense with the CBCT images. This result indicates that the relation between breast cancer risk and true (volumetric) breast density needs to be further investigated.

  13. Computer program BL2D for solving two-dimensional and axisymmetric boundary layers

    NASA Technical Reports Server (NTRS)

    Iyer, Venkit

    1995-01-01

    This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.

  14. Experimental and Computational Study of Multiphase Flow Hydrodynamics in 2D Trickle Bed Reactors

    NASA Astrophysics Data System (ADS)

    Nadeem, H.; Ben Salem, I.; Kurnia, J. C.; Rabbani, S.; Shamim, T.; Sassi, M.

    2014-12-01

    Trickle bed reactors are largely used in the refining processes. Co-current heavy oil and hydrogen gas flow downward on catalytic particle bed. Fine particles in the heavy oil and/or soot formed by the exothermic catalytic reactions deposit on the bed and clog the flow channels. This work is funded by the refining company of Abu Dhabi and aims at mitigating pressure buildup due to fine deposition in the TBR. In this work, we focus on meso-scale experimental and computational investigations of the interplay between flow regimes and the various parameters that affect them. A 2D experimental apparatus has been built to investigate the flow regimes with an average pore diameter close to the values encountered in trickle beds. A parametric study is done for the development of flow regimes and the transition between them when the geometry and arrangement of the particles within the porous medium are varied. Liquid and gas flow velocities have also been varied to capture the different flow regimes. Real time images of the multiphase flow are captured using a high speed camera, which were then used to characterize the transition between the different flow regimes. A diffused light source was used behind the 2D Trickle Bed Reactor to enhance visualizations. Experimental data shows very good agreement with the published literature. The computational study focuses on the hydrodynamics of multiphase flow and to identify the flow regime developed inside TBRs using the ANSYS Fluent Software package. Multiphase flow inside TBRs is investigated using the "discrete particle" approach together with Volume of Fluid (VoF) multiphase flow modeling. The effect of the bed particle diameter, spacing, and arrangement are presented that may be used to provide guidelines for designing trickle bed reactors.

  15. Adiabatic and Hamiltonian computing on a 2D lattice with simple two-qubit interactions

    NASA Astrophysics Data System (ADS)

    Lloyd, Seth; Terhal, Barbara M.

    2016-02-01

    We show how to perform universal Hamiltonian and adiabatic computing using a time-independent Hamiltonian on a 2D grid describing a system of hopping particles which string together and interact to perform the computation. In this construction, the movement of one particle is controlled by the presence or absence of other particles, an effective quantum field effect transistor that allows the construction of controlled-NOT and controlled-rotation gates. The construction translates into a model for universal quantum computation with time-independent two-qubit ZZ and XX+YY interactions on an (almost) planar grid. The effective Hamiltonian is arrived at by a single use of first-order perturbation theory avoiding the use of perturbation gadgets. The dynamics and spectral properties of the effective Hamiltonian can be fully determined as it corresponds to a particular realization of a mapping between a quantum circuit and a Hamiltonian called the space-time circuit-to-Hamiltonian construction. Because of the simple interactions required, and because no higher-order perturbation gadgets are employed, our construction is potentially realizable using superconducting or other solid-state qubits.

  16. Diverse Geological Applications For Basil: A 2d Finite-deformation Computational Algorithm

    NASA Astrophysics Data System (ADS)

    Houseman, Gregory A.; Barr, Terence D.; Evans, Lynn

    Geological processes are often characterised by large finite-deformation continuum strains, on the order of 100% or greater. Microstructural processes cause deformation that may be represented by a viscous constitutive mechanism, with viscosity that may depend on temperature, pressure, or strain-rate. We have developed an effective com- putational algorithm for the evaluation of 2D deformation fields produced by Newto- nian or non-Newtonian viscous flow. With the implementation of this algorithm as a computer program, Basil, we have applied it to a range of diverse applications in Earth Sciences. Viscous flow fields in 2D may be defined for the thin-sheet case or, using a velocity-pressure formulation, for the plane-strain case. Flow fields are represented using 2D triangular elements with quadratic interpolation for velocity components and linear for pressure. The main matrix equation is solved by an efficient and compact conjugate gradient algorithm with iteration for non-Newtonian viscosity. Regular grids may be used, or grids based on a random distribution of points. Definition of the prob- lem requires that velocities, tractions, or some combination of the two, are specified on all external boundary nodes. Compliant boundaries may also be defined, based on the idea that traction is opposed to and proportional to boundary displacement rate. In- ternal boundary segments, allowing fault-like displacements within a viscous medium have also been developed, and we find that the computed displacement field around the fault tip is accurately represented for Newtonian and non-Newtonian viscosities, in spite of the stress singularity at the fault tip. Basil has been applied by us and colleagues to problems that include: thin sheet calculations of continental collision, Rayleigh-Taylor instability of the continental mantle lithosphere, deformation fields around fault terminations at the outcrop scale, stress and deformation fields in and around porphyroblasts, and

  17. Comparing 2-D screen projections to 1-D goniometric measurements in scattering studies of surface roughness

    NASA Astrophysics Data System (ADS)

    Jones, Laurel R.; Jacques, Steven L.

    2009-02-01

    Video goniometry was used to study the angular dependence of scattering from tissues and test materials. Tissues and standard roughness samples (sandpaper) were placed vertically in front of a 543 nm He-Ne laser with the tissue surface normal at 45° from the incident beam. The scattered light patterns projected onto a screen that was photographed by a digital camera. The scatter pattern showed a specular peak centered at -45° which was described by a Henyey-Greenstein function. The pattern also presented a diffuse Lambertian pattern at 0° (normal to the tissue). The line between the peak specular and the peak Lambertian identified the scattering plane, despite any slight misalignment of the tissue. The analysis utilized a coordinate transform based on mathematics for mapping between a flat Mercator map and a spherical planetary surface. The system was used to study the surface roughness of muscle tissue samples (bovine striated muscle and chicken cardiac muscle).

  18. Computational Study and Analysis of Structural Imperfections in 1D and 2D Photonic Crystals

    SciTech Connect

    Maskaly, Karlene Rosera

    2005-06-01

    increasing RMS roughness. Again, the homogenization approximation is able to predict these results. The problem of surface scratches on 1D photonic crystals is also addressed. Although the reflectivity decreases are lower in this study, up to a 15% change in reflectivity is observed in certain scratched photonic crystal structures. However, this reflectivity change can be significantly decreased by adding a low index protective coating to the surface of the photonic crystal. Again, application of homogenization theory to these structures confirms its predictive power for this type of imperfection as well. Additionally, the problem of a circular pores in 2D photonic crystals is investigated, showing that almost a 50% change in reflectivity can occur for some structures. Furthermore, this study reveals trends that are consistent with the 1D simulations: parameter changes that increase the absolute reflectivity of the photonic crystal will also increase its tolerance to structural imperfections. Finally, experimental reflectance spectra from roughened 1D photonic crystals are compared to the results predicted computationally in this thesis. Both the computed and experimental spectra correlate favorably, validating the findings presented herein.

  19. Craniosynostosis: prenatal diagnosis by 2D/3D ultrasound, magnetic resonance imaging and computed tomography.

    PubMed

    Helfer, Talita Micheletti; Peixoto, Alberto Borges; Tonni, Gabriele; Araujo Júnior, Edward

    2016-09-01

    Craniosynostosis is defined as the process of premature fusion of one or more of the cranial sutures. It is a common condition that occurs in about 1 to 2,000 live births. Craniosynostosis may be classified in primary or secondary. It is also classified as nonsyndromic or syndromic. According to suture commitment, craniosynostosis may affect a single suture or multiple sutures. There is a wide range of syndromes involving craniosynostosis and the most common are Apert, Pffeifer, Crouzon, Shaethre-Chotzen and Muenke syndromes. The underlying etiology of nonsyndromic craniosynostosis is unknown. Mutations in the fibroblast growth factor (FGF) signalling pathway play a crucial role in the etiology of craniosynostosis syndromes. Prenatal ultrasound`s detection rate of craniosynostosis is low. Nowadays, different methods can be applied for prenatal diagnosis of craniosynostosis, such as two-dimensional (2D) and three-dimensional (3D) ultrasound, magnetic resonance imaging (MRI), computed tomography (CT) scan and, finally, molecular diagnosis. The presence of craniosynostosis may affect the birthing process. Fetuses with craniosynostosis also have higher rates of perinatal complications. In order to avoid the risks of untreated craniosynostosis, children are usually treated surgically soon after postnatal diagnosis.

  20. An algorithm for computing the 2D structure of fast rotating stars

    SciTech Connect

    Rieutord, Michel; Espinosa Lara, Francisco; Putigny, Bertrand

    2016-08-01

    Stars may be understood as self-gravitating masses of a compressible fluid whose radiative cooling is compensated by nuclear reactions or gravitational contraction. The understanding of their time evolution requires the use of detailed models that account for a complex microphysics including that of opacities, equation of state and nuclear reactions. The present stellar models are essentially one-dimensional, namely spherically symmetric. However, the interpretation of recent data like the surface abundances of elements or the distribution of internal rotation have reached the limits of validity of one-dimensional models because of their very simplified representation of large-scale fluid flows. In this article, we describe the ESTER code, which is the first code able to compute in a consistent way a two-dimensional model of a fast rotating star including its large-scale flows. Compared to classical 1D stellar evolution codes, many numerical innovations have been introduced to deal with this complex problem. First, the spectral discretization based on spherical harmonics and Chebyshev polynomials is used to represent the 2D axisymmetric fields. A nonlinear mapping maps the spheroidal star and allows a smooth spectral representation of the fields. The properties of Picard and Newton iterations for solving the nonlinear partial differential equations of the problem are discussed. It turns out that the Picard scheme is efficient on the computation of the simple polytropic stars, but Newton algorithm is unsurpassed when stellar models include complex microphysics. Finally, we discuss the numerical efficiency of our solver of Newton iterations. This linear solver combines the iterative Conjugate Gradient Squared algorithm together with an LU-factorization serving as a preconditioner of the Jacobian matrix.

  1. Atom pair 2D-fingerprints perceive 3D-molecular shape and pharmacophores for very fast virtual screening of ZINC and GDB-17.

    PubMed

    Awale, Mahendra; Reymond, Jean-Louis

    2014-07-28

    Three-dimensional (3D) molecular shape and pharmacophores are important determinants of the biological activity of organic molecules; however, a precise computation of 3D-shape is generally too slow for virtual screening of very large databases. A reinvestigation of the concept of atom pairs initially reported by Carhart et al. and extended by Schneider et al. showed that a simple atom pair fingerprint (APfp) counting atom pairs at increasing topological distances in 2D-structures without atom property assignment correlates with various representations of molecular shape extracted from the 3D-structures. A related 55-dimensional atom pair fingerprint extended with atom properties (Xfp) provided an efficient pharmacophore fingerprint with good performance for ligand-based virtual screening such as the recovery of active compounds from decoys in DUD, and overlap with the ROCS 3D-pharmacophore scoring function. The APfp and Xfp data were organized for web-based extremely fast nearest-neighbor searching in ZINC (13.5 M compounds) and GDB-17 (50 M random subset) freely accessible at www.gdb.unibe.ch .

  2. 2D-RNA-coupling numbers: a new computational chemistry approach to link secondary structure topology with biological function.

    PubMed

    González-Díaz, Humberto; Agüero-Chapin, Guillermín; Varona, Javier; Molina, Reinaldo; Delogu, Giovanna; Santana, Lourdes; Uriarte, Eugenio; Podda, Gianni

    2007-04-30

    Methods for prediction of proteins, DNA, or RNA function and mapping it onto sequence often rely on bioinformatics alignment approach instead of chemical structure. Consequently, it is interesting to develop computational chemistry approaches based on molecular descriptors. In this sense, many researchers used sequence-coupling numbers and our group extended them to 2D proteins representations. However, no coupling numbers have been reported for 2D-RNA topology graphs, which are highly branched and contain useful information. Here, we use a computational chemistry scheme: (a) transforming sequences into RNA secondary structures, (b) defining and calculating new 2D-RNA-coupling numbers, (c) seek a structure-function model, and (d) map biological function onto the folded RNA. We studied as example 1-aminocyclopropane-1-carboxylic acid (ACC) oxidases known as ACO, which control fruit ripening having importance for biotechnology industry. First, we calculated tau(k)(2D-RNA) values to a set of 90-folded RNAs, including 28 transcripts of ACO and control sequences. Afterwards, we compared the classification performance of 10 different classifiers implemented in the software WEKA. In particular, the logistic equation ACO = 23.8 . tau(1)(2D-RNA) + 41.4 predicts ACOs with 98.9%, 98.0%, and 97.8% of accuracy in training, leave-one-out and 10-fold cross-validation, respectively. Afterwards, with this equation we predict ACO function to a sequence isolated in this work from Coffea arabica (GenBank accession DQ218452). The tau(1)(2D-RNA) also favorably compare with other descriptors. This equation allows us to map the codification of ACO activity on different mRNA topology features. The present computational-chemistry approach is general and could be extended to connect RNA secondary structure topology to other functions.

  3. Computer vision for high content screening.

    PubMed

    Kraus, Oren Z; Frey, Brendan J

    2016-01-01

    High Content Screening (HCS) technologies that combine automated fluorescence microscopy with high throughput biotechnology have become powerful systems for studying cell biology and drug screening. These systems can produce more than 100 000 images per day, making their success dependent on automated image analysis. In this review, we describe the steps involved in quantifying microscopy images and different approaches for each step. Typically, individual cells are segmented from the background using a segmentation algorithm. Each cell is then quantified by extracting numerical features, such as area and intensity measurements. As these feature representations are typically high dimensional (>500), modern machine learning algorithms are used to classify, cluster and visualize cells in HCS experiments. Machine learning algorithms that learn feature representations, in addition to the classification or clustering task, have recently advanced the state of the art on several benchmarking tasks in the computer vision community. These techniques have also recently been applied to HCS image analysis.

  4. A BENCHMARKING ANALYSIS FOR FIVE RADIONUCLIDE VADOSE ZONE MODELS (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, AND CHAIN 2D) IN SOIL SCREENING LEVEL CALCULATIONS

    EPA Science Inventory

    Five radionuclide vadose zone models with different degrees of complexity (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide (99Tc) rele...

  5. Orbit computation of the TELECOM-2D satellite with a Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Deleflie, Florent; Coulot, David; Vienne, Alain; Decosta, Romain; Richard, Pascal; Lasri, Mohammed Amjad

    2014-07-01

    In order to test a preliminary orbit determination method, we fit an orbit of the geostationary satellite TELECOM-2D, as if we did not know any a priori information on its trajectory. The method is based on a genetic algorithm coupled to an analytical propagator of the trajectory, that is used over a couple of days, and that uses a whole set of altazimutal data that are acquired by the tracking network made up of the two TAROT telescopes. The adjusted orbit is then compared to a numerical reference. The method is described, and the results are analyzed, as a step towards an operational method of preliminary orbit determination for uncatalogued objects.

  6. Distributed computing architecture for image-based wavefront sensing and 2D FFTs

    NASA Astrophysics Data System (ADS)

    Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan

    2006-06-01

    Image-based wavefront sensing provides significant advantages over interferometric-based wavefront sensors such as optical design simplicity and stability. However, the image-based approach is computationally intensive, and therefore, applications utilizing the image-based approach gain substantial benefits using specialized high-performance computing architectures. The development and testing of these computing architectures are essential to missions such as James Webb Space Telescope (JWST), Terrestrial Planet Finder-Coronagraph (TPF-C and CorSpec), and the Spherical Primary Optical Telescope (SPOT). The algorithms implemented on these specialized computing architectures make use of numerous two-dimensional Fast Fourier Transforms (FFTs) which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of digital signal processors (DSPs) and multiple DSP field programmable gate arrays (FPGAs), offering a novel application of low-diameter graph theory. Timing results and performance analysis are presented. The solutions offered could be applied to other computationally complex all-to-all communication problems.

  7. Distributed Computing Architecture for Image-Based Wavefront Sensing and 2 D FFTs

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan

    2006-01-01

    Image-based wavefront sensing (WFS) provides significant advantages over interferometric-based wavefi-ont sensors such as optical design simplicity and stability. However, the image-based approach is computational intensive, and therefore, specialized high-performance computing architectures are required in applications utilizing the image-based approach. The development and testing of these high-performance computing architectures are essential to such missions as James Webb Space Telescope (JWST), Terrestial Planet Finder-Coronagraph (TPF-C and CorSpec), and Spherical Primary Optical Telescope (SPOT). The development of these specialized computing architectures require numerous two-dimensional Fourier Transforms, which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of DSPs, multiple DSP FPGAs, and an application of low-diameter graph theory. Timing results and performance analysis will be presented. The solutions offered could be applied to other all-to-all communication and scientifically computationally complex problems.

  8. Characterization of Unsteady Flow Structures Near Landing-Edge Slat. Part 2; 2D Computations

    NASA Technical Reports Server (NTRS)

    Khorrami, Mehdi; Choudhari, Meelan M.; Jenkins, Luther N.

    2004-01-01

    In our previous computational studies of a generic high-lift configuration, quasi-laminar (as opposed to fully turbulent) treatment of the slat cove region proved to be an effective approach for capturing the unsteady dynamics of the cove flow field. Combined with acoustic propagation via Ffowes Williams and Hawkings formulation, the quasi-laminar simulations captured some important features of the slat cove noise measured with microphone array techniques. However. a direct assessment of the computed cove flow field was not feasible due to the unavailability of off-surface flow measurements. To remedy this shortcoming, we have undertaken a combined experiment and computational study aimed at characterizing the flow structures and fluid mechanical processes within the slat cove region. Part I of this paper outlines the experimental aspects of this investigation focused on the 30P30N high-lift configuration; the present paper describes the accompanying computational results including a comparison between computation and experiment at various angles of attack. Even through predictions of the time-averaged flow field agree well with the measured data, the study indicates the need for further refinement of the zonal turbulence approach in order to capture the full dynamics of the cove's fluctuating flow field.

  9. Singular value decomposition-based 2D image reconstruction for computed tomography.

    PubMed

    Liu, Rui; He, Lu; Luo, Yan; Yu, Hengyong

    2017-01-01

    Singular value decomposition (SVD)-based 2D image reconstruction methods are developed and evaluated for a broad class of inverse problems for which there are no analytical solutions. The proposed methods are fast and accurate for reconstructing images in a non-iterative fashion. The multi-resolution strategy is adopted to reduce the size of the system matrix to reconstruct large images using limited memory capacity. A modified high-contrast Shepp-Logan phantom, a low-contrast FORBILD head phantom, and a physical phantom are employed to evaluate the proposed methods with different system configurations. The results show that the SVD methods can accurately reconstruct images from standard scan and interior scan projections and that they outperform other benchmark methods. The general SVD method outperforms the other SVD methods. The truncated SVD and Tikhonov regularized SVD methods accurately reconstruct a region-of-interest (ROI) from an internal scan with a known sub-region inside the ROI. Furthermore, the SVD methods are much faster and more flexible than the benchmark algorithms, especially in the ROI reconstructions in our experiments.

  10. Lattice Boltzmann methods for some 2-D nonlinear diffusion equations:Computational results

    SciTech Connect

    Elton, B.H.; Rodrigue, G.H. . Dept. of Applied Science Lawrence Livermore National Lab., CA ); Levermore, C.D. . Dept. of Mathematics)

    1990-01-01

    In this paper we examine two lattice Boltzmann methods (that are a derivative of lattice gas methods) for computing solutions to two two-dimensional nonlinear diffusion equations of the form {partial derivative}/{partial derivative}t u = v ({partial derivative}/{partial derivative}x D(u){partial derivative}/{partial derivative}x u + {partial derivative}/{partial derivative}y D(u){partial derivative}/{partial derivative}y u), where u = u({rvec x},t), {rvec x} {element of} R{sup 2}, v is a constant, and D(u) is a nonlinear term that arises from a Chapman-Enskog asymptotic expansion. In particular, we provide computational evidence supporting recent results showing that the methods are second order convergent (in the L{sub 1}-norm), conservative, conditionally monotone finite difference methods. Solutions computed via the lattice Boltzmann methods are compared with those computed by other explicit, second order, conservative, monotone finite difference methods. Results are reported for both the L{sub 1}- and L{sub {infinity}}-norms.

  11. The PR2D (Place, Route in 2-Dimensions) automatic layout computer program handbook

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1978-01-01

    Place, Route in 2-Dimensions is a standard cell automatic layout computer program for generating large scale integrated/metal oxide semiconductor arrays. The program was utilized successfully for a number of years in both government and private sectors but until now was undocumented. The compilation, loading, and execution of the program on a Sigma V CP-V operating system is described.

  12. Computational results for flows over 2-D ramp and 3-D obstacle with an upwind Navier-Stokes solver

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj

    1990-01-01

    An implicit, finite-difference, upwind, full Navier-Stokes solver was applied to supersonic/hypersonic flows over two-dimensional ramps and three-dimensional obstacle. Some of the computed results are presented. The numerical scheme used in the study is an implicit, spacially second order accurate, upwind, LU-ADI scheme based on Roe's approximate Reimann solver with MUSCL differencing of Van Leer. An algebraic grid generation scheme based on generalized interpolation scheme was used in generating the grids for the various 2-D and 3-D problems.

  13. Computational results for 2-D and 3-D ramp flows with an upwind Navier-Stokes solver

    NASA Technical Reports Server (NTRS)

    Venkatapathy, Ethiraj

    1991-01-01

    An implicit, finite-difference, upwind, full Navier-Stokes solver was applied to supersonic/hypersonic flows over two-dimensional ramps and three-dimensional obstacle. Some of the computed results are presented. The numerical scheme used in the study is an implicit, spatially second order accurate, upwind, LU-ADI scheme based on Roe's approximate Reimann solver with MUSCL differencing of Van Leer. An algebraic grid generation scheme based on generalized interpolation scheme was used in generating the grids for the various 2-D and 3-D problems.

  14. Identification of the wave speed and the second viscosity of cavitation flows with 2D RANS computations - Part I

    NASA Astrophysics Data System (ADS)

    Decaix, J.; Alligné, S.; Nicolet, C.; Avellan, F.; Münch, C.

    2015-12-01

    1D hydro-electric models are useful to predict dynamic behaviour of hydro-power plants. Regarding vortex rope and cavitation surge in Francis turbines, the 1D models require some inputs that can be provided by numerical simulations. In this paper, a 2D cavitating Venturi is considered. URANS computations are performed to investigate the dynamic behaviour of the cavitation sheet depending on the frequency variation of the outlet pressure. The results are used to calibrate and to assess the reliability of the 1D models.

  15. Computational Screening of All Stoichiometric Inorganic Materials.

    PubMed

    Davies, Daniel W; Butler, Keith T; Jackson, Adam J; Morris, Andrew; Frost, Jarvist M; Skelton, Jonathan M; Walsh, Aron

    2016-10-13

    Forming a four-component compound from the first 103 elements of the periodic table results in more than 10(12) combinations. Such a materials space is intractable to high-throughput experiment or first-principle computation. We introduce a framework to address this problem and quantify how many materials can exist. We apply principles of valency and electronegativity to filter chemically implausible compositions, which reduces the inorganic quaternary space to 10(10) combinations. We demonstrate that estimates of band gaps and absolute electron energies can be made simply on the basis of the chemical composition and apply this to the search for new semiconducting materials to support the photoelectrochemical splitting of water. We show the applicability to predicting crystal structure by analogy with known compounds, including exploration of the phase space for ternary combinations that form a perovskite lattice. Computer screening reproduces known perovskite materials and predicts the feasibility of thousands more. Given the simplicity of the approach, large-scale searches can be performed on a single workstation.

  16. Quantitative comparison of dose distribution in radiotherapy plans using 2D gamma maps and X-ray computed tomography

    PubMed Central

    Balosso, Jacques

    2016-01-01

    Background The advanced dose calculation algorithms implemented in treatment planning system (TPS) have remarkably improved the accuracy of dose calculation especially the modeling of electrons transport in the low density medium. The purpose of this study is to evaluate the use of 2D gamma (γ) index to quantify and evaluate the impact of the calculation of electrons transport on dose distribution for lung radiotherapy. Methods X-ray computed tomography images were used to calculate the dose for twelve radiotherapy treatment plans. The doses were originally calculated with Modified Batho (MB) 1D density correction method, and recalculated with anisotropic analytical algorithm (AAA), using the same prescribed dose. Dose parameters derived from dose volume histograms (DVH) and target coverage indices were compared. To compare dose distribution, 2D γ-index was applied, ranging from 1%/1 mm to 6%/6 mm. The results were displayed using γ-maps in 2D. Correlation between DVH metrics and γ passing rates was tested using Spearman’s rank test and Wilcoxon paired test to calculate P values. Results the plans generated with AAA predicted more heterogeneous dose distribution inside the target, with P<0.05. However, MB overestimated the dose predicting more coverage of the target by the prescribed dose. The γ analysis showed that the difference between MB and AAA could reach up to ±10%. The 2D γ-maps illustrated that AAA predicted more dose to organs at risks, as well as lower dose to the target compared to MB. Conclusions Taking into account of the electrons transport on radiotherapy plans showed a significant impact on delivered dose and dose distribution. When considering the AAA represent the true cumulative dose, a readjusting of the prescribed dose and an optimization to protect the organs at risks should be taken in consideration in order to obtain the better clinical outcome. PMID:27429908

  17. The Roles of Endstopped and Curvature Tuned Computations in a Hierarchical Representation of 2D Shape

    PubMed Central

    Rodríguez-Sánchez, Antonio J.; Tsotsos, John K.

    2012-01-01

    That shape is important for perception has been known for almost a thousand years (thanks to Alhazen in 1083) and has been a subject of study ever since by scientists and phylosophers (such as Descartes, Helmholtz or the Gestalt psychologists). Shapes are important object descriptors. If there was any remote doubt regarding the importance of shape, recent experiments have shown that intermediate areas of primate visual cortex such as V2, V4 and TEO are involved in analyzing shape features such as corners and curvatures. The primate brain appears to perform a wide variety of complex tasks by means of simple operations. These operations are applied across several layers of neurons, representing increasingly complex, abstract intermediate processing stages. Recently, new models have attempted to emulate the human visual system. However, the role of intermediate representations in the visual cortex and their importance have not been adequately studied in computational modeling. This paper proposes a model of shape-selective neurons whose shape-selectivity is achieved through intermediate layers of visual representation not previously fully explored. We hypothesize that hypercomplex - also known as endstopped - neurons play a critical role to achieve shape selectivity and show how shape-selective neurons may be modeled by integrating endstopping and curvature computations. This model - a representational and computational system for the detection of 2-dimensional object silhouettes that we term 2DSIL - provides a highly accurate fit with neural data and replicates responses from neurons in area V4 with an average of 83% accuracy. We successfully test a biologically plausible hypothesis on how to connect early representations based on Gabor or Difference of Gaussian filters and later representations closer to object categories without the need of a learning phase as in most recent models. PMID:22912683

  18. Coloured computational imaging with single-pixel detectors based on a 2D discrete cosine transform

    NASA Astrophysics Data System (ADS)

    Liu, Bao-Lei; Yang, Zhao-Hua; Liu, Xia; Wu, Ling-An

    2017-02-01

    We propose and demonstrate a computational imaging technique that uses structured illumination based on a two-dimensional discrete cosine transform to perform imaging with a single-pixel detector. A scene is illuminated by a projector with two sets of orthogonal patterns, then by applying an inverse cosine transform to the spectra obtained from the single-pixel detector a full-color image is retrieved. This technique can retrieve an image from sub-Nyquist measurements, and the background noise is easily canceled to give excellent image quality. Moreover, the experimental setup is very simple.

  19. Manifest: A computer program for 2-D flow modeling in Stirling machines

    NASA Technical Reports Server (NTRS)

    Gedeon, David

    1989-01-01

    A computer program named Manifest is discussed. Manifest is a program one might want to use to model the fluid dynamics in the manifolds commonly found between the heat exchangers and regenerators of Stirling machines; but not just in the manifolds - in the regenerators as well. And in all sorts of other places too, such as: in heaters or coolers, or perhaps even in cylinder spaces. There are probably nonStirling uses for Manifest also. In broad strokes, Manifest will: (1) model oscillating internal compressible laminar fluid flow in a wide range of two-dimensional regions, either filled with porous materials or empty; (2) present a graphics-based user-friendly interface, allowing easy selection and modification of region shape and boundary condition specification; (3) run on a personal computer, or optionally (in the case of its number-crunching module) on a supercomputer; and (4) allow interactive examination of the solution output so the user can view vector plots of flow velocity, contour plots of pressure and temperature at various locations and tabulate energy-related integrals of interest.

  20. A numerical method for computing unsteady 2-D boundary layer flows

    NASA Technical Reports Server (NTRS)

    Krainer, Andreas

    1988-01-01

    A numerical method for computing unsteady two-dimensional boundary layers in incompressible laminar and turbulent flows is described and applied to a single airfoil changing its incidence angle in time. The solution procedure adopts a first order panel method with a simple wake model to solve for the inviscid part of the flow, and an implicit finite difference method for the viscous part of the flow. Both procedures integrate in time in a step-by-step fashion, in the course of which each step involves the solution of the elliptic Laplace equation and the solution of the parabolic boundary layer equations. The Reynolds shear stress term of the boundary layer equations is modeled by an algebraic eddy viscosity closure. The location of transition is predicted by an empirical data correlation originating from Michel. Since transition and turbulence modeling are key factors in the prediction of viscous flows, their accuracy will be of dominant influence to the overall results.

  1. Computing Aerodynamic Performance of a 2D Iced Airfoil: Blocking Topology and Grid Generation

    NASA Technical Reports Server (NTRS)

    Chi, X.; Zhu, B.; Shih, T. I.-P.; Slater, J. W.; Addy, H. E.; Choo, Yung K.; Lee, Chi-Ming (Technical Monitor)

    2002-01-01

    The ice accrued on airfoils can have enormously complicated shapes with multiple protruded horns and feathers. In this paper, several blocking topologies are proposed and evaluated on their ability to produce high-quality structured multi-block grid systems. A transition layer grid is introduced to ensure that jaggedness on the ice-surface geometry do not to propagate into the domain. This is important for grid-generation methods based on hyperbolic PDEs (Partial Differential Equations) and algebraic transfinite interpolation. A 'thick' wrap-around grid is introduced to ensure that grid lines clustered next to solid walls do not propagate as streaks of tightly packed grid lines into the interior of the domain along block boundaries. For ice shapes that are not too complicated, a method is presented for generating high-quality single-block grids. To demonstrate the usefulness of the methods developed, grids and CFD solutions were generated for two iced airfoils: the NLF0414 airfoil with and without the 623-ice shape and the B575/767 airfoil with and without the 145m-ice shape. To validate the computations, the computed lift coefficients as a function of angle of attack were compared with available experimental data. The ice shapes and the blocking topologies were prepared by NASA Glenn's SmaggIce software. The grid systems were generated by using a four-boundary method based on Hermite interpolation with controls on clustering, orthogonality next to walls, and C continuity across block boundaries. The flow was modeled by the ensemble-averaged compressible Navier-Stokes equations, closed by the shear-stress transport turbulence model in which the integration is to the wall. All solutions were generated by using the NPARC WIND code.

  2. Study of liquid water by computer simulations. I. Static properties of a 2D model

    NASA Astrophysics Data System (ADS)

    Okazaki, Keiji; Nosé, Shuichi; Kataoka, Yosuke; Yamamoto, Tsunenobu

    1981-12-01

    A computer-simulation study of a water-like system is carried out by making use of a two-dimensional version of the Ben-Naim and Stillinger potential. The pair potential is set up such that at 0 K it yields a square net structure at low pressures and an interpretation of two square nets at high pressures. The liquid state is surveyed over a wide range of temperature and pressure. Various kinds of molecular distribution functions are derived to see how the hydrogen-bond network structure depends on temperature and density. The pressure and thermal equations of state are ''experimentally'' determined by a least square fitting to the pressures and energies calculated for about 200 different state points. The well-known anomalous behavior of liquid water is reproduced at least in a semiquantitative way. The singular properties of supercooled water also are reproduced and their origin is ascribed to the thermodynamical instability. New anomalies are predicted at high temperatures and pressures.

  3. Touch-screen technology for the dynamic display of -2D spatial information without vision: promise and progress.

    PubMed

    Klatzky, Roberta L; Giudice, Nicholas A; Bennett, Christopher R; Loomis, Jack M

    2014-01-01

    Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface.

  4. Computational study of the rovibrational spectra of CO2-C2H2 and CO2-C2D2

    NASA Astrophysics Data System (ADS)

    Donoghue, Geoff; Wang, Xiao-Gang; Dawes, Richard; Carrington, Tucker

    2016-12-01

    An intermolecular potential energy surface and rovibrational transition frequencies are computed for CO2-C2H2. An interpolating moving least squares method is used to fit ab initio points at the explicitly correlated coupled-cluster level. The rovibrational Schrödinger equation is solved with a symmetry-adapted Lanczos algorithm. The computed disrotatory and torsion vibrational levels of both CO2-C2H2 and CO2-C2D2 differ from those obtained by experimentalists by less than 0.5 cm-1. CO2-C2H2 has two equivalent minima with the monomers perpendicular to the inter-monomer axis. In contrast to many other Van der Waals dimers there is no disrotatory path that connects the minima. The tunnelling path follows the torsional coordinate over a high barrier and the splitting is therefore tiny. Using vibrational parent analysis we are able to fit and thus obtain rotational constants and centrifugal distortion constants. Calculated rotational constants differ from their experimental counterparts by less than 0.001 cm-1.

  5. Synthesis, screening and docking of fused pyrano[3,2-d]pyrimidine derivatives as xanthine oxidase inhibitor.

    PubMed

    Kaur, Manroopraj; Kaur, Amandeep; Mankotia, Suhani; Singh, Harbinder; Singh, Arshdeep; Singh, Jatinder Vir; Gupta, Manish Kumar; Sharma, Sahil; Nepali, Kunal; Bedi, Preet Mohinder Singh

    2017-05-05

    In view of developing effective xanthine oxidase (XO) enzyme inhibitors, a series of 100 pyrano[3,2-d]pyrimidine derivatives was synthesized and evaluated for its in vitro XO enzyme inhibition. Structure activity relationship has also been established. Among all the synthesized compounds, 4d, 8d and 9d were found to be the most potent enzyme inhibitors with IC50 values of 8μM, 8.5μM and 7μM, respectively. Compound 9d was further investigated in enzyme kinetic studies and the Lineweaver-Burk plot revealed that the compound 9d was mixed type inhibitor. Molecular properties of the most potent compounds 4d, 8d and 9d, have also been calculated. Docking study was performed to investigate the recognition pattern between xanthine oxidase and the most potent XO inhibitor, 9d. The study suggests that 9d may block the activity of XO sufficiently enough to prevent the substrate from binding to its active site.

  6. Targeted sequencing identifies a novel SH2D1A pathogenic variant in a Chinese family: Carrier screening and prenatal genetic testing.

    PubMed

    Zhang, Jun-Yu; Chen, Song-Chang; Chen, Yi-Yao; Li, Shu-Yuan; Zhang, Lan-Lan; Shen, Ying-Hua; Chang, Chun-Xin; Xiang, Yu-Qian; Huang, He-Feng; Xu, Chen-Ming

    2017-01-01

    X-linked lymphoproliferative disease type 1 (XLP1) is a rare primary immunodeficiency characterized by a clinical triad consisting of severe EBV-induced hemophagocytic lymphohistiocytosis, B-cell lymphoma, and dysgammaglobulinemia. Mutations in SH2D1A gene have been revealed as the cause of XLP1. In this study, a pregnant woman with recurrence history of birthing immunodeficiency was screened for pathogenic variant because the proband sample was unavailable. We aimed to clarify the genetic diagnosis and provide prenatal testing for the family. Next-generation sequencing (NGS)-based multigene panel was used in carrier screening of the pregnant woman. Variants of immunodeficiency related genes were analyzed and prioritized. Candidate variant was verified by using Sanger sequencing. The possible influence of the identified variant was evaluated through RNA assay. Amniocentesis, karyotyping, and Sanger sequencing were performed for prenatal testing. We identified a novel de novo frameshift SH2D1A pathogenic variant (c.251_255delTTTCA) in the pregnant carrier. Peripheral blood RNA assay indicated that the mutant transcript could escape nonsense-mediated mRNA decay (NMD) and might encode a C-terminal truncated protein. Information of the variant led to success prenatal diagnosis of the fetus. In conclusion, our study clarified the genetic diagnosis and altered disease prevention for a pregnant carrier of XLP1.

  7. Targeted sequencing identifies a novel SH2D1A pathogenic variant in a Chinese family: Carrier screening and prenatal genetic testing

    PubMed Central

    Chen, Yi-Yao; Li, Shu-Yuan; Zhang, Lan-Lan; Shen, Ying-Hua; Chang, Chun-Xin; Xiang, Yu-Qian; Huang, He-Feng; Xu, Chen-Ming

    2017-01-01

    X-linked lymphoproliferative disease type 1 (XLP1) is a rare primary immunodeficiency characterized by a clinical triad consisting of severe EBV-induced hemophagocytic lymphohistiocytosis, B-cell lymphoma, and dysgammaglobulinemia. Mutations in SH2D1A gene have been revealed as the cause of XLP1. In this study, a pregnant woman with recurrence history of birthing immunodeficiency was screened for pathogenic variant because the proband sample was unavailable. We aimed to clarify the genetic diagnosis and provide prenatal testing for the family. Next-generation sequencing (NGS)-based multigene panel was used in carrier screening of the pregnant woman. Variants of immunodeficiency related genes were analyzed and prioritized. Candidate variant was verified by using Sanger sequencing. The possible influence of the identified variant was evaluated through RNA assay. Amniocentesis, karyotyping, and Sanger sequencing were performed for prenatal testing. We identified a novel de novo frameshift SH2D1A pathogenic variant (c.251_255delTTTCA) in the pregnant carrier. Peripheral blood RNA assay indicated that the mutant transcript could escape nonsense-mediated mRNA decay (NMD) and might encode a C-terminal truncated protein. Information of the variant led to success prenatal diagnosis of the fetus. In conclusion, our study clarified the genetic diagnosis and altered disease prevention for a pregnant carrier of XLP1. PMID:28231257

  8. Motivational Screen Design Guidelines for Effective Computer-Mediated Instruction.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    Screen designers for computer-mediated instruction (CMI) products must consider the motivational appeal of their designs. Although learners may be motivated to use CMI programs initially because of their novelty, this effect wears off and the instruction must stand on its own. Instructional screens must provide effective and efficient instruction,…

  9. Systematic E2 screening reveals a UBE2D-RNF138-CtIP axis promoting DNA repair

    PubMed Central

    Sczaniecka-Clift, Matylda; Coates, Julia; Jhujh, Satpal; Demir, Mukerrem; Cornwell, Matthew; Beli, Petra; Jackson, Stephen P

    2016-01-01

    Ubiquitylation is crucial for proper cellular responses to DNA double-strand breaks (DSBs). If unrepaired, these highly cytotoxic lesions cause genome instability, tumourigenesis, neurodegeneration or premature ageing. Here, we conduct a comprehensive, multilayered screen to systematically profile all human ubiquitin E2-enzymes for impacts on cellular DSB responses. Applying a widely applicable approach, we use an exemplary E2 family, UBE2Ds, to identify ubiquitylation-cascade components downstream of E2s. Thus, we uncover the nuclear E3-ligase RNF138 as a key homologous recombination (HR)-promoting factor that functions with UBE2Ds in cells. Mechanistically, UBE2Ds and RNF138 accumulate at DNA-damage sites and act at early resection stages by promoting CtIP ubiquitylation and accrual. This work supplies insights into regulation of DSB repair by HR. Moreover, it provides a rich information resource on E2s that can be exploited by follow-on studies. PMID:26502057

  10. ASIC-based architecture for the real-time computation of 2D convolution with large kernel size

    NASA Astrophysics Data System (ADS)

    Shao, Rui; Zhong, Sheng; Yan, Luxin

    2015-12-01

    Bidimensional convolution is a low-level processing algorithm of interest in many areas, but its high computational cost constrains the size of the kernels, especially in real-time embedded systems. This paper presents a hardware architecture for the ASIC-based implementation of 2-D convolution with medium-large kernels. Aiming to improve the efficiency of storage resources on-chip, reducing off-chip bandwidth of these two issues, proposed construction of a data cache reuse. Multi-block SPRAM to cross cached images and the on-chip ping-pong operation takes full advantage of the data convolution calculation reuse, design a new ASIC data scheduling scheme and overall architecture. Experimental results show that the structure can achieve 40× 32 size of template real-time convolution operations, and improve the utilization of on-chip memory bandwidth and on-chip memory resources, the experimental results show that the structure satisfies the conditions to maximize data throughput output , reducing the need for off-chip memory bandwidth.

  11. Identification of HIV-1 reverse transcriptase dual inhibitors by a combined shape-, 2D-fingerprint- and pharmacophore-based virtual screening approach.

    PubMed

    Distinto, Simona; Esposito, Francesca; Kirchmair, Johannes; Cardia, M Cristina; Gaspari, Marco; Maccioni, Elias; Alcaro, Stefano; Markt, Patrick; Wolber, Gerhard; Zinzula, Luca; Tramontano, Enzo

    2012-04-01

    We report the first application of ligand-based virtual screening (VS) methods for discovering new compounds able to inhibit both human immunodeficiency virus type 1 (HIV-1) reverse transcriptase (RT)-associated functions, DNA polymerase and ribonuclease H (RNase H) activities. The overall VS campaign consisted of two consecutive screening processes. In the first, the VS platform Rapid Overlay of Chemical Structures (ROCS) was used to perform in silico shape-based similarity screening on the NCI compounds database in which a hydrazone derivative, previously shown to inhibit the HIV-1 RT, was chosen. As a result, 34 hit molecules were selected and assayed on both RT-associated functions. In the second, the 4 most potent RT inhibitors identified were selected as queries for parallel VS performed by combining shape-based, 2D-fingerprint and 3D-pharmacophore VS methods. Overall, a set of molecules characterized by new different scaffolds were identified as novel inhibitors of both HIV-1 RT-associated activities in the low micromolar range.

  12. Towards the chemoinformatic-based identification of DNA methyltransferase inhibitors: 2D- and 3D-similarity profile of screening libraries.

    PubMed

    Yoo, Jakyung; Medina-Franco, José Luis

    2012-12-01

    DNA methyltransferases (DNMTs) are emerging targets for the treatment of cancer and other diseases. The quinolone-based compound, SGI-1027, is a promising inhibitor of DNMT1 with a distinct mode of action and it is an attractive starting point for further research. Several experimental and computational approaches can be used to further develop novel DNMT1 inhibitors based on SGI-1027. In this work, we used a chemoinformatic-based approach to explore the potential to identify novel inhibitors in large screening collections of natural products and synthetic commercial libraries. Using the principles of similarity searching, the similarity profile to the active reference compound SGI-1027 was computed for four different screening libraries using a total of 22 two- and three- dimensional representations and two similarity metrics. The compound library with the overall highest similarity profile to the probe molecule was identified as the most promising collection for experimental testing. Individual compounds with high similarity to the reference were also selected as suitable candidates for experimental validation. During the course of this work, the 22 two- and three- dimensional representations were compared to each other and classified based on the similarity values computed with the reference compound. This classification is valuable to select structure representations for similarity searching of any other screening library. This work represents a step forward to further advance epigenetic therapies using computational approaches.

  13. SCREENING CHEMICALS FOR ESTROGEN RECEPTOR BIOACTIVITY USING A COMPUTATIONAL MODEL

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is considering the use high-throughput and computational methods for regulatory applications in the Endocrine Disruptor Screening Program (EDSP). To use these new tools for regulatory decision making, computational methods must be a...

  14. Chemical profiling and adulteration screening of Aquilariae Lignum Resinatum by Fourier transform infrared (FT-IR) spectroscopy and two-dimensional correlation infrared (2D-IR) spectroscopy.

    PubMed

    Qu, Lei; Chen, Jian-Bo; Zhang, Gui-Jun; Sun, Su-Qin; Zheng, Jing

    2017-03-05

    As a kind of expensive perfume and valuable herb, Aquilariae Lignum Resinatum (ALR) is often adulterated for economic motivations. In this research, Fourier transform infrared (FT-IR) spectroscopy is employed to establish a simple and quick method for the adulteration screening of ALR. First, the principal chemical constituents of ALR are characterized by FT-IR spectroscopy at room temperature and two-dimensional correlation infrared (2D-IR) spectroscopy with thermal perturbation. Besides the common cellulose and lignin compounds, a certain amount of resin is the characteristic constituent of ALR. Synchronous and asynchronous 2D-IR spectra indicate that the resin (an unstable secondary metabolite) is more sensitive than cellulose and lignin (stable structural constituents) to the thermal perturbation. Using a certified ALR sample as the reference, the infrared spectral correlation threshold is determined by 30 authentic samples and 6 adulterated samples. The spectral correlation coefficient of an authentic ALR sample to the standard reference should be not less than 0.9886 (p=0.01). Three commercial adulterated ALR samples are identified by the correlation threshold. Further interpretation of the infrared spectra of the adulterated samples indicates the common adulterating methods - counterfeiting with other kind of wood, adding ingredient such as sand to increase the weight, and adding the cheap resin such as rosin to increase the content of resin compounds. Results of this research prove that FT-IR spectroscopy can be used as a simple and accurate quality control method of ALR.

  15. Chemical profiling and adulteration screening of Aquilariae Lignum Resinatum by Fourier transform infrared (FT-IR) spectroscopy and two-dimensional correlation infrared (2D-IR) spectroscopy

    NASA Astrophysics Data System (ADS)

    Qu, Lei; Chen, Jian-bo; Zhang, Gui-Jun; Sun, Su-qin; Zheng, Jing

    2017-03-01

    As a kind of expensive perfume and valuable herb, Aquilariae Lignum Resinatum (ALR) is often adulterated for economic motivations. In this research, Fourier transform infrared (FT-IR) spectroscopy is employed to establish a simple and quick method for the adulteration screening of ALR. First, the principal chemical constituents of ALR are characterized by FT-IR spectroscopy at room temperature and two-dimensional correlation infrared (2D-IR) spectroscopy with thermal perturbation. Besides the common cellulose and lignin compounds, a certain amount of resin is the characteristic constituent of ALR. Synchronous and asynchronous 2D-IR spectra indicate that the resin (an unstable secondary metabolite) is more sensitive than cellulose and lignin (stable structural constituents) to the thermal perturbation. Using a certified ALR sample as the reference, the infrared spectral correlation threshold is determined by 30 authentic samples and 6 adulterated samples. The spectral correlation coefficient of an authentic ALR sample to the standard reference should be not less than 0.9886 (p = 0.01). Three commercial adulterated ALR samples are identified by the correlation threshold. Further interpretation of the infrared spectra of the adulterated samples indicates the common adulterating methods - counterfeiting with other kind of wood, adding ingredient such as sand to increase the weight, and adding the cheap resin such as rosin to increase the content of resin compounds. Results of this research prove that FT-IR spectroscopy can be used as a simple and accurate quality control method of ALR.

  16. Computer Assisted Instruction Techniques for Screening Freshmen.

    ERIC Educational Resources Information Center

    Flower, K. W.; Craft, W. J.

    1981-01-01

    Describes the use of computer assisted instruction at North Carolina Agriculture and Technical State University in freshman and remedial mathematics to cut down high attrition rates and weed out quickly the students who can't adapt to the vigors of engineering course work. (Author/DS)

  17. The Effects of Computer Usage on Computer Screen Reading Rate.

    ERIC Educational Resources Information Center

    Clausing, Carolyn S.; Schmitt, Dorren Rafael

    This study investigated the differences in the reading rate of eighth grade students on a cloze reading exercise involving the reading of text from a computer monitor. Several different modes of presentation were used in order to determine the effect of prior experience with computers on the students' reading rate. Subjects were 240 eighth grade…

  18. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 1: Theory and method

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  19. Automated screening of 2D crystallization trials using transmission electron microscopy: a high-throughput tool-chain for sample preparation and microscopic analysis.

    PubMed

    Coudray, Nicolas; Hermann, Gilles; Caujolle-Bert, Daniel; Karathanou, Argyro; Erne-Brand, Françoise; Buessler, Jean-Luc; Daum, Pamela; Plitzko, Juergen M; Chami, Mohamed; Mueller, Urs; Kihl, Hubert; Urban, Jean-Philippe; Engel, Andreas; Rémigy, Hervé-W

    2011-02-01

    We have built and extensively tested a tool-chain to prepare and screen two-dimensional crystals of membrane proteins by transmission electron microscopy (TEM) at room temperature. This automated process is an extension of a new procedure described recently that allows membrane protein 2D crystallization in parallel (Iacovache et al., 2010). The system includes a gantry robot that transfers and prepares the crystalline solutions on grids suitable for TEM analysis and an entirely automated microscope that can analyze 96 grids at once without human interference. The operation of the system at the user level is solely controlled within the MATLAB environment: the commands to perform sample handling (loading/unloading in the microscope), microscope steering (magnification, focus, image acquisition, etc.) as well as automatic crystal detection have been implemented. Different types of thin samples can efficiently be screened provided that the particular detection algorithm is adapted to the specific task. Hence, operating time can be shared between multiple users. This is a major step towards the integration of transmission electron microscopy into a high throughput work-flow.

  20. Perspective: On the active site model in computational catalyst screening

    NASA Astrophysics Data System (ADS)

    Reuter, Karsten; Plaisance, Craig P.; Oberhofer, Harald; Andersen, Mie

    2017-01-01

    First-principles screening approaches exploiting energy trends in surface adsorption represent an unparalleled success story in recent computational catalysis research. Here we argue that our still limited understanding of the structure of active sites is one of the major bottlenecks towards an ever extended and reliable use of such computational screening for catalyst discovery. For low-index transition metal surfaces, the prevalently chosen high-symmetry (terrace and step) sites offered by the nominal bulk-truncated crystal lattice might be justified. For more complex surfaces and composite catalyst materials, computational screening studies will need to actively embrace a considerable uncertainty with respect to what truly are the active sites. By systematically exploring the space of possible active site motifs, such studies might eventually contribute towards a targeted design of optimized sites in future catalysts.

  1. Verification and benchmarking of MAGNUM-2D: a finite element computer code for flow and heat transfer in fractured porous media

    SciTech Connect

    Eyler, L.L.; Budden, M.J.

    1985-03-01

    The objective of this work is to assess prediction capabilities and features of the MAGNUM-2D computer code in relation to its intended use in the Basalt Waste Isolation Project (BWIP). This objective is accomplished through a code verification and benchmarking task. Results are documented which support correctness of prediction capabilities in areas of intended model application. 10 references, 43 figures, 11 tables.

  2. Computed tomography screening for lung cancer: back to basics.

    PubMed

    Ellis, S M; Husband, J E; Armstrong, P; Hansell, D M

    2001-09-01

    After some years in the doldrums, interest in screening for lung cancer is resurging. Conflicting evidence from previous lung cancer screening trials, based on plain chest radiography, has been the subject of much debate: the failure to demonstrate a reduction in mortality has led to the widely held conclusion that screening for lung cancer is ineffective. The validity of this assumption has been questioned sporadically and a large study currently under way in the U.S.A. should help settle the issue. Recently, there has been interest in the use of computed tomography to screen for lung cancer; radiation doses have been reduced to 'acceptable' levels and the superiority of computed tomography (CT) over chest radiography for the identification of pulmonary nodules is unquestioned. However, whether improved nodule detection will result in a reduction in mortality has not yet been demonstrated. The present review provides a historical background to the current interest in low-dose CT screening, explains the arguments that previous studies have provoked, and discusses the recent and evolving status of lung cancer screening with CT. Ellis, S. M. et al. (2001).

  3. Device-dependent screen optimization using evolutionary computing

    NASA Astrophysics Data System (ADS)

    Bartels, Rudi

    2000-12-01

    Most of the half toning algorithms are based on ideal imaging devices that can render perfect square pixels. In real printing environments this is not the case. Most imaging deices are a trade-off between the best quality and the highest speed. In this paper a screen will be designed for Agfa's newspaper-dedicated computer-to-plate imaging device Polaris.

  4. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and

  5. Upgrade of PARC2D to include real gas effects. [computer program for flowfield surrounding aeroassist flight experiment

    NASA Technical Reports Server (NTRS)

    Saladino, Anthony; Praharaj, Sarat C.; Collins, Frank G.; Seaford, C. Mark

    1990-01-01

    This paper presents a description of the changes and additions to the perfect gas PARC2D code to include chemical equilibrium effects, resulting in a code called PARCEQ2D. The work developed out of a need to have the capability of more accurately representing the flowfield surrounding the aeroassist flight experiment (AFE) vehicle. Use is made of the partition function of statistical mechanics in the evaluation of the thermochemical properties. This approach will allow the PARC code to be extended to thermal nonequilibrium when this task is undertaken in the future. The transport properties follow from formulae from the kinetic theory of gases. Results are presented for a two-dimensional AFE that compare perfect gas and real gas solutions at flight conditions, showing vast differences between the two cases.

  6. A Benchmarking Analysis for Five Radionuclide Vadose Zone Models (Chain, Multimed{_}DP, Fectuz, Hydrus, and Chain 2D) in Soil Screening Level Calculations

    SciTech Connect

    Chen, J-S.; Drake, R.; Lin, Z.; Jewett, D. G.

    2002-02-26

    Five vadose zone models with different degrees of complexity (CHAIN, MULTIMED{_}DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in radionuclide soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide ({sup 99}Tc) release scenario at the Las Cruces Trench Site in New Mexico. Sensitivity of three model outputs to the input parameters were evaluated and compared among the models. The three outputs were peak contaminant concentrations, time to peak concentrations at the water table, and time to exceed the contaminants maximum critical level at a representative receptor well. Model parameters investigated include soil properties such as bulk density, water content, soil water retention parameters and hydraulic conductivity. Chemical properties examined include distribution coefficient, radionuclide half-life, dispersion coefficient, and molecular diffusion. Other soil characteristics, such as recharge rate, also were examined. Model sensitivity was quantified in the form of sensitivity and relative sensitivity coefficients. Relative sensitivities were used to compare the sensitivities of different parameters. The analysis indicates that soil water content, recharge rate, saturated soil water content, and soil retention parameter, {beta}, have a great influence on model outputs. In general, the results of sensitivities and relative sensitivities using five models are similar for a specific scenario. Slight differences were observed in predicted peak contaminant concentrations due to different mathematical treatment among models. The results of benchmarking and sensitivity analysis would facilitate the model selection and application of the model in SSL calculations.

  7. A review of automated image understanding within 3D baggage computed tomography security screening.

    PubMed

    Mouton, Andre; Breckon, Toby P

    2015-01-01

    Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT.

  8. Validity of computational hemodynamics in human arteries based on 3D time-of-flight MR angiography and 2D electrocardiogram gated phase contrast images

    NASA Astrophysics Data System (ADS)

    Yu, Huidan (Whitney); Chen, Xi; Chen, Rou; Wang, Zhiqiang; Lin, Chen; Kralik, Stephen; Zhao, Ye

    2015-11-01

    In this work, we demonstrate the validity of 4-D patient-specific computational hemodynamics (PSCH) based on 3-D time-of-flight (TOF) MR angiography (MRA) and 2-D electrocardiogram (ECG) gated phase contrast (PC) images. The mesoscale lattice Boltzmann method (LBM) is employed to segment morphological arterial geometry from TOF MRA, to extract velocity profiles from ECG PC images, and to simulate fluid dynamics on a unified GPU accelerated computational platform. Two healthy volunteers are recruited to participate in the study. For each volunteer, a 3-D high resolution TOF MRA image and 10 2-D ECG gated PC images are acquired to provide the morphological geometry and the time-varying flow velocity profiles for necessary inputs of the PSCH. Validation results will be presented through comparisons of LBM vs. 4D Flow Software for flow rates and LBM simulation vs. MRA measurement for blood flow velocity maps. Indiana University Health (IUH) Values Fund.

  9. Decision trees and integrated features for computer aided mammographic screening

    SciTech Connect

    Kegelmeyer, W.P. Jr.; Groshong, B.; Allmen, M.; Woods, K.

    1997-02-01

    Breast cancer is a serious problem, which in the United States causes 43,000 deaths a year, eventually striking 1 in 9 women. Early detection is the only effective countermeasure, and mass mammography screening is the only reliable means for early detection. Mass screening has many shortcomings which could be addressed by a computer-aided mammographic screening system. Accordingly, we have applied the pattern recognition methods developed in earlier investigations of speculated lesions in mammograms to the detection of microcalcifications and circumscribed masses, generating new, more rigorous and uniform methods for the detection of both those signs. We have also improved the pattern recognition methods themselves, through the development of a new approach to combinations of multiple classifiers.

  10. Affinity-Based Screening of Tetravalent Peptides Identifies Subtype-Selective Neutralizers of Shiga Toxin 2d, a Highly Virulent Subtype, by Targeting a Unique Amino Acid Involved in Its Receptor Recognition

    PubMed Central

    Mitsui, Takaaki; Watanabe-Takahashi, Miho; Shimizu, Eiko; Zhang, Baihao; Funamoto, Satoru; Yamasaki, Shinji

    2016-01-01

    Shiga toxin (Stx), a major virulence factor of enterohemorrhagic Escherichia coli (EHEC), can be classified into two subgroups, Stx1 and Stx2, each consisting of various closely related subtypes. Stx2 subtypes Stx2a and Stx2d are highly virulent and linked with serious human disorders, such as acute encephalopathy and hemolytic-uremic syndrome. Through affinity-based screening of a tetravalent peptide library, we previously developed peptide neutralizers of Stx2a in which the structure was optimized to bind to the B-subunit pentamer. In this study, we identified Stx2d-selective neutralizers by targeting Asn16 of the B subunit, an amino acid unique to Stx2d that plays an essential role in receptor binding. We synthesized a series of tetravalent peptides on a cellulose membrane in which the core structure was exactly the same as that of peptides in the tetravalent library. A total of nine candidate motifs were selected to synthesize tetravalent forms of the peptides by screening two series of the tetravalent peptides. Five of the tetravalent peptides effectively inhibited the cytotoxicity of Stx2a and Stx2d, and notably, two of the peptides selectively inhibited Stx2d. These two tetravalent peptides bound to the Stx2d B subunit with high affinity dependent on Asn16. The mechanism of binding to the Stx2d B subunit differed from that of binding to Stx2a in that the peptides covered a relatively wide region of the receptor-binding surface. Thus, this highly optimized screening technique enables the development of subtype-selective neutralizers, which may lead to more sophisticated treatments of infections by Stx-producing EHEC. PMID:27382021

  11. SU-E-T-497: Initial Characterization of a Novel 2D Computed Radiography (CR) Dosimeter for SBRT

    SciTech Connect

    Crijns, W; Ramaekers, S; Defraene, G; Haustermans, K; Depuydt, T; Leblans, P; Maes, F

    2015-06-15

    Purpose For 2D, sub-mm resolution dose measurements Gafchromic™ film is currently a standard in radiotherapy; mainly because of its energy independence and water equivalence. However, EBT film is disposable. Therefore, the dosimetric and uniformity characteristics needs to be estimated from a second, possibly different, (calibration) film. Moreover, EBT has a post-irradiation coloration and a non-linear dose dependence with saturation, which limits the applicable time interval and dose range. CR technology forms an interesting alternative for EBT. Dose dependent CRplates have a sub-mm resolution, and additionally a linear dose dependence over decades of dose. But, CR has an inherent signal fading and energy dependence. Here, for the first time, a radiotherapy 2D CR prototype was characterized for an extended dose range (up to 35Gy), signal fading, and basic energy dependence. Methods The prototype was irradiated with a standard 10×10cm 6MV photon beam, and scanned with a commercial CR 15-X(R) scanner. The time between the start of irradiation and scanning (T-Scan) was monitored. The linearity of the dose response was evaluated between 0 and 35Gy using a fixed T-Scan of 4min, if possible (i.e. ≤10Gy). Next, the signal fading was characterized for a T-Scan-range of 4 to 20min. The energy dependence was assessed by a comparison of out-of-field measurements of CR, EBT, and TLD. Results The radiotherapy CR prototype has a linear response over the complete SBRT dose range (0– 35Gy). The prototype had a small (5%), but linear signal fading over the time interval of interest (4–20min). Out-of-field, the prototype has a 8% over response due to an increased amount of low energy photons. The impact of the over response on intensity modulated radiotherapy remains to be evaluated. Conclusion CR technology is promising for SBRT dose measurements up to 35Gy. It is a reusable linear alternative for film dosimetry.

  12. Analysis of 2D Torus and Hub Topologies of 100Mb/s Ethernet for the Whitney Commodity Computing Testbed

    NASA Technical Reports Server (NTRS)

    Pedretti, Kevin T.; Fineberg, Samuel A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    A variety of different network technologies and topologies are currently being evaluated as part of the Whitney Project. This paper reports on the implementation and performance of a Fast Ethernet network configured in a 4x4 2D torus topology in a testbed cluster of 'commodity' Pentium Pro PCs. Several benchmarks were used for performance evaluation: an MPI point to point message passing benchmark, an MPI collective communication benchmark, and the NAS Parallel Benchmarks version 2.2 (NPB2). Our results show that for point to point communication on an unloaded network, the hub and 1 hop routes on the torus have about the same bandwidth and latency. However, the bandwidth decreases and the latency increases on the torus for each additional route hop. Collective communication benchmarks show that the torus provides roughly four times more aggregate bandwidth and eight times faster MPI barrier synchronizations than a hub based network for 16 processor systems. Finally, the SOAPBOX benchmarks, which simulate real-world CFD applications, generally demonstrated substantially better performance on the torus than on the hub. In the few cases the hub was faster, the difference was negligible. In total, our experimental results lead to the conclusion that for Fast Ethernet networks, the torus topology has better performance and scales better than a hub based network.

  13. FACET: a radiation view factor computer code for axisymmetric, 2D planar, and 3D geometries with shadowing

    SciTech Connect

    Shapiro, A.B.

    1983-08-01

    The computer code FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors for input to finite-element heat-transfer analysis codes. The first section of this report is a brief review of previous radiation-view-factor computer codes. The second section presents the defining integral equation for the geometric view factor between two surfaces and the assumptions made in its derivation. Also in this section are the numerical algorithms used to integrate this equation for the various geometries. The third section presents the algorithms used to detect self-shadowing and third-surface shadowing between the two surfaces for which a view factor is being calculated. The fourth section provides a user's input guide followed by several example problems.

  14. 3D computations of flow field in a guide vane blading designed by means of 2D model for a low head hydraulic turbine

    NASA Astrophysics Data System (ADS)

    Krzemianowski, Z.; Puzyrewski, R.

    2014-08-01

    The paper presents the main parameters of the flow field behind the guide vane cascade designed by means of 2D inverse problem and following check by means of 3D commercial program ANSYS/Fluent applied for a direct problem. This approach of using different models reflects the contemporary design procedure for non-standardized turbomachinery stage. Depending on the model, the set of conservation equation to be solved differs, although the physical background remains the same. The example of computations for guide vane cascade for a low head hydraulic turbine is presented.

  15. Coupled 2-dimensional cascade theory for noise an d unsteady aerodynamics of blade row interaction in turbofans. Volume 2: Documentation for computer code CUP2D

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.

    1994-01-01

    A two dimensional linear aeroacoustic theory for rotor/stator interaction with unsteady coupling was derived and explored in Volume 1 of this report. Computer program CUP2D has been written in FORTRAN embodying the theoretical equations. This volume (Volume 2) describes the structure of the code, installation and running, preparation of the input file, and interpretation of the output. A sample case is provided with printouts of the input and output. The source code is included with comments linking it closely to the theoretical equations in Volume 1.

  16. Computer-assisted lesion detection system for stomach screening using stomach shape and appearance models

    NASA Astrophysics Data System (ADS)

    Midoh, Y.; Nakamura, M.; Takashima, M.; Nakamae, K.; Fujioka, H.

    2007-03-01

    In Japan, stomach cancer is one of the three most common causes of death from cancer. Since periodic health checks of stomach X-rays have become more widely carried out, the physicians' burdens have been increasing in the mass screening to detect initial symptoms of a disease. For the purpose of automatic diagnosis, we try to develop a computer-assisted lesion detection system for stomach screening. The proposed system has two databases. One is the stomach shape database that consists of the computer graphics stomach 3D models based on biomechanics simulation and their projected 2D images. The other is the normal appearance database that is constructed by learning patterns in a normal patient training set. The stomach contour is extracted from an X-ray image including a barium filled region by the following steps. Firstly, the approximated stomach region is obtained by nonrigid registration based on mutual information. We define nonrigid transformation as one that includes translations, rotations, scaling, air-barium interface and weights of eigenvectors determined by principal components analysis in the stomach shape database. Secondly, the accurate stomach contour is extracted from the gradient of an image by using the Dynamic Programming. After then, stomach lesions are detected by inspecting whether the Mahalanobis distance from the mean in the normal appearance database is longer than a suitable value on the extracted stomach contour. We applied our system to 75 X-ray images of barium-filled stomach to show its validity.

  17. 2D Computational Fluid Dynamic Modeling of Human Ventricle System Based on Fluid-Solid Interaction and Pulsatile Flow.

    PubMed

    Masoumi, Nafiseh; Framanzad, F; Zamanian, Behnam; Seddighi, A S; Moosavi, M H; Najarian, S; Bastani, Dariush

    2013-01-01

    Many diseases are related to cerebrospinal fluid (CSF) hydrodynamics. Therefore, understanding the hydrodynamics of CSF flow and intracranial pressure is helpful for obtaining deeper knowledge of pathological processes and providing better treatments. Furthermore, engineering a reliable computational method is promising approach for fabricating in vitro models which is essential for inventing generic medicines. A Fluid-Solid Interaction (FSI)model was constructed to simulate CSF flow. An important problem in modeling the CSF flow is the diastolic back flow. In this article, using both rigid and flexible conditions for ventricular system allowed us to evaluate the effect of surrounding brain tissue. Our model assumed an elastic wall for the ventricles and a pulsatile CSF input as its boundary conditions. A comparison of the results and the experimental data was done. The flexible model gave better results because it could reproduce the diastolic back flow mentioned in clinical research studies. The previous rigid models have ignored the brain parenchyma interaction with CSF and so had not reported the back flow during the diastolic time. In this computational fluid dynamic (CFD) analysis, the CSF pressure and flow velocity in different areas were concordant with the experimental data.

  18. 2D Computational Fluid Dynamic Modeling of Human Ventricle System Based on Fluid-Solid Interaction and Pulsatile Flow

    PubMed Central

    Masoumi, Nafiseh; Framanzad, F.; Zamanian, Behnam; Seddighi, A.S.; Moosavi, M.H.; Najarian, S.; Bastani, Dariush

    2013-01-01

    Many diseases are related to cerebrospinal fluid (CSF) hydrodynamics. Therefore, understanding the hydrodynamics of CSF flow and intracranial pressure is helpful for obtaining deeper knowledge of pathological processes and providing better treatments. Furthermore, engineering a reliable computational method is promising approach for fabricating in vitro models which is essential for inventing generic medicines. A Fluid-Solid Interaction (FSI)model was constructed to simulate CSF flow. An important problem in modeling the CSF flow is the diastolic back flow. In this article, using both rigid and flexible conditions for ventricular system allowed us to evaluate the effect of surrounding brain tissue. Our model assumed an elastic wall for the ventricles and a pulsatile CSF input as its boundary conditions. A comparison of the results and the experimental data was done. The flexible model gave better results because it could reproduce the diastolic back flow mentioned in clinical research studies. The previous rigid models have ignored the brain parenchyma interaction with CSF and so had not reported the back flow during the diastolic time. In this computational fluid dynamic (CFD) analysis, the CSF pressure and flow velocity in different areas were concordant with the experimental data. PMID:25337330

  19. VFLOW2D - A Vorte-Based Code for Computing Flow Over Elastically Supported Tubes and Tube Arrays

    SciTech Connect

    WOLFE,WALTER P.; STRICKLAND,JAMES H.; HOMICZ,GREGORY F.; GOSSLER,ALBERT A.

    2000-10-11

    A numerical flow model is developed to simulate two-dimensional fluid flow past immersed, elastically supported tube arrays. This work is motivated by the objective of predicting forces and motion associated with both deep-water drilling and production risers in the oil industry. This work has other engineering applications including simulation of flow past tubular heat exchangers or submarine-towed sensor arrays and the flow about parachute ribbons. In the present work, a vortex method is used for solving the unsteady flow field. This method demonstrates inherent advantages over more conventional grid-based computational fluid dynamics. The vortex method is non-iterative, does not require artificial viscosity for stability, displays minimal numerical diffusion, can easily treat moving boundaries, and allows a greatly reduced computational domain since vorticity occupies only a small fraction of the fluid volume. A gridless approach is used in the flow sufficiently distant from surfaces. A Lagrangian remap scheme is used near surfaces to calculate diffusion and convection of vorticity. A fast multipole technique is utilized for efficient calculation of velocity from the vorticity field. The ability of the method to correctly predict lift and drag forces on simple stationary geometries over a broad range of Reynolds numbers is presented.

  20. Investigation of mechanical strength of 2D nanoscale structures using a molecular dynamics based computational intelligence approach

    NASA Astrophysics Data System (ADS)

    Garg, A.; Vijayaraghavan, V.; Wong, C. H.; Tai, K.; Singru, Pravin M.; Mahapatra, S. S.; Sangwan, K. S.

    2015-09-01

    A molecular dynamics (MD) based computational intelligence (CI) approach is proposed to investigate the Young modulus of two graphene sheets: Armchair and Zigzag. In this approach, the effect of aspect ratio, the temperature, the number of atomic planes and the vacancy defects on the Young modulus of two graphene sheets are first analyzed using the MD simulation. The data obtained using the MD simulation is then fed into the paradigm of a CI cluster comprising of genetic programming, which was specifically designed to formulate the explicit relationship of Young modulus of two graphene structures. We find that the MD-based-CI model is able to model the Young modulus of two graphene structures very well, which compiles in good agreement with that of experimental results obtained from the literature. Additionally, we also conducted sensitivity and parametric analysis and found that the number of defects has the most dominating influence on the Young modulus of two graphene structures.

  1. Local finite element enrichment strategies for 2D contact computations and a corresponding post-processing scheme

    NASA Astrophysics Data System (ADS)

    Sauer, Roger A.

    2013-08-01

    Recently an enriched contact finite element formulation has been developed that substantially increases the accuracy of contact computations while keeping the additional numerical effort at a minimum reported by Sauer (Int J Numer Meth Eng, 87: 593-616, 2011). Two enrich-ment strategies were proposed, one based on local p-refinement using Lagrange interpolation and one based on Hermite interpolation that produces C 1-smoothness on the contact surface. Both classes, which were initially considered for the frictionless Signorini problem, are extended here to friction and contact between deformable bodies. For this, a symmetric contact formulation is used that allows the unbiased treatment of both contact partners. This paper also proposes a post-processing scheme for contact quantities like the contact pressure. The scheme, which provides a more accurate representation than the raw data, is based on an averaging procedure that is inspired by mortar formulations. The properties of the enrichment strategies and the corresponding post-processing scheme are illustrated by several numerical examples considering sliding and peeling contact in the presence of large deformations.

  2. Documentation of computer program VS2D to solve the equations of fluid flow in variably saturated porous media

    USGS Publications Warehouse

    Lappala, E.G.; Healy, R.W.; Weeks, E.P.

    1987-01-01

    This report documents FORTRAN computer code for solving problems involving variably saturated single-phase flow in porous media. The flow equation is written with total hydraulic potential as the dependent variable, which allows straightforward treatment of both saturated and unsaturated conditions. The spatial derivatives in the flow equation are approximated by central differences, and time derivatives are approximated either by a fully implicit backward or by a centered-difference scheme. Nonlinear conductance and storage terms may be linearized using either an explicit method or an implicit Newton-Raphson method. Relative hydraulic conductivity is evaluated at cell boundaries by using either full upstream weighting, the arithmetic mean, or the geometric mean of values from adjacent cells. Nonlinear boundary conditions treated by the code include infiltration, evaporation, and seepage faces. Extraction by plant roots that is caused by atmospheric demand is included as a nonlinear sink term. These nonlinear boundary and sink terms are linearized implicitly. The code has been verified for several one-dimensional linear problems for which analytical solutions exist and against two nonlinear problems that have been simulated with other numerical models. A complete listing of data-entry requirements and data entry and results for three example problems are provided. (USGS)

  3. Computer simulation of topological evolution in 2-d grain growth using a continuum diffuse-interface field model

    SciTech Connect

    Fan, D.; Geng, C.; Chen, L.Q.

    1997-03-01

    The local kinetics and topological phenomena during normal grain growth were studied in two dimensions by computer simulations employing a continuum diffuse-interface field model. The relationships between topological class and individual grain growth kinetics were examined, and compared with results obtained previously from analytical theories, experimental results and Monte Carlo simulations. It was shown that both the grain-size and grain-shape (side) distributions are time-invariant and the linear relationship between the mean radii of individual grains and topological class n was reproduced. The moments of the shape distribution were determined, and the differences among the data from soap froth. Potts model and the present simulation were discussed. In the limit when the grain size goes to zero, the average number of grain edges per grain is shown to be between 4 and 5, implying the direct vanishing of 4- and 5-sided grains, which seems to be consistent with recent experimental observations on thin films. Based on the simulation results, the conditions for the applicability of the familiar Mullins-Von Neumann law and the Hillert`s equation were discussed.

  4. 2-D computer modeling of oil generation and migration in a Transect of the Eastern Venezuela Basin

    SciTech Connect

    Gallango, O. ); Parnaud, F. )

    1993-02-01

    The aim of the study was a two-dimensional computer simulation of the basin evolution based on available geological, geophysical, geochemical, geothermal, and hydrodynamic data with the main purpose of determining the hydrocarbon generation and migration history. The modeling was done in two geological sections (platform and pre-thrusting) located along the Chacopata-Uverito Transect in the Eastern Venezuelan Basin. In the platform section an hypothetic source rock equivalent to the Gyayuta Group was considered in order to simulate the migration of hydrocarbons. The thermal history reconstruction of hypothetic source rock confirms that this source rock does not reach the oil window before the middle Miocene and that the maturity in this sector is due to the sedimentation of the Freites, La Pica, and Mesa-Las Piedras formations. The oil expulsion and migration from this hypothetic source rock began after middle Miocene time. The expulsion of the hydrocarbons took place mainly along the Oligocene-Miocene reservoir and do not reach at the present time zones located beyond of the Oritupano field, which imply that the oil accumulated in south part of the basin was generated by a source rock located to the north, in the actual deformation zone. Since 17 m.y. ago water migration pattern from north to south was observed in this section. In the pre-thrusting section the hydrocarbon expulsion started during the early Tertiary and took place mainly toward the lower Cretaceous (El Cantil and Barranquim formations). At the end of the passive margin the main migration occur across the Merecure reservoir, through which the hydrocarbon migrated forward to the Onado sector before the thrusting.

  5. Computer-aided detection of masses in digital tomosynthesis mammography: combination of 3D and 2D detection information

    NASA Astrophysics Data System (ADS)

    Chan, Heang-Ping; Wei, Jun; Zhang, Yiheng; Moore, Richard H.; Kopans, Daniel B.; Hadjiiski, Lubomir; Sahiner, Berkman; Roubidoux, Marilyn A.; Helvie, Mark A.

    2007-03-01

    We are developing a computer-aided detection (CAD) system for masses on digital breast tomosynthesis mammograms (DBTs). The CAD system includes two parallel processes. In the first process, mass detection and feature analysis are performed in the reconstructed 3D DBT volume. A mass likelihood score is estimated for each mass candidate using a linear discriminant (LDA) classifier. In the second process, mass detection and feature analysis are applied to the individual projection view (PV) images. A mass likelihood score is estimated for each mass candidate using another LDA classifier. The mass likelihood images derived from the PVs are back-projected to the breast volume to estimate the 3D spatial distribution of the mass likelihood scores. The mass likelihood scores estimated by the two processes at the corresponding 3D location are then merged and evaluated using FROC analysis. In this preliminary study, a data set of 52 DBT cases acquired with a GE prototype system at the Massachusetts General Hospital was used. The LDA classifiers with stepwise feature selection were designed with leave-one-case-out resampling. In an FROC analysis, the CAD system for detection in the DBT volume alone achieved test sensitivities of 80% and 90% at an average FP rate of 1.6 and 3.0 per breast, respectively. In comparison, the average FP rates of the combined system were 1.2 and 2.3 per breast, respectively, at the same sensitivities. The combined system is a promising approach to improving mass detection on DBTs.

  6. Defining the RNA internal loops preferred by benzimidazole derivatives via 2D combinatorial screening and computational analysis.

    PubMed

    Velagapudi, Sai Pradeep; Seedhouse, Steven J; French, Jonathan; Disney, Matthew D

    2011-07-06

    RNA is an important therapeutic target; however, RNA targets are generally underexploited due to a lack of understanding of the small molecules that bind RNA and the RNA motifs that bind small molecules. Herein, we describe the identification of the RNA internal loops derived from a 4096 member 3 × 3 nucleotide loop library that are the most specific and highest affinity binders to a series of four designer, druglike benzimidazoles. These studies establish a potentially general protocol to define the highest affinity and most specific RNA motif targets for heterocyclic small molecules. Such information could be used to target functionally important RNAs in genomic sequence.

  7. Electroencephalography (EEG)-based brain-computer interface (BCI): a 2-D virtual wheelchair control based on event-related desynchronization/synchronization and state control.

    PubMed

    Huang, Dandan; Qian, Kai; Fei, Ding-Yu; Jia, Wenchuan; Chen, Xuedong; Bai, Ou

    2012-05-01

    This study aims to propose an effective and practical paradigm for a brain-computer interface (BCI)-based 2-D virtual wheelchair control. The paradigm was based on the multi-class discrimination of spatiotemporally distinguishable phenomenon of event-related desynchronization/synchronization (ERD/ERS) in electroencephalogram signals associated with motor execution/imagery of right/left hand movement. Comparing with traditional method using ERD only, where bilateral ERDs appear during left/right hand mental tasks, the 2-D control exhibited high accuracy within a short time, as incorporating ERS into the paradigm hypothetically enhanced the spatiotemoral feature contrast of ERS versus ERD. We also expected users to experience ease of control by including a noncontrol state. In this study, the control command was sent discretely whereas the virtual wheelchair was moving continuously. We tested five healthy subjects in a single visit with two sessions, i.e., motor execution and motor imagery. Each session included a 20 min calibration and two sets of games that were less than 30 min. Average target hit rate was as high as 98.4% with motor imagery. Every subject achieved 100% hit rate in the second set of wheelchair control games. The average time to hit a target 10 m away was about 59 s, with 39 s for the best set. The superior control performance in subjects without intensive BCI training suggested a practical wheelchair control paradigm for BCI users.

  8. Computational Drug Target Screening through Protein Interaction Profiles

    PubMed Central

    Vilar, Santiago; Quezada, Elías; Uriarte, Eugenio; Costanzi, Stefano; Borges, Fernanda; Viña, Dolores; Hripcsak, George

    2016-01-01

    The development of computational methods to discover novel drug-target interactions on a large scale is of great interest. We propose a new method for virtual screening based on protein interaction profile similarity to discover new targets for molecules, including existing drugs. We calculated Target Interaction Profile Fingerprints (TIPFs) based on ChEMBL database to evaluate drug similarity and generated new putative compound-target candidates from the non-intersecting targets in each pair of compounds. A set of drugs was further studied in monoamine oxidase B (MAO-B) and cyclooxygenase-1 (COX-1) enzyme through molecular docking and experimental assays. The drug ethoxzolamide and the natural compound piperlongumine, present in Piper longum L, showed hMAO-B activity with IC50 values of 25 and 65 μM respectively. Five candidates, including lapatinib, SB-202190, RO-316233, GW786460X and indirubin-3′-monoxime were tested against human COX-1. Compounds SB-202190 and RO-316233 showed a IC50 in hCOX-1 of 24 and 25 μM respectively (similar range as potent inhibitors such as diclofenac and indomethacin in the same experimental conditions). Lapatinib and indirubin-3′-monoxime showed moderate hCOX-1 activity (19.5% and 28% of enzyme inhibition at 25 μM respectively). Our modeling constitutes a multi-target predictor for large scale virtual screening with potential in lead discovery, repositioning and drug safety. PMID:27845365

  9. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems.

  10. Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer Screening

    DTIC Science & Technology

    2012-10-01

    11-1-0755 TITLE: Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer Screening PRINCIPAL INVESTIGATOR: Maciej...AND SUBTITLE 5a. CONTRACT NUMBER Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer Screening 5b. GRANT...propose to research the methodology for constructing adaptive computer-aided education systems for mammography . Improved mammography education could

  11. Detailed landfill leachate plume mapping using 2D and 3D electrical resistivity tomography - with correlation to ionic strength measured in screens

    NASA Astrophysics Data System (ADS)

    Maurya, P. K.; Rønde, V. K.; Fiandaca, G.; Balbarini, N.; Auken, E.; Bjerg, P. L.; Christiansen, A. V.

    2017-03-01

    Leaching of organic and inorganic contamination from landfills is a serious environmental problem as surface water and aquifers are affected. In order to assess these risks and investigate the migration of leachate from the landfill, 2D and large scale 3D electrical resistivity tomography were used at a heavily contaminated landfill in Grindsted, Denmark. The inverted 2D profiles describe both the variations along the groundwater flow as well as the plume extension across the flow directions. The 3D inversion model shows the variability in the low resistivity anomaly pattern corresponding to differences in the ionic strength of the landfill leachate. Chemical data from boreholes agree well with the observations indicating a leachate plume which gradually sinks and increases in size while migrating from the landfill in the groundwater flow direction. Overall results show that the resistivity method has been very successful in delineating the landfill leachate plume and that good correlation exists between the resistivity model and leachate ionic strength.

  12. CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm: 2D and 3D Ising, Potts, and XY models

    NASA Astrophysics Data System (ADS)

    Komura, Yukihiro; Okabe, Yutaka

    2014-03-01

    We present sample CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm. We deal with the classical spin models; the Ising model, the q-state Potts model, and the classical XY model. As for the lattice, both the 2D (square) lattice and the 3D (simple cubic) lattice are treated. We already reported the idea of the GPU implementation for 2D models (Komura and Okabe, 2012). We here explain the details of sample programs, and discuss the performance of the present GPU implementation for the 3D Ising and XY models. We also show the calculated results of the moment ratio for these models, and discuss phase transitions. Catalogue identifier: AERM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERM_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5632 No. of bytes in distributed program, including test data, etc.: 14688 Distribution format: tar.gz Programming language: C, CUDA. Computer: System with an NVIDIA CUDA enabled GPU. Operating system: System with an NVIDIA CUDA enabled GPU. Classification: 23. External routines: NVIDIA CUDA Toolkit 3.0 or newer Nature of problem: Monte Carlo simulation of classical spin systems. Ising, q-state Potts model, and the classical XY model are treated for both two-dimensional and three-dimensional lattices. Solution method: GPU-based Swendsen-Wang multi-cluster spin flip Monte Carlo method. The CUDA implementation for the cluster-labeling is based on the work by Hawick et al. [1] and that by Kalentev et al. [2]. Restrictions: The system size is limited depending on the memory of a GPU. Running time: For the parameters used in the sample programs, it takes about a minute for each program. Of course, it depends on the system size, the number of Monte Carlo steps, etc. References: [1] K

  13. Icarus: A 2D direct simulation Monte Carlo (DSMC) code for parallel computers. User`s manual - V.3.0

    SciTech Connect

    Bartel, T.; Plimpton, S.; Johannes, J.; Payne, J.

    1996-10-01

    Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modelled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modelled using steric factors derived from Arrhenius reaction rates. Surface chemistry is modelled with surface reaction probabilities. The electron number density is either a fixed external generated field or determined using a local charge neutrality assumption. Ion chemistry is modelled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electrostatic fields can either be externally input or internally generated using a Langmuir-Tonks model. The Icarus software package includes the grid generation, parallel processor decomposition, postprocessing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. The majority of the software packages are written in standard Fortran.

  14. Combination of transient 2D-IR experiments and ab initio computations sheds light on the formation of the charge-transfer state in photoexcited carbonyl carotenoids.

    PubMed

    Di Donato, Mariangela; Segado Centellas, Mireia; Lapini, Andrea; Lima, Manuela; Avila, Francisco; Santoro, Fabrizio; Cappelli, Chiara; Righini, Roberto

    2014-08-14

    The excited state dynamics of carbonyl carotenoids is very complex because of the coupling of single- and doubly excited states and the possible involvement of intramolecular charge-transfer (ICT) states. In this contribution we employ ultrafast infrared spectroscopy and theoretical computations to investigate the relaxation dynamics of trans-8'-apo-β-carotenal occurring on the picosecond time scale, after excitation in the S2 state. In a (slightly) polar solvent like chloroform, one-dimensional (T1D-IR) and two-dimensional (T2D-IR) transient infrared spectroscopy reveal spectral components with characteristic frequencies and lifetimes that are not observed in nonpolar solvents (cyclohexane). Combining experimental evidence with an analysis of CASPT2//CASSCF ground and excited state minima and energy profiles, complemented with TDDFT calculations in gas phase and in solvent, we propose a photochemical decay mechanism for this system where only the bright single-excited 1Bu(+) and the dark double-excited 2Ag(-) states are involved. Specifically, the initially populated 1Bu(+) relaxes toward 2Ag(-) in 200 fs. In a nonpolar solvent 2Ag(-) decays to the ground state (GS) in 25 ps. In polar solvents, distortions along twisting modes of the chain promote a repopulation of the 1Bu(+) state which then quickly relaxes to the GS (18 ps in chloroform). The 1Bu(+) state has a high electric dipole and is the main contributor to the charge-transfer state involved in the dynamics in polar solvents. The 2Ag(-) → 1Bu(+) population transfer is evidenced by a cross peak on the T2D-IR map revealing that the motions along the same stretching of the conjugated chain on the 2Ag(-) and 1Bu(+) states are coupled.

  15. Non-native side chain IR probe in peptides: ab initio computation and 1D and 2D IR spectral simulation.

    PubMed

    Zheng, Michael L; Zheng, David C; Wang, Jianping

    2010-02-18

    Infrared frequency region of 2000-2600 cm(-1) (i.e., ca. 4-5 microm in wavelength) is a well-known open spectral window for peptides and proteins. In this work, six unnatural amino acids (unAAs) were designed to have characteristic absorption bands located in this region. Key chemical groups that served as side chains in these unAAs are C[triple bond]C, Phe-C[triple bond]C, N=C=O, N=C=S, P-H, and Si-H, respectively. Cysteine (a natural AA having S-H in side chain) was also studied for comparison. The anharmonic vibrational properties, including frequencies, anharmonicities, and intermode couplings, were examined using the density functional theory. Broadband linear infrared (IR) and two-dimensional (2D) IR spectra were simulated for each molecule. It is found that all of the side chain modes have significant overtone diagonal anharmonicities. All have moderate transition dipole strengths except the C[triple bond]C and S-H stretching modes, in comparison with the C=O stretching mode. In each case, a collection of 2D IR cross peaks were predicted to appear due to the presence of the side chain groups, whose strengths are closely related to the intramolecular anharmonic interactions, and to the transition dipole strengths of the coupled vibrators. Further, potential energy distribution analysis and high-order anharmonic constant computation showed that these IR probes possess a varying degree of mode localization. The results suggest that these IR probes are potentially useful in complementing the well-studied amide-I mode, to investigate structures and dynamics of peptides and proteins.

  16. 2D and 3D Traveling Salesman Problem

    ERIC Educational Resources Information Center

    Haxhimusa, Yll; Carpenter, Edward; Catrambone, Joseph; Foldes, David; Stefanov, Emil; Arns, Laura; Pizlo, Zygmunt

    2011-01-01

    When a two-dimensional (2D) traveling salesman problem (TSP) is presented on a computer screen, human subjects can produce near-optimal tours in linear time. In this study we tested human performance on a real and virtual floor, as well as in a three-dimensional (3D) virtual space. Human performance on the real floor is as good as that on a…

  17. A semi-analytical model for computation of capillary entry pressures and fluid configurations in uniformly-wet pore spaces from 2D rock images

    NASA Astrophysics Data System (ADS)

    Frette, O. I.; Helland, J. O.

    2010-08-01

    A novel semi-analytical model for computation of capillary entry pressures and associated fluid configurations in arbitrary, potentially non-convex, 2D pore space geometries at uniform wettability is developed. The model computes all possible centre positions of circular arcs, and physically sound criteria are implemented to determine the set of these arcs that correspond to geometrically allowed interfaces. Interfaces and pore boundary segments are connected to form closed boundaries of identified geometrical regions. These regions are classified as either oil regions, located in the wider parts of the pore space, or as water regions located in pore space constrictions. All possible region combinations are identified and evaluated for each radius value in an iterative procedure to determine the favourable entry radius and corresponding configuration based on minimisation of free energy. The model has been validated by comparison with known analytical solutions in idealised pore geometries. In cases where different analytical solutions are geometrically possible, the model generates several oil and water regions, and the valid solution is determined by the region combination that corresponds to the most favourable entry pressure, consistent with the analytical solution. Entry pressure radii and configurations are computed in strongly non-convex pore spaces extracted from an image of Bentheimer sandstone, which demonstrates that the model captures successfully well-known characteristics of capillary behaviour at different wetting conditions. The computations also demonstrate the importance of selecting the fluid configuration of minimum change in free energy. In some cases, a merged region formed by a combination of oil and water regions corresponds to the favourable entry configuration of oil, whereas in other cases, an individual oil region may correspond to the favourable oil entry configuration. It is also demonstrated that oil entry configurations may

  18. VIBA-Lab 3.0: Computer program for simulation and semi-quantitative analysis of PIXE and RBS spectra and 2D elemental maps

    NASA Astrophysics Data System (ADS)

    Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana

    2015-11-01

    VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.

  19. A computer-controlled near-field electrospinning setup and its graphic user interface for precision patterning of functional nanofibers on 2D and 3D substrates.

    PubMed

    Bisht, Gobind; Nesterenko, Sergiy; Kulinsky, Lawrence; Madou, Marc

    2012-08-01

    Electrospinning is a versatile technique for production of nanofibers. However, it lacks the precision and control necessary for fabrication of nanofiber-based devices. The positional control of the nanofiber placement can be dramatically improved using low-voltage near-field electrospinning (LV-NFES). LV-NFES allows nanofibers to be patterned on 2D and 3D substrates. However, use of NFES requires low working distance between the electrospinning nozzle and substrate, manual jet initiation, and precise substrate movement to control fiber deposition. Environmental factors such as humidity also need to be controlled. We developed a computer-controlled automation strategy for LV-NFES to improve performance and reliability. With this setup, the user is able to control the relevant sensor and actuator parameters through a custom graphic user interface application programmed on the C#.NET platform. The stage movement can be programmed as to achieve any desired nanofiber pattern and thickness. The nanofiber generation step is initiated through a software-controlled linear actuator. Parameter setting files can be saved into an Excel sheet and can be used subsequently in running multiple experiments. Each experiment is automatically video recorded and stamped with the pertinent real-time parameters. Humidity is controlled with ±3% accuracy through a feedback loop. Further improvements, such as real-time droplet size control for feed rate regulation are in progress.

  20. Development of a non-denaturing 2D gel electrophoresis protocol for screening in vivo uranium-protein targets in Procambarus clarkii with laser ablation ICP MS followed by protein identification by HPLC-Orbitrap MS.

    PubMed

    Xu, Ming; Frelon, Sandrine; Simon, Olivier; Lobinski, Ryszard; Mounicou, Sandra

    2014-10-01

    Limited knowledge about in vivo non-covalent uranium (U)-protein complexes is largely due to the lack of appropriate analytical methodology. Here, a method for screening and identifying the molecular targets of U was developed. The approach was based on non-denaturing 1D and 2D gel electrophoresis (ND-PAGE and ND-2D-PAGE (using ND-IEF as first dimension previously described)) in conjunction with laser ablation inductively coupled plasma mass spectrometry (LA-ICP MS) for the detection of U-containing proteins. The proteins were then identified by µbore HPLC-Orbitrap MS/MS. The method was applied to the analysis of cytosol of hepatopancreas (HP) of a model U-bioaccumulating organism (Procambarus clarkii). The imaging of uranium in 2D gels revealed the presence of 11 U-containing protein spots. Six protein candidates (i.e. ferritin, glyceraldehyde-3-phosphate dehydrogenase, triosephosphate isomerase, cytosolic manganese superoxide dismutase (Mn-SOD), glutathione S transferase D1 and H3 histone family protein) were then identified by matching with the data base of crustacea Decapoda species (e.g. crayfish). Among them, ferritin was the most important one. This strategy is expected to provide an insight into U toxicology and metabolism.

  1. Visibility of microcalcifications in computed and screen-film mammography

    NASA Astrophysics Data System (ADS)

    Cowen, Arnold R.; Launders, Jason H.; Jadav, Mark; Brettle, David S.

    1997-08-01

    Due to the clinically and technically demanding nature of breast x-ray imaging, mammography still remains one of the few essentially film-based radiological imaging techniques in modern medical imaging. There are a range of possible benefits available if a practical and economical direct digital imaging technique can be introduced to routine clinical practice. There has been much debate regarding the minimum specification required for direct digital acquisition. One such direct digital system available is computed radiography (CR), which has a modest specification when compared with modern screen-film mammography (SFM) systems. This paper details two psychophysical studies in which the detection of simulated microcalcifications with CR has been directly compared to that with SFM. The first study found that under scatter-free conditions the minimum detectable size of microcalcification was approximately for both SFM and CR. The second study found that SFM had a 4.6% higher probability of observers being able to correctly identify the shape of diameter test details; there was no significant difference for either larger or smaller test details. From the results of these studies it has been demonstrated that the modest specification of CR, in terms of limiting resolution, does not translate into a dramatic difference in the perception of details at the limit of detectability. When judging the imaging performance of a system it is more important to compare the signal-to-noise ratio transfer spectrum characteristics, rather than simply the modulation transfer function.

  2. Implementation of subject-specific collagen architecture of cartilage into a 2D computational model of a knee joint--data from the Osteoarthritis Initiative (OAI).

    PubMed

    Räsänen, Lasse P; Mononen, Mika E; Nieminen, Miika T; Lammentausta, Eveliina; Jurvelin, Jukka S; Korhonen, Rami K

    2013-01-01

    A subject-specific collagen architecture of cartilage, obtained from T(2) mapping of 3.0 T magnetic resonance imaging (MRI; data from the Osteoarthritis Initiative), was implemented into a 2D finite element model of a knee joint with fibril-reinforced poroviscoelastic cartilage properties. For comparison, we created two models with alternative collagen architectures, addressing the potential inaccuracies caused by the nonoptimal estimation of the collagen architecture from MRI. Also two models with constant depth-dependent zone thicknesses obtained from literature were created. The mechanical behavior of the models were analyzed and compared under axial impact loading of 846N. Compared to the model with patient-specific collagen architecture, the cartilage model without tangentially oriented collagen fibrils in the superficial zone showed up to 69% decrease in maximum principal stress and fibril strain and 35% and 13% increase in maximum principal strain and pore pressure, respectively, in the superficial layers of the cartilage. The model with increased thickness for the superficial and middle zones, as obtained from the literature, demonstrated at most 73% increase in stress, 143% increase in fibril strain, and 26% and 23% decrease in strain and pore pressure, respectively, in the intermediate cartilage. The present results demonstrate that the computational model of a knee joint with the collagen architecture of cartilage estimated from patient-specific MRI or literature lead to different stress and strain distributions. The findings also suggest that minor errors in the analysis of collagen architecture from MRI, for example due to the analysis method or MRI resolution, can lead to alterations in knee joint stresses and strains.

  3. Altered spin state equilibrium in the T309V mutant of cytochrome P450 2D6: a spectroscopic and computational study

    PubMed Central

    Bonifacio, Alois; Groenhof, André R.; Keizers, Peter H. J.; de Graaf, Chris; Commandeur, Jan N. M.; Vermeulen, Nico P. E.; Ehlers, Andreas W.; Lammertsma, Koop; Gooijer, Cees

    2007-01-01

    Cytochrome P450 2D6 (CYP2D6) is one of the most important cytochromes P450 in humans. Resonance Raman data from the T309V mutant of CYP2D6 show that the substitution of the conserved I-helix threonine situated in the enzyme’s active site perturbs the heme spin equilibrium in favor of the six-coordinated low-spin species. A mechanistic hypothesis is introduced to explain the experimental observations, and its compatibility with the available structural and spectroscopic data is tested using quantum-mechanical density functional theory calculations on active-site models for both the CYP2D6 wild type and the T309V mutant. Electronic supplementary material The online version of this article (doi:10.1007/s00775-007-0210-5) contains supplementary material, which is available to authorized users. PMID:17318599

  4. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 2: User's manual and program listing

    NASA Technical Reports Server (NTRS)

    Bailey, R. T.; Shih, T. I.-P.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D, was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  5. The New Screen Time: Computers, Tablets, and Smartphones Enter the Equation

    ERIC Educational Resources Information Center

    Wiles, Bradford B.; Schachtner, Laura; Pentz, Julie L.

    2016-01-01

    Emerging technologies attract children and push parents' and caregivers' abilities to attend to their families. This article presents recommendations related to the new version of screen time, which includes time with computers, tablets, and smartphones. Recommendations are provided for screen time for very young children and those in middle and…

  6. COMPUTATIONAL TOXICOLOGY - OBJECTIVE 2: DEVELOPING APPROACHES FOR PRIORITIZING CHEMICALS FOR SUBSEQUENT SCREENING AND TESTING

    EPA Science Inventory

    One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...

  7. DockScreen: A database of in silico biomolecular interactions to support computational toxicology

    EPA Science Inventory

    We have developed DockScreen, a database of in silico biomolecular interactions designed to enable rational molecular toxicological insight within a computational toxicology framework. This database is composed of chemical/target (receptor and enzyme) binding scores calculated by...

  8. The Use of Geometric Properties of 2D Arrays across Development

    ERIC Educational Resources Information Center

    Gibson, Brett M.; Leichtman, Michelle D.; Costa, Rachel; Bemis, Rhyannon

    2009-01-01

    Four- to 10-year-old children (n = 50) participated in a 2D search task that included geometry (with- and without lines) and feature conditions. During each of 27 trials, participants watched as a cartoon character hid behind one of three landmarks arranged in a triangle on a computer screen. During feature condition trials, participants could use…

  9. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  10. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  11. VERA2D-84: a computer program for two-dimensional analysis of flow, heat, and mass transfer in evaporative cooling towers. Volume 2. User's manual. Final report

    SciTech Connect

    Majumdar, A.K.; Agrawal, N.K.; Keeton, L.W.; Singhal, A.K.

    1985-07-01

    Cooling towers that do not meet design performance standards can add millions of dollars to the long-term operating costs of generating plants. The VERA2D-84 code offers a reliable method for predicting the performance of natural-draft and mechanical-draft towers on the basis of physical design information.

  12. Staring 2-D hadamard transform spectral imager

    DOEpatents

    Gentry, Stephen M.; Wehlburg, Christine M.; Wehlburg, Joseph C.; Smith, Mark W.; Smith, Jody L.

    2006-02-07

    A staring imaging system inputs a 2D spatial image containing multi-frequency spectral information. This image is encoded in one dimension of the image with a cyclic Hadamarid S-matrix. The resulting image is detecting with a spatial 2D detector; and a computer applies a Hadamard transform to recover the encoded image.

  13. Automatic classification of pulmonary peri-fissural nodules in computed tomography using an ensemble of 2D views and a convolutional neural network out-of-the-box.

    PubMed

    Ciompi, Francesco; de Hoop, Bartjan; van Riel, Sarah J; Chung, Kaman; Scholten, Ernst Th; Oudkerk, Matthijs; de Jong, Pim A; Prokop, Mathias; van Ginneken, Bram

    2015-12-01

    In this paper, we tackle the problem of automatic classification of pulmonary peri-fissural nodules (PFNs). The classification problem is formulated as a machine learning approach, where detected nodule candidates are classified as PFNs or non-PFNs. Supervised learning is used, where a classifier is trained to label the detected nodule. The classification of the nodule in 3D is formulated as an ensemble of classifiers trained to recognize PFNs based on 2D views of the nodule. In order to describe nodule morphology in 2D views, we use the output of a pre-trained convolutional neural network known as OverFeat. We compare our approach with a recently presented descriptor of pulmonary nodule morphology, namely Bag of Frequencies, and illustrate the advantages offered by the two strategies, achieving performance of AUC = 0.868, which is close to the one of human experts.

  14. Computer-Controlled Photometry And Large-Screen Display Resolution

    NASA Astrophysics Data System (ADS)

    Robinson, Waldo R.

    1981-10-01

    A new method for measuring the resolution and contrast of calligraphic projection displays has been developed at NOSC (Naval Ocean Systems Center). The technique involves the use of a computer-controlled photometer and an X-Y plotter. The plotter is normally used to give hard copy records of the photometric data obtained from spectral or spatial scans and subsequently stored in the memory of the controlling computer. In this application the photo-metric sensor, a miniature integrating sphere with a slit aperture, is mounted on the X-Y plotter pen holder, which is in turn moved according to programmed instructions from the computer. With the X-Y plotter positioned vertically in front of a projection display, the projected brightness is measured and stored in the computer memory as a function of sensor position. The spatial displacement between measurements is under operator control and can be in increments as small as one-thousandth of an inch. After the data are stored in computer memory, they are made into hard copy with the X-Y plotter in one of several plotting modes. From this hard copy the display performance parameters of resolution and contrast can be extracted. The different plotting modes are used to enhance various differences when more scans than one are recorded on the same chart.

  15. Smart time-pulse coding photoconverters as basic components 2D-array logic devices for advanced neural networks and optical computers

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Michalnichenko, Nikolay N.

    2004-04-01

    The article deals with a conception of building arithmetic-logic devices (ALD) with a 2D-structure and optical 2D-array inputs-outputs as advanced high-productivity parallel basic operational training modules for realization of basic operation of continuous, neuro-fuzzy, multilevel, threshold and others logics and vector-matrix, vector-tensor procedures in neural networks, that consists in use of time-pulse coding (TPC) architecture and 2D-array smart optoelectronic pulse-width (or pulse-phase) modulators (PWM or PPM) for transformation of input pictures. The input grayscale image is transformed into a group of corresponding short optical pulses or time positions of optical two-level signal swing. We consider optoelectronic implementations of universal (quasi-universal) picture element of two-valued ALD, multi-valued ALD, analog-to-digital converters, multilevel threshold discriminators and we show that 2D-array time-pulse photoconverters are the base elements for these devices. We show simulation results of the time-pulse photoconverters as base components. Considered devices have technical parameters: input optical signals power is 200nW_200μW (if photodiode responsivity is 0.5A/W), conversion time is from tens of microseconds to a millisecond, supply voltage is 1.5_15V, consumption power is from tens of microwatts to a milliwatt, conversion nonlinearity is less than 1%. One cell consists of 2-3 photodiodes and about ten CMOS transistors. This simplicity of the cells allows to carry out their integration in arrays of 32x32, 64x64 elements and more.

  16. Computer-aided 2D and 3D quantification of human stem cell fate from in vitro samples using Volocity high performance image analysis software.

    PubMed

    Piltti, Katja M; Haus, Daniel L; Do, Eileen; Perez, Harvey; Anderson, A J; Cummings, B J

    2011-11-01

    Accurate automated cell fate analysis of immunostained human stem cells from 2- and 3-dimensional (2D-3D) images would improve efficiency in the field of stem cell research. Development of an accurate and precise tool that reduces variability and the time needed for human stem cell fate analysis will improve productivity and interpretability of the data across research groups. In this study, we have created protocols for high performance image analysis software Volocity® to classify and quantify cytoplasmic and nuclear cell fate markers from 2D-3D images of human neural stem cells after in vitro differentiation. To enhance 3D image capture efficiency, we optimized the image acquisition settings of an Olympus FV10i® confocal laser scanning microscope to match our quantification protocols and improve cell fate classification. The methods developed in this study will allow for a more time efficient and accurate software based, operator validated, stem cell fate classification and quantification from 2D and 3D images, and yield the highest ≥94.4% correspondence with human recognized objects.

  17. Protein engineering by highly parallel screening of computationally designed variants

    PubMed Central

    Sun, Mark G. F.; Seo, Moon-Hyeong; Nim, Satra; Corbi-Verge, Carles; Kim, Philip M.

    2016-01-01

    Current combinatorial selection strategies for protein engineering have been successful at generating binders against a range of targets; however, the combinatorial nature of the libraries and their vast undersampling of sequence space inherently limit these methods due to the difficulty in finely controlling protein properties of the engineered region. Meanwhile, great advances in computational protein design that can address these issues have largely been underutilized. We describe an integrated approach that computationally designs thousands of individual protein binders for high-throughput synthesis and selection to engineer high-affinity binders. We show that a computationally designed library enriches for tight-binding variants by many orders of magnitude as compared to conventional randomization strategies. We thus demonstrate the feasibility of our approach in a proof-of-concept study and successfully obtain low-nanomolar binders using in vitro and in vivo selection systems. PMID:27453948

  18. A novel incubation direct injection LC/MS/MS technique for in vitro drug metabolism screening studies involving the CYP 2D6 and the CYP 3A4 isozymes.

    PubMed

    Bhoopathy, S; Xin, B; Unger, S E; Karnes, H T

    2005-04-01

    A direct injection LC/MS/MS method involving a novel incubation technique was developed for the inhibition screening of CYP 2D6 and CYP 3A4 isoenzymes using dextromethorphan and midazolam as probe substrates. Both assays were performed using an electrospray ionization source in the positive ion mode. Direct injection was possible by using a short C 18, LC column (2 mm x 20 mm) with large particle diameter packing (10 microm). Analytical characteristics of the direct injection technique were studied by examining matrix effects, which showed suppression of the ESI signal between 0.20 and 0.65 min. The retention times for analytes were adjusted to approximately 0.8 min (k'>3), resulting in no matrix effect. Column lifetime was evaluated and determined to be approximately 160 direct injections of the matrix. The precision and accuracy of the control samples for the quantitation of dextromethorphan was between -0.53 and -12.80, and 3.73 and 6.69% respectively. Unlike conventional incubation techniques, incubations were carried out in an autosampler equipped with a heating accessory. This novel incubation method, which involved no stirring of the incubation mixture, estimated the Cl(int in vitro) for dextromethorphan and midazolam in human liver microsomes to be 1.65+/-0.22 ml/(hmg) and 0.861 ml/(min mg) respectively. The autosampler tray maintained uniform temperature and was sensitive to changes in temperature between 33 and 41 degrees C. High-throughput screening was performed using known inhibitors of the CYP 2D6 isozyme, and the system was evaluated for its ability to differentiate between these inhibitors. The strong inhibitor quinidine resulted in a 25.6% increase in t(1/2), the medium potency inhibitor chlorpromazine resulted in an increase of 6.14% and the weak inhibitor primaquine had no significant effect on half-life. This technique involves no sample preparation, demonstrated run times of 2 min per injection and can be fully automated. The method should

  19. School Students and Computer Games with Screen Violence

    ERIC Educational Resources Information Center

    Fedorov, A. V.

    2005-01-01

    In this article, the author states how these days, school students from low-income strata of the population in Russia spend hours sitting in computer rooms and Internet clubs, where, for a relatively small fee, they can play interactive video games. And to determine what games they prefer the author conducted a content analysis of eighty-seven…

  20. Low-dose computed tomography screening for lung cancer: how strong is the evidence?

    PubMed

    Woolf, Steven H; Harris, Russell P; Campos-Outcalt, Doug

    2014-12-01

    In 2013, the US Preventive Services Task Force (USPSTF) recommended low-dose computed tomographic (CT) screening for high-risk current and former smokers with a B recommendation (indicating a level of certainty that it offered moderate to substantial net benefit). Under the Affordable Care Act, the USPSTF recommendation requires commercial insurers to fully cover low-dose CT. The Centers for Medicare & Medicaid Services (CMS) is now considering whether to also offer coverage for Medicare beneficiaries. Although the National Lung Screening Trial (NLST) demonstrated the efficacy of low-dose CT, implementation of national screening may be premature. The magnitude of benefit from routine screening is uncertain; estimates are based on data from a single study and simulation models commissioned by the USPSTF. The potential harms-which could affect a large population-include false-positive results, anxiety, radiation exposure, diagnostic workups, and the resulting complications. It is unclear if routine screening would result in net benefit or net harm. The NLST may not be generalizable to a national screening program for the Medicare age group because 73% of NLST participants were younger than 65 years. Moreover, screening outside of trial conditions is less likely to be restricted to high-risk smokers and qualified imaging centers with responsible referral protocols. Until better data are available for older adults who are screened in ordinary (nontrial) community settings, CMS should postpone coverage of low-dose CT screening for Medicare beneficiaries.

  1. DSD2D-FLS 2010: Bdzil's 2010 DSD Code Base; Computing tb and Dn with Edits to Reduce the Noise in the Dn Field Near HE Boundaries

    SciTech Connect

    Bdzil, John Bohdan

    2016-09-21

    The full level-set function code, DSD3D, is fully described in LA-14336 (2007) [1]. This ASCI-supported, DSD code project was the last such LANL DSD code project that I was involved with before my retirement in 2007. My part in the project was to design and build the core DSD3D solver, which was to include a robust DSD boundary condition treatment. A robust boundary condition treatment was required, since for an important local “customer,” the only description of the explosives’ boundary was through volume fraction data. Given this requirement, the accuracy issues I had encountered with our “fast-tube,” narrowband, DSD2D solver, and the difficulty we had building an efficient MPI-parallel version of the narrowband DSD2D, I decided DSD3D should be built as a full level-set function code, using a totally local DSD boundary condition algorithm for the level-­set function, phi, which did not rely on the gradient of the level-­set function being one, |grad(phi)| = 1. The narrowband DSD2D solver was built on the assumption that |grad(phi)| could be driven to one, and near the boundaries of the explosive this condition was not being satisfied. Since the narrowband is typically no more than10*dx wide, narrowband methods are discrete methods with a fixed, non-­resolvable error, where the error is related to the thickness of the band: the narrower the band the larger the errors. Such a solution represents a discrete approximation to the true solution and does not limit to the solution of the underlying PDEs under grid resolution.The full level-­set function code, DSD3D, is fully described in LA-14336 (2007) [1]. This ASCI-­supported, DSD code project was the last such LANL DSD code project that I was involved with before my retirement in 2007. My part in the project was to design and build the core DSD3D solver, which was to include a robust DSD boundary condition treatment. A robust boundary condition treatment was required, since for an important local

  2. Tangential and sagittal curvature from the normals computed by the null screen method in corneal topography

    NASA Astrophysics Data System (ADS)

    Estrada-Molina, Amilcar; Díaz-Uribe, Rufino

    2011-08-01

    A new method for computing the tangential and sagittal curvatures from the normals to a cornea is proposed. The normals are obtained through a Null Screen method from the coordinates of the drops shaped spots at the null screen, the coordinates on a reference approximating surface and the centroids on the image plane. This method assumes that the cornea has rotational symmetry and our derivations will be carried out in the meridional plane that contains the symmetry axis. Experimental results are shown for a calibration spherical surface, using cylindrical null screens with radial point arrays.

  3. Lung Cancer Screening with Low-Dose Computed Tomography for Primary Care Providers

    PubMed Central

    Richards, Thomas B.; White, Mary C.; Caraballo, Ralph S.

    2015-01-01

    This review provides an update on lung cancer screening with low-dose computed tomography (LDCT) and its implications for primary care providers. One of the unique features of lung cancer screening is the potential complexity in patient management if an LDCT scan reveals a small pulmonary nodule. Additional tests, consultation with multiple specialists, and follow-up evaluations may be needed to evaluate whether lung cancer is present. Primary care providers should know the resources available in their communities for lung cancer screening with LDCT and smoking cessation, and the key points to be addressed in informed and shared decision-making discussions with patients. PMID:24830610

  4. A Computational model for compressed sensing RNAi cellular screening

    PubMed Central

    2012-01-01

    Background RNA interference (RNAi) becomes an increasingly important and effective genetic tool to study the function of target genes by suppressing specific genes of interest. This system approach helps identify signaling pathways and cellular phase types by tracking intensity and/or morphological changes of cells. The traditional RNAi screening scheme, in which one siRNA is designed to knockdown one specific mRNA target, needs a large library of siRNAs and turns out to be time-consuming and expensive. Results In this paper, we propose a conceptual model, called compressed sensing RNAi (csRNAi), which employs a unique combination of group of small interfering RNAs (siRNAs) to knockdown a much larger size of genes. This strategy is based on the fact that one gene can be partially bound with several small interfering RNAs (siRNAs) and conversely, one siRNA can bind to a few genes with distinct binding affinity. This model constructs a multi-to-multi correspondence between siRNAs and their targets, with siRNAs much fewer than mRNA targets, compared with the conventional scheme. Mathematically this problem involves an underdetermined system of equations (linear or nonlinear), which is ill-posed in general. However, the recently developed compressed sensing (CS) theory can solve this problem. We present a mathematical model to describe the csRNAi system based on both CS theory and biological concerns. To build this model, we first search nucleotide motifs in a target gene set. Then we propose a machine learning based method to find the effective siRNAs with novel features, such as image features and speech features to describe an siRNA sequence. Numerical simulations show that we can reduce the siRNA library to one third of that in the conventional scheme. In addition, the features to describe siRNAs outperform the existing ones substantially. Conclusions This csRNAi system is very promising in saving both time and cost for large-scale RNAi screening experiments which

  5. Molecular dynamics-based virtual screening: accelerating the drug discovery process by high-performance computing.

    PubMed

    Ge, Hu; Wang, Yu; Li, Chanjuan; Chen, Nanhao; Xie, Yufang; Xu, Mengyan; He, Yingyan; Gu, Xinchun; Wu, Ruibo; Gu, Qiong; Zeng, Liang; Xu, Jun

    2013-10-28

    High-performance computing (HPC) has become a state strategic technology in a number of countries. One hypothesis is that HPC can accelerate biopharmaceutical innovation. Our experimental data demonstrate that HPC can significantly accelerate biopharmaceutical innovation by employing molecular dynamics-based virtual screening (MDVS). Without using HPC, MDVS for a 10K compound library with tens of nanoseconds of MD simulations requires years of computer time. In contrast, a state of the art HPC can be 600 times faster than an eight-core PC server is in screening a typical drug target (which contains about 40K atoms). Also, careful design of the GPU/CPU architecture can reduce the HPC costs. However, the communication cost of parallel computing is a bottleneck that acts as the main limit of further virtual screening improvements for drug innovations.

  6. Computational screening of oxetane monomers for novel hydroxy terminated polyethers.

    PubMed

    Sarangapani, Radhakrishnan; Ghule, Vikas D; Sikder, Arun K

    2014-06-01

    Energetic hydroxy terminated polyether prepolymers find paramount importance in search of energetic binders for propellant applications. In the present study, density functional theory (DFT) has been employed to screen the various novel energetic oxetane derivatives, which usually construct the backbone for these energetic polymers. Molecular structures were investigated at the B3LYP/6-31G* level, and isodesmic reactions were designed for calculating the gas phase heats of formation. The condensed phase heats of formation for designed compounds were calculated by the Politzer approach using heats of sublimation. Among the designed oxetane derivatives, T4 and T5 possess condensed phase heat of formation above 210 kJ mol(-1). The crystal packing density of the designed oxetane derivatives varied from 1.2 to 1.6 g/cm(3). The detonation velocities and pressures were evaluated using the Kamlet-Jacobs equations, utilizing the predicted densities and HOFCond. It was found that most of the designed oxetane derivatives have detonation performance comparable to the monomers of benchmark energetic polymers viz., NIMMO, AMMO, and BAMO. The strain energy (SE) for the oxetane derivatives were calculated using homodesmotic reactions, while intramolecular group interactions were predicted through the disproportionation energies. The concept of chemical hardness is used to analyze the susceptibility of designed compounds to reactivity and chemical transformations. The heats of formation, density, and predicted performance imply that the designed molecules are expected to be candidates for polymer synthesis and potential molecules for energetic binders.

  7. Computational screening of organic materials towards improved photovoltaic properties

    NASA Astrophysics Data System (ADS)

    Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan; Borunda, Mario

    2015-03-01

    The world today faces an energy crisis that is an obstruction to the development of the human civilization. One of the most promising solutions is solar energy harvested by economical solar cells. Being the third generation of solar cell materials, organic photovoltaic (OPV) materials is now under active development from both theoretical and experimental points of view. In this study, we constructed a parameter to select the desired molecules based on their optical spectra performance. We applied it to investigate a large collection of potential OPV materials, which were from the CEPDB database set up by the Harvard Clean Energy Project. Time dependent density functional theory (TD-DFT) modeling was used to calculate the absorption spectra of the molecules. Then based on the parameter, we screened out the top performing molecules for their potential OPV usage and suggested experimental efforts toward their synthesis. In addition, from those molecules, we summarized the functional groups that provided molecules certain spectrum capability. It is hoped that useful information could be mined out to provide hints to molecular design of OPV materials.

  8. The importance of lung cancer screening with low-dose computed tomography for Medicare beneficiaries.

    PubMed

    Wood, Douglas E

    2014-12-01

    The National Lung Screening Trial has provided convincing evidence of a substantial mortality benefit of lung cancer screening with low-dose computed tomography (CT) for current and former smokers at high risk. The United States Preventive Services Task Force has recommended screening, triggering coverage of low-dose CT by private health insurers under provisions of the Affordable Care Act. The Centers for Medicare & Medicaid Services (CMS) are currently evaluating coverage of lung cancer screening for Medicare beneficiaries. Since 70% of lung cancer occurs in patients 65 years or older, CMS should cover low-dose CT, thus avoiding the situation of at-risk patients being screened up to age 64 through private insurers and then abruptly ceasing screening at exactly the ages when their risk for developing lung cancer is increasing. Legitimate concerns include false-positive findings that lead to further testing and invasive procedures, overdiagnosis (detection of clinically unimportant cancers), the morbidity and mortality of surgery, and the overall costs of follow-up tests and procedures. These concerns can be mitigated by clear criteria for screening high-risk patients, disciplined management of abnormalities based on algorithms, and high-quality multidisciplinary care. Lung cancer screening with low-dose CT can lead to early diagnosis and cure for thousands of patients each year. Professional societies can help CMS responsibly implement a program that is patient-centered and minimizes unintended harms and costs.

  9. Reading from computer screen versus reading from paper: does it still make a difference?

    PubMed

    Köpper, Maja; Mayr, Susanne; Buchner, Axel

    2016-05-01

    Four experiments were conducted to test whether recent developments in display technology would suffice to eliminate the well-known disadvantages in reading from screen as compared with paper. Proofreading speed and performance were equal for a TFT-LCD and a paper display, but there were more symptoms of eyestrain in the screen condition accompanied by a strong preference for paper (Experiment 1). These results were replicated using a longer reading duration (Experiment 2). Additional experiments were conducted to test hypotheses about the reasons for the higher amount of eyestrain associated with reading from screen. Reduced screen luminance did not change the pattern of results (Experiment 3), but positioning both displays in equal inclination angles eliminated the differences in eyestrain symptoms and increased proofreading speed in the screen condition (Experiment 4). A paper-like positioning of TFT-LCDs seems to enable unimpaired reading without evidence of increased physical strain. Practitioner Summary: Given the developments in screen technology, a re-assessment of the differences in proofreading speed and performance, well-being, and preference between computer screen and paper was conducted. State-of-the-art TFT-LCDs enable unimpaired reading, but a book-like positioning of screens seems necessary to minimise eyestrain symptoms.

  10. Vertical 2D Heterostructures

    NASA Astrophysics Data System (ADS)

    Lotsch, Bettina V.

    2015-07-01

    Graphene's legacy has become an integral part of today's condensed matter science and has equipped a whole generation of scientists with an armory of concepts and techniques that open up new perspectives for the postgraphene area. In particular, the judicious combination of 2D building blocks into vertical heterostructures has recently been identified as a promising route to rationally engineer complex multilayer systems and artificial solids with intriguing properties. The present review highlights recent developments in the rapidly emerging field of 2D nanoarchitectonics from a materials chemistry perspective, with a focus on the types of heterostructures available, their assembly strategies, and their emerging properties. This overview is intended to bridge the gap between two major—yet largely disjunct—developments in 2D heterostructures, which are firmly rooted in solid-state chemistry or physics. Although the underlying types of heterostructures differ with respect to their dimensions, layer alignment, and interfacial quality, there is common ground, and future synergies between the various assembly strategies are to be expected.

  11. Performance of linear and nonlinear texture measures in 2D and 3D for monitoring architectural changes in osteoporosis using computer-generated models of trabecular bone

    NASA Astrophysics Data System (ADS)

    Boehm, Holger F.; Link, Thomas M.; Monetti, Roberto A.; Mueller, Dirk; Rummeny, Ernst J.; Raeth, Christoph W.

    2005-04-01

    Osteoporosis is a metabolic bone disease leading to de-mineralization and increased risk of fracture. The two major factors that determine the biomechanical competence of bone are the degree of mineralization and the micro-architectural integrity. Today, modern imaging modalities (high resolution MRI, micro-CT) are capable of depicting structural details of trabecular bone tissue. From the image data, structural properties obtained by quantitative measures are analysed with respect to the presence of osteoporotic fractures of the spine (in-vivo) or correlated with biomechanical strength as derived from destructive testing (in-vitro). Fairly well established are linear structural measures in 2D that are originally adopted from standard histo-morphometry. Recently, non-linear techniques in 2D and 3D based on the scaling index method (SIM), the standard Hough transform (SHT), and the Minkowski Functionals (MF) have been introduced, which show excellent performance in predicting bone strength and fracture risk. However, little is known about the performance of the various parameters with respect to monitoring structural changes due to progression of osteoporosis or as a result of medical treatment. In this contribution, we generate models of trabecular bone with pre-defined structural properties which are exposed to simulated osteoclastic activity. We apply linear and non-linear texture measures to the models and analyse their performance with respect to detecting architectural changes. This study demonstrates, that the texture measures are capable of monitoring structural changes of complex model data. The diagnostic potential varies for the different parameters and is found to depend on the topological composition of the model and initial "bone density". In our models, non-linear texture measures tend to react more sensitively to small structural changes than linear measures. Best performance is observed for the 3rd and 4th Minkowski Functionals and for the scaling

  12. Automated computational screening of the thiol reactivity of substituted alkenes

    NASA Astrophysics Data System (ADS)

    Smith, Jennifer M.; Rowley, Christopher N.

    2015-08-01

    Electrophilic olefins can react with the S-H moiety of cysteine side chains. The formation of a covalent adduct through this mechanism can result in the inhibition of an enzyme. The reactivity of an olefin towards cysteine depends on its functional groups. In this study, 325 reactions of thiol-Michael-type additions to olefins were modeled using density functional theory. All combinations of ethenes with hydrogen, methyl ester, amide, and cyano substituents were included. An automated workflow was developed to perform the construction, conformation search, minimization, and calculation of molecular properties for the reactant, carbanion intermediate, and thioether products for a model reaction of the addition of methanethiol to the electrophile. Known cysteine-reactive electrophiles present in the database were predicted to react exergonically with methanethiol through a carbanion with a stability in the 30-40 kcal mol-1 range. 13 other compounds in our database that are also present in the PubChem database have similar properties. Natural bond orbital parameters were computed and regression analysis was used to determine the relationship between properties of the olefin electronic structure and the product and intermediate stability. The stability of the intermediates is very sensitive to electronic effects on the carbon where the anionic charge is centered. The stability of the products is more sensitive to steric factors.

  13. Automated computational screening of the thiol reactivity of substituted alkenes.

    PubMed

    Smith, Jennifer M; Rowley, Christopher N

    2015-08-01

    Electrophilic olefins can react with the S-H moiety of cysteine side chains. The formation of a covalent adduct through this mechanism can result in the inhibition of an enzyme. The reactivity of an olefin towards cysteine depends on its functional groups. In this study, 325 reactions of thiol-Michael-type additions to olefins were modeled using density functional theory. All combinations of ethenes with hydrogen, methyl ester, amide, and cyano substituents were included. An automated workflow was developed to perform the construction, conformation search, minimization, and calculation of molecular properties for the reactant, carbanion intermediate, and thioether products for a model reaction of the addition of methanethiol to the electrophile. Known cysteine-reactive electrophiles present in the database were predicted to react exergonically with methanethiol through a carbanion with a stability in the 30-40 kcal mol(-1) range. 13 other compounds in our database that are also present in the PubChem database have similar properties. Natural bond orbital parameters were computed and regression analysis was used to determine the relationship between properties of the olefin electronic structure and the product and intermediate stability. The stability of the intermediates is very sensitive to electronic effects on the carbon where the anionic charge is centered. The stability of the products is more sensitive to steric factors.

  14. Computer Decision Support to Improve Autism Screening and Care in Community Pediatric Clinics

    ERIC Educational Resources Information Center

    Bauer, Nerissa S.; Sturm, Lynne A.; Carroll, Aaron E.; Downs, Stephen M.

    2013-01-01

    An autism module was added to an existing computer decision support system (CDSS) to facilitate adherence to recommended guidelines for screening for autism spectrum disorders in primary care pediatric clinics. User satisfaction was assessed by survey and informal feedback at monthly meetings between clinical staff and the software team. To assess…

  15. 2D semiconductor optoelectronics

    NASA Astrophysics Data System (ADS)

    Novoselov, Kostya

    The advent of graphene and related 2D materials has recently led to a new technology: heterostructures based on these atomically thin crystals. The paradigm proved itself extremely versatile and led to rapid demonstration of tunnelling diodes with negative differential resistance, tunnelling transistors, photovoltaic devices, etc. By taking the complexity and functionality of such van der Waals heterostructures to the next level we introduce quantum wells engineered with one atomic plane precision. Light emission from such quantum wells, quantum dots and polaritonic effects will be discussed.

  16. Does Patient Time Spent Viewing Computer-Tailored Colorectal Cancer Screening Materials Predict Patient-Reported Discussion of Screening with Providers?

    ERIC Educational Resources Information Center

    Sanders, Mechelle; Fiscella, Kevin; Veazie, Peter; Dolan, James G.; Jerant, Anthony

    2016-01-01

    The main aim is to examine whether patients' viewing time on information about colorectal cancer (CRC) screening before a primary care physician (PCP) visit is associated with discussion of screening options during the visit. We analyzed data from a multi-center randomized controlled trial of a tailored interactive multimedia computer program…

  17. Designing Multimedia Learning Application with Learning Theories: A Case Study on a Computer Science Subject with 2-D and 3-D Animated Versions

    ERIC Educational Resources Information Center

    Rias, Riaza Mohd; Zaman, Halimah Badioze

    2011-01-01

    Higher learning based instruction may be primarily concerned in most cases with the content of their academic lessons, and not very much with their instructional delivery. However, the effective application of learning theories and technology in higher education has an impact on student performance. With the rapid progress in the computer and…

  18. Patient Perspectives on Low-Dose Computed Tomography for Lung Cancer Screening, New Mexico, 2014

    PubMed Central

    Sussman, Andrew L.; Murrietta, Ambroshia M.; Getrich, Christina M.; Rhyne, Robert; Crowell, Richard E.; Taylor, Kathryn L.; Reifler, Ellen J.; Wescott, Pamela H.; Saeed, Ali I.; Hoffman, Richard M.

    2016-01-01

    Introduction National guidelines call for annual lung cancer screening for high-risk smokers using low-dose computed tomography (LDCT). The objective of our study was to characterize patient knowledge and attitudes about lung cancer screening, smoking cessation, and shared decision making by patient and health care provider. Methods We conducted semistructured qualitative interviews with patients with histories of heavy smoking who received care at a Federally Qualified Health Center (FQHC Clinic) and at a comprehensive cancer center-affiliated chest clinic (Chest Clinic) in Albuquerque, New Mexico. The interviews, conducted from February through September 2014, focused on perceptions about health screening, knowledge and attitudes about LDCT screening, and preferences regarding decision aids. We used a systematic iterative analytic process to identify preliminary and emergent themes and to create a coding structure. Results We reached thematic saturation after 22 interviews (10 at the FQHC Clinic, 12 at the Chest Clinic). Most patients were unaware of LDCT screening for lung cancer but were receptive to the test. Some smokers said they would consider quitting smoking if their screening result were positive. Concerns regarding screening were cost, radiation exposure, and transportation issues. To support decision making, most patients said they preferred one-on-one discussions with a provider. They also valued decision support tools (print materials, videos), but raised concerns about readability and Internet access. Conclusion Implementing lung cancer screening in sociodemographically diverse populations poses significant challenges. The value of tobacco cessation counseling cannot be overemphasized. Effective interventions for shared decision making to undergo lung cancer screening will need the active engagement of health care providers and will require the use of accessible decision aids designed for people with low health literacy. PMID:27536900

  19. Discovery of novel MDR-Mycobacterium tuberculosis inhibitor by new FRIGATE computational screen.

    PubMed

    Scheich, Christoph; Szabadka, Zoltán; Vértessy, Beáta; Pütter, Vera; Grolmusz, Vince; Schade, Markus

    2011-01-01

    With 1.6 million casualties annually and 2 billion people being infected, tuberculosis is still one of the most pressing healthcare challenges. Here we report on the new computational docking algorithm FRIGATE which unites continuous local optimization techniques (conjugate gradient method) with an inherently discrete computational approach in forcefield computation, resulting in equal or better scoring accuracies than several benchmark docking programs. By utilizing FRIGATE for a virtual screen of the ZINC library against the Mycobacterium tuberculosis (Mtb) enzyme antigen 85C, we identified novel small molecule inhibitors of multiple drug-resistant Mtb, which bind in vitro to the catalytic site of antigen 85C.

  20. Designing specific protein-protein interactions using computation, experimental library screening, or integrated methods.

    PubMed

    Chen, T Scott; Keating, Amy E

    2012-07-01

    Given the importance of protein-protein interactions for nearly all biological processes, the design of protein affinity reagents for use in research, diagnosis or therapy is an important endeavor. Engineered proteins would ideally have high specificities for their intended targets, but achieving interaction specificity by design can be challenging. There are two major approaches to protein design or redesign. Most commonly, proteins and peptides are engineered using experimental library screening and/or in vitro evolution. An alternative approach involves using protein structure and computational modeling to rationally choose sequences predicted to have desirable properties. Computational design has successfully produced novel proteins with enhanced stability, desired interactions and enzymatic function. Here we review the strengths and limitations of experimental library screening and computational structure-based design, giving examples where these methods have been applied to designing protein interaction specificity. We highlight recent studies that demonstrate strategies for combining computational modeling with library screening. The computational methods provide focused libraries predicted to be enriched in sequences with the properties of interest. Such integrated approaches represent a promising way to increase the efficiency of protein design and to engineer complex functionality such as interaction specificity.

  1. Cognitive context detection in UAS operators using eye-gaze patterns on computer screens

    NASA Astrophysics Data System (ADS)

    Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph

    2016-05-01

    In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.

  2. Comparison of 2D Radiographic Images and 3D Cone Beam Computed Tomography for Positioning Head-and-Neck Radiotherapy Patients

    SciTech Connect

    Li Heng; Zhu, X. Ronald Zhang Lifei; Dong Lei; Tung, Sam; Ahamad, Anesa M.D.; Chao, K. S. Clifford; Morrison, William H.; Rosenthal, David I.; Schwartz, David L.; Mohan, Radhe; Garden, Adam S.

    2008-07-01

    Purpose: To assess the positioning accuracy using two-dimensional kilovoltage (2DkV) imaging and three-dimensional cone beam CT (CBCT) in patients with head and neck (H and N) cancer receiving radiation therapy. To assess the benefit of patient-specific headrest. Materials and Methods: All 21 patients studied were immobilized using thermoplastic masks with either a patient-specific vacuum bag (11 of 21, IMA) or standard clear plastic (10 of 21, IMB) headrests. Each patient was imaged with a pair of orthogonal 2DkV images in treatment position using onboard imaging before the CBCT procedure. The 2DkV and CBCT images were acquired weekly during the same session. The 2DkV images were reviewed by oncologists and also analyzed by a software tool based on mutual information (MI). Results: Ninety-eight pairs of assessable 2DkV-CBCT alignment sets were obtained. Systematic and random errors were <1.6 mm for both 2DkV and CBCT alignments. When we compared shifts determined by CBCT and 2DkV for the same patient setup, statistically significant correlations were observed in all three major directions. Among all CBCT couch shifts, 4.1% {>=} 0.5 cm and 18.7% {>=} 0.3 cm, whereas among all 2DkV (MI) shifts, 1.7% {>=} 0.5 cm and 11.2% {>=} 0.3 cm. Statistically significant difference was found on anteroposterior direction between IMA and IMB with the CBCT alignment only. Conclusions: The differences between 2D and 3D alignments were mainly caused by the relative flexibility of certain H and N structures and possibly by rotation. Better immobilization of the flexible neck is required to further reduce the setup errors for H and N patients receiving radiotherapy.

  3. Validation of a computer analysis to determine 3-D rotations and translations of the rib cage in upright posture from three 2-D digital images

    PubMed Central

    Harrison, Deed E.; Janik, Tadeusz J.; Cailliet, Rene; Normand, Martin C.; Perron, Denise L.; Ferrantelli, Joseph R

    2006-01-01

    Since thoracic cage posture affects lumbar spine coupling and loads on the spinal tissues and extremities, a scientific analysis of upright posture is needed. Common posture analyzers measure human posture as displacements from a plumb line, while the PosturePrint™ claims to measure head, rib cage, and pelvic postures as rotations and translations. In this study, it was decided to evaluate the validity of the PosturePrint™ Internet computer system’s analysis of thoracic cage postures. In a university biomechanics laboratory, photographs of a mannequin thoracic cage were obtained in different postures on a stand in front of a digital camera. For each mannequin posture, three photographs were obtained (left lateral, right lateral, and AP). The mannequin thoracic cage was placed in 68 different single and combined postures (requiring 204 photographs) in five degrees of freedom: lateral translation (Tx), lateral flexion (Rz), axial rotation (Ry), flexion–extension (Rx), and anterior–posterior translation (Tz). The PosturePrint™ system requires 13 reflective markers to be placed on the subject (mannequin) during photography and 16 additional “click-on” markers via computer mouse before a set of three photographs is analyzed by the PosturePrint™ computer system over the Internet. Errors were the differences between the positioned mannequin and the calculated positions from the computer system. Average absolute value errors were obtained by comparing the exact inputted posture to the PosturePrint™’s computed values. Mean and standard deviation of computational errors for sagittal displacements of the thoracic cage were Rx=0.3±0.1°, Tz=1.6±0.7 mm, and for frontal view displacements were Ry=1.2±1.0°, Rz=0.6±0.4°, and Tx=1.5±0.6 mm. The PosturePrint™ system is sufficiently accurate in measuring thoracic cage postures in five degrees of freedom on a mannequin indicating the need for a further study on human subjects. PMID:16547756

  4. Optimisation and Assessment of Three Modern Touch Screen Tablet Computers for Clinical Vision Testing

    PubMed Central

    Tahir, Humza J.; Murray, Ian J.; Parry, Neil R. A.; Aslam, Tariq M.

    2014-01-01

    Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration. PMID:24759774

  5. Optimisation and assessment of three modern touch screen tablet computers for clinical vision testing.

    PubMed

    Tahir, Humza J; Murray, Ian J; Parry, Neil R A; Aslam, Tariq M

    2014-01-01

    Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration.

  6. Comparing Benefits from Many Possible Computed Tomography Lung Cancer Screening Programs: Extrapolating from the National Lung Screening Trial Using Comparative Modeling

    PubMed Central

    McMahon, Pamela M.; Meza, Rafael; Plevritis, Sylvia K.; Black, William C.; Tammemagi, C. Martin; Erdogan, Ayca; ten Haaf, Kevin; Hazelton, William; Holford, Theodore R.; Jeon, Jihyoun; Clarke, Lauren; Kong, Chung Yin; Choi, Sung Eun; Munshi, Vidit N.; Han, Summer S.; van Rosmalen, Joost; Pinsky, Paul F.; Moolgavkar, Suresh

    2014-01-01

    Background The National Lung Screening Trial (NLST) demonstrated that in current and former smokers aged 55 to 74 years, with at least 30 pack-years of cigarette smoking history and who had quit smoking no more than 15 years ago, 3 annual computed tomography (CT) screens reduced lung cancer-specific mortality by 20% relative to 3 annual chest X-ray screens. We compared the benefits achievable with 576 lung cancer screening programs that varied CT screen number and frequency, ages of screening, and eligibility based on smoking. Methods and Findings We used five independent microsimulation models with lung cancer natural history parameters previously calibrated to the NLST to simulate life histories of the US cohort born in 1950 under all 576 programs. ‘Efficient’ (within model) programs prevented the greatest number of lung cancer deaths, compared to no screening, for a given number of CT screens. Among 120 ‘consensus efficient’ (identified as efficient across models) programs, the average starting age was 55 years, the stopping age was 80 or 85 years, the average minimum pack-years was 27, and the maximum years since quitting was 20. Among consensus efficient programs, 11% to 40% of the cohort was screened, and 153 to 846 lung cancer deaths were averted per 100,000 people. In all models, annual screening based on age and smoking eligibility in NLST was not efficient; continuing screening to age 80 or 85 years was more efficient. Conclusions Consensus results from five models identified a set of efficient screening programs that include annual CT lung cancer screening using criteria like NLST eligibility but extended to older ages. Guidelines for screening should also consider harms of screening and individual patient characteristics. PMID:24979231

  7. 2D discrete Fourier transform on sliding windows.

    PubMed

    Park, Chun-Su

    2015-03-01

    Discrete Fourier transform (DFT) is the most widely used method for determining the frequency spectra of digital signals. In this paper, a 2D sliding DFT (2D SDFT) algorithm is proposed for fast implementation of the DFT on 2D sliding windows. The proposed 2D SDFT algorithm directly computes the DFT bins of the current window using the precalculated bins of the previous window. Since the proposed algorithm is designed to accelerate the sliding transform process of a 2D input signal, it can be directly applied to computer vision and image processing applications. The theoretical analysis shows that the computational requirement of the proposed 2D SDFT algorithm is the lowest among existing 2D DFT algorithms. Moreover, the output of the 2D SDFT is mathematically equivalent to that of the traditional DFT at all pixel positions.

  8. Three-dimensional textural analysis of products from the Vulcanian explosions of Soufrière Hills Volcano, Montserrat, 1997, using X-ray computed microtomography: comparison with 2D data and implications for vesiculation processes

    NASA Astrophysics Data System (ADS)

    Giachetti, T.; Burgisser, A.; Druitt, T. H.

    2009-12-01

    X-ray computed microtomography is a powerful, non-destructive method for imaging textures and for quantifying vesicle spatial relationships and size distributions. It was applied to one breadcrust bomb and three pumices from the 1997 Vulcanian explosions of the Soufrière Hills Volcano, Montserrat. A detailed 2D textural study of these same samples, including high-resolution vesicle size distributions, has already been carried out. In order to cover the wide range of vesicle and crystal sizes (few µm to a few hundreds of µm), we acquired tomographic images at three magnifications (0.37, 4.0 and 17.4 µm/voxel) on each sample. Each magnification covers only part of the whole vesicle size range, but with large overlaps. Vesicle walls were reconstructed using ImageJ and 3DMA_Rock softwares, and a Matlab code was written to combine the data from the three stacks and to calculate vesicle size and sphericity distributions for the whole sample. Comparison with the 2D results show only minor differences in the vesicle size distributions. Cumulative vesicle number densities obtained by the two methods are of the same order of magnitude (~1015 m-3 of glass). Three vesicle populations are recognized in the samples, the same three recognized in 2D, although minor shifts in modes are observed. As in 2D, vesicle sphericity decreases with size, vesicle shape being progressively more complicated as the vesicle grows and coalesces with neighbours. Coalescence in all samples appears to occur between neighbouring vesicles of any sizes, but, as intuitively expected, the larger the vesicle, the more connected it is. Our results show that coalescence affected indifferently pre- and syn-eruptive bubbles. This suggests that the syn-eruptive gas was rapidly connected to large-scale pathways through which it could escape. Depending on clast permeability, this mechanism has the potential to double the amount of gas available for propulsion of the Vulcanian jet.

  9. Computation of three-phase capillary entry pressures and arc menisci configurations in pore geometries from 2D rock images: A combinatorial approach

    NASA Astrophysics Data System (ADS)

    Zhou, Yingfang; Helland, Johan Olav; Hatzignatiou, Dimitrios G.

    2014-07-01

    We present a semi-analytical, combinatorial approach to compute three-phase capillary entry pressures for gas invasion into pore throats with constant cross-sections of arbitrary shapes that are occupied by oil and/or water. For a specific set of three-phase capillary pressures, geometrically allowed gas/oil, oil/water and gas/water arc menisci are determined by moving two circles in opposite directions along the pore/solid boundary for each fluid pair such that the contact angle is defined at the front circular arcs. Intersections of the two circles determine the geometrically allowed arc menisci for each fluid pair. The resulting interfaces are combined systematically to allow for all geometrically possible three-phase configuration changes. The three-phase extension of the Mayer and Stowe - Princen method is adopted to calculate capillary entry pressures for all determined configuration candidates, from which the most favorable gas invasion configuration is determined. The model is validated by comparing computed three-phase capillary entry pressures and corresponding fluid configurations with analytical solutions in idealized triangular star-shaped pores. It is demonstrated that the model accounts for all scenarios that have been analyzed previously in these shapes. Finally, three-phase capillary entry pressures and associated fluid configurations are computed in throat cross-sections extracted from segmented SEM images of Bentheim sandstone. The computed gas/oil capillary entry pressures account for the expected dependence of oil/water capillary pressure in spreading and non-spreading fluid systems at the considered wetting conditions. Because these geometries are irregular and include constrictions, we introduce three-phase displacements that have not been identified previously in pore-network models that are based on idealized pore shapes. However, in the limited number of pore geometries considered in this work, we find that the favorable displacements are

  10. Brain-Computer Interfaces for 1-D and 2-D Cursor Control: Designs Using Volitional Control of the EEG Spectrum or Steady-State Visual Evoked Potentials

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Matthews, Bryan; Rosipal, Roman

    2005-01-01

    We have developed and tested two EEG-based brain-computer interfaces (BCI) for users to control a cursor on a computer display. Our system uses an adaptive algorithm, based on kernel partial least squares classification (KPLS), to associate patterns in multichannel EEG frequency spectra with cursor controls. Our first BCI, Target Practice, is a system for one-dimensional device control, in which participants use biofeedback to learn voluntary control of their EEG spectra. Target Practice uses a KF LS classifier to map power spectra of 30-electrode EEG signals to rightward or leftward position of a moving cursor on a computer display. Three subjects learned to control motion of a cursor on a video display in multiple blocks of 60 trials over periods of up to six weeks. The best subject s average skill in correct selection of the cursor direction grew from 58% to 88% after 13 training sessions. Target Practice also implements online control of two artifact sources: a) removal of ocular artifact by linear subtraction of wavelet-smoothed vertical and horizontal EOG signals, b) control of muscle artifact by inhibition of BCI training during periods of relatively high power in the 40-64 Hz band. The second BCI, Think Pointer, is a system for two-dimensional cursor control. Steady-state visual evoked potentials (SSVEP) are triggered by four flickering checkerboard stimuli located in narrow strips at each edge of the display. The user attends to one of the four beacons to initiate motion in the desired direction. The SSVEP signals are recorded from eight electrodes located over the occipital region. A KPLS classifier is individually calibrated to map multichannel frequency bands of the SSVEP signals to right-left or up-down motion of a cursor on a computer display. The display stops moving when the user attends to a central fixation point. As for Target Practice, Think Pointer also implements wavelet-based online removal of ocular artifact; however, in Think Pointer muscle

  11. Improved CUDA programs for GPU computing of Swendsen-Wang multi-cluster spin flip algorithm: 2D and 3D Ising, Potts, and XY models

    NASA Astrophysics Data System (ADS)

    Komura, Yukihiro; Okabe, Yutaka

    2016-03-01

    We present new versions of sample CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm. In this update, we add the method of GPU-based cluster-labeling algorithm without the use of conventional iteration (Komura, 2015) to those programs. For high-precision calculations, we also add a random-number generator in the cuRAND library. Moreover, we fix several bugs and remove the extra usage of shared memory in the kernel functions.

  12. Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials.

    PubMed

    Trejo, Leonard J; Rosipal, Roman; Matthews, Bryan

    2006-06-01

    We have developed and tested two electroencephalogram (EEG)-based brain-computer interfaces (BCI) for users to control a cursor on a computer display. Our system uses an adaptive algorithm, based on kernel partial least squares classification (KPLS), to associate patterns in multichannel EEG frequency spectra with cursor controls. Our first BCI, Target Practice, is a system for one-dimensional device control, in which participants use biofeedback to learn voluntary control of their EEG spectra. Target Practice uses a KPLS classifier to map power spectra of 62-electrode EEG signals to rightward or leftward position of a moving cursor on a computer display. Three subjects learned to control motion of a cursor on a video display in multiple blocks of 60 trials over periods of up to six weeks. The best subject's average skill in correct selection of the cursor direction grew from 58% to 88% after 13 training sessions. Target Practice also implements online control of two artifact sources: 1) removal of ocular artifact by linear subtraction of wavelet-smoothed vertical and horizontal electrooculograms (EOG) signals, 2) control of muscle artifact by inhibition of BCI training during periods of relatively high power in the 40-64 Hz band. The second BCI, Think Pointer, is a system for two-dimensional cursor control. Steady-state visual evoked potentials (SSVEP) are triggered by four flickering checkerboard stimuli located in narrow strips at each edge of the display. The user attends to one of the four beacons to initiate motion in the desired direction. The SSVEP signals are recorded from 12 electrodes located over the occipital region. A KPLS classifier is individually calibrated to map multichannel frequency bands of the SSVEP signals to right-left or up-down motion of a cursor on a computer display. The display stops moving when the user attends to a central fixation point. As for Target Practice, Think Pointer also implements wavelet-based online removal of ocular

  13. Feasibility of Tablet Computer Screening for Opioid Abuse in the Emergency Department

    PubMed Central

    Weiner, Scott G.; Horton, Laura C.; Green, Traci C.; Butler, Stephen F.

    2015-01-01

    Introduction Tablet computer-based screening may have the potential for detecting patients at risk for opioid abuse in the emergency department (ED). Study objectives were a) to determine if the revised Screener and Opioid Assessment for Patients with Pain (SOAPP®-R), a 24-question previously paper-based screening tool for opioid abuse potential, could be administered on a tablet computer to an ED patient population; b) to demonstrate that >90% of patients can complete the electronic screener without assistance in <5 minutes and; c) to determine patient ease of use with screening on a tablet computer. Methods This was a cross-sectional convenience sample study of patients seen in an urban academic ED. SOAPP®-R was programmed on a tablet computer by study investigators. Inclusion criteria were patients ages ≥18 years who were being considered for discharge with a prescription for an opioid analgesic. Exclusion criteria included inability to understand English or physical disability preventing use of the tablet. Results 93 patients were approached for inclusion and 82 (88%) provided consent. Fifty-two percent (n=43) of subjects were male; 46% (n=38) of subjects were between 18–35 years, and 54% (n=44) were >35 years. One hundred percent of subjects completed the screener. Median time to completion was 148 (interquartile range 117.5–184.3) seconds, and 95% (n=78) completed in <5 minutes. 93% (n=76) rated ease of completion as very easy. Conclusions It is feasible to administer a screening tool to a cohort of ED patients on a tablet computer. The screener administration time is minimal and patient ease of use with this modality is high. PMID:25671003

  14. Concrete resource analysis of the quantum linear-system algorithm used to compute the electromagnetic scattering cross section of a 2D target

    NASA Astrophysics Data System (ADS)

    Scherer, Artur; Valiron, Benoît; Mau, Siun-Chuon; Alexander, Scott; van den Berg, Eric; Chapuran, Thomas E.

    2017-03-01

    We provide a detailed estimate for the logical resource requirements of the quantum linear-system algorithm (Harrow et al. in Phys Rev Lett 103:150502, 2009) including the recently described elaborations and application to computing the electromagnetic scattering cross section of a metallic target (Clader et al. in Phys Rev Lett 110:250504, 2013). Our resource estimates are based on the standard quantum-circuit model of quantum computation; they comprise circuit width (related to parallelism), circuit depth (total number of steps), the number of qubits and ancilla qubits employed, and the overall number of elementary quantum gate operations as well as more specific gate counts for each elementary fault-tolerant gate from the standard set { X, Y, Z, H, S, T, { CNOT } }. In order to perform these estimates, we used an approach that combines manual analysis with automated estimates generated via the Quipper quantum programming language and compiler. Our estimates pertain to the explicit example problem size N=332{,}020{,}680 beyond which, according to a crude big-O complexity comparison, the quantum linear-system algorithm is expected to run faster than the best known classical linear-system solving algorithm. For this problem size, a desired calculation accuracy ɛ =0.01 requires an approximate circuit width 340 and circuit depth of order 10^{25} if oracle costs are excluded, and a circuit width and circuit depth of order 10^8 and 10^{29}, respectively, if the resource requirements of oracles are included, indicating that the commonly ignored oracle resources are considerable. In addition to providing detailed logical resource estimates, it is also the purpose of this paper to demonstrate explicitly (using a fine-grained approach rather than relying on coarse big-O asymptotic approximations) how these impressively large numbers arise with an actual circuit implementation of a quantum algorithm. While our estimates may prove to be conservative as more efficient

  15. Computational protein-ligand docking and virtual drug screening with the AutoDock suite

    PubMed Central

    Forli, Stefano; Huey, Ruth; Pique, Michael E.; Sanner, Michel; Goodsell, David S.; Olson, Arthur J.

    2016-01-01

    Computational docking can be used to predict bound conformations and free energies of binding for small molecule ligands to macromolecular targets. Docking is widely used for the study of biomolecular interactions and mechanisms, and is applied to structure-based drug design. The methods are fast enough to allow virtual screening of ligand libraries containing tens of thousands of compounds. This protocol covers the docking and virtual screening methods provided by the AutoDock suite of programs, including a basic docking of a drug molecule with an anticancer target, a virtual screen of this target with a small ligand library, docking with selective receptor flexibility, active site prediction, and docking with explicit hydration. The entire protocol will require approximately 5 hours. PMID:27077332

  16. Computed Tomography Imaging Spectrometer (CTIS) with 2D Reflective Grating for Ultraviolet to Long-Wave Infrared Detection Especially Useful for Surveying Transient Events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for events it is also for investigation of some slow moving phenomena as in the life sciences.

  17. Computed tomography imaging spectrometer (CTIS) with 2D reflective grating for ultraviolet to long-wave infrared detection especially useful for surveying transient events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.

  18. Comparison of 2D and 3D Computational Multiphase Fluid Flow Models of Oxygen Lancing of Pyrometallurgical Furnace Tap-Holes

    NASA Astrophysics Data System (ADS)

    Erwee, M. W.; Reynolds, Q. G.; Zietsman, J. H.

    2016-06-01

    Furnace tap-holes vary in design depending on the type of furnace and process involved, but they share one common trait: The tap-hole must be opened and closed periodically. In general, tap-holes are plugged with refractory clay after tapping, thereby stopping the flow of molten material. Once a furnace is ready to be tapped, drilling and/or lancing with oxygen are typically used to remove tap-hole clay from the tap-hole. Lancing with oxygen is an energy-intensive, mostly manual process, which affects the performance and longevity of the tap-hole refractory material as well as the processes inside the furnace. Computational modeling offers an opportunity to gain insight into the possible effects of oxygen lancing on various aspects of furnace operation.

  19. SPECT Imaging of 2-D and 3-D Distributed Sources with Near-Field Coded Aperture Collimation: Computer Simulation and Real Data Validation.

    PubMed

    Mu, Zhiping; Dobrucki, Lawrence W; Liu, Yi-Hwa

    The imaging of distributed sources with near-field coded aperture (CA) remains extremely challenging and is broadly considered unsuitable for single-photon emission computerized tomography (SPECT). This study proposes a novel CA SPECT reconstruction approach and evaluates the feasibilities of imaging and reconstructing distributed hot sources and cold lesions using near-field CA collimation and iterative image reconstruction. Computer simulations were designed to compare CA and pinhole collimations in two-dimensional radionuclide imaging. Digital phantoms were created and CA images of the phantoms were reconstructed using maximum likelihood expectation maximization (MLEM). Errors and the contrast-to-noise ratio (CNR) were calculated and image resolution was evaluated. An ex vivo rat heart with myocardial infarction was imaged using a micro-SPECT system equipped with a custom-made CA module and a commercial 5-pinhole collimator. Rat CA images were reconstructed via the three-dimensional (3-D) MLEM algorithm developed for CA SPECT with and without correction for a large projection angle, and 5-pinhole images were reconstructed using the commercial software provided by the SPECT system. Phantom images of CA were markedly improved in terms of image quality, quantitative root-mean-squared error, and CNR, as compared to pinhole images. CA and pinhole images yielded similar image resolution, while CA collimation resulted in fewer noise artifacts. CA and pinhole images of the rat heart were well reconstructed and the myocardial perfusion defects could be clearly discerned from 3-D CA and 5-pinhole SPECT images, whereas 5-pinhole SPECT images suffered from severe noise artifacts. Image contrast of CA SPECT was further improved after correction for the large projection angle used in the rat heart imaging. The computer simulations and small-animal imaging study presented herein indicate that the proposed 3-D CA SPECT imaging and reconstruction approaches worked reasonably

  20. Computer-Aided Design of Fragment Mixtures for NMR-Based Screening

    PubMed Central

    Arroyo, Xavier; Goldflam, Michael; Feliz, Miguel; Belda, Ignasi; Giralt, Ernest

    2013-01-01

    Fragment-based drug discovery is widely applied both in industrial and in academic screening programs. Several screening techniques rely on NMR to detect binding of a fragment to a target. NMR-based methods are among the most sensitive techniques and have the further advantage of yielding a low rate of false positives and negatives. However, NMR is intrinsically slower than other screening techniques; thus, to increase throughput in NMR-based screening, researchers often assay mixtures of fragments, rather than single fragments. Herein we present a fast and straightforward computer-aided method to design mixtures of fragments taken from a library that have minimized NMR signal overlap. This approach enables direct identification of one or several active fragments without the need for deconvolution. Our approach entails encoding of NMR spectra into a computer-readable format that we call a fingerprint, and minimizing the global signal overlap through a Monte Carlo algorithm. The scoring function used favors a homogenous distribution of the global signal overlap. The method does not require additional experimental work: the only data required are NMR spectra, which are generally recorded for each compound as a quality control measure before its insertion into the library. PMID:23516512

  1. Application of a screening method in assessing occupational safety and health of computer workstations.

    PubMed

    Niskanen, Toivo; Lehtelä, Jouni; Länsikallio, Riina

    2014-01-01

    Employers and workers need concrete guidance to plan and implement changes in the ergonomics of computer workstations. The Näppärä method is a screening tool for identifying problems requiring further assessment and corrective actions. The aim of this study was to assess the work of occupational safety and health (OSH) government inspectors who used Näppärä as part of their OSH enforcement inspections (430 assessments) related to computer work. The modifications in workstation ergonomics involved mainly adjustments to the screen, mouse, keyboard, forearm supports, and chair. One output of the assessment is an index indicating the percentage of compliance items. This method can be considered as exposure assessment and ergonomics intervention used as a benchmark for the level of ergonomics. Future research can examine whether the effectiveness of participatory ergonomics interventions should be investigated with Näppärä.

  2. A comparison of 3-D computed tomography versus 2-D radiography measurements of ulnar variance and ulnolunate distance during forearm rotation.

    PubMed

    Kawanishi, Y; Moritomo, H; Omori, S; Kataoka, T; Murase, T; Sugamoto, K

    2014-06-01

    Positive ulnar variance is associated with ulnar impaction syndrome and ulnar variance is reported to increase with pronation. However, radiographic measurement can be affected markedly by the incident angle of the X-ray beam. We performed three-dimensional (3-D) computed tomography measurements of ulnar variance and ulnolunate distance during forearm rotation and compared these with plain radiographic measurements in 15 healthy wrists. From supination to pronation, ulnar variance increased in all cases on the radiographs; mean ulnar variance increased significantly and mean ulnolunate distance decreased significantly. However on 3-D imaging, ulna variance decreased in 12 cases on moving into pronation and increased in three cases; neither the mean ulnar variance nor mean ulnolunate distance changed significantly. Our results suggest that the forearm position in which ulnar variance increased varies among individuals. This may explain why some patients with ulnar impaction syndrome complain of wrist pain exacerbated by forearm supination. It also suggests that standard radiographic assessments of ulnar variance are unreliable.

  3. Adaptive Computer-Assisted Mammography Training for Improved Breast Cancer Screening

    DTIC Science & Technology

    2013-10-01

    disease . The project includes: Observer studies to collect reading data from radiology trainees; Extraction of image features (human- and computer- based...mammography in breast cancer detection and lower mortality associated the disease . 15. SUBJECT TERMS Mammography, radiology, education, user modeling...associated the disease . BODY: Overall progress: Specific aim Expected Actual 1.1 Prepare the database of screening mammograms (year 1, months 1-6

  4. High-throughput screening to estimate single or multiple enzymes involved in drug metabolism: microtitre plate assay using a combination of recombinant CYP2D6 and human liver microsomes.

    PubMed

    Yamamoto, T; Suzuki, A; Kohno, Y

    2003-08-01

    1. The purpose of this study was to estimate readily involvement of single or multiple enzymes in the metabolism of a drug through inhibitory assessment. Inhibitory effects of various compounds on CYP2D6 activity assayed by formation of fluorescent metabolite from 3-[2-(N,N-diethyl-N-methyl-ammonium)ethyl]-7-methoxy-4-methyl-coumarin (AMMC) were assessed using microtitre plate (MTP) assays with a combination of recombinant CYP2D6 and human liver microsomes (HLM). 2. Among various compounds studied, antipsychotic drugs extensively inhibited recombinant CYP2D6 activity and the IC50 values were generally lower than those of antidepressants and antiarrhythmic drugs. 3. After pre-incubation, the IC50 values of mianserin, chlorpromadine, risperidone, thioridazine, alprenolol, propafenone and dextromethorphan increased but the values of timolol, S-metoprolol and propranolol substantially decreased compared with those in case of co-incubation. 4. The IC50 values of typical substrates of CYP2D6 (bufuralol and dextromethorphan at lower substrate concentration) in inhibition studies using HLM, were similar to those in the case of recombinant CYP2D6, but the values of the compounds that are metabolized by multiple CYP forms (perphenazine and chlorpromazine) in HLM were much larger. 5. If the ratio (HLM/rCYP ratio) of IC50 values between HLM and recombinant CYP2D6 exceeds approximately 2, it suggests that other CYP forms in addition to CYP2D6 might be involved in the metabolism of the test compounds. From the advantage such as speed, high throughput and ease of the technique, the MTP assay using a combination of the recombinant CYP2D6 and HLM is useful to estimate the involvement of single or multiple enzymes in the metabolism of drugs at the stage of drug discovery.

  5. 2-d Finite Element Code Postprocessor

    SciTech Connect

    Sanford, L. A.; Hallquist, J. O.

    1996-07-15

    ORION is an interactive program that serves as a postprocessor for the analysis programs NIKE2D, DYNA2D, TOPAZ2D, and CHEMICAL TOPAZ2D. ORION reads binary plot files generated by the two-dimensional finite element codes currently used by the Methods Development Group at LLNL. Contour and color fringe plots of a large number of quantities may be displayed on meshes consisting of triangular and quadrilateral elements. ORION can compute strain measures, interface pressures along slide lines, reaction forces along constrained boundaries, and momentum. ORION has been applied to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  6. Developing a computer touch-screen interactive colorectal screening decision aid for a low-literacy African American population: lessons learned.

    PubMed

    Bass, Sarah Bauerle; Gordon, Thomas F; Ruzek, Sheryl Burt; Wolak, Caitlin; Ruggieri, Dominique; Mora, Gabriella; Rovito, Michael J; Britto, Johnson; Parameswaran, Lalitha; Abedin, Zainab; Ward, Stephanie; Paranjape, Anuradha; Lin, Karen; Meyer, Brian; Pitts, Khaliah

    2013-07-01

    African Americans have higher colorectal cancer (CRC) mortality than White Americans and yet have lower rates of CRC screening. Increased screening aids in early detection and higher survival rates. Coupled with low literacy rates, the burden of CRC morbidity and mortality is exacerbated in this population, making it important to develop culturally and literacy appropriate aids to help low-literacy African Americans make informed decisions about CRC screening. This article outlines the development of a low-literacy computer touch-screen colonoscopy decision aid using an innovative marketing method called perceptual mapping and message vector modeling. This method was used to mathematically model key messages for the decision aid, which were then used to modify an existing CRC screening tutorial with different messages. The final tutorial was delivered through computer touch-screen technology to increase access and ease of use for participants. Testing showed users were not only more comfortable with the touch-screen technology but were also significantly more willing to have a colonoscopy compared with a "usual care group." Results confirm the importance of including participants in planning and that the use of these innovative mapping and message design methods can lead to significant CRC screening attitude change.

  7. Three-dimensional mapping of soil chemical characteristics at micrometric scale: Statistical prediction by combining 2D SEM-EDX data and 3D X-ray computed micro-tomographic images

    NASA Astrophysics Data System (ADS)

    Hapca, Simona

    2015-04-01

    Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential

  8. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan.

    PubMed

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/.

  9. Computational approaches to screen candidate ligands with anti- Parkinson's activity using R programming.

    PubMed

    Jayadeepa, R M; Niveditha, M S

    2012-01-01

    It is estimated that by 2050 over 100 million people will be affected by the Parkinson's disease (PD). We propose various computational approaches to screen suitable candidate ligand with anti-Parkinson's activity from phytochemicals. Five different types of dopamine receptors have been identified in the brain, D1-D5. Dopamine receptor D3 was selected as the target receptor. The D3 receptor exists in areas of the brain outside the basal ganglia, such as the limbic system, and thus may play a role in the cognitive and emotional changes noted in Parkinson's disease. A ligand library of 100 molecules with anti-Parkinson's activity was collected from literature survey. Nature is the best combinatorial chemist and possibly has answers to all diseases of mankind. Failure of some synthetic drugs and its side effects have prompted many researches to go back to ancient healing methods which use herbal medicines to give relief. Hence, the candidate ligands with anti-Parkinson's were selected from herbal sources through literature survey. Lipinski rules were applied to screen the suitable molecules for the study, the resulting 88 molecules were energy minimized, and subjected to docking using Autodock Vina. The top eleven molecules were screened according to the docking score generated by Autodock Vina Commercial drug Ropinirole was computed similarly and was compared with the 11 phytochemicals score, the screened molecules were subjected to toxicity analysis and to verify toxic property of phytochemicals. R Programming was applied to remove the bias from the top eleven molecules. Using cluster analysis and Confusion Matrix two phytochemicals were computationally selected namely Rosmarinic acid and Gingkolide A for further studies on the disease Parkinson's.

  10. A clinical screening protocol for the RSVP Keyboard™ brain-computer interface

    PubMed Central

    Fried-Oken, Melanie; Mooney, Aimee; Peters, Betts; Oken, Barry

    2013-01-01

    Purpose To propose a screening protocol that identifies requisite sensory, motor, cognitive, and communication skills for people with locked-in syndrome (PLIS) to use the RSVP Keyboard™ brain-computer interface (BCI). Method A multidisciplinary clinical team of seven individuals representing five disciplines identified requisite skills for the BCI. They chose questions and subtests from existing standardized instruments for auditory comprehension, reading, and spelling;modified them to accommodate nonverbal response modalities; and developed novel tasks to screen visual perception, sustained visual attention, and working memory. Questions were included about sensory skills, positioning, pain interference, and medications. The result is a compilation of questions, adapted subtests and original tasks designed for this new BCI system. It was administered to 12 PLIS and six healthy controls. Results Administration required one hour or less. Yes/no choices and eye gaze were adequate response modes for PLIS. Healthy controls and 9 PLIS were 100% accurate on all tasks; three PLIS missed single items. Conclusions The RSVP BCI screening protocol is a brief, repeatable technique for patients with different levels of LIS to identify the presence/absence of skills for BCI use. Widespread adoption of screening methods should be a clinical goal and will help standardize BCI implementation for research and intervention. PMID:24059536

  11. JAC2D: A two-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method; Yucca Mountain Site Characterization Project

    SciTech Connect

    Biffle, J.H.; Blanford, M.L.

    1994-05-01

    JAC2D is a two-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equations. The method is implemented in a two-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. A four-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic/plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere.

  12. Multimodal evaluation of 2-D and 3-D ultrasound, computed tomography and magnetic resonance imaging in measurements of the thyroid volume using universally applicable cross-sectional imaging software: a phantom study.

    PubMed

    Freesmeyer, Martin; Wiegand, Steffen; Schierz, Jan-Henning; Winkens, Thomas; Licht, Katharina

    2014-07-01

    A precise estimate of thyroid volume is necessary for making adequate therapeutic decisions and planning, as well as for monitoring therapy response. The goal of this study was to compare the precision of different volumetry methods. Thyroid-shaped phantoms were subjected to volumetry via 2-D and 3-D ultrasonography (US), computed tomography (CT) and magnetic resonance imaging (MRI). The 3-D US scans were performed using sensor navigation and mechanical sweeping methods. Volumetry calculation ensued with the conventional ellipsoid model and the manual tracing method. The study confirmed the superiority of manual tracing with CT and MRI volumetry of the thyroid, but extended this knowledge also to the superiority of the 3-D US method, regardless of whether sensor navigation or mechanical sweeping is used. A novel aspect was successful use of the same universally applicable cross-imaging software for all modalities.

  13. Identification of Serine Conformers by Matrix-Isolation IR Spectroscopy Aided by Near-Infrared Laser-Induced Conformational Change, 2D Correlation Analysis, and Quantum Mechanical Anharmonic Computations.

    PubMed

    Najbauer, Eszter E; Bazsó, Gábor; Apóstolo, Rui; Fausto, Rui; Biczysko, Malgorzata; Barone, Vincenzo; Tarczay, György

    2015-08-20

    The conformers of α-serine were investigated by matrix-isolation IR spectroscopy combined with NIR laser irradiation. This method, aided by 2D correlation analysis, enabled unambiguously grouping the spectral lines to individual conformers. On the basis of comparison of at least nine experimentally observed vibrational transitions of each conformer with empirically scaled (SQM) and anharmonic (GVPT2) computed IR spectra, six conformers were identified. In addition, the presence of at least one more conformer in Ar matrix was proved, and a short-lived conformer with a half-life of (3.7 ± 0.5) × 10(3) s in N2 matrix was generated by NIR irradiation. The analysis of the NIR laser-induced conversions revealed that the excitation of the stretching overtone of both the side chain and the carboxylic OH groups can effectively promote conformational changes, but remarkably different paths were observed for the two kinds of excitations.

  14. Large-Scale Computational Screening of Zeolites for Ethane/Ethene Separation

    SciTech Connect

    Kim, J; Lin, LC; Martin, RL; Swisher, JA; Haranczyk, M; Smit, B

    2012-08-14

    Large-scale computational screening of thirty thousand zeolite structures was conducted to find optimal structures for seperation of ethane/ethene mixtures. Efficient grand canonical Monte Carlo (GCMC) simulations were performed with graphics processing units (GPUs) to obtain pure component adsorption isotherms for both ethane and ethene. We have utilized the ideal adsorbed solution theory (LAST) to obtain the mixture isotherms, which were used to evaluate the performance of each zeolite structure based on its working capacity and selectivity. In our analysis, we have determined that specific arrangements of zeolite framework atoms create sites for the preferential adsorption of ethane over ethene. The majority of optimum separation materials can be identified by utilizing this knowledge and screening structures for the presence of this feature will enable the efficient selection of promising candidate materials for ethane/ethene separation prior to performing molecular simulations.

  15. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    PubMed

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  16. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    PubMed Central

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  17. E-2D Advanced Hawkeye Aircraft (E-2D AHE)

    DTIC Science & Technology

    2015-12-01

    Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-364 E-2D Advanced Hawkeye Aircraft (E-2D AHE) As of FY 2017 President’s Budget Defense...Office Estimate RDT&E - Research, Development, Test, and Evaluation SAR - Selected Acquisition Report SCP - Service Cost Position TBD - To Be Determined

  18. Should computed tomographic colonography replace optical colonoscopy in screening for colorectal cancer?

    PubMed

    Veerappan, Ganesh R; Cash, Brooks D

    2009-04-01

    Clinical evidence amassed over the last several decades indicates that routine colorectal cancer (CRC) screening, compared to no screening, detects CRC at an earlier stage, reduces the incidence of CRC or the progression early CRC through polypectomy, and reduces CRC mortality. Computed tomographic colonography (CTC) is a minimally invasive, structural evaluation of the entire colorectum that has recently been advocated by multiple American professional medical societies as an effective alternative for CRC screening. The potential advantages of CTC, including rapid image acquisition and processing, non-invasiveness, and decreased procedural risks of perforation, bleeding, and sedation complications may serve to improve the low rates of colorectal cancer screening that are currently observed in our society. Several large studies of CTC as a CRC screening test have reported excellent results but have been criticized because of the expertise of CTC interpreters participating in those trials. As a response to these criticisms, the long-awaited results of the American College of Radiology Imaging Network (ACRIN) National CT Colonography Trial were recently published. The purpose of this study was to assess the accuracy of CTC in a "community based" environment to determine if previous results obtained at expert sites could be replicated. All CTC were confirmed and compared to conventional colonoscopy, the gold-standard colorectal cancer screening test. For polyps >10 mm, the results obtained in the ACRIN trial were comparable to previous studies with a mean CTC sensitivity of 90% and a mean CTC specificity of 86%. The sensitivity of CTC fell to 78% for lesions >6 mm, a value that some studies have suggested is comparable to the detection rate of conventional colonoscopy. This study adds to the body of literature regarding the efficacy of CTC and will likely be cited by many as evidence supporting CTC as an acceptable CRC screening test, in the same league as colonoscopy

  19. 2D microwave imaging reflectometer electronics

    SciTech Connect

    Spear, A. G.; Domier, C. W. Hu, X.; Muscatello, C. M.; Ren, X.; Luhmann, N. C.; Tobias, B. J.

    2014-11-15

    A 2D microwave imaging reflectometer system has been developed to visualize electron density fluctuations on the DIII-D tokamak. Simultaneously illuminated at four probe frequencies, large aperture optics image reflections from four density-dependent cutoff surfaces in the plasma over an extended region of the DIII-D plasma. Localized density fluctuations in the vicinity of the plasma cutoff surfaces modulate the plasma reflections, yielding a 2D image of electron density fluctuations. Details are presented of the receiver down conversion electronics that generate the in-phase (I) and quadrature (Q) reflectometer signals from which 2D density fluctuation data are obtained. Also presented are details on the control system and backplane used to manage the electronics as well as an introduction to the computer based control program.

  20. 2D microwave imaging reflectometer electronics.

    PubMed

    Spear, A G; Domier, C W; Hu, X; Muscatello, C M; Ren, X; Tobias, B J; Luhmann, N C

    2014-11-01

    A 2D microwave imaging reflectometer system has been developed to visualize electron density fluctuations on the DIII-D tokamak. Simultaneously illuminated at four probe frequencies, large aperture optics image reflections from four density-dependent cutoff surfaces in the plasma over an extended region of the DIII-D plasma. Localized density fluctuations in the vicinity of the plasma cutoff surfaces modulate the plasma reflections, yielding a 2D image of electron density fluctuations. Details are presented of the receiver down conversion electronics that generate the in-phase (I) and quadrature (Q) reflectometer signals from which 2D density fluctuation data are obtained. Also presented are details on the control system and backplane used to manage the electronics as well as an introduction to the computer based control program.

  1. Implementing low-dose computed tomography screening for lung cancer in Canada: implications of alternative at-risk populations, screening frequency, and duration

    PubMed Central

    Evans, W.K.; Flanagan, W.M.; Miller, A.B.; Goffin, J.R.; Memon, S.; Fitzgerald, N.; Wolfson, M.C.

    2016-01-01

    Background Low-dose computed tomography (ldct) screening has been shown to reduce mortality from lung cancer; however, the optimal screening duration and “at risk” population are not known. Methods The Cancer Risk Management Model developed by Statistics Canada for the Canadian Partnership Against Cancer includes a lung screening module based on data from the U.S. National Lung Screening Trial (nlst). The base-case scenario reproduces nlst outcomes with high fidelity. The impact in Canada of annual screening on the number of incident cases and life-years gained, with a wider range of age and smoking history eligibility criteria and varied participation rates, was modelled to show the magnitude of clinical benefit nationally and by province. Life-years gained, costs (discounted and undiscounted), and resource requirements were also estimated. Results In 2014, 1.4 million Canadians were eligible for screening according to nlst criteria. Over 10 years, screening would detect 12,500 more lung cancers than the expected 268,300 and would gain 9200 life-years. The computed tomography imaging requirement of 24,000–30,000 at program initiation would rise to between 87,000 and 113,000 by the 5th year of an annual nlst-like screening program. Costs would increase from approximately $75 million to $128 million at 10 years, and the cumulative cost nationally over 10 years would approach $1 billion, partially offset by a reduction in the costs of managing advanced lung cancer. Conclusions Modelling various ways in which ldct might be implemented provides decision-makers with estimates of the effect on clinical benefit and on resource needs that clinical trial results are unable to provide. PMID:27330355

  2. Computer selection of oligonucleotide probes from amino acid sequences for use in gene library screening.

    PubMed

    Yang, J H; Ye, J H; Wallace, D C

    1984-01-11

    We present a computer program, FINPROBE, which utilizes known amino acid sequence data to deduce minimum redundancy oligonucleotide probes for use in screening cDNA or genomic libraries or in primer extension. The user enters the amino acid sequence of interest, the desired probe length, the number of probes sought, and the constraints on oligonucleotide synthesis. The computer generates a table of possible probes listed in increasing order of redundancy and provides the location of each probe in the protein and mRNA coding sequence. Activation of a next function provides the amino acid and mRNA sequences of each probe of interest as well as the complementary sequence and the minimum dissociation temperature of the probe. A final routine prints out the amino acid sequence of the protein in parallel with the mRNA sequence listing all possible codons for each amino acid.

  3. Lung cancer screening beyond low-dose computed tomography: the role of novel biomarkers.

    PubMed

    Hasan, Naveed; Kumar, Rohit; Kavuru, Mani S

    2014-10-01

    Lung cancer is the most common and lethal malignancy in the world. The landmark National lung screening trial (NLST) showed a 20% relative reduction in mortality in high-risk individuals with screening low-dose computed tomography. However, the poor specificity and low prevalence of lung cancer in the NLST provide major limitations to its widespread use. Furthermore, a lung nodule on CT scan requires a nuanced and individualized approach towards management. In this regard, advances in high through-put technology (molecular diagnostics, multi-gene chips, proteomics, and bronchoscopic techniques) have led to discovery of lung cancer biomarkers that have shown potential to complement the current screening standards. Early detection of lung cancer can be achieved by analysis of biomarkers from tissue samples within the respiratory tract such as sputum, saliva, nasal/bronchial airway epithelial cells and exhaled breath condensate or through peripheral biofluids such as blood, serum and urine. Autofluorescence bronchoscopy has been employed in research setting to identify pre-invasive lesions not identified on CT scan. Although these modalities are not yet commercially available in clinic setting, they will be available in the near future and clinicians who care for patients with lung cancer should be aware. In this review, we present up-to-date state of biomarker development, discuss their clinical relevance and predict their future role in lung cancer management.

  4. Assessment of bone microarchitecture in chronic kidney disease: a comparison of 2D bone texture analysis and high-resolution peripheral quantitative computed tomography at the radius and tibia.

    PubMed

    Bacchetta, Justine; Boutroy, Stéphanie; Vilayphiou, Nicolas; Fouque-Aubert, Anne; Delmas, Pierre D; Lespessailles, Eric; Fouque, Denis; Chapurlat, Roland

    2010-11-01

    Bone microarchitecture can be studied noninvasively using high-resolution peripheral quantitative computed tomography (HR-pQCT). However, this technique is not widely available, so more simple techniques may be useful. BMA is a new 2D high-resolution digital X-ray device, allowing for bone texture analysis with a fractal parameter (H(mean)). The aims of this study were (1) to evaluate the reproducibility of BMA at two novel sites (radius and tibia) in addition to the conventional site (calcaneus), (2) to compare the results obtained with BMA at all of those sites, and (3) to study the relationship between H(mean) and trabecular microarchitecture measured with an in vivo 3D device (HR-pQCT) at the distal tibia and radius. BMA measurements were performed at three sites (calcaneus, distal tibia, and radius) in 14 healthy volunteers to measure the short-term reproducibility and in a group of 77 patients with chronic kidney disease to compare BMA results to HR-pQCT results. The coefficient of variation of H(mean) was 1.2, 2.1, and 4.7% at the calcaneus, radius, and tibia, respectively. We found significant associations between trabecular volumetric bone mineral density and microarchitectural variables measured by HR-pQCT and H(mean) at the three sites (e.g., Pearson correlation between radial trabecular number and radial H(mean) r = 0.472, P < 0.001). This study demonstrated a significant but moderate relationship between 2D bone texture and 3D trabecular microarchitecture. BMA is a new reproducible technique with few technical constraints. Thus, it may represent an interesting tool for evaluating bone structure, in association with biological parameters and DXA.

  5. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size.

  6. Computational generation and screening of RNA motifs in large nucleotide sequence pools

    PubMed Central

    Kim, Namhee; Izzo, Joseph A.; Elmetwaly, Shereef; Gan, Hin Hark; Schlick, Tamar

    2010-01-01

    Although identification of active motifs in large random sequence pools is central to RNA in vitro selection, no systematic computational equivalent of this process has yet been developed. We develop a computational approach that combines target pool generation, motif scanning and motif screening using secondary structure analysis for applications to 1012–1014-sequence pools; large pool sizes are made possible using program redesign and supercomputing resources. We use the new protocol to search for aptamer and ribozyme motifs in pools up to experimental pool size (1014 sequences). We show that motif scanning, structure matching and flanking sequence analysis, respectively, reduce the initial sequence pool by 6–8, 1–2 and 1 orders of magnitude, consistent with the rare occurrence of active motifs in random pools. The final yields match the theoretical yields from probability theory for simple motifs and overestimate experimental yields, which constitute lower bounds, for aptamers because screening analyses beyond secondary structure information are not considered systematically. We also show that designed pools using our nucleotide transition probability matrices can produce higher yields for RNA ligase motifs than random pools. Our methods for generating, analyzing and designing large pools can help improve RNA design via simulation of aspects of in vitro selection. PMID:20448026

  7. Effects of LED-backlit computer screen and emotional selfregulation on human melatonin production.

    PubMed

    Sroykham, Watchara; Wongsawat, Yodchanan

    2013-01-01

    Melatonin is a circadian hormone transmitted via suprachiasmatic nucleus (SCN) in the hypothalamus and sympathetic nervous system to the pineal gland. It is a hormone necessary to many human functions such as immune, cardiovascular, neuron and sleep/awake functions. Since melatonin enhancement or suppression is reported to be closely related to the photic information from retina, in this paper, we aim further to study both the lighting condition and the emotional self-regulation in different lighting conditions together with their effects on the production of human melatonin. In this experiment, five participants are in three light exposure conditions by LED backlit computer screen (No light, Red light (∼650nm) and Blue light (∼470nm)) for 30 minute (8-8:30pm), then they are collected saliva both before and after the experiments. After the experiment, the participants are also asked to answer the emotional self-regulation questionnaire of PANAS and BRUMS regarding each light exposure condition. These results show that positive mood mean difference of PANAS between no light and red light is significant with p=0.001. Tension, depression, fatigue, confusion and vigor from BRUMS are not significantly changed while we can observe the significant change in anger mood. Finally, we can also report that the blue light of LED-backlit computer screen significantly suppress melatonin production (91%) more than red light (78%) and no light (44%).

  8. Computer-Delivered Screening and Brief Intervention for Alcohol Use in Pregnancy: A Pilot Randomized Trial

    PubMed Central

    Ondersma, Steven J.; Beatty, Jessica R.; Svikis, Dace S.; Strickler, Ronald C.; Tzilos, Golfo K.; Chang, Grace; Divine, W.; Taylor, Andrew R.; Sokol, Robert J.

    2015-01-01

    Background Although screening and brief intervention (SBI) for unhealthy alcohol use has demonstrated efficacy in some trials, its implementation has been limited. Technology-delivered approaches are a promising alternative, particularly during pregnancy when the importance of alcohol use is amplified. The present trial evaluated the feasibility and acceptability of an interactive, empathic, video-enhanced, and computer-delivered SBI (e-SBI) plus three separate tailored mailings, and estimated intervention effects. Methods We recruited 48 pregnant women who screened positive for alcohol risk at an urban prenatal care clinic. Participants were randomly assigned to the e-SBI plus mailings or to a control session on infant nutrition, and were reevaluated during their postpartum hospitalization. The primary outcome was 90-day period-prevalence abstinence as measured by timeline follow-back interview. Results Participants rated the intervention as easy to use and helpful (4.7-5.0 on a 5-point scale). Blinded follow-up evaluation at childbirth revealed medium-size intervention effects on 90-day period prevalence abstinence (OR = 3.4); similarly, intervention effects on a combined healthy pregnancy outcome variable (live birth, normal birthweight, and no NICU stay) were also of moderate magnitude in favor of e-SBI participants (OR=3.3). As expected in this intentionally under-powered pilot trial, these effects were non-significant (p = .19 and .09, respectively). Conclusions This pilot trial demonstrated the acceptability and preliminary efficacy of a computer-delivered screening and brief intervention (e-SBI) plus tailored mailings for alcohol use in pregnancy. These findings mirror the promising results of other trials using a similar approach, and should be confirmed in a fully-powered trial. PMID:26010235

  9. Computational reverse chemical ecology: Virtual screening and predicting behaviorally active semiochemicals for Bactrocera dorsalis

    PubMed Central

    2014-01-01

    Background Semiochemical is a generic term used for a chemical substance that influences the behaviour of an organism. It is a common term used in the field of chemical ecology to encompass pheromones, allomones, kairomones, attractants and repellents. Insects have mastered the art of using semiochemicals as communication signals and rely on them to find mates, host or habitat. This dependency of insects on semiochemicals has allowed chemical ecologists to develop environment friendly pest management strategies. However, discovering semiochemicals is a laborious process that involves a plethora of behavioural and analytical techniques, making it expansively time consuming. Recently, reverse chemical ecology approach using odorant binding proteins (OBPs) as target for elucidating behaviourally active compounds is gaining eminence. In this scenario, we describe a “computational reverse chemical ecology” approach for rapid screening of potential semiochemicals. Results We illustrate the high prediction accuracy of our computational method. We screened 25 semiochemicals for their binding potential to a GOBP of B. dorsalis using molecular docking (in silico) and molecular dynamics. Parallely, compounds were subjected to fluorescent quenching assays (Experimental). The correlation between in silico and experimental data were significant (r2 = 0.9408; P < 0.0001). Further, predicted compounds were subjected to behavioral bioassays and were found to be highly attractive to insects. Conclusions The present study provides a unique methodology for rapid screening and predicting behaviorally active semiochemicals. This methodology may be developed as a viable approach for prospecting active semiochemicals for pest control, which otherwise is a laborious process. PMID:24640964

  10. Extrinsic Cation Selectivity of 2D Membranes

    PubMed Central

    2017-01-01

    From a systematic study of the concentration driven diffusion of positive and negative ions across porous 2D membranes of graphene and hexagonal boron nitride (h-BN), we prove their cation selectivity. Using the current–voltage characteristics of graphene and h-BN monolayers separating reservoirs of different salt concentrations, we calculate the reversal potential as a measure of selectivity. We tune the Debye screening length by exchanging the salt concentrations and demonstrate that negative surface charge gives rise to cation selectivity. Surprisingly, h-BN and graphene membranes show similar characteristics, strongly suggesting a common origin of selectivity in aqueous solvents. For the first time, we demonstrate that the cation flux can be increased by using ozone to create additional pores in graphene while maintaining excellent selectivity. We discuss opportunities to exploit our scalable method to use 2D membranes for applications including osmotic power conversion. PMID:28157333

  11. Estimating Development Cost for a Tailored Interactive Computer Program to Enhance Colorectal Cancer Screening Compliance

    PubMed Central

    Lairson, David R.; Chang, Yu-Chia; Bettencourt, Judith L.; Vernon, Sally W.; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was $328,866. The development cost was $52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  12. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions.

  13. Evening exposure to a light-emitting diodes (LED)-backlit computer screen affects circadian physiology and cognitive performance.

    PubMed

    Cajochen, Christian; Frey, Sylvia; Anders, Doreen; Späti, Jakub; Bues, Matthias; Pross, Achim; Mager, Ralph; Wirz-Justice, Anna; Stefani, Oliver

    2011-05-01

    Many people spend an increasing amount of time in front of computer screens equipped with light-emitting diodes (LED) with a short wavelength (blue range). Thus we investigated the repercussions on melatonin (a marker of the circadian clock), alertness, and cognitive performance levels in 13 young male volunteers under controlled laboratory conditions in a balanced crossover design. A 5-h evening exposure to a white LED-backlit screen with more than twice as much 464 nm light emission {irradiance of 0,241 Watt/(steradian × m(2)) [W/(sr × m(2))], 2.1 × 10(13) photons/(cm(2) × s), in the wavelength range of 454 and 474 nm} than a white non-LED-backlit screen [irradiance of 0,099 W/(sr × m(2)), 0.7 × 10(13) photons/(cm(2) × s), in the wavelength range of 454 and 474 nm] elicited a significant suppression of the evening rise in endogenous melatonin and subjective as well as objective sleepiness, as indexed by a reduced incidence of slow eye movements and EEG low-frequency activity (1-7 Hz) in frontal brain regions. Concomitantly, sustained attention, as determined by the GO/NOGO task; working memory/attention, as assessed by "explicit timing"; and declarative memory performance in a word-learning paradigm were significantly enhanced in the LED-backlit screen compared with the non-LED condition. Screen quality and visual comfort were rated the same in both screen conditions, whereas the non-LED screen tended to be considered brighter. Our data indicate that the spectral profile of light emitted by computer screens impacts on circadian physiology, alertness, and cognitive performance levels. The challenge will be to design a computer screen with a spectral profile that can be individually programmed to add timed, essential light information to the circadian system in humans.

  14. Computer-aided diagnostics of screening mammography using content-based image retrieval

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Soiron, Michael; de Oliveira, Júlia E. E.; de A. Araújo, Arnaldo

    2012-03-01

    Breast cancer is one of the main causes of death among women in occidental countries. In the last years, screening mammography has been established worldwide for early detection of breast cancer, and computer-aided diagnostics (CAD) is being developed to assist physicians reading mammograms. A promising method for CAD is content-based image retrieval (CBIR). Recently, we have developed a classification scheme of suspicious tissue pattern based on the support vector machine (SVM). In this paper, we continue moving towards automatic CAD of screening mammography. The experiments are based on in total 10,509 radiographs that have been collected from different sources. From this, 3,375 images are provided with one and 430 radiographs with more than one chain code annotation of cancerous regions. In different experiments, this data is divided into 12 and 20 classes, distinguishing between four categories of tissue density, three categories of pathology and in the 20 class problem two categories of different types of lesions. Balancing the number of images in each class yields 233 and 45 images remaining in each of the 12 and 20 classes, respectively. Using a two-dimensional principal component analysis, features are extracted from small patches of 128 x 128 pixels and classified by means of a SVM. Overall, the accuracy of the raw classification was 61.6 % and 52.1 % for the 12 and the 20 class problem, respectively. The confusion matrices are assessed for detailed analysis. Furthermore, an implementation of a SVM-based CBIR system for CADx in screening mammography is presented. In conclusion, with a smarter patch extraction, the CBIR approach might reach precision rates that are helpful for the physicians. This, however, needs more comprehensive evaluation on clinical data.

  15. Temporal analysis of laser beam propagation in the atmosphere using computer-generated long phase screens.

    PubMed

    Dios, Federico; Recolons, Jaume; Rodríguez, Alejandro; Batet, Oscar

    2008-02-04

    Temporal analysis of the irradiance at the detector plane is intended as the first step in the study of the mean fade time in a free optical communication system. In the present work this analysis has been performed for a Gaussian laser beam propagating in the atmospheric turbulence by means of computer simulation. To this end, we have adapted a previously known numerical method to the generation of long phase screens. The screens are displaced in a transverse direction as the wave is propagated, in order to simulate the wind effect. The amplitude of the temporal covariance and its power spectrum have been obtained at the optical axis, at the beam centroid and at a certain distance from these two points. Results have been worked out for weak, moderate and strong turbulence regimes and when possible they have been compared with theoretical models. These results show a significant contribution of beam wander to the temporal behaviour of the irradiance, even in the case of weak turbulence. We have also found that the spectral bandwidth of the covariance is hardly dependent on the Rytov variance.

  16. Pre- and postoperative neurocognitive deficits in brain tumor patients assessed by a computer based screening test.

    PubMed

    Hoffermann, Markus; Bruckmann, Lukas; Mahdy Ali, Kariem; Zaar, Karla; Avian, Alexander; von Campe, Gord

    2017-02-01

    Neurocognitive assessment becomes increasingly important in neuro-oncology. The presence and degree of neurocognitive deficits in patients with brain tumors appear to be important not only as outcome measures but also in treatment planning and as possible prognostic markers for tumor-progression. Common screening methods for neurocognitive deficits are often insufficient in uncovering subtle changes or harbor the risk of being observer-dependent and time-consuming. We present data of brain tumor patients screened by a computer-based neurocognitive assessment tool before and after surgery. 196 patients with tumor resections were tested at our institution using the NeuroCog Fx® software 2days before and 3-4months after surgery. Additionally to the test results, patient-related information, such as age, sex, handedness, level of education, pre- and postoperative neurological status, KPS, location and histopathological diagnosis were recorded. These prospectively collected results were correlated in the here presented retrospective study. The majority of patients with malignant gliomas, metastases and meningiomas showed significant deficits in various neurocognitive domains, most of them improved or did not decline in their postoperative neurocognitive performances. Interestingly, there was no significant correlation of neurocognitive deficits and brain tumor location. In future, standardized neuropsychological assessment should become an essential part of the management and care of patients with brain tumors to provide a more personalized and tailored treatment. Further studies will improve the understanding of the influence of various treatment modalities on neuro-cognition.

  17. A computational screen for mammalian pseudouridylation guide H/ACA RNAs.

    PubMed

    Schattner, Peter; Barberan-Soler, Sergio; Lowe, Todd M

    2006-01-01

    The box H/ACA RNA gene family is one of the largest non-protein-coding gene families in eukaryotes and archaea. Recently, we developed snoGPS, a computational screening program for H/ACA snoRNAs, and applied it to Saccharomyces cerevisiae. We report here results of extending our method to screen for H/ACA RNAs in multiple large genomes of related species, and apply it to the human, mouse, and rat genomes. Because of the 250-fold larger search space compared to S. cerevisiae, significant enhancements to our algorithms were required. Complementing extensive cloning experiments performed by others, our findings include the detection and experimental verification of seven new mammalian H/ACA RNAs and the prediction of 23 new H/ACA RNA pseudouridine guide assignments. These assignments include four for H/ACA RNAs previously classified as orphan H/ACA RNAs with no known targets. We also determined systematic syntenic conservation among human and mouse H/ACA RNAs. With this work, 82 of 97 ribosomal RNA pseudouridines and 18 of 32 spliceosomal RNA pseudouridines in mammals have been linked to H/ACA guide RNAs.

  18. A Computational Screen for Regulators of Oxidative Phosphorylation Implicates SLIRP in Mitochondrial RNA Homeostasis

    PubMed Central

    Baughman, Joshua M.; Nilsson, Roland; Gohil, Vishal M.; Arlow, Daniel H.; Gauhar, Zareen; Mootha, Vamsi K.

    2009-01-01

    The human oxidative phosphorylation (OxPhos) system consists of approximately 90 proteins encoded by nuclear and mitochondrial genomes and serves as the primary cellular pathway for ATP biosynthesis. While the core protein machinery for OxPhos is well characterized, many of its assembly, maturation, and regulatory factors remain unknown. We exploited the tight transcriptional control of the genes encoding the core OxPhos machinery to identify novel regulators. We developed a computational procedure, which we call expression screening, which integrates information from thousands of microarray data sets in a principled manner to identify genes that are consistently co-expressed with a target pathway across biological contexts. We applied expression screening to predict dozens of novel regulators of OxPhos. For two candidate genes, CHCHD2 and SLIRP, we show that silencing with RNAi results in destabilization of OxPhos complexes and a marked loss of OxPhos enzymatic activity. Moreover, we show that SLIRP plays an essential role in maintaining mitochondrial-localized mRNA transcripts that encode OxPhos protein subunits. Our findings provide a catalogue of potential novel OxPhos regulators that advance our understanding of the coordination between nuclear and mitochondrial genomes for the regulation of cellular energy metabolism. PMID:19680543

  19. The promise of computer-assisted auscultation in screening for structural heart disease and clinical teaching.

    PubMed

    Zühlke, L; Myer, L; Mayosi, B M

    2012-08-01

    Cardiac auscultation has been the central clinical tool for the diagnosis of valvular and other structural heart diseases for over a century. Physicians acquire competence in this technique through considerable training and experience. In Africa, however, we face a shortage of physicians and have the lowest health personnel-to-population ratio in the world. One of the proposed solutions for tackling this crisis is the adoption of health technologies and product innovations to support different cadres of health workers as part of task shifting. Computer-assisted auscultation (CAA) uses a digital stethoscope combined with acoustic neural networking to provide a visual display of heart sounds and murmurs, and analyses the recordings to distinguish between innocent and pathological murmurs. In so doing, CAA may serve as an objective tool for the screening of structural heart disease and facilitate the teaching of cardiac auscultation. This article reviews potential clinical applications of CAA.

  20. Screening of photosynthetic pigments for herbicidal activity with a new computational molecular approach.

    PubMed

    Krishnaraj, R Navanietha; Chandran, Saravanan; Pal, Parimal; Berchmans, Sheela

    2013-12-01

    There is an immense interest among the researchers to identify new herbicides which are effective against the herbs without affecting the environment. In this work, photosynthetic pigments are used as the ligands to predict their herbicidal activity. The enzyme 5-enolpyruvylshikimate-3-phosphate (EPSP) synthase is a good target for the herbicides. Homology modeling of the target enzyme is done using Modeler 9.11 and the model is validated. Docking studies were performed with AutoDock Vina algorithm to predict the binding of the natural pigments such as β-carotene, chlorophyll a, chlorophyll b, phycoerythrin and phycocyanin to the target. β-carotene, phycoerythrin and phycocyanin have higher binding energies indicating the herbicidal activity of the pigments. This work reports a procedure to screen herbicides with computational molecular approach. These pigments will serve as potential bioherbicides in the future.

  1. Fish freshness detection by a computer screen photoassisted based gas sensor array.

    PubMed

    Alimelli, Adriano; Pennazza, Giorgio; Santonico, Marco; Paolesse, Roberto; Filippini, Daniel; D'Amico, Arnaldo; Lundström, Ingemar; Di Natale, Corrado

    2007-01-23

    In the last years a large number of different measurement methodologies were applied to measure the freshness of fishes. Among them the connection between freshness and headspace composition has been considered by gas chromatographic analysis and from the last two decades by a number of sensors and biosensors aimed at measuring some characteristic indicators (usually amines). More recently also the so-called artificial olfaction systems gathering together many non-specific sensors have shown a certain capability to transduce the global composition of the fish headspace capturing the differences between fresh and spoiled products. One of the main objectives related to the introduction of sensor systems with respect to the analytical methods is the claimed possibility to distribute the freshness control since sensors are expected to be "portable" and "simple". In spite of these objectives, until now sensor systems did not result in any tool that may be broadly distributed. In this paper, we present a chemical sensor array where the optical features of layers of chemicals, sensitive to volatile compounds typical of spoilage processes in fish, are interrogated by a very simple platform based on a computer screen and a web cam. An array of metalloporphyrins is here used to classify fillets of thawed fishes according to their storage days and to monitor the spoilage in filleted anchovies for a time of 8 h. Results indicate a complete identification of the storage days of thawed fillets and a determination of the storage time of anchovies held at room temperature with a root mean square error of validation of about 30 min. The optical system produces a sort of spectral fingerprint containing information about both the absorbance and the emission of the sensitive layer. The system here illustrated, based on computer peripherals, can be easily scaled to any device endowed with a programmable screen and a camera such as cellular phones offering for the first time the

  2. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    NASA Astrophysics Data System (ADS)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  3. Optoelectronics with 2D semiconductors

    NASA Astrophysics Data System (ADS)

    Mueller, Thomas

    2015-03-01

    Two-dimensional (2D) atomic crystals, such as graphene and layered transition-metal dichalcogenides, are currently receiving a lot of attention for applications in electronics and optoelectronics. In this talk, I will review our research activities on electrically driven light emission, photovoltaic energy conversion and photodetection in 2D semiconductors. In particular, WSe2 monolayer p-n junctions formed by electrostatic doping using a pair of split gate electrodes, type-II heterojunctions based on MoS2/WSe2 and MoS2/phosphorene van der Waals stacks, 2D multi-junction solar cells, and 3D/2D semiconductor interfaces will be presented. Upon optical illumination, conversion of light into electrical energy occurs in these devices. If an electrical current is driven, efficient electroluminescence is obtained. I will present measurements of the electrical characteristics, the optical properties, and the gate voltage dependence of the device response. In the second part of my talk, I will discuss photoconductivity studies of MoS2 field-effect transistors. We identify photovoltaic and photoconductive effects, which both show strong photoconductive gain. A model will be presented that reproduces our experimental findings, such as the dependence on optical power and gate voltage. We envision that the efficient photon conversion and light emission, combined with the advantages of 2D semiconductors, such as flexibility, high mechanical stability and low costs of production, could lead to new optoelectronic technologies.

  4. 2D vs. 3D mammography observer study

    NASA Astrophysics Data System (ADS)

    Fernandez, James Reza F.; Hovanessian-Larsen, Linda; Liu, Brent

    2011-03-01

    Breast cancer is the most common type of non-skin cancer in women. 2D mammography is a screening tool to aid in the early detection of breast cancer, but has diagnostic limitations of overlapping tissues, especially in dense breasts. 3D mammography has the potential to improve detection outcomes by increasing specificity, and a new 3D screening tool with a 3D display for mammography aims to improve performance and efficiency as compared to 2D mammography. An observer study using a mammography phantom was performed to compare traditional 2D mammography with this ne 3D mammography technique. In comparing 3D and 2D mammography there was no difference in calcification detection, and mass detection was better in 2D as compared to 3D. There was a significant decrease in reading time for masses, calcifications, and normals in 3D compared to 2D, however, as well as more favorable confidence levels in reading normal cases. Given the limitations of the mammography phantom used, however, a clearer picture in comparing 3D and 2D mammography may be better acquired with the incorporation of human studies in the future.

  5. Effect of 3G cell phone exposure with computer controlled 2-D stepper motor on non-thermal activation of the hsp27/p38MAPK stress pathway in rat brain.

    PubMed

    Kesari, Kavindra Kumar; Meena, Ramovatar; Nirala, Jayprakash; Kumar, Jitender; Verma, H N

    2014-03-01

    Cell phone radiation exposure and its biological interaction is the present concern of debate. Present study aimed to investigate the effect of 3G cell phone exposure with computer controlled 2-D stepper motor on 45-day-old male Wistar rat brain. Animals were exposed for 2 h a day for 60 days by using mobile phone with angular movement up to zero to 30°. The variation of the motor is restricted to 90° with respect to the horizontal plane, moving at a pre-determined rate of 2° per minute. Immediately after 60 days of exposure, animals were scarified and numbers of parameters (DNA double-strand break, micronuclei, caspase 3, apoptosis, DNA fragmentation, expression of stress-responsive genes) were performed. Result shows that microwave radiation emitted from 3G mobile phone significantly induced DNA strand breaks in brain. Meanwhile a significant increase in micronuclei, caspase 3 and apoptosis were also observed in exposed group (P < 0.05). Western blotting result shows that 3G mobile phone exposure causes a transient increase in phosphorylation of hsp27, hsp70, and p38 mitogen-activated protein kinase (p38MAPK), which leads to mitochondrial dysfunction-mediated cytochrome c release and subsequent activation of caspases, involved in the process of radiation-induced apoptotic cell death. Study shows that the oxidative stress is the main factor which activates a variety of cellular signal transduction pathways, among them the hsp27/p38MAPK is the pathway of principle stress response. Results conclude that 3G mobile phone radiations affect the brain function and cause several neurological disorders.

  6. Static & Dynamic Response of 2D Solids

    SciTech Connect

    Lin, Jerry

    1996-07-15

    NIKE2D is an implicit finite-element code for analyzing the finite deformation, static and dynamic response of two-dimensional, axisymmetric, plane strain, and plane stress solids. The code is fully vectorized and available on several computing platforms. A number of material models are incorporated to simulate a wide range of material behavior including elasto-placicity, anisotropy, creep, thermal effects, and rate dependence. Slideline algorithms model gaps and sliding along material interfaces, including interface friction, penetration and single surface contact. Interactive-graphics and rezoning is included for analyses with large mesh distortions. In addition to quasi-Newton and arc-length procedures, adaptive algorithms can be defined to solve the implicit equations using the solution language ISLAND. Each of these capabilities and more make NIKE2D a robust analysis tool.

  7. The UK Lung Cancer Screening Trial: a pilot randomised controlled trial of low-dose computed tomography screening for the early detection of lung cancer.

    PubMed Central

    Field, John K; Duffy, Stephen W; Baldwin, David R; Brain, Kate E; Devaraj, Anand; Eisen, Tim; Green, Beverley A; Holemans, John A; Kavanagh, Terry; Kerr, Keith M; Ledson, Martin; Lifford, Kate J; McRonald, Fiona E; Nair, Arjun; Page, Richard D; Parmar, Mahesh Kb; Rintoul, Robert C; Screaton, Nicholas; Wald, Nicholas J; Weller, David; Whynes, David K; Williamson, Paula R; Yadegarfar, Ghasem; Hansell, David M

    2016-01-01

    BACKGROUND Lung cancer kills more people than any other cancer in the UK (5-year survival < 13%). Early diagnosis can save lives. The USA-based National Lung Cancer Screening Trial reported a 20% relative reduction in lung cancer mortality and 6.7% all-cause mortality in low-dose computed tomography (LDCT)-screened subjects. OBJECTIVES To (1) analyse LDCT lung cancer screening in a high-risk UK population, determine optimum recruitment, screening, reading and care pathway strategies; and (2) assess the psychological consequences and the health-economic implications of screening. DESIGN A pilot randomised controlled trial comparing intervention with usual care. A population-based risk questionnaire identified individuals who were at high risk of developing lung cancer (≥ 5% over 5 years). SETTING Thoracic centres with expertise in lung cancer imaging, respiratory medicine, pathology and surgery: Liverpool Heart & Chest Hospital, Merseyside, and Papworth Hospital, Cambridgeshire. PARTICIPANTS Individuals aged 50-75 years, at high risk of lung cancer, in the primary care trusts adjacent to the centres. INTERVENTIONS A thoracic LDCT scan. Follow-up computed tomography (CT) scans as per protocol. Referral to multidisciplinary team clinics was determined by nodule size criteria. MAIN OUTCOME MEASURES Population-based recruitment based on risk stratification; management of the trial through web-based database; optimal characteristics of CT scan readers (radiologists vs. radiographers); characterisation of CT-detected nodules utilising volumetric analysis; prevalence of lung cancer at baseline; sociodemographic factors affecting participation; psychosocial measures (cancer distress, anxiety, depression, decision satisfaction); and cost-effectiveness modelling. RESULTS A total of 247,354 individuals were approached to take part in the trial; 30.7% responded positively to the screening invitation. Recruitment of participants resulted in 2028 in the CT arm and 2027 in

  8. Realistic and efficient 2D crack simulation

    NASA Astrophysics Data System (ADS)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  9. Highly crystalline 2D superconductors

    NASA Astrophysics Data System (ADS)

    Saito, Yu; Nojima, Tsutomu; Iwasa, Yoshihiro

    2016-12-01

    Recent advances in materials fabrication have enabled the manufacturing of ordered 2D electron systems, such as heterogeneous interfaces, atomic layers grown by molecular beam epitaxy, exfoliated thin flakes and field-effect devices. These 2D electron systems are highly crystalline, and some of them, despite their single-layer thickness, exhibit a sheet resistance more than an order of magnitude lower than that of conventional amorphous or granular thin films. In this Review, we explore recent developments in the field of highly crystalline 2D superconductors and highlight the unprecedented physical properties of these systems. In particular, we explore the quantum metallic state (or possible metallic ground state), the quantum Griffiths phase observed in out-of-plane magnetic fields and the superconducting state maintained in anomalously large in-plane magnetic fields. These phenomena are examined in the context of weakened disorder and/or broken spatial inversion symmetry. We conclude with a discussion of how these unconventional properties make highly crystalline 2D systems promising platforms for the exploration of new quantum physics and high-temperature superconductors.

  10. Extensions of 2D gravity

    SciTech Connect

    Sevrin, A.

    1993-06-01

    After reviewing some aspects of gravity in two dimensions, I show that non-trivial embeddings of sl(2) in a semi-simple (super) Lie algebra give rise to a very large class of extensions of 2D gravity. The induced action is constructed as a gauged WZW model and an exact expression for the effective action is given.

  11. Drug search for leishmaniasis: a virtual screening approach by grid computing

    NASA Astrophysics Data System (ADS)

    Ochoa, Rodrigo; Watowich, Stanley J.; Flórez, Andrés; Mesa, Carol V.; Robledo, Sara M.; Muskus, Carlos

    2016-07-01

    The trypanosomatid protozoa Leishmania is endemic in 100 countries, with infections causing 2 million new cases of leishmaniasis annually. Disease symptoms can include severe skin and mucosal ulcers, fever, anemia, splenomegaly, and death. Unfortunately, therapeutics approved to treat leishmaniasis are associated with potentially severe side effects, including death. Furthermore, drug-resistant Leishmania parasites have developed in most endemic countries. To address an urgent need for new, safe and inexpensive anti-leishmanial drugs, we utilized the IBM World Community Grid to complete computer-based drug discovery screens (Drug Search for Leishmaniasis) using unique leishmanial proteins and a database of 600,000 drug-like small molecules. Protein structures from different Leishmania species were selected for molecular dynamics (MD) simulations, and a series of conformational "snapshots" were chosen from each MD trajectory to simulate the protein's flexibility. A Relaxed Complex Scheme methodology was used to screen 2000 MD conformations against the small molecule database, producing >1 billion protein-ligand structures. For each protein target, a binding spectrum was calculated to identify compounds predicted to bind with highest average affinity to all protein conformations. Significantly, four different Leishmania protein targets were predicted to strongly bind small molecules, with the strongest binding interactions predicted to occur for dihydroorotate dehydrogenase (LmDHODH; PDB:3MJY). A number of predicted tight-binding LmDHODH inhibitors were tested in vitro and potent selective inhibitors of Leishmania panamensis were identified. These promising small molecules are suitable for further development using iterative structure-based optimization and in vitro/in vivo validation assays.

  12. The Study of Learners' Preference for Visual Complexity on Small Screens of Mobile Computers Using Neural Networks

    ERIC Educational Resources Information Center

    Wang, Lan-Ting; Lee, Kun-Chou

    2014-01-01

    The vision plays an important role in educational technologies because it can produce and communicate quite important functions in teaching and learning. In this paper, learners' preference for the visual complexity on small screens of mobile computers is studied by neural networks. The visual complexity in this study is divided into five…

  13. Use of computed tomography renal angiography for screening feline renal transplant donors.

    PubMed

    Bouma, Jennifer L; Aronson, Lillian R; Keith, Dennis G; Saunders, H Mark

    2003-01-01

    Preoperative knowledge of the renal vascular anatomy is important for selection of the appropriate feline renal donor. Intravenous urograms (IVUs) have been performed routinely to screen potential donors at the Veterinary Hospital of the University of Pennsylvania (VHUP), but the vascular phase views lack sufficient detail of the renal vascular anatomy. Computed tomography angiography (CTA), which requires a helical computed tomography (CT) scanner, has been found to provide superior renal vascular anatomic information of prospective human renal donors. The specific aims of this study were as follows: 1) develop the CTA technique for the feline patient; and 2) obtain preliminary information on feline renal vessel anatomy in potential renal donors. Ten healthy, potential feline renal donors were anesthetized and imaged using a third-generation helical CT scanner. The time delay between i.v. contrast medium injection and image acquisition, and other parameters of slice collimation, slice interval, pitch, exposure settings, and reconstruction algorithms were varied to maximize contrast medium opacification of the renal vascular anatomy. Optimal CTA acquisition parameters were determined to be: 1) 10-sec delay post-i.v. bolus of iodinated contrast medium; 2) two serially acquired (corresponding to arterial and venous phases) helical scans through the renal vasculature; 3) pitch of 2 (4 mm/sec patient translation, 2 mm slice collimation); and 4) 120-kVp, 160-mA, and 1-sec exposure settings. Retrospective reconstructed CTA transverse images obtained at a 2-mm slice width and a 1-mm slice interval in combination with two-dimensional reformatted images and three-dimensional reconstructed images were qualitatively evaluated for vascular anatomy; vascular anatomy was confirmed at surgery. Four cats had single renal arteries and veins bilaterally; four cats had double renal veins. One cat had a small accessory artery supplying the caudal pole of the left kidney. One cat had a

  14. Computed tomography screening: the international early lung cancer action program experience.

    PubMed

    Henschke, Claudia I; Boffetta, Paolo; Yankelevitz, David F; Altorki, Nasser

    2015-05-01

    The International Early Lung Cancer Action Program (I-ELCAP) used a novel study design that provided quantitative information about annual CT screening for lung cancer. The results stimulated additional studies of lung cancer screening and ultimately led to the National Lung Screening Trial (NLST) being initiated in 2002, as the initial report in 1999 was sufficiently compelling to reawaken interest in screening for lung cancer. The authors think that the I-ELCAP and NLST "story" provides a strong argument for relevant agencies to consider alternative study designs for the public funding of studies aimed at evaluating the effectiveness of screening and other medical trials.

  15. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    PubMed

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  16. A computer touch screen system and training procedure for use with primate infants: Results from pigtail monkeys (Macaca nemestrina).

    PubMed

    Mandell, Dorothy J; Sackett, Gene P

    2008-03-01

    Computerized cognitive and perceptual testing has resulted in many advances towards understanding adult brain-behavior relations across a variety of abilities and species. However, there has been little migration of this technology to the assessment of very young primate subjects. We describe a training procedure and software that was developed to teach infant monkeys to interact with a touch screen computer. Eighteen infant pigtail macaques began training at 90-postnatal days and five began at 180-postnatal days. All animals were trained to reliably touch a stimulus presented on a computer screen and no significant differences were found between the two age groups. The results demonstrate the feasibility of using computers to assess cognitive and perceptual abilities early in development.

  17. Spatial task for rats testing position recognition of an object displayed on a computer screen.

    PubMed

    Klement, Daniel; Levcik, David; Duskova, Lenka; Nekovarova, Tereza

    2010-03-05

    We developed two spatial tasks for rats employing computer monitor for stimuli presentation. Both tasks were aimed for testing rats' ability to recognize position of a distant object. In the first task the object was stationary except moments when it jumped from one position to another. In the second task it moved continuously across the screen. Rats were trained in an operant chamber located in front of the monitor. They responded to the object position by pressing a lever for food reward. Responses were reinforced when the object was displayed in a to-be-recognized position in the first task and when it was passing through a to-be-recognized region in the second task. The to-be-recognized position as well as the to-be-recognized region had to be determined with respect to surrounding orientation cues. Responding rate of well trained rats negatively depended on the distance between the object and the to-be-recognized position/region. In the first task this relationship was apparent during a short time after the object changed its position and it held even for newly presented unfamiliar positions of the object. We conclude that in both tasks the rats recognize position of the object by estimating distance between the object and the to-be-recognized position/region. We also analyzed contribution of timing behavior to the solution of the second task.

  18. Preparation of forefinger's sequence on keyboard orients ocular fixations on computer screen.

    PubMed

    Coutté, Alexandre; Olivier, Gérard; Faure, Sylvane; Baccino, Thierry

    2014-08-01

    This study examined the links between attention, hand movements and eye movements when performed in different spatial areas. Participants performed a visual search task on a computer screen while preparing to press two keyboard keys sequentially with their index. Results showed that the planning of the manual sequence influenced the latency of the first saccade and the placement of the first fixation. In particular, even if the first fixation placement was influenced by the combination of both components of the prepared manual sequence in some trials, it was affected principally by the first component of the prepared manual sequence. Moreover, the probability that the first fixation placement did reflect a combination of both components of the manual sequence was correlated with the speed of the second component. This finding suggests that the preparation of the second component of the sequence influence simultaneous oculomotor behavior when motor control of the manual sequence relied on proactive motor planning. These results are discussed taking into account the current debate on the eye/hand coordination research.

  19. Large-Scale Computational Screening Identifies First in Class Multitarget Inhibitor of EGFR Kinase and BRD4

    PubMed Central

    Allen, Bryce K.; Mehta, Saurabh; Ember, Stewart W. J.; Schonbrunn, Ernst; Ayad, Nagi; Schürer, Stephan C.

    2015-01-01

    Inhibition of cancer-promoting kinases is an established therapeutic strategy for the treatment of many cancers, although resistance to kinase inhibitors is common. One way to overcome resistance is to target orthogonal cancer-promoting pathways. Bromo and Extra-Terminal (BET) domain proteins, which belong to the family of epigenetic readers, have recently emerged as promising therapeutic targets in multiple cancers. The development of multitarget drugs that inhibit kinase and BET proteins therefore may be a promising strategy to overcome tumor resistance and prolong therapeutic efficacy in the clinic. We developed a general computational screening approach to identify novel dual kinase/bromodomain inhibitors from millions of commercially available small molecules. Our method integrated machine learning using big datasets of kinase inhibitors and structure-based drug design. Here we describe the computational methodology, including validation and characterization of our models and their application and integration into a scalable virtual screening pipeline. We screened over 6 million commercially available compounds and selected 24 for testing in BRD4 and EGFR biochemical assays. We identified several novel BRD4 inhibitors, among them a first in class dual EGFR-BRD4 inhibitor. Our studies suggest that this computational screening approach may be broadly applicable for identifying dual kinase/BET inhibitors with potential for treating various cancers. PMID:26596901

  20. Touch screen computer-assisted health-related quality of life and distress data collection in head and neck cancer patients.

    PubMed

    de Bree, R; Verdonck-de Leeuw, I M; Keizer, A L; Houffelaar, A; Leemans, C R

    2008-04-01

    Touch screen computer-assisted health-related quality of life data collection in head and neck cancer patients is feasible. Touch screen computer-assisted health-related quality of life data collection can be used for scientific documentation as well as in clinical setting. Patients are willing to complete the questionnaire on a touch-screen and find the equipment easy to use. Compliance needs improvement by instructing clinicians and nurses and a better alert system.

  1. Computer aided screening and evaluation of herbal therapeutics against MRSA infections.

    PubMed

    Skariyachan, Sinosh; Krishnan, Rao Shruti; Siddapa, Snehapriya Bangalore; Salian, Chithra; Bora, Prerana; Sebastian, Denoj

    2011-01-01

    Methicillin resistant Staphylococcus aureus (MRSA), a pathogenic bacterium that causes life threatening outbreaks such as community-onset and nosocomial infections has emerged as 'superbug'. The organism developed resistance to all classes of antibiotics including the best known Vancomycin (VRSA). Hence, there is a need to develop new therapeutic agents. This study mainly evaluates the potential use of botanicals against MRSA infections. Computer aided design is an initial platform to screen novel inhibitors and the data finds applications in drug development. The drug-likeness and efficiency of various herbal compounds were screened by ADMET and docking studies. The virulent factor of most of the MRSA associated infections are Penicillin Binding Protein 2A (PBP2A) and Panton-Valentine Leukocidin (PVL). Hence, native structures of these proteins (PDB: 1VQQ and 1T5R) were used as the drug targets. The docking studies revealed that the active component of Aloe vera, β-sitosterol (3S, 8S, 9S, 10R, 13R, 14S, 17R) -17- [(2R, 5R)-5-ethyl-6-methylheptan-2-yl] -10, 13-dimethyl 2, 3, 4, 7, 8, 9, 11, 12, 14, 15, 16, 17- dodecahydro-1H-cyclopenta [a] phenanthren-3-ol) showed best binding energies of -7.40 kcal/mol and -6.34 kcal/mol for PBP2A and PVL toxin, respectively. Similarly, Meliantriol (1S-1-[ (2R, 3R, 5R)-5-hydroxy-3-[(3S, 5R, 9R, 10R, 13S, 14S, 17S)-3-hydroxy 4, 4, 10, 13, 14-pentamethyl-2, 3, 5, 6, 9, 11, 12, 15, 16, 17-decahydro-1H-cyclopenta[a] phenanthren-17-yl] oxolan-2-yl] -2- methylpropane-1, 2 diol), active compound in Azadirachta indica (Neem) showed the binding energies of -6.02 kcal/mol for PBP2A and -8.94 for PVL toxin. Similar studies were conducted with selected herbal compound based on pharmacokinetic properties. All in silico data tested in vitro concluded that herbal extracts of Aloe-vera, Neem, Guava (Psidium guajava), Pomegranate (Punica granatum) and tea (Camellia sinensis) can be used as therapeutics against MRSA infections.

  2. Computer aided screening and evaluation of herbal therapeutics against MRSA infections

    PubMed Central

    Skariyachan, Sinosh; Krishnan, Rao Shruti; Siddapa, Snehapriya Bangalore; Salian, Chithra; Bora, Prerana; Sebastian, Denoj

    2011-01-01

    Methicillin resistant Staphylococcus aureus (MRSA), a pathogenic bacterium that causes life threatening outbreaks such as community-onset and nosocomial infections has emerged as ‘superbug’. The organism developed resistance to all classes of antibiotics including the best known Vancomycin (VRSA). Hence, there is a need to develop new therapeutic agents. This study mainly evaluates the potential use of botanicals against MRSA infections. Computer aided design is an initial platform to screen novel inhibitors and the data finds applications in drug development. The drug-likeness and efficiency of various herbal compounds were screened by ADMET and docking studies. The virulent factor of most of the MRSA associated infections are Penicillin Binding Protein 2A (PBP2A) and Panton-Valentine Leukocidin (PVL). Hence, native structures of these proteins (PDB: 1VQQ and 1T5R) were used as the drug targets. The docking studies revealed that the active component of Aloe vera, β-sitosterol (3S, 8S, 9S, 10R, 13R, 14S, 17R) ­17­ [(2R, 5R)-5-ethyl-6-methylheptan-2-yl] -10, 13-dimethyl 2, 3, 4, 7, 8, 9, 11, 12, 14, 15, 16, 17- dodecahydro-1H-cyclopenta [a] phenanthren-3-ol) showed best binding energies of -7.40 kcal/mol and ­6.34 kcal/mol for PBP2A and PVL toxin, respectively. Similarly, Meliantriol (1S-1-[ (2R, 3R, 5R)-5-hydroxy-3-[(3S, 5R, 9R, 10R, 13S, 14S, 17S)-3-hydroxy 4, 4, 10, 13, 14-pentamethyl-2, 3, 5, 6, 9, 11, 12, 15, 16, 17-decahydro-1H-cyclopenta[a] phenanthren-17-yl] oxolan-2-yl] -2- methylpropane-1, 2 diol), active compound in Azadirachta indica (Neem) showed the binding energies of ­6.02 kcal/mol for PBP2A and ­8.94 for PVL toxin. Similar studies were conducted with selected herbal compound based on pharmacokinetic properties. All in silico data tested in vitro concluded that herbal extracts of Aloe-vera, Neem, Guava (Psidium guajava), Pomegranate (Punica granatum) and tea (Camellia sinensis) can be used as therapeutics against MRSA infections. PMID

  3. WE-AB-BRA-07: Quantitative Evaluation of 2D-2D and 2D-3D Image Guided Radiation Therapy for Clinical Trial Credentialing, NRG Oncology/RTOG

    SciTech Connect

    Giaddui, T; Yu, J; Xiao, Y; Jacobs, P; Manfredi, D; Linnemann, N

    2015-06-15

    Purpose: 2D-2D kV image guided radiation therapy (IGRT) credentialing evaluation for clinical trial qualification was historically qualitative through submitting screen captures of the fusion process. However, as quantitative DICOM 2D-2D and 2D-3D image registration tools are implemented in clinical practice for better precision, especially in centers that treat patients with protons, better IGRT credentialing techniques are needed. The aim of this work is to establish methodologies for quantitatively reviewing IGRT submissions based on DICOM 2D-2D and 2D-3D image registration and to test the methodologies in reviewing 2D-2D and 2D-3D IGRT submissions for RTOG/NRG Oncology clinical trials qualifications. Methods: DICOM 2D-2D and 2D-3D automated and manual image registration have been tested using the Harmony tool in MIM software. 2D kV orthogonal portal images are fused with the reference digital reconstructed radiographs (DRR) in the 2D-2D registration while the 2D portal images are fused with DICOM planning CT image in the 2D-3D registration. The Harmony tool allows alignment of the two images used in the registration process and also calculates the required shifts. Shifts calculated using MIM are compared with those submitted by institutions for IGRT credentialing. Reported shifts are considered to be acceptable if differences are less than 3mm. Results: Several tests have been performed on the 2D-2D and 2D-3D registration. The results indicated good agreement between submitted and calculated shifts. A workflow for reviewing these IGRT submissions has been developed and will eventually be used to review IGRT submissions. Conclusion: The IROC Philadelphia RTQA center has developed and tested a new workflow for reviewing DICOM 2D-2D and 2D-3D IGRT credentialing submissions made by different cancer clinical centers, especially proton centers. NRG Center for Innovation in Radiation Oncology (CIRO) and IROC RTQA center continue their collaborative efforts to enhance

  4. A computer touch-screen version of the North American Spine Society outcome assessment instrument for the lumbar spine.

    PubMed

    Schaeren, S; Bischoff-Ferrari, H A; Knupp, M; Dick, W; Huber, J F; Theiler, R

    2005-02-01

    We validated the North American Spine Society (NASS) outcome-assessment instrument for the lumbar spine in a computerised touch-screen format and assessed patients' acceptance, taking into account previous computer experience, age and gender. Fifty consecutive patients with symptomatic and radiologically-proven degenerative disease of the lumbar spine completed both the hard copy (paper) and the computerised versions of the NASS questionnaire. Statistical analysis showed high agreement between the paper and the touch-screen computer format for both subscales (intraclass correlation coefficient 0.94, 95% confidence interval (0.90 to 0.97)) independent of computer experience, age and gender. In total, 55% of patients stated that the computer format was easier to use and 66% preferred it to the paper version (p < 0.0001 among subjects expressing a preference). Our data indicate that the touch-screen format is comparable to the paper form. It may improve follow-up in clinical practice and research by meeting patients' preferences and minimising administrative work.

  5. Computational detection and suppression of sequence-specific off-target phenotypes from whole genome RNAi screens

    PubMed Central

    Zhong, Rui; Kim, Jimi; Kim, Hyun Seok; Kim, Minsoo; Lum, Lawrence; Levine, Beth; Xiao, Guanghua; White, Michael A.; Xie, Yang

    2014-01-01

    A challenge for large-scale siRNA loss-of-function studies is the biological pleiotropy resulting from multiple modes of action of siRNA reagents. A major confounding feature of these reagents is the microRNA-like translational quelling resulting from short regions of oligonucleotide complementarity to many different messenger RNAs. We developed a computational approach, deconvolution analysis of RNAi screening data, for automated quantitation of off-target effects in RNAi screening data sets. Substantial reduction of off-target rates was experimentally validated in five distinct biological screens across different genome-wide siRNA libraries. A public-access graphical-user-interface has been constructed to facilitate application of this algorithm. PMID:24972830

  6. The interplay of attention economics and computer-aided detection marks in screening mammography

    NASA Astrophysics Data System (ADS)

    Schwartz, Tayler M.; Sridharan, Radhika; Wei, Wei; Lukyanchenko, Olga; Geiser, William; Whitman, Gary J.; Haygood, Tamara Miner

    2016-03-01

    Introduction: According to attention economists, overabundant information leads to decreased attention for individual pieces of information. Computer-aided detection (CAD) alerts radiologists to findings potentially associated with breast cancer but is notorious for creating an abundance of false-positive marks. We suspected that increased CAD marks do not lengthen mammogram interpretation time, as radiologists will selectively disregard these marks when present in larger numbers. We explore the relevance of attention economics in mammography by examining how the number of CAD marks affects interpretation time. Methods: We performed a retrospective review of bilateral digital screening mammograms obtained between January 1, 2011 and February 28, 2014, using only weekend interpretations to decrease distractions and the likelihood of trainee participation. We stratified data according to reader and used ANOVA to assess the relationship between number of CAD marks and interpretation time. Results: Ten radiologists, with median experience after residency of 12.5 years (range 6 to 24,) interpreted 1849 mammograms. When accounting for number of images, Breast Imaging Reporting and Data System category, and breast density, increasing numbers of CAD marks was correlated with longer interpretation time only for the three radiologists with the fewest years of experience (median 7 years.) Conclusion: For the 7 most experienced readers, increasing CAD marks did not lengthen interpretation time. We surmise that as CAD marks increase, the attention given to individual marks decreases. Experienced radiologists may rapidly dismiss larger numbers of CAD marks as false-positive, having learned that devoting extra attention to such marks does not improve clinical detection.

  7. Novel Insight from Computational Virtual Screening Depict the Binding Potential of Selected Phytotherapeutics Against Probable Drug Targets of Clostridium difficile.

    PubMed

    Kamath, Suman; Skariyachan, Sinosh

    2017-02-20

    This study explores computational screening and molecular docking approaches to screen novel herbal therapeutics against probable drug targets of Clostridium difficile. The essential genes were predicted by comparative genome analysis of C. difficile and best homologous organisms using BLAST search at database of essential genes (DEG). The functions of these genes in various metabolic pathways were predicted and some of these genes were considered as potential targets. Three major proteins were selected as putative targets, namely permease IIC component, ABC transporter and histidine kinase. The three-dimensional structures of these targets were predicted by molecular modelling. The herbal bioactive compounds were screened by computer-aided virtual screening and binding potentials against the drug targets were predicted by molecular docking. Quercetin present in Psidium guajava (binding energy of -9.1 kcal/mol), Ellagic acid found in Punica granatum and Psidium guajava (binding energy -9.0 kcal/mol) and Curcumin, present in Curcuma longa (binding energy -7.8 kcal/mol) demonstrated minimum binding energy and more number of interacting residues with the drug targets. Further, comparative study revealed that phytoligands demonstrated better binding affinities to the drug targets in comparison with usual ligands. Thus, this investigation explores the therapeutic probabilities of selected phytoligands against the putative drug targets of C. difficile.

  8. A computer-tailored intervention to promote informed decision making for prostate cancer screening among African-American men

    PubMed Central

    Allen, Jennifer D.; Mohllajee, Anshu P.; Shelton, Rachel C.; Drake, Bettina F.; Mars, Dana R.

    2010-01-01

    African-American men experience a disproportionate burden of prostate cancer (CaP) morbidity and mortality. National screening guidelines advise men to make individualized screening decisions through a process termed “informed decision making” (IDM). In this pilot study, a computer-tailored decision-aid designed to promote IDM was evaluated using a pre/post test design. African-American men aged 40+ recruited from a variety of community settings (n=108). At pre-test, 43% of men reported having made a screening decision; at post-test 47% reported this to be the case (p=0.39). Significant improvements were observed on scores (0–100%) of knowledge (54% vs 72%; p<0.001), decision self-efficacy (87% vs 89%; p<0.01), and decisional conflict (21% vs 13%; p<0.001). Men were also more likely to want an active role in decision-making after using the tool (67% vs 75%; p=0.03). These results suggest that use of a computer-tailored decision-aid is a promising strategy to promote IDM for CaP screening among African-American men. PMID:19477736

  9. A Combination of Screening and Computational Approaches for the Identification of Novel Compounds That Decrease Mast Cell Degranulation

    PubMed Central

    McShane, Marisa P.; Friedrichson, Tim; Giner, Angelika; Meyenhofer, Felix; Barsacchi, Rico; Bickle, Marc

    2015-01-01

    High-content screening of compound libraries poses various challenges in the early steps in drug discovery such as gaining insights into the mode of action of the selected compounds. Here, we addressed these challenges by integrating two biological screens through bioinformatics and computational analysis. We screened a small-molecule library enriched in amphiphilic compounds in a degranulation assay in rat basophilic leukemia 2H3 (RBL-2H3) cells. The same library was rescreened in a high-content image-based endocytosis assay in HeLa cells. This assay was previously applied to a genome-wide RNAi screen that produced quantitative multiparametric phenotypic profiles for genes that directly or indirectly affect endocytosis. By correlating the endocytic profiles of the compounds with the genome-wide siRNA profiles, we identified candidate pathways that may be inhibited by the compounds. Among these, we focused on the Akt pathway and validated its inhibition in HeLa and RBL-2H3 cells. We further showed that the compounds inhibited the translocation of the Akt-PH domain to the plasma membrane. The approach performed here can be used to integrate chemical and functional genomics screens for investigating the mechanism of action of compounds. PMID:25838434

  10. High throughput screening for mammography using a human-computer interface with rapid serial visual presentation (RSVP)

    NASA Astrophysics Data System (ADS)

    Hope, Chris; Sterr, Annette; Elangovan, Premkumar; Geades, Nicholas; Windridge, David; Young, Ken; Wells, Kevin

    2013-03-01

    The steady rise of the breast cancer screening population, coupled with data expansion produced by new digital screening technologies (tomosynthesis/CT) motivates the development of new, more efficient image screening processes. Rapid Serial Visual Presentation (RSVP) is a new fast-content recognition approach which uses electroencephalography to record brain activity elicited by fast bursts of image data. These brain responses are then subjected to machine classification methods to reveal the expert's `reflex' response to classify images according to their presence or absence of particular targets. The benefit of this method is that images can be presented at high temporal rates (~10 per second), faster than that required for fully conscious detection, facilitating a high throughput of image (screening) material. In the present paper we present the first application of RSVP to medical image data, and demonstrate how cortically coupled computer vision can be successfully applied to breast cancer screening. Whilst prior RSVP work has utilised multichannel approaches, we also present the first RSVP results demonstrating discriminatory response on a single electrode with a ROC area under the curve of 0.62- 0.86 using a simple Fisher discriminator for classification. This increases to 0.75 - 0.94 when multiple electrodes are used in combination.

  11. Application of 2-D graphical representation of DNA sequence

    NASA Astrophysics Data System (ADS)

    Liao, Bo; Tan, Mingshu; Ding, Kequan

    2005-10-01

    Recently, we proposed a 2-D graphical representation of DNA sequence [Bo Liao, A 2-D graphical representation of DNA sequence, Chem. Phys. Lett. 401 (2005) 196-199]. Based on this representation, we consider properties of mutations and compute the similarities among 11 mitochondrial sequences belonging to different species. The elements of the similarity matrix are used to construct phylogenic tree. Unlike most existing phylogeny construction methods, the proposed method does not require multiple alignment.

  12. A simultaneous 2D/3D autostereo workstation

    NASA Astrophysics Data System (ADS)

    Chau, Dennis; McGinnis, Bradley; Talandis, Jonas; Leigh, Jason; Peterka, Tom; Knoll, Aaron; Sumer, Aslihan; Papka, Michael; Jellinek, Julius

    2012-03-01

    We present a novel immersive workstation environment that scientists can use for 3D data exploration and as their everyday 2D computer monitor. Our implementation is based on an autostereoscopic dynamic parallax barrier 2D/3D display, interactive input devices, and a software infrastructure that allows client/server software modules to couple the workstation to scientists' visualization applications. This paper describes the hardware construction and calibration, software components, and a demonstration of our system in nanoscale materials science exploration.

  13. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    SciTech Connect

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  14. Excitons in van der Waals heterostructures: The important role of dielectric screening

    NASA Astrophysics Data System (ADS)

    Latini, S.; Olsen, T.; Thygesen, K. S.

    2015-12-01

    The existence of strongly bound excitons is one of the hallmarks of the newly discovered atomically thin semiconductors. While it is understood that the large binding energy is mainly due to the weak dielectric screening in two dimensions, a systematic investigation of the role of screening on two-dimensional (2D) excitons is still lacking. Here we provide a critical assessment of a widely used 2D hydrogenic exciton model, which assumes a dielectric function of the form ɛ (q )=1 +2 π α q , and we develop a quasi-2D model with a much broader applicability. Within the quasi-2D picture, electrons and holes are described as in-plane point charges with a finite extension in the perpendicular direction, and their interaction is screened by a dielectric function with a nonlinear q dependence which is computed ab initio. The screened interaction is used in a generalized Mott-Wannier model to calculate exciton binding energies in both isolated and supported 2D materials. For isolated 2D materials, the quasi-2D treatment yields results almost identical to those of the strict 2D model, and both are in good agreement with ab initio many-body calculations. On the other hand, for more complex structures such as supported layers or layers embedded in a van der Waals heterostructure, the size of the exciton in reciprocal space extends well beyond the linear regime of the dielectric function, and a quasi-2D description has to replace the 2D one. Our methodology has the merit of providing a seamless connection between the strict 2D limit of isolated monolayer materials and the more bulk-like screening characteristics of supported 2D materials or van der Waals heterostructures.

  15. 2D quasiperiodic plasmonic crystals

    PubMed Central

    Bauer, Christina; Kobiela, Georg; Giessen, Harald

    2012-01-01

    Nanophotonic structures with irregular symmetry, such as quasiperiodic plasmonic crystals, have gained an increasing amount of attention, in particular as potential candidates to enhance the absorption of solar cells in an angular insensitive fashion. To examine the photonic bandstructure of such systems that determines their optical properties, it is necessary to measure and model normal and oblique light interaction with plasmonic crystals. We determine the different propagation vectors and consider the interaction of all possible waveguide modes and particle plasmons in a 2D metallic photonic quasicrystal, in conjunction with the dispersion relations of a slab waveguide. Using a Fano model, we calculate the optical properties for normal and inclined light incidence. Comparing measurements of a quasiperiodic lattice to the modelled spectra for angle of incidence variation in both azimuthal and polar direction of the sample gives excellent agreement and confirms the predictive power of our model. PMID:23209871

  16. Valleytronics in 2D materials

    NASA Astrophysics Data System (ADS)

    Schaibley, John R.; Yu, Hongyi; Clark, Genevieve; Rivera, Pasqual; Ross, Jason S.; Seyler, Kyle L.; Yao, Wang; Xu, Xiaodong

    2016-11-01

    Semiconductor technology is currently based on the manipulation of electronic charge; however, electrons have additional degrees of freedom, such as spin and valley, that can be used to encode and process information. Over the past several decades, there has been significant progress in manipulating electron spin for semiconductor spintronic devices, motivated by potential spin-based information processing and storage applications. However, experimental progress towards manipulating the valley degree of freedom for potential valleytronic devices has been limited until very recently. We review the latest advances in valleytronics, which have largely been enabled by the isolation of 2D materials (such as graphene and semiconducting transition metal dichalcogenides) that host an easily accessible electronic valley degree of freedom, allowing for dynamic control.

  17. Unparticle example in 2D.

    PubMed

    Georgi, Howard; Kats, Yevgeny

    2008-09-26

    We discuss what can be learned about unparticle physics by studying simple quantum field theories in one space and one time dimension. We argue that the exactly soluble 2D theory of a massless fermion coupled to a massive vector boson, the Sommerfield model, is an interesting analog of a Banks-Zaks model, approaching a free theory at high energies and a scale-invariant theory with nontrivial anomalous dimensions at low energies. We construct a toy standard model coupling to the fermions in the Sommerfield model and study how the transition from unparticle behavior at low energies to free particle behavior at high energies manifests itself in interactions with the toy standard model particles.

  18. Quantum coherence selective 2D Raman–2D electronic spectroscopy

    PubMed Central

    Spencer, Austin P.; Hutson, William O.; Harel, Elad

    2017-01-01

    Electronic and vibrational correlations report on the dynamics and structure of molecular species, yet revealing these correlations experimentally has proved extremely challenging. Here, we demonstrate a method that probes correlations between states within the vibrational and electronic manifold with quantum coherence selectivity. Specifically, we measure a fully coherent four-dimensional spectrum which simultaneously encodes vibrational–vibrational, electronic–vibrational and electronic–electronic interactions. By combining near-impulsive resonant and non-resonant excitation, the desired fifth-order signal of a complex organic molecule in solution is measured free of unwanted lower-order contamination. A critical feature of this method is electronic and vibrational frequency resolution, enabling isolation and assignment of individual quantum coherence pathways. The vibronic structure of the system is then revealed within an otherwise broad and featureless 2D electronic spectrum. This method is suited for studying elusive quantum effects in which electronic transitions strongly couple to phonons and vibrations, such as energy transfer in photosynthetic pigment–protein complexes. PMID:28281541

  19. Quantum coherence selective 2D Raman-2D electronic spectroscopy

    NASA Astrophysics Data System (ADS)

    Spencer, Austin P.; Hutson, William O.; Harel, Elad

    2017-03-01

    Electronic and vibrational correlations report on the dynamics and structure of molecular species, yet revealing these correlations experimentally has proved extremely challenging. Here, we demonstrate a method that probes correlations between states within the vibrational and electronic manifold with quantum coherence selectivity. Specifically, we measure a fully coherent four-dimensional spectrum which simultaneously encodes vibrational-vibrational, electronic-vibrational and electronic-electronic interactions. By combining near-impulsive resonant and non-resonant excitation, the desired fifth-order signal of a complex organic molecule in solution is measured free of unwanted lower-order contamination. A critical feature of this method is electronic and vibrational frequency resolution, enabling isolation and assignment of individual quantum coherence pathways. The vibronic structure of the system is then revealed within an otherwise broad and featureless 2D electronic spectrum. This method is suited for studying elusive quantum effects in which electronic transitions strongly couple to phonons and vibrations, such as energy transfer in photosynthetic pigment-protein complexes.

  20. Quantum coherence selective 2D Raman-2D electronic spectroscopy.

    PubMed

    Spencer, Austin P; Hutson, William O; Harel, Elad

    2017-03-10

    Electronic and vibrational correlations report on the dynamics and structure of molecular species, yet revealing these correlations experimentally has proved extremely challenging. Here, we demonstrate a method that probes correlations between states within the vibrational and electronic manifold with quantum coherence selectivity. Specifically, we measure a fully coherent four-dimensional spectrum which simultaneously encodes vibrational-vibrational, electronic-vibrational and electronic-electronic interactions. By combining near-impulsive resonant and non-resonant excitation, the desired fifth-order signal of a complex organic molecule in solution is measured free of unwanted lower-order contamination. A critical feature of this method is electronic and vibrational frequency resolution, enabling isolation and assignment of individual quantum coherence pathways. The vibronic structure of the system is then revealed within an otherwise broad and featureless 2D electronic spectrum. This method is suited for studying elusive quantum effects in which electronic transitions strongly couple to phonons and vibrations, such as energy transfer in photosynthetic pigment-protein complexes.

  1. NIKE2D96. Static & Dynamic Response of 2D Solids

    SciTech Connect

    Raboin, P.; Engelmann, B.; Halquist, J.O.

    1992-01-24

    NIKE2D is an implicit finite-element code for analyzing the finite deformation, static and dynamic response of two-dimensional, axisymmetric, plane strain, and plane stress solids. The code is fully vectorized and available on several computing platforms. A number of material models are incorporated to simulate a wide range of material behavior including elasto-placicity, anisotropy, creep, thermal effects, and rate dependence. Slideline algorithms model gaps and sliding along material interfaces, including interface friction, penetration and single surface contact. Interactive-graphics and rezoning is included for analyses with large mesh distortions. In addition to quasi-Newton and arc-length procedures, adaptive algorithms can be defined to solve the implicit equations using the solution language ISLAND. Each of these capabilities and more make NIKE2D a robust analysis tool.

  2. Computer-aided detection of breast masses: Four-view strategy for screening mammography

    SciTech Connect

    Wei Jun; Chan Heangping; Zhou Chuan; Wu Yita; Sahiner, Berkman; Hadjiiski, Lubomir M.; Roubidoux, Marilyn A.; Helvie, Mark A.

    2011-04-15

    Purpose: To improve the performance of a computer-aided detection (CAD) system for mass detection by using four-view information in screening mammography. Methods: The authors developed a four-view CAD system that emulates radiologists' reading by using the craniocaudal and mediolateral oblique views of the ipsilateral breast to reduce false positives (FPs) and the corresponding views of the contralateral breast to detect asymmetry. The CAD system consists of four major components: (1) Initial detection of breast masses on individual views, (2) information fusion of the ipsilateral views of the breast (referred to as two-view analysis), (3) information fusion of the corresponding views of the contralateral breast (referred to as bilateral analysis), and (4) fusion of the four-view information with a decision tree. The authors collected two data sets for training and testing of the CAD system: A mass set containing 389 patients with 389 biopsy-proven masses and a normal set containing 200 normal subjects. All cases had four-view mammograms. The true locations of the masses on the mammograms were identified by an experienced MQSA radiologist. The authors randomly divided the mass set into two independent sets for cross validation training and testing. The overall test performance was assessed by averaging the free response receiver operating characteristic (FROC) curves of the two test subsets. The FP rates during the FROC analysis were estimated by using the normal set only. The jackknife free-response ROC (JAFROC) method was used to estimate the statistical significance of the difference between the test FROC curves obtained with the single-view and the four-view CAD systems. Results: Using the single-view CAD system, the breast-based test sensitivities were 58% and 77% at the FP rates of 0.5 and 1.0 per image, respectively. With the four-view CAD system, the breast-based test sensitivities were improved to 76% and 87% at the corresponding FP rates, respectively

  3. RNA folding pathways and kinetics using 2D energy landscapes.

    PubMed

    Senter, Evan; Dotu, Ivan; Clote, Peter

    2015-01-01

    RNA folding pathways play an important role in various biological processes, such as (i) the hok/sok (host-killing/suppression of killing) system in E. coli to check for sufficient plasmid copy number, (ii) the conformational switch in spliced leader (SL) RNA from Leptomonas collosoma, which controls trans splicing of a portion of the '5 exon, and (iii) riboswitches--portions of the 5' untranslated region of messenger RNA that regulate genes by allostery. Since RNA folding pathways are determined by the energy landscape, we describe a novel algorithm, FFTbor2D, which computes the 2D projection of the energy landscape for a given RNA sequence. Given two metastable secondary structures A, B for a given RNA sequence, FFTbor2D computes the Boltzmann probability p(x, y) = Z(x,y)/Z that a secondary structure has base pair distance x from A and distance y from B. Using polynomial interpolationwith the fast Fourier transform,we compute p(x, y) in O(n(5)) time and O(n(2)) space, which is an improvement over an earlier method, which runs in O(n(7)) time and O(n(4)) space. FFTbor2D has potential applications in synthetic biology, where one might wish to design bistable switches having target metastable structures A, B with favorable pathway kinetics. By inverting the transition probability matrix determined from FFTbor2D output, we show that L. collosoma spliced leader RNA has larger mean first passage time from A to B on the 2D energy landscape, than 97.145% of 20,000 sequences, each having metastable structures A, B. Source code and binaries are freely available for download at http://bioinformatics.bc.edu/clotelab/FFTbor2D. The program FFTbor2D is implemented in C++, with optional OpenMP parallelization primitives.

  4. Simulation of Yeast Cooperation in 2D.

    PubMed

    Wang, M; Huang, Y; Wu, Z

    2016-03-01

    Evolution of cooperation has been an active research area in evolutionary biology in decades. An important type of cooperation is developed from group selection, when individuals form spatial groups to prevent them from foreign invasions. In this paper, we study the evolution of cooperation in a mixed population of cooperating and cheating yeast strains in 2D with the interactions among the yeast cells restricted to their small neighborhoods. We conduct a computer simulation based on a game theoretic model and show that cooperation is increased when the interactions are spatially restricted, whether the game is of a prisoner's dilemma, snow drifting, or mutual benefit type. We study the evolution of homogeneous groups of cooperators or cheaters and describe the conditions for them to sustain or expand in an opponent population. We show that under certain spatial restrictions, cooperator groups are able to sustain and expand as group sizes become large, while cheater groups fail to expand and keep them from collapse.

  5. Predicting reading outcomes in the classroom using a computer-based phonological awareness screening and monitoring assessment (Com-PASMA).

    PubMed

    Carson, Karyn; Boustead, Therese; Gillon, Gail

    2014-12-01

    The screening and monitoring of phonological awareness (PA) in the classroom is of great importance to the early identification and prevention of reading disorder. This study investigated whether a time-efficient computer-based PA screening and monitoring assessment (Com-PASMA) could accurately predict end-of-year reading outcomes for 5-year-old children in the first year of schooling. A longitudinal design was employed where the Com-PASMA was used to measure the PA ability of 95 5-year-old children at the start, middle, and end of the first year of school. Of this group, 21 children presented with spoken language impairment. Reading outcomes were formally measured after 1 year of schooling. School-entry measures of PA using the Com-PASMA (p < .001), in conjunction with language ability (p = .004), accounted for 68.9% of the variance in end-of-year word decoding ability. Sensitivity and specificity calculations demonstrated that the Com-PASMA was 92% accurate at school-entry, and 94% accurate by the middle of the school year in predicting reading outcomes at 6-years of age. Results suggest that a time-efficient computer-based method of screening and monitoring PA can support the early identification of reading difficulties in the first year of schooling.

  6. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    NASA Astrophysics Data System (ADS)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  7. Feasibility and Acceptability of an Audio Computer-Assisted Self-Interview Version of the Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST) in Primary Care Patients

    PubMed Central

    Spear, Suzanne E.; Shedlin, Michele; Gilberti, Brian; Fiellin, Maya; McNeely, Jennifer

    2016-01-01

    Background This study explores the feasibility and acceptability of a computer self-administered approach to substance use screening from the perspective of primary care patients. Methods Forty-eight patients from a large safety net hospital in New York City completed an audio computer-assisted self-interview (ACASI) version of the Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST) and a qualitative interview to assess feasibility and acceptability; comprehension; comfort with screening questions; and preferences for screening mode (interviewer or computer). Qualitative data analysis organized the participants’ feedback into major themes. Results Participants overwhelmingly reported being comfortable with the ACASI ASSIST. Mean administration time was 5.2 minutes (range 1.6 – 14.8). The major themes from the qualitative interviews were 1) ACASI ASSIST is feasible and acceptable to patients, 2) Social stigma around substance use is a barrier to patient disclosure, and 3) ACASI screening should not preclude personal interaction with providers. Conclusions The ACASI ASSIST is an appropriate and feasible approach to substance use screening in primary care. Because of the highly sensitive nature of substance use, screening tools must explain the purpose of screening, assure patients that their privacy is protected, and inform patients of the opportunity to discuss their screening results with their provider. PMID:26158798

  8. PERSONAL COMPUTER MONITORS: A SCREENING EVALUATION OF VOLATILE ORGANIC EMISSIONS FROM EXISTING PRINTED CIRCUIT BOARD LAMINATES AND POTENTIAL POLLUTION PREVENTION ALTERNATIVES

    EPA Science Inventory

    The report gives results of a screening evaluation of volatile organic emissions from printed circuit board laminates and potential pollution prevention alternatives. In the evaluation, printed circuit board laminates, without circuitry, commonly found in personal computer (PC) m...

  9. EEG-based cognitive load of processing events in 3D virtual worlds is lower than processing events in 2D displays.

    PubMed

    Dan, Alex; Reiner, Miriam

    2016-08-31

    Interacting with 2D displays, such as computer screens, smartphones, and TV, is currently a part of our daily routine; however, our visual system is built for processing 3D worlds. We examined the cognitive load associated with a simple and a complex task of learning paper-folding (origami) by observing 2D or stereoscopic 3D displays. While connected to an electroencephalogram (EEG) system, participants watched a 2D video of an instructor demonstrating the paper-folding tasks, followed by a stereoscopic 3D projection of the same instructor (a digital avatar) illustrating identical tasks. We recorded the power of alpha and theta oscillations and calculated the cognitive load index (CLI) as the ratio of the average power of frontal theta (Fz.) and parietal alpha (Pz). The results showed a significantly higher cognitive load index associated with processing the 2D projection as compared to the 3D projection; additionally, changes in the average theta Fz power were larger for the 2D conditions as compared to the 3D conditions, while alpha average Pz power values were similar for 2D and 3D conditions for the less complex task and higher in the 3D state for the more complex task. The cognitive load index was lower for the easier task and higher for the more complex task in 2D and 3D. In addition, participants with lower spatial abilities benefited more from the 3D compared to the 2D display. These findings have implications for understanding cognitive processing associated with 2D and 3D worlds and for employing stereoscopic 3D technology over 2D displays in designing emerging virtual and augmented reality applications.

  10. Recovering 3D particle size distributions from 2D sections

    NASA Astrophysics Data System (ADS)

    Cuzzi, Jeffrey N.; Olson, Daniel M.

    2017-03-01

    We discuss different ways to convert observed, apparent particle size distributions from 2D sections (thin sections, SEM maps on planar surfaces, etc.) into true 3D particle size distributions. We give a simple, flexible, and practical method to do this; show which of these techniques gives the most faithful conversions; and provide (online) short computer codes to calculate both 2D-3D recoveries and simulations of 2D observations by random sectioning. The most important systematic bias of 2D sectioning, from the standpoint of most chondrite studies, is an overestimate of the abundance of the larger particles. We show that fairly good recoveries can be achieved from observed size distributions containing 100-300 individual measurements of apparent particle diameter.

  11. ORION96. 2-d Finite Element Code Postprocessor

    SciTech Connect

    Sanford, L.A.; Hallquist, J.O.

    1992-02-02

    ORION is an interactive program that serves as a postprocessor for the analysis programs NIKE2D, DYNA2D, TOPAZ2D, and CHEMICAL TOPAZ2D. ORION reads binary plot files generated by the two-dimensional finite element codes currently used by the Methods Development Group at LLNL. Contour and color fringe plots of a large number of quantities may be displayed on meshes consisting of triangular and quadrilateral elements. ORION can compute strain measures, interface pressures along slide lines, reaction forces along constrained boundaries, and momentum. ORION has been applied to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  12. Microwave Imaging with Infrared 2-D Lock-in Amplifier

    NASA Astrophysics Data System (ADS)

    Chiyo, Noritaka; Arai, Mizuki; Tanaka, Yasuhiro; Nishikata, Atsuhiro; Maeno, Takashi

    We have developed a 3-D electromagnetic field measurement system using 2-D lock-in amplifier. This system uses an amplitude modulated electromagnetic wave source to heat a resistive screen. A very small change of temperature on a screen illuminated with the modulated electromagnetic wave is measured using an infrared thermograph camera. In this paper, we attempted to apply our system to microwave imaging. By placing conductor patches in front of the resistive screen and illuminating with microwave, the shape of each conductor was clearly observed as the temperature difference image of the screen. In this way, the conductor pattern inside the non-contact type IC card could be visualized. Moreover, we could observe the temperature difference image reflecting the shape of a Konnyaku (a gelatinous food made from devil's-tonge starch) or a dried fishbone, both as non-conducting material resembling human body. These results proved that our method is applicable to microwave see-through imaging.

  13. Rotation invariance principles in 2D/3D registration

    NASA Astrophysics Data System (ADS)

    Birkfellner, Wolfgang; Wirth, Joachim; Burgstaller, Wolfgang; Baumann, Bernard; Staedele, Harald; Hammer, Beat; Gellrich, Niels C.; Jacob, Augustinus L.; Regazzoni, Pietro; Messmer, Peter

    2003-05-01

    2D/3D patient-to-computed tomography (CT) registration is a method to determine a transformation that maps two coordinate systems by comparing a projection image rendered from CT to a real projection image. Applications include exact patient positioning in radiation therapy, calibration of surgical robots, and pose estimation in computer-aided surgery. One of the problems associated with 2D/3D registration is the fast that finding a registration includes sovling a minimization problem in six degrees-of-freedom in motion. This results in considerable time expenses since for each iteration step at least one volume rendering has to be computed. We show that by choosing an appropriate world coordinate system and by applying a 2D/2D registration method in each iteration step, the number of iterations can be grossly reduced from n6 to n5. Here, n is the number of discrete variations aroudn a given coordinate. Depending on the configuration of the optimization algorithm, this reduces the total number of iterations necessary to at least 1/3 of its original value. The method was implemented and extensively tested on simulated x-ray images of a pelvis. We conclude that this hardware-indepenent optimization of 2D/3D registration is a step towards increasing the acceptance of this promising method for a wide number of clinical applications.

  14. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals

    PubMed Central

    Amat-ur-Rasool, Hafsa; Ahmed, Mehboob

    2015-01-01

    Alzheimer's disease (AD), a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh). The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE), an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals) and self-drawn ligands were compared with Food and Drug Administration (FDA) approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD. PMID:26325402

  15. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

    PubMed

    Amat-Ur-Rasool, Hafsa; Ahmed, Mehboob

    2015-01-01

    Alzheimer's disease (AD), a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh). The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE), an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals) and self-drawn ligands were compared with Food and Drug Administration (FDA) approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD.

  16. Computer-Facilitated Substance Use Screening and Brief Advice for Teens in Primary Care: An International Trial

    PubMed Central

    Csémy, Ladislav; Sherritt, Lon; Starostova, Olga; Van Hook, Shari; Johnson, Julie; Boulter, Suzanne; Brooks, Traci; Carey, Peggy; Kossack, Robert; Kulig, John W.; Van Vranken, Nancy; Knight, John R.

    2012-01-01

    OBJECTIVE: Primary care providers need effective strategies for substance use screening and brief counseling of adolescents. We examined the effects of a new computer-facilitated screening and provider brief advice (cSBA) system. METHODS: We used a quasi-experimental, asynchronous study design in which each site served as its own control. From 2005 to 2008, 12- to 18-year-olds arriving for routine care at 9 medical offices in New England (n = 2096, 58% females) and 10 in Prague, Czech Republic (n = 589, 47% females) were recruited. Patients completed measurements only during the initial treatment-as-usual study phase. We then conducted 1-hour provider training, and initiated the cSBA phase. Before seeing the provider, all cSBA participants completed a computerized screen, and then viewed screening results, scientific information, and true-life stories illustrating substance use harms. Providers received screening results and “talking points” designed to prompt 2 to 3 minutes of brief advice. We examined alcohol and cannabis use, initiation, and cessation rates over the past 90 days at 3-month follow-up, and over the past 12 months at 12-month follow-up. RESULTS: Compared with treatment as usual, cSBA patients reported less alcohol use at follow-up in New England (3-month rates 15.5% vs 22.9%, adjusted relative risk ratio [aRRR] = 0.54, 95% confidence interval 0.38–0.77; 12-month rates 29.3% vs 37.5%, aRRR = 0.73, 0.57–0.92), and less cannabis use in Prague (3-month rates 5.5% vs 9.8%, aRRR = 0.37, 0.17–0.77; 12-month rates 17.0% vs 28.7%, aRRR = 0.47, 0.32–0.71). CONCLUSIONS: Computer-facilitated screening and provider brief advice appears promising for reducing substance use among adolescent primary care patients. PMID:22566420

  17. Circulating microRNA signature as liquid-biopsy to monitor lung cancer in low-dose computed tomography screening.

    PubMed

    Sestini, Stefano; Boeri, Mattia; Marchiano, Alfonso; Pelosi, Giuseppe; Galeone, Carlotta; Verri, Carla; Suatoni, Paola; Sverzellati, Nicola; La Vecchia, Carlo; Sozzi, Gabriella; Pastorino, Ugo

    2015-10-20

    Liquid biopsies can detect biomarkers carrying information on the development and progression of cancer. We demonstrated that a 24 plasma-based microRNA signature classifier (MSC) was capable of increasing the specificity of low dose computed tomography (LDCT) in a lung cancer screening trial. In the present study, we tested the prognostic performance of MSC, and its ability to monitor disease status recurrence in LDCT screening-detected lung cancers.Between 2000 and 2010, 3411 heavy smokers enrolled in two screening programmes, underwent annual or biennial LDCT. During the first five years of screening, 84 lung cancer patients were classified according to one of the three MSC levels of risk: high, intermediate or low. Kaplan-Meier survival analysis was performed according to MSC and clinico-pathological information. Follow-up MSC analysis was performed on longitudinal plasma samples (n = 100) collected from 31 patients before and after surgical resection.Five-year survival was 88.9% for low risk, 79.5% for intermediate risk and 40.1% for high risk MSC (p = 0.001). The prognostic power of MSC persisted after adjusting for tumor stage (p = 0.02) and when the analysis was restricted to LDCT-detected cases after exclusion of interval cancers (p < 0.001). The MSC risk level decreased after surgery in 76% of the 25 high-intermediate subjects who remained disease free, whereas in relapsing patients an increase of the MSC risk level was observed at the time of detection of second primary tumor or metastatic progression.These results encourage exploiting the MSC test for lung cancer monitoring in LDCT screening for lung cancer.

  18. Assessment of an Interactive Computer-Based Patient Prenatal Genetic Screening and Testing Education Tool

    ERIC Educational Resources Information Center

    Griffith, Jennifer M.; Sorenson, James R.; Bowling, J. Michael; Jennings-Grant, Tracey

    2005-01-01

    The Enhancing Patient Prenatal Education study tested the feasibility and educational impact of an interactive program for patient prenatal genetic screening and testing education. Patients at two private practices and one public health clinic participated (N = 207). The program collected knowledge and measures of anxiety before and after use of…

  19. A Multivariate Computational Method to Analyze High-Content RNAi Screening Data.

    PubMed

    Rameseder, Jonathan; Krismer, Konstantin; Dayma, Yogesh; Ehrenberger, Tobias; Hwang, Mun Kyung; Airoldi, Edoardo M; Floyd, Scott R; Yaffe, Michael B

    2015-09-01

    High-content screening (HCS) using RNA interference (RNAi) in combination with automated microscopy is a powerful investigative tool to explore complex biological processes. However, despite the plethora of data generated from these screens, little progress has been made in analyzing HC data using multivariate methods that exploit the full richness of multidimensional data. We developed a novel multivariate method for HCS, multivariate robust analysis method (M-RAM), integrating image feature selection with ranking of perturbations for hit identification, and applied this method to an HC RNAi screen to discover novel components of the DNA damage response in an osteosarcoma cell line. M-RAM automatically selects the most informative phenotypic readouts and time points to facilitate the more efficient design of follow-up experiments and enhance biological understanding. Our method outperforms univariate hit identification and identifies relevant genes that these approaches would have missed. We found that statistical cell-to-cell variation in phenotypic responses is an important predictor of hits in RNAi-directed image-based screens. Genes that we identified as modulators of DNA damage signaling in U2OS cells include B-Raf, a cancer driver gene in multiple tumor types, whose role in DNA damage signaling we confirm experimentally, and multiple subunits of protein kinase A.

  20. A multivariate computational method to analyze high-content RNAi screening data

    PubMed Central

    Rameseder, Jonathan; Krismer, Konstantin; Dayma, Yogesh; Ehrenberger, Tobias; Hwang, Mun Kyung; Airoldi, Edoardo M.; Floyd, Scott R.; Yaffe, Michael B.

    2017-01-01

    High-content screening (HCS) using RNA interference (RNAi) in combination with automated microscopy is a powerful investigative tool to explore complex biological processes. However, despite the plethora of data generated from these screens, little progress has been made in analyzing HC data using multivariate methods that exploit the full richness of multidimensional data. We developed a novel multivariate method for HCS, Multivariate Robust Analysis Method (M-RAM), integrating image feature selection with ranking of perturbations for hit identification, and applied this method to a HC RNAi screen to discover novel components of the DNA damage response in an osteosarcoma cell line. M-RAM automatically selects the most informative phenotypic readouts and time points to facilitate the more efficient design of follow-up experiments and enhance biological understanding. Our method outperforms univariate hit identification and identifies relevant genes that these approaches would have missed. We found that statistical cell-to-cell variation in phenotypic responses is an important predictor of ‘hits’ in RNAi-directed image-based screens. Genes that we identified as modulators of DNA damage signaling in U2OS cells include B-Raf, a cancer driver gene in multiple tumor types, whose role in DNA damage signaling we confirm experimentally, and multiple subunits of protein kinase A. PMID:25918037

  1. NKG2D ligands as therapeutic targets

    PubMed Central

    Spear, Paul; Wu, Ming-Ru; Sentman, Marie-Louise; Sentman, Charles L.

    2013-01-01

    The Natural Killer Group 2D (NKG2D) receptor plays an important role in protecting the host from infections and cancer. By recognizing ligands induced on infected or tumor cells, NKG2D modulates lymphocyte activation and promotes immunity to eliminate ligand-expressing cells. Because these ligands are not widely expressed on healthy adult tissue, NKG2D ligands may present a useful target for immunotherapeutic approaches in cancer. Novel therapies targeting NKG2D ligands for the treatment of cancer have shown preclinical success and are poised to enter into clinical trials. In this review, the NKG2D receptor and its ligands are discussed in the context of cancer, infection, and autoimmunity. In addition, therapies targeting NKG2D ligands in cancer are also reviewed. PMID:23833565

  2. SCREENOP: A Computer Assisted Model for ASW (Anti-Submarine Warfare) Screen Design.

    DTIC Science & Technology

    1983-09-01

    a SCREEN data file is a major component of the data storage. SCREENOP was originally written to process propagation loss data in increments of one...miles. Larger ranges are accomodated by inputting a scale factor of 2 or 3. A scale factor of 2 results in processing propagation loss data at 2...Equivalently, setting the scale factor to 3 processes propagation loss data at three mile incre- ments to a maximum distance of 360 nautical miles. The

  3. Interobserver agreement and performance score comparison in quality control using a breast phantom: screen-film mammography vs computed radiography.

    PubMed

    Shimamoto, Kazuhiro; Ikeda, Mitsuru; Satake, Hiroko; Ishigaki, Satoko; Sawaki, Akiko; Ishigaki, Takeo

    2002-09-01

    Our objective was to evaluate interobserver agreement and to compare the performance score in quality control of screen-film mammography and computed radiography (CR) using a breast phantom. Eleven radiologists interpreted a breast phantom image (CIRS model X) by four viewing methods: (a) original screen-film; (b) soft-copy reading of the digitized film image; (c) hard-copy reading of CR using an imaging plate; and (d) soft-copy reading of CR. For the soft-copy reading, a 17-in. CRT monitor (1024x1536x8 bits) was used. The phantom image was evaluated using a scoring system outlined in the instruction manual, and observers judged each object using a three-point rating scale: (a) clearly seen; (b) barely seen; and (c) not seen. For statistical analysis, the kappa statistic was employed. For "mass" depiction, interobserver agreement using CR was significantly lower than when using screen-film ( p<0.05). There was no significant difference in the kappa value for detecting "microcalcification"; however, the performance score of "microcalcification" on CR hard-copy was significantly lower than on the other three viewing methods ( p<0.05). Viewing methods (film or CR, soft-copy or hard-copy) could affect how the phantom image is judged. Paying special attention to viewing conditions is recommended for quality control of CR mammograms.

  4. Computational approaches for protein function prediction: a combined strategy from multiple sequence alignment to molecular docking-based virtual screening.

    PubMed

    Pierri, Ciro Leonardo; Parisi, Giovanni; Porcelli, Vito

    2010-09-01

    The functional characterization of proteins represents a daily challenge for biochemical, medical and computational sciences. Although finally proved on the bench, the function of a protein can be successfully predicted by computational approaches that drive the further experimental assays. Current methods for comparative modeling allow the construction of accurate 3D models for proteins of unknown structure, provided that a crystal structure of a homologous protein is available. Binding regions can be proposed by using binding site predictors, data inferred from homologous crystal structures, and data provided from a careful interpretation of the multiple sequence alignment of the investigated protein and its homologs. Once the location of a binding site has been proposed, chemical ligands that have a high likelihood of binding can be identified by using ligand docking and structure-based virtual screening of chemical libraries. Most docking algorithms allow building a list sorted by energy of the lowest energy docking configuration for each ligand of the library. In this review the state-of-the-art of computational approaches in 3D protein comparative modeling and in the study of protein-ligand interactions is provided. Furthermore a possible combined/concerted multistep strategy for protein function prediction, based on multiple sequence alignment, comparative modeling, binding region prediction, and structure-based virtual screening of chemical libraries, is described by using suitable examples. As practical examples, Abl-kinase molecular modeling studies, HPV-E6 protein multiple sequence alignment analysis, and some other model docking-based characterization reports are briefly described to highlight the importance of computational approaches in protein function prediction.

  5. Mermin–Wagner fluctuations in 2D amorphous solids

    PubMed Central

    Illing, Bernd; Fritschi, Sebastian; Kaiser, Herbert; Klix, Christian L.; Maret, Georg; Keim, Peter

    2017-01-01

    In a recent commentary, J. M. Kosterlitz described how D. Thouless and he got motivated to investigate melting and suprafluidity in two dimensions [Kosterlitz JM (2016) J Phys Condens Matter 28:481001]. It was due to the lack of broken translational symmetry in two dimensions—doubting the existence of 2D crystals—and the first computer simulations foretelling 2D crystals (at least in tiny systems). The lack of broken symmetries proposed by D. Mermin and H. Wagner is caused by long wavelength density fluctuations. Those fluctuations do not only have structural impact, but additionally a dynamical one: They cause the Lindemann criterion to fail in 2D in the sense that the mean squared displacement of atoms is not limited. Comparing experimental data from 3D and 2D amorphous solids with 2D crystals, we disentangle Mermin–Wagner fluctuations from glassy structural relaxations. Furthermore, we demonstrate with computer simulations the logarithmic increase of displacements with system size: Periodicity is not a requirement for Mermin–Wagner fluctuations, which conserve the homogeneity of space on long scales. PMID:28137872

  6. Sparse radar imaging using 2D compressed sensing

    NASA Astrophysics Data System (ADS)

    Hou, Qingkai; Liu, Yang; Chen, Zengping; Su, Shaoying

    2014-10-01

    Radar imaging is an ill-posed linear inverse problem and compressed sensing (CS) has been proved to have tremendous potential in this field. This paper surveys the theory of radar imaging and a conclusion is drawn that the processing of ISAR imaging can be denoted mathematically as a problem of 2D sparse decomposition. Based on CS, we propose a novel measuring strategy for ISAR imaging radar and utilize random sub-sampling in both range and azimuth dimensions, which will reduce the amount of sampling data tremendously. In order to handle 2D reconstructing problem, the ordinary solution is converting the 2D problem into 1D by Kronecker product, which will increase the size of dictionary and computational cost sharply. In this paper, we introduce the 2D-SL0 algorithm into the reconstruction of imaging. It is proved that 2D-SL0 can achieve equivalent result as other 1D reconstructing methods, but the computational complexity and memory usage is reduced significantly. Moreover, we will state the results of simulating experiments and prove the effectiveness and feasibility of our method.

  7. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.

  8. Screening for Substance Use Disorder among Incarcerated Men with the Alcohol, Smoking, Substance Involvement Screening Test (ASSIST): A Comparative Analysis of Computer-administered and Interviewer-administered Modalities

    PubMed Central

    Wolff, Nancy; Shi, Jing

    2015-01-01

    Substance use disorders are overrepresented in incarcerated male populations. Cost- effective screening for alcohol and substance use problems among incarcerated populations is a necessary first step forward intervention. The Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST) holds promise because it has strong psychometric properties, requires minimal training, is easy to score, is available in the public domain but, because of complicated skip patterns, cannot be self-administered. This study tests the feasibility, reliability, and validity of using computer-administered self-interviewing (CASI) versus interviewer-administered interviewing (IAI) to screen for substance use problems among incarcerated men using the ASSIST. A 2 X 2 factorial design was used to randomly assign 396 incarcerated men to screening modality. Findings indicate that computer screening was feasible. Compared to IAI, CASI produced equally reliable screening information on substance use and symptom severity, with test-retest intraclass correlations for ASSIST total and substance-specific scores ranging from 0.7 to 0.9, and ASSIST substance-specific scores and a substance abuse disorder diagnosis based on the Structured Clinical Interview (SCID) were significantly correlated for IAI and CASI. These findings indicate that data on substance use and symptom severity using the ASSIST can be reliably and validly obtained from CASI technology, increasing the efficiency by which incarcerated populations can be screened for substance use problems and, those at risk, identified for treatment. PMID:25659203

  9. 2D Hexagonal Boron Nitride (2D-hBN) Explored for the Electrochemical Sensing of Dopamine.

    PubMed

    Khan, Aamar F; Brownson, Dale A C; Randviir, Edward P; Smith, Graham C; Banks, Craig E

    2016-10-04

    Crystalline 2D hexagonal boron nitride (2D-hBN) nanosheets are explored as a potential electrocatalyst toward the electroanalytical sensing of dopamine (DA). The 2D-hBN nanosheets are electrically wired via a drop-casting modification process onto a range of commercially available carbon supporting electrodes, including glassy carbon (GC), boron-doped diamond (BDD), and screen-printed graphitic electrodes (SPEs). 2D-hBN has not previously been explored toward the electrochemical detection/electrochemical sensing of DA. We critically evaluate the potential electrocatalytic performance of 2D-hBN modified electrodes, the effect of supporting carbon electrode platforms, and the effect of "mass coverage" (which is commonly neglected in the 2D material literature) toward the detection of DA. The response of 2D-hBN modified electrodes is found to be largely dependent upon the interaction between 2D-hBN and the underlying supporting electrode material. For example, in the case of SPEs, modification with 2D-hBN (324 ng) improves the electrochemical response, decreasing the electrochemical oxidation potential of DA by ∼90 mV compared to an unmodified SPE. Conversely, modification of a GC electrode with 2D-hBN (324 ng) resulted in an increased oxidation potential of DA by ∼80 mV when compared to the unmodified electrode. We explore the underlying mechanisms of the aforementioned examples and infer that electrode surface interactions and roughness factors are critical considerations. 2D-hBN is utilized toward the sensing of DA in the presence of the common interferents ascorbic acid (AA) and uric acid (UA). 2D-hBN is found to be an effective electrocatalyst in the simultaneous detection of DA and UA at both pH 5.0 and 7.4. The peak separations/resolution between DA and UA increases by ∼70 and 50 mV (at pH 5.0 and 7.4, respectively, when utilizing 108 ng of 2D-hBN) compared to unmodified SPEs, with a particularly favorable response evident in pH 5.0, giving rise to a

  10. Computational Toxicology as Implemented by the U.S. EPA: Providing High Throughput Decision Support Tools for Screening and Assessing Chemical Exposure, Hazard and Risk

    EPA Science Inventory

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...

  11. Peptide ligands for pro-survival protein Bfl-1 from computationally guided library screening

    PubMed Central

    Dutta, Sanjib; Chen, T. Scott; Keating, Amy E.

    2013-01-01

    Pro-survival members of the Bcl-2 protein family inhibit cell death by binding short helical BH3 motifs in pro-apoptotic proteins. Mammalian pro-survival proteins Bcl-xL, Bcl-2, Bcl-w, Mcl-1 and Bfl-1 bind with varying affinities and specificities to native BH3 motifs, engineered peptides and small molecules. Biophysical studies have determined interaction patterns for these proteins, particularly for the most-studied family members Bcl-xL and Mcl-1. Bfl-1 is a pro-survival protein implicated in preventing apoptosis in leukemia, lymphoma and melanoma. Although Bfl-1 is a promising therapeutic target, relatively little is known about its binding preferences. We explored the binding of Bfl-1 to BH3-like peptides by screening a peptide library that was designed to sample a high degree of relevant sequence diversity. Screening using yeast-surface display led to several novel high-affinity Bfl-1 binders and to thousands of putative binders identified through deep sequencing. Further screening for specificity led to identification of a peptide that bound to Bfl-1 with Kd < 1 nM and very slow dissociation from Bfl-1 compared to other pro-survival Bcl-2 family members. A point mutation in this sequence gave a peptide with ~50 nM affinity for Bfl-1 that was selective for Bfl-1 in equilibrium binding assays. Analysis of engineered Bfl-1 binders deepens our understanding of how the binding profiles of pro-survival proteins differ, and may guide the development of targeted Bfl-1 inhibitors. PMID:23363053

  12. Peptide ligands for pro-survival protein Bfl-1 from computationally guided library screening.

    PubMed

    Dutta, Sanjib; Chen, T Scott; Keating, Amy E

    2013-04-19

    Pro-survival members of the Bcl-2 protein family inhibit cell death by binding short helical BH3 motifs in pro-apoptotic proteins. Mammalian pro-survival proteins Bcl-x(L), Bcl-2, Bcl-w, Mcl-1, and Bfl-1 bind with varying affinities and specificities to native BH3 motifs, engineered peptides, and small molecules. Biophysical studies have determined interaction patterns for these proteins, particularly for the most-studied family members Bcl-x(L) and Mcl-1. Bfl-1 is a pro-survival protein implicated in preventing apoptosis in leukemia, lymphoma, and melanoma. Although Bfl-1 is a promising therapeutic target, relatively little is known about its binding preferences. We explored the binding of Bfl-1 to BH3-like peptides by screening a peptide library that was designed to sample a high degree of relevant sequence diversity. Screening using yeast-surface display led to several novel high-affinity Bfl-1 binders and to thousands of putative binders identified through deep sequencing. Further screening for specificity led to identification of a peptide that bound to Bfl-1 with K(d) < 1 nM and very slow dissociation from Bfl-1 compared to other pro-survival Bcl-2 family members. A point mutation in this sequence gave a peptide with ~50 nM affinity for Bfl-1 that was selective for Bfl-1 in equilibrium binding assays. Analysis of engineered Bfl-1 binders deepens our understanding of how the binding profiles of pro-survival proteins differ and may guide the development of targeted Bfl-1 inhibitors.

  13. Radiative heat transfer in 2D Dirac materials.

    PubMed

    Rodriguez-López, Pablo; Tse, Wang-Kong; Dalvit, Diego A R

    2015-06-03

    We compute the radiative heat transfer between two sheets of 2D Dirac materials, including topological Chern insulators and graphene, within the framework of the local approximation for the optical response of these materials. In this approximation, which neglects spatial dispersion, we derive both numerically and analytically the short-distance asymptotic of the near-field heat transfer in these systems, and show that it scales as the inverse of the distance between the two sheets. Finally, we discuss the limitations to the validity of this scaling law imposed by spatial dispersion in 2D Dirac materials.

  14. Radiative heat transfer in 2D Dirac materials

    DOE PAGES

    Rodriguez-López, Pablo; Tse, Wang -Kong; Dalvit, Diego A. R.

    2015-05-12

    We compute the radiative heat transfer between two sheets of 2D Dirac materials, including topological Chern insulators and graphene, within the framework of the local approximation for the optical response of these materials. In this approximation, which neglects spatial dispersion, we derive both numerically and analytically the short-distance asymptotic of the near-field heat transfer in these systems, and show that it scales as the inverse of the distance between the two sheets. In conclusion, we discuss the limitations to the validity of this scaling law imposed by spatial dispersion in 2D Dirac materials.

  15. Quantitative 2D liquid-state NMR.

    PubMed

    Giraudeau, Patrick

    2014-06-01

    Two-dimensional (2D) liquid-state NMR has a very high potential to simultaneously determine the absolute concentration of small molecules in complex mixtures, thanks to its capacity to separate overlapping resonances. However, it suffers from two main drawbacks that probably explain its relatively late development. First, the 2D NMR signal is strongly molecule-dependent and site-dependent; second, the long duration of 2D NMR experiments prevents its general use for high-throughput quantitative applications and affects its quantitative performance. Fortunately, the last 10 years has witnessed an increasing number of contributions where quantitative approaches based on 2D NMR were developed and applied to solve real analytical issues. This review aims at presenting these recent efforts to reach a high trueness and precision in quantitative measurements by 2D NMR. After highlighting the interest of 2D NMR for quantitative analysis, the different strategies to determine the absolute concentrations from 2D NMR spectra are described and illustrated by recent applications. The last part of the manuscript concerns the recent development of fast quantitative 2D NMR approaches, aiming at reducing the experiment duration while preserving - or even increasing - the analytical performance. We hope that this comprehensive review will help readers to apprehend the current landscape of quantitative 2D NMR, as well as the perspectives that may arise from it.

  16. Identification of Potent Chemotypes Targeting Leishmania major Using a High-Throughput, Low-Stringency, Computationally Enhanced, Small Molecule Screen

    PubMed Central

    Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.

    2009-01-01

    Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337

  17. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction.

    PubMed

    Rowley-Neale, Samuel J; Fearn, Jamie M; Brownson, Dale A C; Smith, Graham C; Ji, Xiaobo; Banks, Craig E

    2016-08-21

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm(-2) modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR.

  18. Graphene band structure and its 2D Raman mode

    NASA Astrophysics Data System (ADS)

    Narula, Rohit; Reich, Stephanie

    2014-08-01

    High-precision simulations are used to generate the 2D Raman mode of graphene under a range of screening conditions and laser energies EL. We reproduce the decreasing trend of the 2D mode FWHM vs EL and the nearly linearly increasing dispersion ∂ω2D/∂EL seen experimentally in freestanding (unscreened) graphene, and propose relations between these experimentally accessible quantities and the local, two-dimensional gradients |∇ | of the electronic and TO phonon bands. In light of state-of-the-art electronic structure calculations that acutely treat the long-range e-e interactions of isolated graphene and its experimentally observed 2D Raman mode, our calculations determine a 40% greater slope of the TO phonons about K than given by explicit phonon measurements performed in graphite or GW phonon calculations in graphene. We also deduce the variation of the broadening energy γ [EL] for freestanding graphene and find a nominal value γ ˜140 meV, showing a gradually increasing trend for the range of frequencies available experimentally.

  19. Volume Calculation of Venous Thrombosis Using 2D Ultrasound Images.

    PubMed

    Dhibi, M; Puentes, J; Bressollette, L; Guias, B; Solaiman, B

    2005-01-01

    Venous thrombosis screening exams use 2D ultrasound images, from which medical experts obtain a rough idea of the thrombosis aspect and infer an approximate volume. Such estimation is essential to follow up the thrombosis evolution. This paper proposes a method to calculate venous thrombosis volume from non-parallel 2D ultrasound images, taking advantage of a priori knowledge about the thrombosis shape. An interactive ellipse fitting contour segmentation extracts the 2D thrombosis contours. Then, a Delaunay triangulation is applied to the set of 2D segmented contours positioned in 3D, and the area that each contour defines, to obtain a global thrombosis 3D surface reconstruction, with a dense triangulation inside the contours. Volume is calculated from the obtained surface and contours triangulation, using a maximum unit normal component approach. Preliminary results obtained on 3 plastic phantoms and 3 in vitro venous thromboses, as well as one in vivo case are presented and discussed. An error rate of volume estimation inferior to 4,5% for the plastic phantoms, and 3,5% for the in vitro venous thromboses was obtained.

  20. Annotated Bibliography of EDGE2D Use

    SciTech Connect

    J.D. Strachan and G. Corrigan

    2005-06-24

    This annotated bibliography is intended to help EDGE2D users, and particularly new users, find existing published literature that has used EDGE2D. Our idea is that a person can find existing studies which may relate to his intended use, as well as gain ideas about other possible applications by scanning the attached tables.

  1. The effects of on-screen, point of care computer reminders on processes and outcomes of care

    PubMed Central

    Shojania, Kaveh G; Jennings, Alison; Mayhew, Alain; Ramsay, Craig R; Eccles, Martin P; Grimshaw, Jeremy

    2014-01-01

    Background The opportunity to improve care by delivering decision support to clinicians at the point of care represents one of the main incentives for implementing sophisticated clinical information systems. Previous reviews of computer reminder and decision support systems have reported mixed effects, possibly because they did not distinguish point of care computer reminders from e-mail alerts, computer-generated paper reminders, and other modes of delivering ‘computer reminders’. Objectives To evaluate the effects on processes and outcomes of care attributable to on-screen computer reminders delivered to clinicians at the point of care. Search methods We searched the Cochrane EPOC Group Trials register, MEDLINE, EMBASE and CINAHL and CENTRAL to July 2008, and scanned bibliographies from key articles. Selection criteria Studies of a reminder delivered via a computer system routinely used by clinicians, with a randomised or quasi-randomised design and reporting at least one outcome involving a clinical endpoint or adherence to a recommended process of care. Data collection and analysis Two authors independently screened studies for eligibility and abstracted data. For each study, we calculated the median improvement in adherence to target processes of care and also identified the outcome with the largest such improvement. We then calculated the median absolute improvement in process adherence across all studies using both the median outcome from each study and the best outcome. Main results Twenty-eight studies (reporting a total of thirty-two comparisons) were included. Computer reminders achieved a median improvement in process adherence of 4.2% (interquartile range (IQR): 0.8% to 18.8%) across all reported process outcomes, 3.3% (IQR: 0.5% to 10.6%) for medication ordering, 3.8% (IQR: 0.5% to 6.6%) for vaccinations, and 3.8% (IQR: 0.4% to 16.3%) for test ordering. In a sensitivity analysis using the best outcome from each study, the median improvement was 5

  2. Application of computer-extracted breast tissue texture features in predicting false-positive recalls from screening mammography

    NASA Astrophysics Data System (ADS)

    Ray, Shonket; Choi, Jae Y.; Keller, Brad M.; Chen, Jinbo; Conant, Emily F.; Kontos, Despina

    2014-03-01

    Mammographic texture features have been shown to have value in breast cancer risk assessment. Previous models have also been developed that use computer-extracted mammographic features of breast tissue complexity to predict the risk of false-positive (FP) recall from breast cancer screening with digital mammography. This work details a novel locallyadaptive parenchymal texture analysis algorithm that identifies and extracts mammographic features of local parenchymal tissue complexity potentially relevant for false-positive biopsy prediction. This algorithm has two important aspects: (1) the adaptive nature of automatically determining an optimal number of region-of-interests (ROIs) in the image and each ROI's corresponding size based on the parenchymal tissue distribution over the whole breast region and (2) characterizing both the local and global mammographic appearances of the parenchymal tissue that could provide more discriminative information for FP biopsy risk prediction. Preliminary results show that this locallyadaptive texture analysis algorithm, in conjunction with logistic regression, can predict the likelihood of false-positive biopsy with an ROC performance value of AUC=0.92 (p<0.001) with a 95% confidence interval [0.77, 0.94]. Significant texture feature predictors (p<0.05) included contrast, sum variance and difference average. Sensitivity for false-positives was 51% at the 100% cancer detection operating point. Although preliminary, clinical implications of using prediction models incorporating these texture features may include the future development of better tools and guidelines regarding personalized breast cancer screening recommendations. Further studies are warranted to prospectively validate our findings in larger screening populations and evaluate their clinical utility.

  3. An Official American Thoracic Society/American College of Chest Physicians Policy Statement: Implementation of Low-Dose Computed Tomography Lung Cancer Screening Programs in Clinical Practice

    PubMed Central

    Wiener, Renda Soylemez; Gould, Michael K.; Arenberg, Douglas A.; Au, David H.; Fennig, Kathleen; Lamb, Carla R.; Mazzone, Peter J.; Midthun, David E.; Napoli, Maryann; Ost, David E.; Powell, Charles A.; Rivera, M. Patricia; Slatore, Christopher G.; Tanner, Nichole T.; Vachani, Anil; Wisnivesky, Juan P.; Yoon, Sue H.

    2015-01-01

    Rationale: Annual low-radiation-dose computed tomography (LDCT) screening for lung cancer has been shown to reduce lung cancer mortality among high-risk individuals and is now recommended by multiple organizations. However, LDCT screening is complex, and implementation requires careful planning to ensure benefits outweigh harms. Little guidance has been provided for sites wishing to develop and implement lung cancer screening programs. Objectives: To promote successful implementation of comprehensive LDCT screening programs that are safe, effective, and sustainable. Methods: The American Thoracic Society (ATS) and American College of Chest Physicians (CHEST) convened a committee with expertise in lung cancer screening, pulmonary nodule evaluation, and implementation science. The committee reviewed the evidence from systematic reviews, clinical practice guidelines, surveys, and the experience of early-adopting LDCT screening programs and summarized potential strategies to implement LDCT screening programs successfully. Measurements and Main Results: We address steps that sites should consider during the main three phases of developing an LDCT screening program: planning, implementation, and maintenance. We present multiple strategies to implement the nine core elements of comprehensive lung cancer screening programs enumerated in a recent CHEST/ATS statement, which will allow sites to select the strategy that best fits with their local context and workflow patterns. Although we do not comment on cost-effectiveness of LDCT screening, we outline the necessary costs associated with starting and sustaining a high-quality LDCT screening program. Conclusions: Following the strategies delineated in this policy statement may help sites to develop comprehensive LDCT screening programs that are safe and effective. PMID:26426785

  4. Mean flow and anisotropic cascades in decaying 2D turbulence

    NASA Astrophysics Data System (ADS)

    Liu, Chien-Chia; Cerbus, Rory; Gioia, Gustavo; Chakraborty, Pinaki

    2015-11-01

    Many large-scale atmospheric and oceanic flows are decaying 2D turbulent flows embedded in a non-uniform mean flow. Despite its importance for large-scale weather systems, the affect of non-uniform mean flows on decaying 2D turbulence remains unknown. In the absence of mean flow it is well known that decaying 2D turbulent flows exhibit the enstrophy cascade. More generally, for any 2D turbulent flow, all computational, experimental and field data amassed to date indicate that the spectrum of longitudinal and transverse velocity fluctuations correspond to the same cascade, signifying isotropy of cascades. Here we report experiments on decaying 2D turbulence in soap films with a non-uniform mean flow. We find that the flow transitions from the usual isotropic enstrophy cascade to a series of unusual and, to our knowledge, never before observed or predicted, anisotropic cascades where the longitudinal and transverse spectra are mutually independent. We discuss implications of our results for decaying geophysical turbulence.

  5. Interobserver variations on interpretation of multislice CT lung cancer screening studies, and the implications for computer-aided diagnosis

    NASA Astrophysics Data System (ADS)

    Novak, Carol L.; Qian, JianZhong; Fan, Li; Ko, Jane P.; Rubinowitz, Ami N.; McGuinness, Georgeann; Naidich, David

    2002-04-01

    With low dose multi-slice CT for screening of lung cancer, physicians are now finding and examining increasingly smaller nodules. However as the size of detectable nodules becomes smaller, there may be greater differences among physicians as to what is detected and what constitutes a nodule. In this study, 10 CT screening studies of smokers were individually evaluated by three thoracic radiologists. After consensus to determine a gold standard, the number of nodules detected by individual radiologists ranged from 1.4 to 2.1 detections per patient. Each radiologist detected nodules missed by the other two. Although a total of 26 true nodules were detected by one or more radiologists, only 8 (31%) were detected by all three radiologists. The number of true nodules detected by an integrated automatic detection algorithm was 3.2 per patient after radiologist validation. Including these nodules in the gold standard set reduced the sensitivity of nodule detection by each radiologist to less than half. The sensitivity of nodule detection by the computer was better at 64%, proving especially efficacious for detecting smaller and more central nodules. Use of the automatic detection module would allow individual radiologists to increase the number of detected nodules by 114% to 207%.

  6. The Emerging Roles of Coronary Computed Tomographic Angiography: Acute Chest Pain Evaluation and Screening for Asymptomatic Individuals

    PubMed Central

    Chien, Ning; Wang, Tzung-Dau; Chang, Yeun-Chung; Lin, Po-Chih; Tseng, Yao-Hui; Lee, Yee-Fan; Ko, Wei-Chun; Lee, Bai-Chin; Lee, Wen-Jeng

    2016-01-01

    Coronary computed tomographic angiography (CCTA) has been widely available since 2004. After that, the diagnostic accuracy of CCTA has been extensively validated with invasive coronary angiography for detection of coronary arterial stenosis. In this paper, we reviewed the updated evidence of the role of CCTA in both scenarios including acute chest pain and screening in asymptomatic adults. Several large-scale studies have been conducted to evaluate the diagnostic value of CCTA in the context of acute chest pain patients. CCTA could play a role in delivering more efficient care. For risk stratification of asymptomatic patients using CCTA, latest studies have revealed incremental benefits. Future studies evaluating the totality of plaque characteristics may be useful for determining the role of noncalcified plaque for risk stratification in asymptomatic individuals. PMID:27122947

  7. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    PubMed

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements.

  8. [Phantom Study on Dose Reduction Using Iterative Reconstruction in Low-dose Computed Tomography for Lung Cancer Screening].

    PubMed

    Minehiro, Kaori; Takata, Tadanori; Hayashi, Hiroyuki; Sakuda, Keita; Nunome, Haruka; Kawashima, Hiroko; Sanada, Shigeru

    2015-12-01

    We investigated dose reduction ability of an iterative reconstruction technology for low-dose computed tomography (CT) for lung cancer screening. The Sinogram Affirmed Iterative Reconstruction (SAFIRE) provided in a multi slice CT system, Somatom Definition Flash (Siemens Healthcare) was used. An anthropomorphic chest phantom (N-1, Kyoto Kagaku) was scanned at volume CT dose index (CTDIvol) of 0.50-11.86 mGy with 120 kV. For noise (standard deviation) and contrast-to-noise ratio (CNR) measurements, CTP486 and CTP515 modules in the Catphan (The Phantom Laboratory) were scanned. Radiological technologists were participated in the perceptual comparison. SAFIRE reduced the SD values by approximately 50% compared with filter back projection (FBP). The estimated dose reduction rates by SAFIRE determined from the perceptual comparison was approximately 23%, while 75% dose reduction rate was expected from the SD value reduction of 50%.

  9. Computational redesign of bacterial biotin carboxylase inhibitors using structure-based virtual screening of combinatorial libraries.

    PubMed

    Brylinski, Michal; Waldrop, Grover L

    2014-04-02

    As the spread of antibiotic resistant bacteria steadily increases, there is an urgent need for new antibacterial agents. Because fatty acid synthesis is only used for membrane biogenesis in bacteria, the enzymes in this pathway are attractive targets for antibacterial agent development. Acetyl-CoA carboxylase catalyzes the committed and regulated step in fatty acid synthesis. In bacteria, the enzyme is composed of three distinct protein components: biotin carboxylase, biotin carboxyl carrier protein, and carboxyltransferase. Fragment-based screening revealed that amino-oxazole inhibits biotin carboxylase activity and also exhibits antibacterial activity against Gram-negative organisms. In this report, we redesigned previously identified lead inhibitors to expand the spectrum of bacteria sensitive to the amino-oxazole derivatives by including Gram-positive species. Using 9,411 small organic building blocks, we constructed a diverse combinatorial library of 1.2×10⁸ amino-oxazole derivatives. A subset of 9×10⁶ of these compounds were subjected to structure-based virtual screening against seven biotin carboxylase isoforms using similarity-based docking by eSimDock. Potentially broad-spectrum antibiotic candidates were selected based on the consensus ranking by several scoring functions including non-linear statistical models implemented in eSimDock and traditional molecular mechanics force fields. The analysis of binding poses of the top-ranked compounds docked to biotin carboxylase isoforms suggests that: (1) binding of the amino-oxazole anchor is stabilized by a network of hydrogen bonds to residues 201, 202 and 204; (2) halogenated aromatic moieties attached to the amino-oxazole scaffold enhance interactions with a hydrophobic pocket formed by residues 157, 169, 171 and 203; and (3) larger substituents reach deeper into the binding pocket to form additional hydrogen bonds with the side chains of residues 209 and 233. These structural insights into drug

  10. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis

    PubMed Central

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A.; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-01-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications. PMID:27325228

  11. Computational Systems Bioinformatics and Bioimaging for Pathway Analysis and Drug Screening

    PubMed Central

    Zhou, Xiaobo; Wong, Stephen T. C.

    2009-01-01

    The premise of today’s drug development is that the mechanism of a disease is highly dependent upon underlying signaling and cellular pathways. Such pathways are often composed of complexes of physically interacting genes, proteins, or biochemical activities coordinated by metabolic intermediates, ions, and other small solutes and are investigated with molecular biology approaches in genomics, proteomics, and metabonomics. Nevertheless, the recent declines in the pharmaceutical industry’s revenues indicate such approaches alone may not be adequate in creating successful new drugs. Our observation is that combining methods of genomics, proteomics, and metabonomics with techniques of bioimaging will systematically provide powerful means to decode or better understand molecular interactions and pathways that lead to disease and potentially generate new insights and indications for drug targets. The former methods provide the profiles of genes, proteins, and metabolites, whereas the latter techniques generate objective, quantitative phenotypes correlating to the molecular profiles and interactions. In this paper, we describe pathway reconstruction and target validation based on the proposed systems biologic approach and show selected application examples for pathway analysis and drug screening. PMID:20011613

  12. Computer based screening of compound databases: 1. Preselection of benzamidine-based thrombin inhibitors.

    PubMed

    Fox, T; Haaksma, E E

    2000-07-01

    We present a computational protocol which uses the known three-dimensional structure of a target enzyme to identify possible ligands from databases of compounds with low molecular weight. This is accomplished by first mapping the essential interactions in the binding site with the program GRID. The resulting regions of favorable interaction between target and ligand are translated into a database query, and with UNITY a flexible 3D database search is performed. The feasibility of this approach is calibrated with thrombin as the target. Our results show that the resulting hit lists are enriched with thrombin inhibitors compared to the total database.

  13. Formulation pre-screening of inhalation powders using computational atom-atom systematic search method.

    PubMed

    Ramachandran, Vasuki; Murnane, Darragh; Hammond, Robert B; Pickering, Jonathan; Roberts, Kevin J; Soufian, Majeed; Forbes, Ben; Jaffari, Sara; Martin, Gary P; Collins, Elizabeth; Pencheva, Klimentina

    2015-01-05

    The synthonic modeling approach provides a molecule-centered understanding of the surface properties of crystals. It has been applied extensively to understand crystallization processes. This study aimed to investigate the functional relevance of synthonic modeling to the formulation of inhalation powders by assessing cohesivity of three active pharmaceutical ingredients (APIs, fluticasone propionate (FP), budesonide (Bud), and salbutamol base (SB)) and the commonly used excipient, α-lactose monohydrate (LMH). It is found that FP (-11.5 kcal/mol) has a higher cohesive strength than Bud (-9.9 kcal/mol) or SB (-7.8 kcal/mol). The prediction correlated directly to cohesive strength measurements using laser diffraction, where the airflow pressure required for complete dispersion (CPP) was 3.5, 2.0, and 1.0 bar for FP, Bud, and SB, respectively. The highest cohesive strength was predicted for LMH (-15.9 kcal/mol), which did not correlate with the CPP value of 2.0 bar (i.e., ranking lower than FP). High FP-LMH adhesive forces (-11.7 kcal/mol) were predicted. However, aerosolization studies revealed that the FP-LMH blends consisted of agglomerated FP particles with a large median diameter (∼4-5 μm) that were not disrupted by LMH. Modeling of the crystal and surface chemistry of LMH identified high electrostatic and H-bond components of its cohesive energy due to the presence of water and hydroxyl groups in lactose, unlike the APIs. A direct comparison of the predicted and measured cohesive balance of LMH with APIs will require a more in-depth understanding of highly hydrogen-bonded systems with respect to the synthonic engineering modeling tool, as well as the influence of agglomerate structure on surface-surface contact geometry. Overall, this research has demonstrated the possible application and relevance of synthonic engineering tools for rapid pre-screening in drug formulation and design.

  14. Meta-analysis of two computer-assisted screening methods for diagnosing oral precancer and cancer.

    PubMed

    Ye, Xiaojing; Zhang, Jing; Tan, Yaqin; Chen, Guanying; Zhou, Gang

    2015-11-01

    The early diagnosis of oral precancer and cancer is crucial and could have the highest impact on improving survival rates. A meta-analysis was conducted to compare the accuracy between the OralCDx brush biopsy and DNA-image cytometry in diagnosing both conditions. Bibliographic databases were systematically searched for original relevant studies on the early diagnosis of oral precancer and oral cancer. Study characteristics were evaluated to determine the accuracy of the two screening strategies. Thirteen studies (eight of OralCDx brush biopsy and five of DNA-image cytometry) were identified as having reported on 1981 oral mucosa lesions. The meta-analysis found that the area under the summary receiver operating characteristic curves of the OralCDx brush biopsy and DNA-image cytometry were 0.8879 and 0.9885, respectively. The pooled sensitivity, specificity, and diagnostic odds ratio of the OralCDx brush biopsy were 86% (95% CI 81-90), 81% (95% CI 78-85), and 20.36 (95% CI 2.72-152.67), respectively, while these modalities of DNA-image cytometry were 89% (95% CI 83-94), 99% (95% CI 97-100), and 446.08 (95% CI 73.36-2712.43), respectively. Results of a pairwise comparison between each modality demonstrated that specificity, area under the curve (AUC), and Q(∗) index of DNA-image cytometry was significantly higher than that of the OralCDx brush biopsy (Z=2.821, p<0.05; Z=1.711, p<0.05; Z=1.727, p<0.05), but no significant difference in sensitivity was found (Z=1.520, p>0.05). In conclusion, the meta-analysis of the published studies indicated that DNA-image cytometry is more accurate than the OralCDx brush biopsy in diagnosing oral precancer and oral cancer.

  15. MAGNUM2D. Radionuclide Transport Porous Media

    SciTech Connect

    Langford, D.W.; Baca, R.G.

    1989-03-01

    MAGNUM2D was developed to analyze thermally driven fluid motion in the deep basalts below the Paco Basin at the Westinghouse Hanford Site. Has been used in the Basalt Waste Isolation Project to simulate nonisothermal groundwater flow in a heterogeneous anisotropic medium and heat transport in a water/rock system near a high level nuclear waste repository. Allows three representations of the hydrogeologic system: an equivalent porous continuum, a system of discrete, unfilled, and interconnecting fractures separated by impervious rock mass, and a low permeability porous continuum with several discrete, unfilled fractures traversing the medium. The calculations assume local thermodynamic equilibrium between the rock and groundwater, nonisothermal Darcian flow in the continuum portions of the rock, and nonisothermal Poiseuille flow in discrete unfilled fractures. In addition, the code accounts for thermal loading within the elements, zero normal gradient and fixed boundary conditions for both temperature and hydraulic head, and simulation of the temperature and flow independently. The Q2DGEOM preprocessor was developed to generate, modify, plot and verify quadratic two dimensional finite element geometries. The BCGEN preprocessor generates the boundary conditions for head and temperature and ICGEN generates the initial conditions. The GRIDDER postprocessor interpolates nonregularly spaced nodal flow and temperature data onto a regular rectangular grid. CONTOUR plots and labels contour lines for a function of two variables and PARAM plots cross sections and time histories for a function of time and one or two spatial variables. NPRINT generates data tables that display the data along horizontal or vertical cross sections. VELPLT differentiates the hydraulic head and buoyancy data and plots the velocity vectors. The PATH postprocessor plots flow paths and computes the corresponding travel times.

  16. MAGNUM2D. Radionuclide Transport Porous Media

    SciTech Connect

    Langford, D.W.; Baca, R.G.

    1988-08-01

    MAGNUM2D was developed to analyze thermally driven fluid motion in the deep basalts below the Paco Basin at the Westinghouse Hanford Site. Has been used in the Basalt Waste Isolation Project to simulate nonisothermal groundwater flow in a heterogeneous anisotropic medium and heat transport in a water/rock system near a high level nuclear waste repository. Allows three representations of the hydrogeologic system: an equivalent porous continuum, a system of discrete, unfilled, and interconnecting fractures separated by impervious rock mass, and a low permeability porous continuum with several discrete, unfilled fractures traversing the medium. The calculation assumes local thermodynamic equilibrium between the rock and groundwater, nonisothermal Darcian flow in the continuum portions of the rock, and nonisothermal Poiseuille flow in discrete unfilled fractures. In addition, the code accounts for thermal loading within the elements, zero normal gradient and fixed boundary conditions for both temperature and hydraulic head, and simulation of the temperature and flow independently. The Q2DGEOM preprocessor was developed to generate, modify, plot and verify quadratic two dimensional finite element geometries. The BCGEN preprocessor generates the boundary conditions for head and temperature and ICGEN generates the initial conditions. The GRIDDER postprocessor interpolates nonregularly spaced nodal flow and temperature data onto a regular rectangular grid. CONTOUR plots and labels contour lines for a function of two variables and PARAM plots cross sections and time histories for a function of time and one or two spatial variables. NPRINT generates data tables that display the data along horizontal or vertical cross sections. VELPLT differentiates the hydraulic head and buoyancy data and plots the velocity vectors. The PATH postprocessor plots flow paths and computes the corresponding travel times.

  17. Ab Initio potential grid based docking: From High Performance Computing to In Silico Screening

    NASA Astrophysics Data System (ADS)

    de Jonge, Marc R.; Vinkers, H. Maarten; van Lenthe, Joop H.; Daeyaert, Frits; Bush, Ian J.; van Dam, Huub J. J.; Sherwood, Paul; Guest, Martyn F.

    2007-09-01

    We present a new and completely parallel method for protein ligand docking. The potential of the docking target structure is obtained directly from the electron density derived through an ab initio computation. A large subregion of the crystal structure of Isocitrate Lyase, was selected as docking target. To allow the full ab initio treatment of this region special care was taken to assign optimal basis functions. The electrostatic potential is tested by docking a small charged molecule (succinate) into the binding site. The ab initio grid yields a superior result by producing the best binding orientation and position, and by recognizing it as the best. In contrast the same docking procedure, but using a classical point-charge based potential, produces a number of additional incorrect binding poses, and does not recognize the correct pose as the best solution.

  18. Computational Screening of Tip and Stalk Cell Behavior Proposes a Role for Apelin Signaling in Sprout Progression.

    PubMed

    Palm, Margriet M; Dallinga, Marchien G; van Dijk, Erik; Klaassen, Ingeborg; Schlingemann, Reinier O; Merks, Roeland M H

    2016-01-01

    Angiogenesis involves the formation of new blood vessels by sprouting or splitting of existing blood vessels. During sprouting, a highly motile type of endothelial cell, called the tip cell, migrates from the blood vessels followed by stalk cells, an endothelial cell type that forms the body of the sprout. To get more insight into how tip cells contribute to angiogenesis, we extended an existing computational model of vascular network formation based on the cellular Potts model with tip and stalk differentiation, without making a priori assumptions about the differences between tip cells and stalk cells. To predict potential differences, we looked for parameter values that make tip cells (a) move to the sprout tip, and (b) change the morphology of the angiogenic networks. The screening predicted that if tip cells respond less effectively to an endothelial chemoattractant than stalk cells, they move to the tips of the sprouts, which impacts the morphology of the networks. A comparison of this model prediction with genes expressed differentially in tip and stalk cells revealed that the endothelial chemoattractant Apelin and its receptor APJ may match the model prediction. To test the model prediction we inhibited Apelin signaling in our model and in an in vitro model of angiogenic sprouting, and found that in both cases inhibition of Apelin or of its receptor APJ reduces sprouting. Based on the prediction of the computational model, we propose that the differential expression of Apelin and APJ yields a "self-generated" gradient mechanisms that accelerates the extension of the sprout.

  19. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine

    NASA Astrophysics Data System (ADS)

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-01

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1.

  20. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine.

    PubMed

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-12

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1.

  1. FASTWO - A 2-D interactive algebraic grid generator

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Lombard, C. K.

    1988-01-01

    This paper presents a very simple and effective computational procedure, FASTWO, for generating patched composite finite difference grids in 2-D for any geometry. Major components of the interactive graphics based method that is closely akin to and borrows many tools from transfinite interpolation are highlighted. Several grids produced by FASTWO are shown to illustrate its powerful capability. Comments about extending the methodology to 3-D are also given.

  2. [3D display of sequential 2D medical images].

    PubMed

    Lu, Yisong; Chen, Yazhu

    2003-12-01

    A detailed review is given in this paper on various current 3D display methods for sequential 2D medical images and the new development in 3D medical image display. True 3D display, surface rendering, volume rendering, 3D texture mapping and distributed collaborative rendering are discussed in depth. For two kinds of medical applications: Real-time navigation system and high-fidelity diagnosis in computer aided surgery, different 3D display methods are presented.

  3. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction

    NASA Astrophysics Data System (ADS)

    Rowley-Neale, Samuel J.; Fearn, Jamie M.; Brownson, Dale A. C.; Smith, Graham C.; Ji, Xiaobo; Banks, Craig E.

    2016-08-01

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm-2 modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR.Two-dimensional molybdenum disulphide nanosheets

  4. Matrix models of 2d gravity

    SciTech Connect

    Ginsparg, P.

    1991-01-01

    These are introductory lectures for a general audience that give an overview of the subject of matrix models and their application to random surfaces, 2d gravity, and string theory. They are intentionally 1.5 years out of date.

  5. Matrix models of 2d gravity

    SciTech Connect

    Ginsparg, P.

    1991-12-31

    These are introductory lectures for a general audience that give an overview of the subject of matrix models and their application to random surfaces, 2d gravity, and string theory. They are intentionally 1.5 years out of date.

  6. Brittle damage models in DYNA2D

    SciTech Connect

    Faux, D.R.

    1997-09-01

    DYNA2D is an explicit Lagrangian finite element code used to model dynamic events where stress wave interactions influence the overall response of the system. DYNA2D is often used to model penetration problems involving ductile-to-ductile impacts; however, with the advent of the use of ceramics in the armor-anti-armor community and the need to model damage to laser optics components, good brittle damage models are now needed in DYNA2D. This report will detail the implementation of four brittle damage models in DYNA2D, three scalar damage models and one tensor damage model. These new brittle damage models are then used to predict experimental results from three distinctly different glass damage problems.

  7. 2D/3D switchable displays

    NASA Astrophysics Data System (ADS)

    Dekker, T.; de Zwart, S. T.; Willemsen, O. H.; Hiddink, M. G. H.; IJzerman, W. L.

    2006-02-01

    A prerequisite for a wide market acceptance of 3D displays is the ability to switch between 3D and full resolution 2D. In this paper we present a robust and cost effective concept for an auto-stereoscopic switchable 2D/3D display. The display is based on an LCD panel, equipped with switchable LC-filled lenticular lenses. We will discuss 3D image quality, with the focus on display uniformity. We show that slanting the lenticulars in combination with a good lens design can minimize non-uniformities in our 20" 2D/3D monitors. Furthermore, we introduce fractional viewing systems as a very robust concept to further improve uniformity in the case slanting the lenticulars and optimizing the lens design are not sufficient. We will discuss measurements and numerical simulations of the key optical characteristics of this display. Finally, we discuss 2D image quality, the switching characteristics and the residual lens effect.

  8. Real-time 2-D temperature imaging using ultrasound.

    PubMed

    Liu, Dalong; Ebbini, Emad S

    2010-01-01

    We have previously introduced methods for noninvasive estimation of temperature change using diagnostic ultrasound. The basic principle was validated both in vitro and in vivo by several groups worldwide. Some limitations remain, however, that have prevented these methods from being adopted in monitoring and guidance of minimally invasive thermal therapies, e.g., RF ablation and high-intensity-focused ultrasound (HIFU). In this letter, we present first results from a real-time system for 2-D imaging of temperature change using pulse-echo ultrasound. The front end of the system is a commercially available scanner equipped with a research interface, which allows the control of imaging sequence and access to the RF data in real time. A high-frame-rate 2-D RF acquisition mode, M2D, is used to capture the transients of tissue motion/deformations in response to pulsed HIFU. The M2D RF data is streamlined to the back end of the system, where a 2-D temperature imaging algorithm based on speckle tracking is implemented on a graphics processing unit. The real-time images of temperature change are computed on the same spatial and temporal grid of the M2D RF data, i.e., no decimation. Verification of the algorithm was performed by monitoring localized HIFU-induced heating of a tissue-mimicking elastography phantom. These results clearly demonstrate the repeatability and sensitivity of the algorithm. Furthermore, we present in vitro results demonstrating the possible use of this algorithm for imaging changes in tissue parameters due to HIFU-induced lesions. These results clearly demonstrate the value of the real-time data streaming and processing in monitoring, and guidance of minimally invasive thermotherapy.

  9. Chemical Approaches to 2D Materials.

    PubMed

    Samorì, Paolo; Palermo, Vincenzo; Feng, Xinliang

    2016-08-01

    Chemistry plays an ever-increasing role in the production, functionalization, processing and applications of graphene and other 2D materials. This special issue highlights a selection of enlightening chemical approaches to 2D materials, which nicely reflect the breadth of the field and convey the excitement of the individuals involved in it, who are trying to translate graphene and related materials from the laboratory into a real, high-impact technology.

  10. A Malaria Diagnostic Tool Based on Computer Vision Screening and Visualization of Plasmodium falciparum Candidate Areas in Digitized Blood Smears

    PubMed Central

    Walliander, Margarita; Mårtensson, Andreas; Diwan, Vinod; Rahtu, Esa; Pietikäinen, Matti; Lundin, Mikael; Lundin, Johan

    2014-01-01

    Introduction Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. Methods Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27) and uninfected controls (n = 20) were digitally scanned with an oil immersion objective (0.1 µm/pixel) to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors) used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. Results The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls). From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. Conclusion We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for visual examination and

  11. 2D/3D Image Registration using Regression Learning

    PubMed Central

    Chou, Chen-Rui; Frederick, Brandon; Mageras, Gig; Chang, Sha; Pizer, Stephen

    2013-01-01

    In computer vision and image analysis, image registration between 2D projections and a 3D image that achieves high accuracy and near real-time computation is challenging. In this paper, we propose a novel method that can rapidly detect an object’s 3D rigid motion or deformation from a 2D projection image or a small set thereof. The method is called CLARET (Correction via Limited-Angle Residues in External Beam Therapy) and consists of two stages: registration preceded by shape space and regression learning. In the registration stage, linear operators are used to iteratively estimate the motion/deformation parameters based on the current intensity residue between the target projec-tion(s) and the digitally reconstructed radiograph(s) (DRRs) of the estimated 3D image. The method determines the linear operators via a two-step learning process. First, it builds a low-order parametric model of the image region’s motion/deformation shape space from its prior 3D images. Second, using learning-time samples produced from the 3D images, it formulates the relationships between the model parameters and the co-varying 2D projection intensity residues by multi-scale linear regressions. The calculated multi-scale regression matrices yield the coarse-to-fine linear operators used in estimating the model parameters from the 2D projection intensity residues in the registration. The method’s application to Image-guided Radiation Therapy (IGRT) requires only a few seconds and yields good results in localizing a tumor under rigid motion in the head and neck and under respiratory deformation in the lung, using one treatment-time imaging 2D projection or a small set thereof. PMID:24058278

  12. Gold silver alloy nanoparticles (GSAN): an imaging probe for breast cancer screening with dual-energy mammography or computed tomography

    NASA Astrophysics Data System (ADS)

    Naha, Pratap C.; Lau, Kristen C.; Hsu, Jessica C.; Hajfathalian, Maryam; Mian, Shaameen; Chhour, Peter; Uppuluri, Lahari; McDonald, Elizabeth S.; Maidment, Andrew D. A.; Cormode, David P.

    2016-07-01

    Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening.Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various

  13. Effect of screen-based computer simulation on knowledge and skill in nursing students' learning of preoperative and postoperative care management: a randomized controlled study.

    PubMed

    Durmaz, Aylin; Dicle, Aklime; Cakan, Emre; Cakir, Şen

    2012-04-01

    Screen-based computer simulations are considered a method of skill teaching in health education. This study examined the effect of screen-based computer simulation on knowledge, skill, and the clinical decision-making process in teaching preoperative and postoperative care management to second-year students in an undergraduate school of nursing. It is a randomized controlled study. The study sample was composed of 82 students. They received education in screen-based computer simulation (n = 41) and skill laboratories (n = 41). Three instruments were used: a preoperative and postoperative care management cognitive level assessment test, skill control lists of preoperative and postoperative care management, and the Clinical Decision Making in Nursing Scale. There was not a significant difference between the students' posteducation knowledge levels (P = .421), practical deep breathing and coughing exercise education skills (P = .867), or clinical decision-making scale total and subscale scores (P = .065). However, a significant difference was found between the admission of the patient in the surgical clinic after surgery skill scores of the students (P = .04). Education provided in the screen-based computer simulation laboratory was equivalent to that provided in the skill laboratory.

  14. 2D Orthogonal Locality Preserving Projection for Image Denoising.

    PubMed

    Shikkenawis, Gitam; Mitra, Suman K

    2016-01-01

    Sparse representations using transform-domain techniques are widely used for better interpretation of the raw data. Orthogonal locality preserving projection (OLPP) is a linear technique that tries to preserve local structure of data in the transform domain as well. Vectorized nature of OLPP requires high-dimensional data to be converted to vector format, hence may lose spatial neighborhood information of raw data. On the other hand, processing 2D data directly, not only preserves spatial information, but also improves the computational efficiency considerably. The 2D OLPP is expected to learn the transformation from 2D data itself. This paper derives mathematical foundation for 2D OLPP. The proposed technique is used for image denoising task. Recent state-of-the-art approaches for image denoising work on two major hypotheses, i.e., non-local self-similarity and sparse linear approximations of the data. Locality preserving nature of the proposed approach automatically takes care of self-similarity present in the image while inferring sparse basis. A global basis is adequate for the entire image. The proposed approach outperforms several state-of-the-art image denoising approaches for gray-scale, color, and texture images.

  15. ELLIPT2D: A Flexible Finite Element Code Written Python

    SciTech Connect

    Pletzer, A.; Mollis, J.C.

    2001-03-22

    The use of the Python scripting language for scientific applications and in particular to solve partial differential equations is explored. It is shown that Python's rich data structure and object-oriented features can be exploited to write programs that are not only significantly more concise than their counter parts written in Fortran, C or C++, but are also numerically efficient. To illustrate this, a two-dimensional finite element code (ELLIPT2D) has been written. ELLIPT2D provides a flexible and easy-to-use framework for solving a large class of second-order elliptic problems. The program allows for structured or unstructured meshes. All functions defining the elliptic operator are user supplied and so are the boundary conditions, which can be of Dirichlet, Neumann or Robbins type. ELLIPT2D makes extensive use of dictionaries (hash tables) as a way to represent sparse matrices.Other key features of the Python language that have been widely used include: operator over loading, error handling, array slicing, and the Tkinter module for building graphical use interfaces. As an example of the utility of ELLIPT2D, a nonlinear solution of the Grad-Shafranov equation is computed using a Newton iterative scheme. A second application focuses on a solution of the toroidal Laplace equation coupled to a magnetohydrodynamic stability code, a problem arising in the context of magnetic fusion research.

  16. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children

    PubMed Central

    Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Background Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. Methods In a cross-sectional study, 185 parents and children aged 3–18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. Results After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23–8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07–2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99–1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. Conclusions The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by

  17. Computational Screening of Tip and Stalk Cell Behavior Proposes a Role for Apelin Signaling in Sprout Progression

    PubMed Central

    Palm, Margriet M.; Dallinga, Marchien G.; Klaassen, Ingeborg; Schlingemann, Reinier O.

    2016-01-01

    Angiogenesis involves the formation of new blood vessels by sprouting or splitting of existing blood vessels. During sprouting, a highly motile type of endothelial cell, called the tip cell, migrates from the blood vessels followed by stalk cells, an endothelial cell type that forms the body of the sprout. To get more insight into how tip cells contribute to angiogenesis, we extended an existing computational model of vascular network formation based on the cellular Potts model with tip and stalk differentiation, without making a priori assumptions about the differences between tip cells and stalk cells. To predict potential differences, we looked for parameter values that make tip cells (a) move to the sprout tip, and (b) change the morphology of the angiogenic networks. The screening predicted that if tip cells respond less effectively to an endothelial chemoattractant than stalk cells, they move to the tips of the sprouts, which impacts the morphology of the networks. A comparison of this model prediction with genes expressed differentially in tip and stalk cells revealed that the endothelial chemoattractant Apelin and its receptor APJ may match the model prediction. To test the model prediction we inhibited Apelin signaling in our model and in an in vitro model of angiogenic sprouting, and found that in both cases inhibition of Apelin or of its receptor APJ reduces sprouting. Based on the prediction of the computational model, we propose that the differential expression of Apelin and APJ yields a “self-generated” gradient mechanisms that accelerates the extension of the sprout. PMID:27828952

  18. Differential patterns of 2D location versus depth decoding along the visual hierarchy.

    PubMed

    Finlayson, Nonie J; Zhang, Xiaoli; Golomb, Julie D

    2017-02-15

    Visual information is initially represented as 2D images on the retina, but our brains are able to transform this input to perceive our rich 3D environment. While many studies have explored 2D spatial representations or depth perception in isolation, it remains unknown if or how these processes interact in human visual cortex. Here we used functional MRI and multi-voxel pattern analysis to investigate the relationship between 2D location and position-in-depth information. We stimulated different 3D locations in a blocked design: each location was defined by horizontal, vertical, and depth position. Participants remained fixated at the center of the screen while passively viewing the peripheral stimuli with red/green anaglyph glasses. Our results revealed a widespread, systematic transition throughout visual cortex. As expected, 2D location information (horizontal and vertical) could be strongly decoded in early visual areas, with reduced decoding higher along the visual hierarchy, consistent with known changes in receptive field sizes. Critically, we found that the decoding of position-in-depth information tracked inversely with the 2D location pattern, with the magnitude of depth decoding gradually increasing from intermediate to higher visual and category regions. Representations of 2D location information became increasingly location-tolerant in later areas, where depth information was also tolerant to changes in 2D location. We propose that spatial representations gradually transition from 2D-dominant to balanced 3D (2D and depth) along the visual hierarchy.

  19. Targeting multiple types of tumors using NKG2D-coated iron oxide nanoparticles

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Ru; Cook, W. James; Zhang, Tong; Sentman, Charles L.

    2014-11-01

    Iron oxide nanoparticles (IONPs) hold great potential for cancer therapy. Actively targeting IONPs to tumor cells can further increase therapeutic efficacy and decrease off-target side effects. To target tumor cells, a natural killer (NK) cell activating receptor, NKG2D, was utilized to develop pan-tumor targeting IONPs. NKG2D ligands are expressed on many tumor types and its ligands are not found on most normal tissues under steady state conditions. The data showed that mouse and human fragment crystallizable (Fc)-fusion NKG2D (Fc-NKG2D) coated IONPs (NKG2D/NPs) can target multiple NKG2D ligand positive tumor types in vitro in a dose dependent manner by magnetic cell sorting. Tumor targeting effect was robust even under a very low tumor cell to normal cell ratio and targeting efficiency correlated with NKG2D ligand expression level on tumor cells. Furthermore, the magnetic separation platform utilized to test NKG2D/NP specificity has the potential to be developed into high throughput screening strategies to identify ideal fusion proteins or antibodies for targeting IONPs. In conclusion, NKG2D/NPs can be used to target multiple tumor types and magnetic separation platform can facilitate the proof-of-concept phase of tumor targeting IONP development.

  20. A scanning-mode 2D shear wave imaging (s2D-SWI) system for ultrasound elastography.

    PubMed

    Qiu, Weibao; Wang, Congzhi; Li, Yongchuan; Zhou, Juan; Yang, Ge; Xiao, Yang; Feng, Ge; Jin, Qiaofeng; Mu, Peitian; Qian, Ming; Zheng, Hairong

    2015-09-01

    Ultrasound elastography is widely used for the non-invasive measurement of tissue elasticity properties. Shear wave imaging (SWI) is a quantitative method for assessing tissue stiffness. SWI has been demonstrated to be less operator dependent than quasi-static elastography, and has the ability to acquire quantitative elasticity information in contrast with acoustic radiation force impulse (ARFI) imaging. However, traditional SWI implementations cannot acquire two dimensional (2D) quantitative images of the tissue elasticity distribution. This study proposes and evaluates a scanning-mode 2D SWI (s2D-SWI) system. The hardware and image processing algorithms are presented in detail. Programmable devices are used to support flexible control of the system and the image processing algorithms. An analytic signal based cross-correlation method and a Radon transformation based shear wave speed determination method are proposed, which can be implemented using parallel computation. Imaging of tissue mimicking phantoms, and in vitro, and in vivo imaging test are conducted to demonstrate the performance of the proposed system. The s2D-SWI system represents a new choice for the quantitative mapping of tissue elasticity, and has great potential for implementation in commercial ultrasound scanners.

  1. Hippocampal lesions in rats impair learning and memory for locations on a touch-sensitive computer screen: the "ASAT" task.

    PubMed

    Talpos, J C; Dias, R; Bussey, T J; Saksida, L M

    2008-10-10

    It has been repeatedly demonstrated across species that the hippocampus is critical for spatial learning and memory. Consequently, numerous paradigms have been created to study spatial learning in the rodent. Most of these tasks, such as the Morris water maze, 8-arm radial maze, and T-maze, are non-automated procedures. It was our goal to create an automated task in the rodent that is quickly learned, hippocampal-dependent, and minimizes the confounding variables present in most tests measuring hippocampal-dependent learning and memory. To accomplish this, we created a novel search task using a standard operant box fitted with a touch-sensitive computer monitor. Subjects were required to locate an S+ "hidden" amongst other identical stimuli on the monitor. In two versions of the task the S+ stayed in the same location within a session but shifted location between sessions. In a third version of the task the S+ was moved to a new location after every 10 trials. It was found that the location of the S+ was quickly acquired each day (within 10 trials), and that the hippocampal-lesion group was impaired when compared to their control cohort. With the benefits inherent in automation, these tasks confer significant advantages over traditional tasks used to study spatial learning and memory in the rodent. When combined with previously developed non-spatial cognitive tests that can also be run in the touch-screen apparatus, the result is a powerful cognitive test battery for the rodent.

  2. Beneficial Effects of Combining Computed Tomography Enteroclysis/Enterography with Capsule Endoscopy for Screening Tumor Lesions in the Small Intestine

    PubMed Central

    Shibata, Hiroaki; Hashimoto, Shinichi; Shimizu, Kensaku; Kawasato, Ryo; Shirasawa, Tomohiro; Yokota, Takayuki; Onoda, Hideko; Okamoto, Takeshi; Matsunaga, Naofumi; Sakaida, Isao

    2015-01-01

    Aim. To compare the efficacy of using computed tomography enteroclysis/enterography (CTE), capsule endoscopy (CE), and CTE with CE for diagnosing tumor lesions in the small intestine. Materials and Methods. We included 98 patients who underwent CE during the observation period and were subjected to CTE at our hospital from April 2008 to May 2014. Results. CTE had a significantly higher sensitivity than CE (84.6% versus 46.2%, P = 0.039), but there were no significant differences in specificity, positive or negative predictive values, or diagnostic accuracy rates. The sensitivity of CTE/CE was 100%, again significantly higher than that of CE (P = 0.002). The difference in specificity between CTE/CE and CE was not significant, but there were significant differences in positive predictive values (100% for CTE/CE versus 66.7% for CE, P = 0.012), negative predictive values (100% versus 92.1%, P = 0.008), and diagnostic accuracy rate (100% versus 89.8%, P = 0.001). The diagnostic accuracy rate was also significantly higher in CTE/CE versus CTE (100% versus 95.9%, P = 0.043). Conclusion. Our findings suggested that a combination of CTE and CE was useful for screening tumor lesions in the small intestine. This trial is registered with number UMIN000016154. PMID:25792979

  3. Computer-aided detection of masses in full-field digital mammography using screen-film mammograms for training.

    PubMed

    Kallenberg, Michiel; Karssemeijer, Nico

    2008-12-07

    It would be of great value when available databases of screen-film mammography (SFM) images can be used to train full-field digital mammography (FFDM) computer-aided detection (CAD) systems, as compilation of new databases is costly. In this paper, we investigate this possibility. Firstly, we develop a method that converts an FFDM image into an SFM-like representation. In this conversion method, we establish a relation between exposure and optical density by simulation of an automatic exposure control unit. Secondly, we investigate the effects of using the SFM images as training samples compared to training with FFDM images. Our FFDM database consisted of 266 cases, of which 102 were biopsy-proven malignant masses and 164 normals. The images were acquired with systems of two different manufacturers. We found that, when we trained our FFDM CAD system with a small number of images, training with FFDM images, using a five-fold crossvalidation procedure, outperformed training with SFM images. However, when the full SFM database, consisting of 348 abnormal cases (including 204 priors) and 810 normal cases, was used for training, SFM training outperformed FFDMA training. These results show that an existing CAD system for detection of masses in SFM can be used for FFDM images without retraining.

  4. Novel inhibitor discovery against aromatase through virtual screening and molecular dynamic simulation: a computational approach in drug design.

    PubMed

    Mirzaie, Sako; Chupani, Latifeh; Asadabadi, Ebrahim Barzegari; Shahverdi, Ahmad Reza; Jamalan, Mostafa

    2013-01-01

    Inhibition of aromatase (CYTP450) as a key enzyme in the estrogen biosynthesis could result in regression of estrogen-dependent tumors and even preventing the promotion of breast cancer. Although today potent steroid and non-steroid inhibitors of aromatase are available, isoflavanone derivatives as natural compounds with least side effects have been described as the candidate for a new generation of aromatase inhibitors. 2a as an isoflavanone derivative is the most potent inhibitor of aromatase, synthesized by Bonfield et al. (2012[7]). In our computational study, the mentioned compound was used as the template for virtual screening. Between 286 selected compounds with 70 % of structural similarity to 2a, 150 of them showed lower docking energy in comparison with 2a. Compound 2a_1 with 11.2 kcal/mol had the lowest docking energy. Interaction of 2a_1 with aromatase was further investigated and compared with 2a and androstenedione (ASD) as a natural substrate of aromatase, through 20 ns of molecular dynamic simulation. Analysis of trajectories showed, while ASD interacts with aromatase through hydrogen bonds and 2a just interacts via hydrophobic forces, 2a_1 not only accommodates in the hydrophobic active site of aromatase in a suitable manner but it also makes a stable coordination with iron atom of aromatase heme group via OB.

  5. Orthotropic Piezoelectricity in 2D Nanocellulose

    NASA Astrophysics Data System (ADS)

    García, Y.; Ruiz-Blanco, Yasser B.; Marrero-Ponce, Yovani; Sotomayor-Torres, C. M.

    2016-10-01

    The control of electromechanical responses within bonding regions is essential to face frontier challenges in nanotechnologies, such as molecular electronics and biotechnology. Here, we present Iβ-nanocellulose as a potentially new orthotropic 2D piezoelectric crystal. The predicted in-layer piezoelectricity is originated on a sui-generis hydrogen bonds pattern. Upon this fact and by using a combination of ab-initio and ad-hoc models, we introduce a description of electrical profiles along chemical bonds. Such developments lead to obtain a rationale for modelling the extended piezoelectric effect originated within bond scales. The order of magnitude estimated for the 2D Iβ-nanocellulose piezoelectric response, ~pm V‑1, ranks this material at the level of currently used piezoelectric energy generators and new artificial 2D designs. Such finding would be crucial for developing alternative materials to drive emerging nanotechnologies.

  6. Orthotropic Piezoelectricity in 2D Nanocellulose

    PubMed Central

    García, Y.; Ruiz-Blanco, Yasser B.; Marrero-Ponce, Yovani; Sotomayor-Torres, C. M.

    2016-01-01

    The control of electromechanical responses within bonding regions is essential to face frontier challenges in nanotechnologies, such as molecular electronics and biotechnology. Here, we present Iβ-nanocellulose as a potentially new orthotropic 2D piezoelectric crystal. The predicted in-layer piezoelectricity is originated on a sui-generis hydrogen bonds pattern. Upon this fact and by using a combination of ab-initio and ad-hoc models, we introduce a description of electrical profiles along chemical bonds. Such developments lead to obtain a rationale for modelling the extended piezoelectric effect originated within bond scales. The order of magnitude estimated for the 2D Iβ-nanocellulose piezoelectric response, ~pm V−1, ranks this material at the level of currently used piezoelectric energy generators and new artificial 2D designs. Such finding would be crucial for developing alternative materials to drive emerging nanotechnologies. PMID:27708364

  7. Orthotropic Piezoelectricity in 2D Nanocellulose.

    PubMed

    García, Y; Ruiz-Blanco, Yasser B; Marrero-Ponce, Yovani; Sotomayor-Torres, C M

    2016-10-06

    The control of electromechanical responses within bonding regions is essential to face frontier challenges in nanotechnologies, such as molecular electronics and biotechnology. Here, we present Iβ-nanocellulose as a potentially new orthotropic 2D piezoelectric crystal. The predicted in-layer piezoelectricity is originated on a sui-generis hydrogen bonds pattern. Upon this fact and by using a combination of ab-initio and ad-hoc models, we introduce a description of electrical profiles along chemical bonds. Such developments lead to obtain a rationale for modelling the extended piezoelectric effect originated within bond scales. The order of magnitude estimated for the 2D Iβ-nanocellulose piezoelectric response, ~pm V(-1), ranks this material at the level of currently used piezoelectric energy generators and new artificial 2D designs. Such finding would be crucial for developing alternative materials to drive emerging nanotechnologies.

  8. Large Area Synthesis of 2D Materials

    NASA Astrophysics Data System (ADS)

    Vogel, Eric

    Transition metal dichalcogenides (TMDs) have generated significant interest for numerous applications including sensors, flexible electronics, heterostructures and optoelectronics due to their interesting, thickness-dependent properties. Despite recent progress, the synthesis of high-quality and highly uniform TMDs on a large scale is still a challenge. In this talk, synthesis routes for WSe2 and MoS2 that achieve monolayer thickness uniformity across large area substrates with electrical properties equivalent to geological crystals will be described. Controlled doping of 2D semiconductors is also critically required. However, methods established for conventional semiconductors, such as ion implantation, are not easily applicable to 2D materials because of their atomically thin structure. Redox-active molecular dopants will be demonstrated which provide large changes in carrier density and workfunction through the choice of dopant, treatment time, and the solution concentration. Finally, several applications of these large-area, uniform 2D materials will be described including heterostructures, biosensors and strain sensors.

  9. Assessing 2D electrophoretic mobility spectroscopy (2D MOSY) for analytical applications.

    PubMed

    Fang, Yuan; Yushmanov, Pavel V; Furó, István

    2016-12-08

    Electrophoretic displacement of charged entity phase modulates the spectrum acquired in electrophoretic NMR experiments, and this modulation can be presented via 2D FT as 2D mobility spectroscopy (MOSY) spectra. We compare in various mixed solutions the chemical selectivity provided by 2D MOSY spectra with that provided by 2D diffusion-ordered spectroscopy (DOSY) spectra and demonstrate, under the conditions explored, a superior performance of the former method. 2D MOSY compares also favourably with closely related LC-NMR methods. The shape of 2D MOSY spectra in complex mixtures is strongly modulated by the pH of the sample, a feature that has potential for areas such as in drug discovery and metabolomics. Copyright © 2016 The Authors. Magnetic Resonance in Chemistry published by John Wiley & Sons Ltd. StartCopTextCopyright © 2016 The Authors. Magnetic Resonance in Chemistry published by John Wiley & Sons Ltd.

  10. 2D Distributed Sensing Via TDR

    DTIC Science & Technology

    2007-11-02

    plate VEGF CompositeSensor Experimental Setup Air 279 mm 61 78 VARTM profile: slope RTM profile: rectangle 22 1 Jul 2003© 2003 University of Delaware...2003 University of Delaware All rights reserved Vision: Non-contact 2D sensing ü VARTM setup constructed within TL can be sensed by its EM field: 2D...300.0 mm/ns. 1 2 1 Jul 2003© 2003 University of Delaware All rights reserved Model Validation “ RTM Flow” TDR Response to 139 mm VEGC

  11. Inkjet printing of 2D layered materials.

    PubMed

    Li, Jiantong; Lemme, Max C; Östling, Mikael

    2014-11-10

    Inkjet printing of 2D layered materials, such as graphene and MoS2, has attracted great interests for emerging electronics. However, incompatible rheology, low concentration, severe aggregation and toxicity of solvents constitute critical challenges which hamper the manufacturing efficiency and product quality. Here, we introduce a simple and general technology concept (distillation-assisted solvent exchange) to efficiently overcome these challenges. By implementing the concept, we have demonstrated excellent jetting performance, ideal printing patterns and a variety of promising applications for inkjet printing of 2D layered materials.

  12. Extreme Growth of Enstrophy on 2D Bounded Domains

    NASA Astrophysics Data System (ADS)

    Protas, Bartosz; Sliwiak, Adam

    2016-11-01

    We study the vortex states responsible for the largest instantaneous growth of enstrophy possible in viscous incompressible flow on 2D bounded domain. The goal is to compare these results with estimates obtained using mathematical analysis. This problem is closely related to analogous questions recently considered in the periodic setting on 1D, 2D and 3D domains. In addition to systematically characterizing the most extreme behavior, these problems are also closely related to the open question of the finite-time singularity formation in the 3D Navier-Stokes system. We demonstrate how such extreme vortex states can be found as solutions of constrained variational optimization problems which in the limit of small enstrophy reduce to eigenvalue problems. Computational results will be presented for circular and square domains emphasizing the effect of geometric singularities (corners of the domain) on the structure of the extreme vortex states. Supported by an NSERC (Canada) Discovery Grant.

  13. Transition to chaos in an open unforced 2D flow

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.; Vastano, John A.

    1993-01-01

    The present numerical study of unsteady, low Reynolds number flow past a 2D airfoil attempts to ascertain the bifurcation sequence leading from simple periodic to complex aperiodic flow with rising Reynolds number, as well as to characterize the degree of chaos present in the aperiodic flow and assess the role of numerics in the modification and control of the observed bifurcation scenario. The ARC2D Navier-Stokes code is used in an unsteady time-accurate mode for most of these computations. The system undergoes a period-doubling bifurcation to chaos as the Reynolds number is increased from 800 to 1600; its chaotic attractors are characterized by estimates of the fractal dimension and partial Liapunov exponent spectra.

  14. 2-D Magnetohydrodynamic Modeling of A Pulsed Plasma Thruster

    NASA Technical Reports Server (NTRS)

    Thio, Y. C. Francis; Cassibry, J. T.; Wu, S. T.; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Experiments are being performed on the NASA Marshall Space Flight Center (MSFC) MK-1 pulsed plasma thruster. Data produced from the experiments provide an opportunity to further understand the plasma dynamics in these thrusters via detailed computational modeling. The detailed and accurate understanding of the plasma dynamics in these devices holds the key towards extending their capabilities in a number of applications, including their applications as high power (greater than 1 MW) thrusters, and their use for producing high-velocity, uniform plasma jets for experimental purposes. For this study, the 2-D MHD modeling code, MACH2, is used to provide detailed interpretation of the experimental data. At the same time, a 0-D physics model of the plasma initial phase is developed to guide our 2-D modeling studies.

  15. FPCAS2D user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Bakhle, Milind A.

    1994-01-01

    The FPCAS2D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady two-dimensional full potential equation which is solved for a cascade of blades. The structural analysis is based on a two degree-of-freedom rigid typical section model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS2D code. A complete description of the input data is provided in this report. In addition, four test cases, including inputs and outputs, are provided.

  16. 2D FEM Heat Transfer & E&M Field Code

    SciTech Connect

    1992-04-02

    TOPAZ and TOPAZ2D are two-dimensional implicit finite element computer codes for heat transfer analysis. TOPAZ2D can also be used to solve electrostatic and magnetostatic problems. The programs solve for the steady-state or transient temperature or electrostatic and magnetostatic potential field on two-dimensional planar or axisymmetric geometries. Material properties may be temperature or potential-dependent and either isotropic or orthotropic. A variety of time and temperature-dependent boundary conditions can be specified including temperature, flux, convection, and radiation. By implementing the user subroutine feature, users can model chemical reaction kinetics and allow for any type of functional representation of boundary conditions and internal heat generation. The programs can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in the material surrounding the enclosure. Additional features include thermal contact resistance across an interface, bulk fluids, phase change, and energy balances.

  17. Parallel-pipeline 2-D DCT/IDCT processor chip

    NASA Astrophysics Data System (ADS)

    Ruiz, G. A.; Michell, J. A.; Buron, A.

    2005-06-01

    This paper describes the architecture of an 8x8 2-D DCT/IDCT processor with high throughput and a cost-effective architecture. The 2D DCT/IDCT is calculated using the separability property, so that its architecture is made up of two 1-D processors and a transpose buffer (TB) as intermediate memory. This transpose buffer presents a regular structure based on D-type flip-flops with a double serial input/output data-flow very adequate for pipeline architectures. The processor has been designed with parallel and pipeline architecture to attain high throughput, reduced hardware and maximum efficiency in all arithmetic elements. This architecture allows that the processing elements and arithmetic units work in parallel at half the frequency of the data input rate, except for normalization of transform which it is done in a multiplier operating at maximum frequency. Moreover, it has been verified that the precision analysis of the proposed processor meets the demands of IEEE Std. 1180-1990 used in video codecs ITU-T H.261 and ITU-T H.263. This processor has been conceived using a standard cell design methodology and manufactured in a 0.35-μm CMOS CSD 3M/2P 3.3V process. It has an area of 6.25 mm2 (the core is 3mm2) and contains a total of 11.7k gates, of which 5.8k gates are flip-flops. A data input rate frequency of 300MHz has been established with a latency of 172 cycles for the 2-D DCT and 178 cycles for the 2-D IDCT. The computing time of a block is close to 580ns. Its performances in computing speed as well as hardware complexity indicate that the proposed design is suitable for HDTV applications.

  18. 2D OR NOT 2D: THE EFFECT OF DIMENSIONALITY ON THE DYNAMICS OF FINGERING CONVECTION AT LOW PRANDTL NUMBER

    SciTech Connect

    Garaud, Pascale; Brummell, Nicholas

    2015-12-10

    Fingering convection (otherwise known as thermohaline convection) is an instability that occurs in stellar radiative interiors in the presence of unstable compositional gradients. Numerical simulations have been used in order to estimate the efficiency of mixing induced by this instability. However, fully three-dimensional (3D) computations in the parameter regime appropriate for stellar astrophysics (i.e., low Prandtl number) are prohibitively expensive. This raises the question of whether two-dimensional (2D) simulations could be used instead to achieve the same goals. In this work, we address this issue by comparing the outcome of 2D and 3D simulations of fingering convection at low Prandtl number. We find that 2D simulations are never appropriate. However, we also find that the required 3D computational domain does not have to be very wide: the third dimension only needs to contain a minimum of two wavelengths of the fastest-growing linearly unstable mode to capture the essentially 3D dynamics of small-scale fingering. Narrow domains, however, should still be used with caution since they could limit the subsequent development of any large-scale dynamics typically associated with fingering convection.

  19. Parallel Stitching of 2D Materials.

    PubMed

    Ling, Xi; Lin, Yuxuan; Ma, Qiong; Wang, Ziqiang; Song, Yi; Yu, Lili; Huang, Shengxi; Fang, Wenjing; Zhang, Xu; Hsu, Allen L; Bie, Yaqing; Lee, Yi-Hsien; Zhu, Yimei; Wu, Lijun; Li, Ju; Jarillo-Herrero, Pablo; Dresselhaus, Mildred; Palacios, Tomás; Kong, Jing

    2016-03-23

    Diverse parallel stitched 2D heterostructures, including metal-semiconductor, semiconductor-semiconductor, and insulator-semiconductor, are synthesized directly through selective "sowing" of aromatic molecules as the seeds in the chemical vapor deposition (CVD) method. The methodology enables the large-scale fabrication of lateral heterostructures, which offers tremendous potential for its application in integrated circuits.

  20. The basics of 2D DIGE.

    PubMed

    Beckett, Phil

    2012-01-01

    The technique of two-dimensional (2D) gel electrophoresis is a powerful tool for separating complex mixtures of proteins, but since its inception in the mid 1970s, it acquired the stigma of being a very difficult application to master and was generally used to its best effect by experts. The introduction of commercially available immobilized pH gradients in the early 1990s provided enhanced reproducibility and easier protocols, leading to a pronounced increase in popularity of the technique. However gel-to-gel variation was still difficult to control without the use of technical replicates. In the mid 1990s (at the same time as the birth of "proteomics"), the concept of multiplexing fluorescently labeled proteins for 2D gel separation was realized by Jon Minden's group and has led to the ability to design experiments to virtually eliminate gel-to-gel variation, resulting in biological replicates being used for statistical analysis with the ability to detect very small changes in relative protein abundance. This technology is referred to as 2D difference gel electrophoresis (2D DIGE).

  1. Parallel stitching of 2D materials

    DOE PAGES

    Ling, Xi; Wu, Lijun; Lin, Yuxuan; ...

    2016-01-27

    Diverse parallel stitched 2D heterostructures, including metal–semiconductor, semiconductor–semiconductor, and insulator–semiconductor, are synthesized directly through selective “sowing” of aromatic molecules as the seeds in the chemical vapor deposition (CVD) method. Lastly, the methodology enables the large-scale fabrication of lateral heterostructures, which offers tremendous potential for its application in integrated circuits.

  2. Results of an Australian trial using SurePath liquid-based cervical cytology with FocalPoint computer-assisted screening technology.

    PubMed

    Bowditch, Ron C; Clarke, Joanne M; Baird, Phillip J; Greenberg, Merle L

    2012-12-01

    BD FocalPoint GS™ computer-assisted screening of BD SurePath® liquid-based cervical cytology slides (SP + FP) was compared with screening an accompanying conventional cervical Papanicolaou (Pap) smear (CON) in a split sample trial of 2,198 routine specimens. The rate of unsatisfactory specimens in the SP + FP arm was 0.2% compared with 4.1% in the conventional Pap smear, a significant reduction. There was no statistically significant difference between SP + FP and CON for the detection of histologically confirmed high-grade (HG) lesions in the routine split sample specimens (n = 9). To further test the sensitivity of SP + FP for HG lesions, 38 SurePath slides from confirmed HG cases, without an accompanying CON, were interpolated among the routine smears. In every one of the 47 confirmed HG cases, either HG cells were present in the microscope fields selected by FocalPointGS™ for review by the screening cytologist (46 of 47), or full screening of the slide was indicated by the FocalPointGS™ (1 of 47), confirming the effectiveness of SP + FP technology for primary screening. In a small number of cases, the screening cytologist did not recognize the abnormality even though on review HG cells were present in fields selected by FocalPointGS™. The overall detection rate was 93% for HG squamous lesions; 89% for known HG endocervical glandular lesions; and 91% for known endometrial carcinoma. In conclusion, the SP + FP detected 100% of HG abnormalities in the trial set; significantly reduced the rate of unsatisfactory specimens; and improved the overall screening rate of detection of HG abnormalities particularly of glandular lesions when compared with other screening technologies.

  3. Usability analysis of 2D graphics software for designing technical clothing.

    PubMed

    Teodoroski, Rita de Cassia Clark; Espíndola, Edilene Zilma; Silva, Enéias; Moro, Antônio Renato Pereira; Pereira, Vera Lucia D V

    2012-01-01

    With the advent of technology, the computer became a working tool increasingly present in companies. Its purpose is to increase production and reduce the inherent errors in manual production. The aim of this study was to analyze the usability of 2D graphics software in creating clothing designs by a professional during his work. The movements of the mouse, keyboard and graphical tools were monitored in real time by software Camtasia 7® installed on the user's computer. To register the use of mouse and keyboard we used auxiliary software called MouseMeter®, which quantifies the number of times they pressed the right, middle and left mouse's buttons, the keyboard and also the distance traveled in meters by the cursor on the screen. Data was collected in periods of 15 minutes, 1 hour and 8 hours, consecutively. The results showed that the job is considered repetitive and high demands physical efforts, which can lead to the appearance of repetitive strain injuries. Thus, the goal of minimizing operator efforts and thereby enhance the usability of the examined tool, becomes imperative to replace the mouse by a device called tablet, which also offers an electronic pen and a drawing platform for design development.

  4. Performance and Cost-Effectiveness of Computed Tomography Lung Cancer Screening Scenarios in a Population-Based Setting: A Microsimulation Modeling Analysis in Ontario, Canada

    PubMed Central

    ten Haaf, Kevin; Tammemägi, Martin C.; Bondy, Susan J.; van der Aalst, Carlijn M.; Gu, Sumei; de Koning, Harry J.

    2017-01-01

    Background The National Lung Screening Trial (NLST) results indicate that computed tomography (CT) lung cancer screening for current and former smokers with three annual screens can be cost-effective in a trial setting. However, the cost-effectiveness in a population-based setting with >3 screening rounds is uncertain. Therefore, the objective of this study was to estimate the cost-effectiveness of lung cancer screening in a population-based setting in Ontario, Canada, and evaluate the effects of screening eligibility criteria. Methods and Findings This study used microsimulation modeling informed by various data sources, including the Ontario Health Insurance Plan (OHIP), Ontario Cancer Registry, smoking behavior surveys, and the NLST. Persons, born between 1940 and 1969, were examined from a third-party health care payer perspective across a lifetime horizon. Starting in 2015, 576 CT screening scenarios were examined, varying by age to start and end screening, smoking eligibility criteria, and screening interval. Among the examined outcome measures were lung cancer deaths averted, life-years gained, percentage ever screened, costs (in 2015 Canadian dollars), and overdiagnosis. The results of the base-case analysis indicated that annual screening was more cost-effective than biennial screening. Scenarios with eligibility criteria that required as few as 20 pack-years were dominated by scenarios that required higher numbers of accumulated pack-years. In general, scenarios that applied stringent smoking eligibility criteria (i.e., requiring higher levels of accumulated smoking exposure) were more cost-effective than scenarios with less stringent smoking eligibility criteria, with modest differences in life-years gained. Annual screening between ages 55–75 for persons who smoked ≥40 pack-years and who currently smoke or quit ≤10 y ago yielded an incremental cost-effectiveness ratio of $41,136 Canadian dollars ($33,825 in May 1, 2015, United States dollars) per

  5. The Privileged Chemical Space Predictor (PCSP): a computer program that identifies privileged chemical space from screens of modularly assembled chemical libraries.

    PubMed

    Seedhouse, Steven J; Labuda, Lucas P; Disney, Matthew D

    2010-02-15

    Modularly assembled combinatorial libraries are often used to identify ligands that bind to and modulate the function of a protein or a nucleic acid. Much of the data from screening these compounds, however, is not efficiently utilized to define structure-activity relationships (SAR). If SAR data are accurately constructed, it can enable the design of more potent binders. Herein, we describe a computer program called Privileged Chemical Space Predictor (PCSP) that statistically determines SAR from high-throughput screening (HTS) data and then identifies features in small molecules that predispose them for binding a target. Features are scored for statistical significance and can be utilized to design improved second generation compounds or more target-focused libraries. The program's utility is demonstrated through analysis of a modularly assembled peptoid library that previously was screened for binding to and inhibiting a group I intron RNA from the fungal pathogen Candida albicans.

  6. Concurrent Validity of a Computer-Based Cognitive Screening Tool for Use in Adults with HIV Disease

    PubMed Central

    Dew, Mary Amanda; Aizenstein, Howard J.; Lopez, Oscar L.; Morrow, Lisa; Saxton, Judith

    2011-01-01

    Abstract As the incidence of HIV-associated dementia has decreased, the survival of HIV-infected individuals with milder forms of cognitive impairment has increased. Detecting this milder impairment in its earliest stages has great clinical and research importance. We report here the results of an initial evaluation of the Computer Assessment of Mild Cognitive Impairment (CAMCI®), a computerized screening tool designed to assess abnormal cognitive decline with reduced respondent and test administrator burden. Fifty-nine volunteers (29 HIV infected; age=50.9 years; education=14.9 years; 36/59 males) completed the CAMCI® and a battery of neuropsychological tests. The CAMCI was repeated 12 and 24 weeks later. The results from the CAMCI were compared to Global and Domain Impairment scores derived from the full neuropsychological test battery. The CAMCI detected mild impairment (compared with normal and borderline test performance) with a sensitivity of 0.72, specificity of 0.97, positive predictive rate of 0.93, and a negative predictive rate of 0.89. Median stability over 12 and 24 weeks of follow-up was 0.32 and 0.46, respectively. These rates did not differ as a function of serostatus. A discriminant function analysis correctly classified 90% of the subjects with respect to their overall Global Impairment Rating from six of the CAMCI scores. This preliminary study demonstrates that the CAMCI is sensitive to mild forms of cognitive impairment, and is stable over 24 weeks of follow-up. A larger trial to obtain risk-group appropriate normative data will be necessary to make the instrument useful in both clinical practice and research (e.g., clinical trials). PMID:21545295

  7. Application of 2D Non-Graphene Materials and 2D Oxide Nanostructures for Biosensing Technology

    PubMed Central

    Shavanova, Kateryna; Bakakina, Yulia; Burkova, Inna; Shtepliuk, Ivan; Viter, Roman; Ubelis, Arnolds; Beni, Valerio; Starodub, Nickolaj; Yakimova, Rositsa; Khranovskyy, Volodymyr

    2016-01-01

    The discovery of graphene and its unique properties has inspired researchers to try to invent other two-dimensional (2D) materials. After considerable research effort, a distinct “beyond graphene” domain has been established, comprising the library of non-graphene 2D materials. It is significant that some 2D non-graphene materials possess solid advantages over their predecessor, such as having a direct band gap, and therefore are highly promising for a number of applications. These applications are not limited to nano- and opto-electronics, but have a strong potential in biosensing technologies, as one example. However, since most of the 2D non-graphene materials have been newly discovered, most of the research efforts are concentrated on material synthesis and the investigation of the properties of the material. Applications of 2D non-graphene materials are still at the embryonic stage, and the integration of 2D non-graphene materials into devices is scarcely reported. However, in recent years, numerous reports have blossomed about 2D material-based biosensors, evidencing the growing potential of 2D non-graphene materials for biosensing applications. This review highlights the recent progress in research on the potential of using 2D non-graphene materials and similar oxide nanostructures for different types of biosensors (optical and electrochemical). A wide range of biological targets, such as glucose, dopamine, cortisol, DNA, IgG, bisphenol, ascorbic acid, cytochrome and estradiol, has been reported to be successfully detected by biosensors with transducers made of 2D non-graphene materials. PMID:26861346

  8. Application of 2D Non-Graphene Materials and 2D Oxide Nanostructures for Biosensing Technology.

    PubMed

    Shavanova, Kateryna; Bakakina, Yulia; Burkova, Inna; Shtepliuk, Ivan; Viter, Roman; Ubelis, Arnolds; Beni, Valerio; Starodub, Nickolaj; Yakimova, Rositsa; Khranovskyy, Volodymyr

    2016-02-06

    The discovery of graphene and its unique properties has inspired researchers to try to invent other two-dimensional (2D) materials. After considerable research effort, a distinct "beyond graphene" domain has been established, comprising the library of non-graphene 2D materials. It is significant that some 2D non-graphene materials possess solid advantages over their predecessor, such as having a direct band gap, and therefore are highly promising for a number of applications. These applications are not limited to nano- and opto-electronics, but have a strong potential in biosensing technologies, as one example. However, since most of the 2D non-graphene materials have been newly discovered, most of the research efforts are concentrated on material synthesis and the investigation of the properties of the material. Applications of 2D non-graphene materials are still at the embryonic stage, and the integration of 2D non-graphene materials into devices is scarcely reported. However, in recent years, numerous reports have blossomed about 2D material-based biosensors, evidencing the growing potential of 2D non-graphene materials for biosensing applications. This review highlights the recent progress in research on the potential of using 2D non-graphene materials and similar oxide nanostructures for different types of biosensors (optical and electrochemical). A wide range of biological targets, such as glucose, dopamine, cortisol, DNA, IgG, bisphenol, ascorbic acid, cytochrome and estradiol, has been reported to be successfully detected by biosensors with transducers made of 2D non-graphene materials.

  9. Thoracic Temporal Subtraction Three Dimensional Computed Tomography (3D-CT): Screening for Vertebral Metastases of Primary Lung Cancers

    PubMed Central

    Iwano, Shingo; Ito, Rintaro; Umakoshi, Hiroyasu; Karino, Takatoshi; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2017-01-01

    Purpose We developed an original, computer-aided diagnosis (CAD) software that subtracts the initial thoracic vertebral three-dimensional computed tomography (3D-CT) image from the follow-up 3D-CT image. The aim of this study was to investigate the efficacy of this CAD software during screening for vertebral metastases on follow-up CT images of primary lung cancer patients. Materials and Methods The interpretation experiment included 30 sets of follow-up CT scans in primary lung cancer patients and was performed by two readers (readers A and B), who each had 2.5 years’ experience reading CT images. In 395 vertebrae from C6 to L3, 46 vertebral metastases were identified as follows: osteolytic metastases (n = 17), osteoblastic metastases (n = 14), combined osteolytic and osteoblastic metastases (n = 6), and pathological fractures (n = 9). Thirty-six lesions were in the anterior component (vertebral body), and 10 lesions were in the posterior component (vertebral arch, transverse process, and spinous process). The area under the curve (AUC) by receiver operating characteristic (ROC) curve analysis and the sensitivity and specificity for detecting vertebral metastases were compared with and without CAD for each observer. Results Reader A detected 47 abnormalities on CT images without CAD, and 33 of them were true-positive metastatic lesions. Using CAD, reader A detected 57 abnormalities, and 38 were true positives. The sensitivity increased from 0.717 to 0.826, and on ROC curve analysis, AUC with CAD was significantly higher than that without CAD (0.849 vs. 0.902, p = 0.021). Reader B detected 40 abnormalities on CT images without CAD, and 36 of them were true-positive metastatic lesions. Using CAD, reader B detected 44 abnormalities, and 39 were true positives. The sensitivity increased from 0.783 to 0.848, and AUC with CAD was nonsignificantly higher than that without CAD (0.889 vs. 0.910, p = 0.341). Both readers detected more osteolytic and osteoblastic

  10. Human- and computer-accessible 2D correlation data for a more reliable structure determination of organic compounds. Future roles of researchers, software developers, spectrometer managers, journal editors, reviewers, publisher and database managers toward artificial-intelligence analysis of NMR spectra.

    PubMed

    Jeannerat, Damien

    2017-01-01

    The introduction of a universal data format to report the correlation data of 2D NMR spectra such as COSY, HSQC and HMBC spectra will have a large impact on the reliability of structure determination of small organic molecules. These lists of assigned cross peaks will bridge signals found in NMR 1D and 2D spectra and the assigned chemical structure. The record could be very compact, human and computer readable so that it can be included in the supplementary material of publications and easily transferred into databases of scientific literature and chemical compounds. The records will allow authors, reviewers and future users to test the consistency and, in favorable situations, the uniqueness of the assignment of the correlation data to the associated chemical structures. Ideally, the data format of the correlation data should include direct links to the NMR spectra to make it possible to validate their reliability and allow direct comparison of spectra. In order to take the full benefits of their potential, the correlation data and the NMR spectra should therefore follow any manuscript in the review process and be stored in open-access database after publication. Keeping all NMR spectra, correlation data and assigned structures together at all time will allow the future development of validation tools increasing the reliability of past and future NMR data. This will facilitate the development of artificial intelligence analysis of NMR spectra by providing a source of data than can be used efficiently because they have been validated or can be validated by future users. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Identification of UV-protective activators of nuclear factor erythroid-derived 2-related factor 2 (Nrf2) by combining a chemical library screen with computer-based virtual screening.

    PubMed

    Lieder, Franziska; Reisen, Felix; Geppert, Tim; Sollberger, Gabriel; Beer, Hans-Dietmar; auf dem Keller, Ulrich; Schäfer, Matthias; Detmar, Michael; Schneider, Gisbert; Werner, Sabine

    2012-09-21

    Nuclear factor erythroid-derived 2-related factor 2 (Nrf2) is a master regulator of cellular antioxidant defense systems, and activation of this transcription factor is a promising strategy for protection of skin and other organs from environmental insults. To identify efficient Nrf2 activators in keratinocytes, we combined a chemical library screen with computer-based virtual screening. Among 14 novel Nrf2 activators, the most potent compound, a nitrophenyl derivative of 2-chloro-5-nitro-N-phenyl-benzamide, was characterized with regard to its molecular mechanism of action. This compound induced the expression of cytoprotective genes in keratinocytes isolated from wild-type but not from Nrf2-deficient mice. Most importantly, it showed low toxicity and protected primary human keratinocytes from UVB-induced cell death. Therefore, it represents a potential lead compound for the development of drugs for skin protection under stress conditions. Our study demonstrates that chemical library screening combined with advanced computational similarity searching is a powerful strategy for identification of bioactive compounds, and it points toward an innovative therapeutic approach against UVB-induced skin damage.

  12. Large-scale virtual high-throughput screening for the identification of new battery electrolyte solvents: computing infrastructure and collective properties.

    PubMed

    Husch, Tamara; Yilmazer, Nusret Duygu; Balducci, Andrea; Korth, Martin

    2015-02-07

    A volunteer computing approach is presented for the purpose of screening a large number of molecular structures with respect to their suitability as new battery electrolyte solvents. Collective properties like melting, boiling and flash points are evaluated using COSMOtherm and quantitative structure-property relationship (QSPR) based methods, while electronic structure theory methods are used for the computation of electrochemical stability window estimators. Two application examples are presented: first, the results of a previous large-scale screening test (PCCP, 2014, 16, 7919) are re-evaluated with respect to the mentioned collective properties. As a second application example, all reasonable nitrile solvents up to 12 heavy atoms are generated and used to illustrate a suitable filter protocol for picking Pareto-optimal candidates.

  13. Experimental validation of 2-D generalized geometric super resolved approach

    NASA Astrophysics Data System (ADS)

    Borkowski, Amikam; Zalevsky, Zeev; Cohen, Nadav; Hadas, Zadok; Marom, Emanuel; Javidi, Bahram

    2014-01-01

    In this paper, we generalize the method of using a 2-D moving binary random mask to overcome the geometrical resolution limitation of an imaging sensor. The spatial blurring is caused by the size of the imaging sensor pixels which yield insufficient spatial sampling. The mask is placed in an intermediate image plane and can be shifted in any direction while keeping the sensor as well as all other optical components fixed. Out of the set of images that are captured and registered, a high resolution image can be composed. In addition, this proposed approach reduces the amount of required computations and it has an improved robustness to spatial noise.

  14. Numerical Simulation of Supersonic Compression Corners and Hypersonic Inlet Flows Using the RPLUS2D Code

    NASA Technical Reports Server (NTRS)

    Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.

    1994-01-01

    A two-dimensional computational code, PRLUS2D, which was developed for the reactive propulsive flows of ramjets and scramjets, was validated for two-dimensional shock-wave/turbulent-boundary-layer interactions. The problem of compression corners at supersonic speeds was solved using the RPLUS2D code. To validate the RPLUS2D code for hypersonic speeds, it was applied to a realistic hypersonic inlet geometry. Both the Baldwin-Lomax and the Chien two-equation turbulence models were used. Computational results showed that the RPLUS2D code compared very well with experimentally obtained data for supersonic compression corner flows, except in the case of large separated flows resulting from the interactions between the shock wave and turbulent boundary layer. The computational results compared well with the experiment results in a hypersonic NASA P8 inlet case, with the Chien two-equation turbulence model performing better than the Baldwin-Lomax model.

  15. Building 3D scenes from 2D image sequences

    NASA Astrophysics Data System (ADS)

    Cristea, Paul D.

    2006-05-01

    Sequences of 2D images, taken by a single moving video receptor, can be fused to generate a 3D representation. This dynamic stereopsis exists in birds and reptiles, whereas the static binocular stereopsis is common in mammals, including humans. Most multimedia computer vision systems for stereo image capture, transmission, processing, storage and retrieval are based on the concept of binocularity. As a consequence, their main goal is to acquire, conserve and enhance pairs of 2D images able to generate a 3D visual perception in a human observer. Stereo vision in birds is based on the fusion of images captured by each eye, with previously acquired and memorized images from the same eye. The process goes on simultaneously and conjointly for both eyes and generates an almost complete all-around visual field. As a consequence, the baseline distance is no longer fixed, as in the case of binocular 3D view, but adjustable in accordance with the distance to the object of main interest, allowing a controllable depth effect. Moreover, the synthesized 3D scene can have a better resolution than each individual 2D image in the sequence. Compression of 3D scenes can be achieved, and stereo transmissions with lower bandwidth requirements can be developed.

  16. A novel point cloud registration using 2D image features

    NASA Astrophysics Data System (ADS)

    Lin, Chien-Chou; Tai, Yen-Chou; Lee, Jhong-Jin; Chen, Yong-Sheng

    2017-01-01

    Since a 3D scanner only captures a scene of a 3D object at a time, a 3D registration for multi-scene is the key issue of 3D modeling. This paper presents a novel and an efficient 3D registration method based on 2D local feature matching. The proposed method transforms the point clouds into 2D bearing angle images and then uses the 2D feature based matching method, SURF, to find matching pixel pairs between two images. The corresponding points of 3D point clouds can be obtained by those pixel pairs. Since the corresponding pairs are sorted by their distance between matching features, only the top half of the corresponding pairs are used to find the optimal rotation matrix by the least squares approximation. In this paper, the optimal rotation matrix is derived by orthogonal Procrustes method (SVD-based approach). Therefore, the 3D model of an object can be reconstructed by aligning those point clouds with the optimal transformation matrix. Experimental results show that the accuracy of the proposed method is close to the ICP, but the computation cost is reduced significantly. The performance is six times faster than the generalized-ICP algorithm. Furthermore, while the ICP requires high alignment similarity of two scenes, the proposed method is robust to a larger difference of viewing angle.

  17. Design Application Translates 2-D Graphics to 3-D Surfaces

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Fabric Images Inc., specializing in the printing and manufacturing of fabric tension architecture for the retail, museum, and exhibit/tradeshow communities, designed software to translate 2-D graphics for 3-D surfaces prior to print production. Fabric Images' fabric-flattening design process models a 3-D surface based on computer-aided design (CAD) specifications. The surface geometry of the model is used to form a 2-D template, similar to a flattening process developed by NASA's Glenn Research Center. This template or pattern is then applied in the development of a 2-D graphic layout. Benefits of this process include 11.5 percent time savings per project, less material wasted, and the ability to improve upon graphic techniques and offer new design services. Partners include Exhibitgroup/Giltspur (end-user client: TAC Air, a division of Truman Arnold Companies Inc.), Jack Morton Worldwide (end-user client: Nickelodeon), as well as 3D Exhibits Inc., and MG Design Associates Corp.

  18. Compatible embedding for 2D shape animation.

    PubMed

    Baxter, William V; Barla, Pascal; Anjyo, Ken-Ichi

    2009-01-01

    We present new algorithms for the compatible embedding of 2D shapes. Such embeddings offer a convenient way to interpolate shapes having complex, detailed features. Compared to existing techniques, our approach requires less user input, and is faster, more robust, and simpler to implement, making it ideal for interactive use in practical applications. Our new approach consists of three parts. First, our boundary matching algorithm locates salient features using the perceptually motivated principles of scale-space and uses these as automatic correspondences to guide an elastic curve matching algorithm. Second, we simplify boundaries while maintaining their parametric correspondence and the embedding of the original shapes. Finally, we extend the mapping to shapes' interiors via a new compatible triangulation algorithm. The combination of our algorithms allows us to demonstrate 2D shape interpolation with instant feedback. The proposed algorithms exhibit a combination of simplicity, speed, and accuracy that has not been achieved in previous work.

  19. Schottky diodes from 2D germanane

    NASA Astrophysics Data System (ADS)

    Sahoo, Nanda Gopal; Esteves, Richard J.; Punetha, Vinay Deep; Pestov, Dmitry; Arachchige, Indika U.; McLeskey, James T.

    2016-07-01

    We report on the fabrication and characterization of a Schottky diode made using 2D germanane (hydrogenated germanene). When compared to germanium, the 2D structure has higher electron mobility, an optimal band-gap, and exceptional stability making germanane an outstanding candidate for a variety of opto-electronic devices. One-atom-thick sheets of hydrogenated puckered germanium atoms have been synthesized from a CaGe2 framework via intercalation and characterized by XRD, Raman, and FTIR techniques. The material was then used to fabricate Schottky diodes by suspending the germanane in benzonitrile and drop-casting it onto interdigitated metal electrodes. The devices demonstrate significant rectifying behavior and the outstanding potential of this material.

  20. Explicit 2-D Hydrodynamic FEM Program

    SciTech Connect

    Lin, Jerry

    1996-08-07

    DYNA2D* is a vectorized, explicit, two-dimensional, axisymmetric and plane strain finite element program for analyzing the large deformation dynamic and hydrodynamic response of inelastic solids. DYNA2D* contains 13 material models and 9 equations of state (EOS) to cover a wide range of material behavior. The material models implemented in all machine versions are: elastic, orthotropic elastic, kinematic/isotropic elastic plasticity, thermoelastoplastic, soil and crushable foam, linear viscoelastic, rubber, high explosive burn, isotropic elastic-plastic, temperature-dependent elastic-plastic. The isotropic and temperature-dependent elastic-plastic models determine only the deviatoric stresses. Pressure is determined by one of 9 equations of state including linear polynomial, JWL high explosive, Sack Tuesday high explosive, Gruneisen, ratio of polynomials, linear polynomial with energy deposition, ignition and growth of reaction in HE, tabulated compaction, and tabulated.

  1. Performance comparison of an active matrix flat panel imager, computed radiography system, and a screen-film system at four standard radiation qualities.

    PubMed

    Monnin, P; Gutierrez, D; Bulling, S; Lepori, D; Valley, J F; Verdun, F R

    2005-02-01

    Four standard radiation qualities (from RQA 3 to RQA 9) were used to compare the imaging performance of a computed radiography (CR) system (general purpose and high resolution phosphor plates of a Kodak CR 9000 system), a selenium-based direct flat panel detector (Kodak Direct View DR 9000), and a conventional screen-film system (Kodak T-MAT L/RA film with a 3M Trimax Regular screen of speed 400) in conventional radiography. Reference exposure levels were chosen according to the manufacturer's recommendations to be representative of clinical practice (exposure index of 1700 for digital systems and a film optical density of 1.4). With the exception of the RQA 3 beam quality, the exposure levels needed to produce a mean digital signal of 1700 were higher than those needed to obtain a mean film optical density of 1.4. In spite of intense developments in the field of digital detectors, screen-film systems are still very efficient detectors for most of the beam qualities used in radiology. An important outcome of this study is the behavior of the detective quantum efficiency of the digital radiography (DR) system as a function of beam energy. The practice of users to increase beam energy when switching from a screen-film system to a CR system, in order to improve the compromise between patient dose and image quality, might not be appropriate when switching from screen-film to selenium-based DR systems.

  2. Quasiparticle interference in unconventional 2D systems

    NASA Astrophysics Data System (ADS)

    Chen, Lan; Cheng, Peng; Wu, Kehui

    2017-03-01

    At present, research of 2D systems mainly focuses on two kinds of materials: graphene-like materials and transition-metal dichalcogenides (TMDs). Both of them host unconventional 2D electronic properties: pseudospin and the associated chirality of electrons in graphene-like materials, and spin-valley-coupled electronic structures in the TMDs. These exotic electronic properties have attracted tremendous interest for possible applications in nanodevices in the future. Investigation on the quasiparticle interference (QPI) in 2D systems is an effective way to uncover these properties. In this review, we will begin with a brief introduction to 2D systems, including their atomic structures and electronic bands. Then, we will discuss the formation of Friedel oscillation due to QPI in constant energy contours of electron bands, and show the basic concept of Fourier-transform scanning tunneling microscopy/spectroscopy (FT-STM/STS), which can resolve Friedel oscillation patterns in real space and consequently obtain the QPI patterns in reciprocal space. In the next two parts, we will summarize some pivotal results in the investigation of QPI in graphene and silicene, in which systems the low-energy quasiparticles are described by the massless Dirac equation. The FT-STM experiments show there are two different interference channels (intervalley and intravalley scattering) and backscattering suppression, which associate with the Dirac cones and the chirality of quasiparticles. The monolayer and bilayer graphene on different substrates (SiC and metal surfaces), and the monolayer and multilayer silicene on a Ag(1 1 1) surface will be addressed. The fifth part will introduce the FT-STM research on QPI in TMDs (monolayer and bilayer of WSe2), which allow us to infer the spin texture of both conduction and valence bands, and present spin-valley coupling by tracking allowed and forbidden scattering channels.

  3. Compact 2-D graphical representation of DNA

    NASA Astrophysics Data System (ADS)

    Randić, Milan; Vračko, Marjan; Zupan, Jure; Novič, Marjana

    2003-05-01

    We present a novel 2-D graphical representation for DNA sequences which has an important advantage over the existing graphical representations of DNA in being very compact. It is based on: (1) use of binary labels for the four nucleic acid bases, and (2) use of the 'worm' curve as template on which binary codes are placed. The approach is illustrated on DNA sequences of the first exon of human β-globin and gorilla β-globin.

  4. 2D Metals by Repeated Size Reduction.

    PubMed

    Liu, Hanwen; Tang, Hao; Fang, Minghao; Si, Wenjie; Zhang, Qinghua; Huang, Zhaohui; Gu, Lin; Pan, Wei; Yao, Jie; Nan, Cewen; Wu, Hui

    2016-10-01

    A general and convenient strategy for manufacturing freestanding metal nanolayers is developed on large scale. By the simple process of repeatedly folding and calendering stacked metal sheets followed by chemical etching, free-standing 2D metal (e.g., Ag, Au, Fe, Cu, and Ni) nanosheets are obtained with thicknesses as small as 1 nm and with sizes of the order of several micrometers.

  5. Computational study of the transition state for H[sub 2] addition to Vaska-type complexes (trans-Ir(L)[sub 2](CO)X). Substituent effects on the energy barrier and the origin of the small H[sub 2]/D[sub 2] kinetic isotope effect

    SciTech Connect

    Abu-Hasanayn, F.; Goldman, A.S.; Krogh-Jespersen, K. )

    1993-06-03

    Ab initio molecular orbital methods have been used to study transition state properties for the concerted addition reaction of H[sub 2] to Vaska-type complexes, trans-Ir(L)[sub 2](CO)X, 1 (L = PH[sub 3] and X = F, Cl, Br, I, CN, or H; L = NH[sub 3] and X = Cl). Stationary points on the reaction path retaining the trans-L[sub 2] arrangement were located at the Hartree-Fock level using relativistic effective core potentials and valence basis sets of double-[zeta] quality. The identities of the stationary points were confirmed by normal mode analysis. Activation energy barriers were calculated with electron correlation effects included via Moller-Plesset perturbation theory carried fully through fourth order, MP4(SDTQ). The more reactive complexes feature structurally earlier transition states and larger reaction exothermicities, in accord with the Hammond postulate. The experimentally observed increase in reactivity of Ir(PPh[sub 3])[sub 2](CO)X complexes toward H[sub 2] addition upon going from X = F to X = I is reproduced well by the calculations and is interpreted to be a consequence of diminished halide-to-Ir [pi]-donation by the heavier halogens. Computed activation barriers (L = PH[sub 3]) range from 6.1 kcal/mol (X = H) to 21.4 kcal/mol (X = F). Replacing PH[sub 3] by NH[sub 3] when X = Cl increases the barrier from 14.1 to 19.9 kcal/mol. Using conventional transition state theory, the kinetic isotope effects for H[sub 2]/D[sub 2] addition are computed to lie between 1.1 and 1.7 with larger values corresponding to earlier transition states. Judging from the computational data presented here, tunneling appears to be unimportant for H[sub 2] addition to these iridium complexes. 51 refs., 4 tabs.

  6. Engineering light outcoupling in 2D materials.

    PubMed

    Lien, Der-Hsien; Kang, Jeong Seuk; Amani, Matin; Chen, Kevin; Tosun, Mahmut; Wang, Hsin-Ping; Roy, Tania; Eggleston, Michael S; Wu, Ming C; Dubey, Madan; Lee, Si-Chen; He, Jr-Hau; Javey, Ali

    2015-02-11

    When light is incident on 2D transition metal dichalcogenides (TMDCs), it engages in multiple reflections within underlying substrates, producing interferences that lead to enhancement or attenuation of the incoming and outgoing strength of light. Here, we report a simple method to engineer the light outcoupling in semiconducting TMDCs by modulating their dielectric surroundings. We show that by modulating the thicknesses of underlying substrates and capping layers, the interference caused by substrate can significantly enhance the light absorption and emission of WSe2, resulting in a ∼11 times increase in Raman signal and a ∼30 times increase in the photoluminescence (PL) intensity of WSe2. On the basis of the interference model, we also propose a strategy to control the photonic and optoelectronic properties of thin-layer WSe2. This work demonstrates the utilization of outcoupling engineering in 2D materials and offers a new route toward the realization of novel optoelectronic devices, such as 2D LEDs and solar cells.

  7. Irreversibility-inversions in 2D turbulence

    NASA Astrophysics Data System (ADS)

    Bragg, Andrew; de Lillo, Filippo; Boffetta, Guido

    2016-11-01

    We consider a recent theoretical prediction that for inertial particles in 2D turbulence, the nature of the irreversibility of their pair dispersion inverts when the particle inertia exceeds a certain value. In particular, when the particle Stokes number, St , is below a certain value, the forward-in-time (FIT) dispersion should be faster than the backward-in-time (BIT) dispersion, but for St above this value, this should invert so that BIT becomes faster than FIT dispersion. This non-trivial behavior arises because of the competition between two physically distinct irreversibility mechanisms that operate in different regimes of St . In 3D turbulence, both mechanisms act to produce faster BIT than FIT dispersion, but in 2D, the two mechanisms have opposite effects because of the inverse energy cascade in the turbulent velocity field. We supplement the qualitative argument given by Bragg et al. by deriving quantitative predictions of this effect in the short-time dispersion limit. These predictions are then confirmed by results of inertial particle dispersion in a direct numerical simulation of 2D turbulence.

  8. Using computational modeling to assess the impact of clinical decision support on cancer screening improvement strategies within the community health centers.

    PubMed

    Carney, Timothy Jay; Morgan, Geoffrey P; Jones, Josette; McDaniel, Anna M; Weaver, Michael; Weiner, Bryan; Haggstrom, David A

    2014-10-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman's Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability.

  9. Fully automated 2D-3D registration and verification.

    PubMed

    Varnavas, Andreas; Carrell, Tom; Penney, Graeme

    2015-12-01

    Clinical application of 2D-3D registration technology often requires a significant amount of human interaction during initialisation and result verification. This is one of the main barriers to more widespread clinical use of this technology. We propose novel techniques for automated initial pose estimation of the 3D data and verification of the registration result, and show how these techniques can be combined to enable fully automated 2D-3D registration, particularly in the case of a vertebra based system. The initialisation method is based on preoperative computation of 2D templates over a wide range of 3D poses. These templates are used to apply the Generalised Hough Transform to the intraoperative 2D image and the sought 3D pose is selected with the combined use of the generated accumulator arrays and a Gradient Difference Similarity Measure. On the verification side, two algorithms are proposed: one using normalised features based on the similarity value and the other based on the pose agreement between multiple vertebra based registrations. The proposed methods are employed here for CT to fluoroscopy registration and are trained and tested with data from 31 clinical procedures with 417 low dose, i.e. low quality, high noise interventional fluoroscopy images. When similarity value based verification is used, the fully automated system achieves a 95.73% correct registration rate, whereas a no registration result is produced for the remaining 4.27% of cases (i.e. incorrect registration rate is 0%). The system also automatically detects input images outside its operating range.

  10. 2D superconductivity by ionic gating

    NASA Astrophysics Data System (ADS)

    Iwasa, Yoshi

    2D superconductivity is attracting a renewed interest due to the discoveries of new highly crystalline 2D superconductors in the past decade. Superconductivity at the oxide interfaces triggered by LaAlO3/SrTiO3 has become one of the promising routes for creation of new 2D superconductors. Also, the MBE grown metallic monolayers including FeSe are also offering a new platform of 2D superconductors. In the last two years, there appear a variety of monolayer/bilayer superconductors fabricated by CVD or mechanical exfoliation. Among these, electric field induced superconductivity by electric double layer transistor (EDLT) is a unique platform of 2D superconductivity, because of its ability of high density charge accumulation, and also because of the versatility in terms of materials, stemming from oxides to organics and layered chalcogenides. In this presentation, the following issues of electric filed induced superconductivity will be addressed; (1) Tunable carrier density, (2) Weak pinning, (3) Absence of inversion symmetry. (1) Since the sheet carrier density is quasi-continuously tunable from 0 to the order of 1014 cm-2, one is able to establish an electronic phase diagram of superconductivity, which will be compared with that of bulk superconductors. (2) The thickness of superconductivity can be estimated as 2 - 10 nm, dependent on materials, and is much smaller than the in-plane coherence length. Such a thin but low resistance at normal state results in extremely weak pinning beyond the dirty Boson model in the amorphous metallic films. (3) Due to the electric filed, the inversion symmetry is inherently broken in EDLT. This feature appears in the enhancement of Pauli limit of the upper critical field for the in-plane magnetic fields. In transition metal dichalcogenide with a substantial spin-orbit interactions, we were able to confirm the stabilization of Cooper pair due to its spin-valley locking. This work has been supported by Grant-in-Aid for Specially

  11. Electron-Phonon Scattering in Atomically Thin 2D Perovskites.

    PubMed

    Guo, Zhi; Wu, Xiaoxi; Zhu, Tong; Zhu, Xiaoyang; Huang, Libai

    2016-11-22

    Two-dimensional (2D) atomically thin perovskites with strongly bound excitons are highly promising for optoelectronic applications. However, the nature of nonradiative processes that limit the photoluminescence (PL) efficiency remains elusive. Here, we present time-resolved and temperature-dependent PL studies to systematically address the intrinsic exciton relaxation pathways in layered (C4H9NH3)2(CH3NH3)n-1PbnI3n+1 (n = 1, 2, 3) structures. Our results show that scatterings via deformation potential by acoustic and homopolar optical phonons are the main scattering mechanisms for excitons in ultrathin single exfoliated flakes, exhibiting a T(γ) (γ = 1.3 to 1.9) temperature dependence for scattering rates. We attribute the absence of polar optical phonon and defect scattering to efficient screening of Coulomb potential, similar to what has been observed in 3D perovskites. These results establish an understanding of the origins of nonradiative pathways and provide guidelines for optimizing PL efficiencies of atomically thin 2D perovskites.

  12. Computational Analysis and In silico Predictive Modeling for Inhibitors of PhoP Regulon in S. typhi on High-Throughput Screening Bioassay Dataset.

    PubMed

    Kaur, Harleen; Ahmad, Mohd; Scaria, Vinod

    2016-03-01

    There is emergence of multidrug-resistant Salmonella enterica serotype typhi in pandemic proportions throughout the world, and therefore, there is a necessity to speed up the discovery of novel molecules having different modes of action and also less influenced by the resistance formation that would be used as drug for the treatment of salmonellosis particularly typhoid fever. The PhoP regulon is well studied and has now been shown to be a critical regulator of number of gene expressions which are required for intracellular survival of S. enterica and pathophysiology of disease like typhoid. The evident roles of two-component PhoP-/PhoQ-regulated products in salmonella virulence have motivated attempts to target them therapeutically. Although the discovery process of biologically active compounds for the treatment of typhoid relies on hit-finding procedure, using high-throughput screening technology alone is very expensive, as well as time consuming when performed on large scales. With the recent advancement in combinatorial chemistry and contemporary technique for compounds synthesis, there are more and more compounds available which give ample growth of diverse compound library, but the time and endeavor required to screen these unfocused massive and diverse library have been slightly reduced in the past years. Hence, there is demand to improve the high-quality hits and success rate for high-throughput screening that required focused and biased compound library toward the particular target. Therefore, we still need an advantageous and expedient method to prioritize the molecules that will be utilized for biological screens, which saves time and is also inexpensive. In this concept, in silico methods like machine learning are widely applicable technique used to build computational model for high-throughput virtual screens to prioritize molecules for advance study. Furthermore, in computational analysis, we extended our study to identify the common enriched

  13. Contributions of the Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) for the diagnosis of MCI in Brazil.

    PubMed

    Memória, Cláudia M; Yassuda, Mônica S; Nakano, Eduardo Y; Forlenza, Orestes V

    2014-05-07

    ABSTRACT Background: The Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) is a computer-based cognitive screening instrument that involves automated administration and scoring and immediate analyses of test sessions. The objective of this study was to translate and culturally adapt the Brazilian Portuguese version of the CANS-MCI (CANS-MCI-BR) and to evaluate its reliability and validity for the diagnostic screening of MCI and dementia due to Alzheimer's disease. Methods: The test was administered to 97 older adults (mean age 73.41 ± 5.27 years) with at least four years of formal education (mean education 12.23 ± 4.48 years). Participants were classified into three diagnostic groups according to global cognitive status (normal controls, n = 41; MCI, n = 35; AD, n = 21) based on clinical data and formal neuropsychological assessments. Results: The results indicated high internal consistency (Cronbach's α = 0.77) in the total sample. Three-month test-retest reliability correlations were significant and robust (0.875; p < 0.001). A moderate level of concurrent validity was attained relative to the screening test for MCI (MoCA test, r = 0.76, p < 0.001). Confirmatory factor analysis supported the three-factor model of the original test, i.e., memory, language/spatial fluency, and executive function/mental control. Goodness of fit indicators were strong (Bentler Comparative Fit Index = 0.96, Root Mean Square Error of Approximation = 0.09). Receiver operating characteristic curve analyses suggested high sensitivity and specificity (81% and 73% respectively) to screen for possible MCI cases. Conclusions: The CANS-MCI-BR maintains adequate psychometric characteristics that render it suitable to identify elderly adults with probable cognitive impairment to whom a more extensive evaluation by formal neuropsychological tests may be required.

  14. The Application of Exploratory Data Analysis Methods in Computing Screening Intervals for Selected Study Measures of Effectiveness

    DTIC Science & Technology

    1990-05-30

    intervals for MOE data. These screening intevals will be incorporated into the rule-based AI system under development. The AI system is being designed to cite...data values for selected MOE according to the study experience of senior analysts. The distribution of MOE output data may be considered to be bounded ...regarding the concept of screening intervals primarily surfaced the notion of confidence intervals for estimation o population parameters using

  15. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    SciTech Connect

    Clark, Aurora Sue; Wall, Nathalie; Benny, Paul

    2015-11-16

    through the design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  16. CYP2D6 Is Inducible by Endogenous and Exogenous Corticosteroids.

    PubMed

    Farooq, Muhammad; Kelly, Edward J; Unadkat, Jashvant D

    2016-05-01

    Although cytochrome P450 (CYP) 2D6 has been widely considered to be noninducible on the basis of human hepatocyte studies, in vivo data suggests that it is inducible by endo- and xenobiotics. Therefore, we investigated if the experimental conditions routinely used in human hepatocyte studies may be a confounding factor in the lack of in vitro induction of CYP2D6. Sandwich cultured human hepatocytes (SCHH) were preincubated with or without dexamethasone (100 nM) for 72 hours before incubation with 1μM endogenous (cortisol or corticosterone) or exogenous (dexamethasone or prednisolone) corticosteroids. At 72 hours, CYP2D6 mRNA, protein, and activity were quantified by real-time quantitative polymerase chain reaction, quantitative proteomics, and formation of dextrorphan from dextromethorphan, respectively. In the absence of supplemental dexamethasone, CYP2D6 activity, mRNA, and protein were significantly and robustly (>10-fold) induced by all four corticosteroids. However, this CYP2D6 induction was abolished in cells preincubated with supplemental dexamethasone. These data show, for the first time, that CYP2D6 is inducible in vitro but the routine presence of 100 nM dexamethasone in the culture medium masks this induction. Our cortisol data are in agreement with the clinical observation that CYP2D6 is inducible during the third trimester of pregnancy when the plasma concentrations of cortisol increase to ∼1μM. These findings, if confirmed in vivo, have implications for predicting CYP2D6-mediated drug-drug interactions and call for re-evaluation of regulatory guidelines on screening for CYP2D6 induction by xenobiotics. Our findings also suggest that cortisol may be a causative factor in the in vivo induction of CYP2D6 during pregnancy.

  17. A 2D histogram representation of images for pooling

    NASA Astrophysics Data System (ADS)

    Yu, Xinnan; Zhang, Yu-Jin

    2011-03-01

    Designing a suitable image representation is one of the most fundamental issues of computer vision. There are three steps in the popular Bag of Words based image representation: feature extraction, coding and pooling. In the final step, current methods make an M x K encoded feature matrix degraded to a K-dimensional vector (histogram), where M is the number of features, and K is the size of the codebook: information is lost dramatically here. In this paper, a novel pooling method, based on 2-D histogram representation, is proposed to retain more information from the encoded image features. This pooling method can be easily incorporated into state-of- the-art computer vision system frameworks. Experiments show that our approach improves current pooling methods, and can achieve satisfactory performance of image classification and image reranking even when using a small codebook and costless linear SVM.

  18. Predicting non-square 2D dice probabilities

    NASA Astrophysics Data System (ADS)

    Pender, G. A. T.; Uhrin, M.

    2014-07-01

    The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.

  19. [Lung cancer screening].

    PubMed

    Sánchez González, M

    2014-01-01

    Lung cancer is a very important disease, curable in early stages. There have been trials trying to show the utility of chest x-ray or computed tomography in Lung Cancer Screening for decades. In 2011, National Lung Screening Trial results were published, showing a 20% reduction in lung cancer mortality in patients with low dose computed tomography screened for three years. These results are very promising and several scientific societies have included lung cancer screening in their guidelines. Nevertheless we have to be aware of lung cancer screening risks, such as: overdiagnosis, radiation and false positive results. Moreover, there are many issues to be solved, including choosing the appropriate group to be screened, the duration of the screening program, intervals between screening and its cost-effectiveness. Ongoing trials will probably answer some of these questions. This article reviews the current evidence on lung cancer screening.

  20. Periodically sheared 2D Yukawa systems

    SciTech Connect

    Kovács, Anikó Zsuzsa; Hartmann, Peter; Donkó, Zoltán

    2015-10-15

    We present non-equilibrium molecular dynamics simulation studies on the dynamic (complex) shear viscosity of a 2D Yukawa system. We have identified a non-monotonic frequency dependence of the viscosity at high frequencies and shear rates, an energy absorption maximum (local resonance) at the Einstein frequency of the system at medium shear rates, an enhanced collective wave activity, when the excitation is near the plateau frequency of the longitudinal wave dispersion, and the emergence of significant configurational anisotropy at small frequencies and high shear rates.

  1. ENERGY LANDSCAPE OF 2D FLUID FORMS

    SciTech Connect

    Y. JIANG; ET AL

    2000-04-01

    The equilibrium states of 2D non-coarsening fluid foams, which consist of bubbles with fixed areas, correspond to local minima of the total perimeter. (1) The authors find an approximate value of the global minimum, and determine directly from an image how far a foam is from its ground state. (2) For (small) area disorder, small bubbles tend to sort inwards and large bubbles outwards. (3) Topological charges of the same sign repel while charges of opposite sign attract. (4) They discuss boundary conditions and the uniqueness of the pattern for fixed topology.

  2. Codon Constraints on Closed 2D Shapes,

    DTIC Science & Technology

    2014-09-26

    19843$ CODON CONSTRAINTS ON CLOSED 2D SHAPES Go Whitman Richards "I Donald D. Hoffman’ D T 18 Abstract: Codons are simple primitives for describing plane...RSONAL AUT"ORtIS) Richards, Whitman & Hoffman, Donald D. 13&. TYPE OF REPORT 13b. TIME COVERED N/A P8 AT F RRrT t~r. Ago..D,) is, PlE COUNT Reprint...outlines, if figure and ground are ignored. Later, we will address the problem of indexing identical codon descriptors that have different figure

  3. TOPAZ2D heat transfer code users manual and thermal property data base

    SciTech Connect

    Shapiro, A.B.; Edwards, A.L.

    1990-05-01

    TOPAZ2D is a two dimensional implicit finite element computer code for heat transfer analysis. This user's manual provides information on the structure of a TOPAZ2D input file. Also included is a material thermal property data base. This manual is supplemented with The TOPAZ2D Theoretical Manual and the TOPAZ2D Verification Manual. TOPAZ2D has been implemented on the CRAY, SUN, and VAX computers. TOPAZ2D can be used to solve for the steady state or transient temperature field on two dimensional planar or axisymmetric geometries. Material properties may be temperature dependent and either isotropic or orthotropic. A variety of time and temperature dependent boundary conditions can be specified including temperature, flux, convection, and radiation. Time or temperature dependent internal heat generation can be defined locally be element or globally by material. TOPAZ2D can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in material surrounding the enclosure. Additional features include thermally controlled reactive chemical mixtures, thermal contact resistance across an interface, bulk fluid flow, phase change, and energy balances. Thermal stresses can be calculated using the solid mechanics code NIKE2D which reads the temperature state data calculated by TOPAZ2D. A three dimensional version of the code, TOPAZ3D is available. The material thermal property data base, Chapter 4, included in this manual was originally published in 1969 by Art Edwards for use with his TRUMP finite difference heat transfer code. The format of the data has been altered to be compatible with TOPAZ2D. Bob Bailey is responsible for adding the high explosive thermal property data.

  4. Remarks on thermalization in 2D CFT

    NASA Astrophysics Data System (ADS)

    de Boer, Jan; Engelhardt, Dalit

    2016-12-01

    We revisit certain aspects of thermalization in 2D conformal field theory (CFT). In particular, we consider similarities and differences between the time dependence of correlation functions in various states in rational and non-rational CFTs. We also consider the distinction between global and local thermalization and explain how states obtained by acting with a diffeomorphism on the ground state can appear locally thermal, and we review why the time-dependent expectation value of the energy-momentum tensor is generally a poor diagnostic of global thermalization. Since all 2D CFTs have an infinite set of commuting conserved charges, generic initial states might be expected to give rise to a generalized Gibbs ensemble rather than a pure thermal ensemble at late times. We construct the holographic dual of the generalized Gibbs ensemble and show that, to leading order, it is still described by a Banados-Teitelboim-Zanelli black hole. The extra conserved charges, while rendering c <1 theories essentially integrable, therefore seem to have little effect on large-c conformal field theories.

  5. Microwave Assisted 2D Materials Exfoliation

    NASA Astrophysics Data System (ADS)

    Wang, Yanbin

    Two-dimensional materials have emerged as extremely important materials with applications ranging from energy and environmental science to electronics and biology. Here we report our discovery of a universal, ultrafast, green, solvo-thermal technology for producing excellent-quality, few-layered nanosheets in liquid phase from well-known 2D materials such as such hexagonal boron nitride (h-BN), graphite, and MoS2. We start by mixing the uniform bulk-layered material with a common organic solvent that matches its surface energy to reduce the van der Waals attractive interactions between the layers; next, the solutions are heated in a commercial microwave oven to overcome the energy barrier between bulk and few-layers states. We discovered the minutes-long rapid exfoliation process is highly temperature dependent, which requires precise thermal management to obtain high-quality inks. We hypothesize a possible mechanism of this proposed solvo-thermal process; our theory confirms the basis of this novel technique for exfoliation of high-quality, layered 2D materials by using an as yet unknown role of the solvent.

  6. 2-D or not 2-D, that is the question: A Northern California test

    SciTech Connect

    Mayeda, K; Malagnini, L; Phillips, W S; Walter, W R; Dreger, D

    2005-06-06

    Reliable estimates of the seismic source spectrum are necessary for accurate magnitude, yield, and energy estimation. In particular, how seismic radiated energy scales with increasing earthquake size has been the focus of recent debate within the community and has direct implications on earthquake source physics studies as well as hazard mitigation. The 1-D coda methodology of Mayeda et al. has provided the lowest variance estimate of the source spectrum when compared against traditional approaches that use direct S-waves, thus making it ideal for networks that have sparse station distribution. The 1-D coda methodology has been mostly confined to regions of approximately uniform complexity. For larger, more geophysically complicated regions, 2-D path corrections may be required. The complicated tectonics of the northern California region coupled with high quality broadband seismic data provides for an ideal ''apples-to-apples'' test of 1-D and 2-D path assumptions on direct waves and their coda. Using the same station and event distribution, we compared 1-D and 2-D path corrections and observed the following results: (1) 1-D coda results reduced the amplitude variance relative to direct S-waves by roughly a factor of 8 (800%); (2) Applying a 2-D correction to the coda resulted in up to 40% variance reduction from the 1-D coda results; (3) 2-D direct S-wave results, though better than 1-D direct waves, were significantly worse than the 1-D coda. We found that coda-based moment-rate source spectra derived from the 2-D approach were essentially identical to those from the 1-D approach for frequencies less than {approx}0.7-Hz, however for the high frequencies (0.7{le} f {le} 8.0-Hz), the 2-D approach resulted in inter-station scatter that was generally 10-30% smaller. For complex regions where data are plentiful, a 2-D approach can significantly improve upon the simple 1-D assumption. In regions where only 1-D coda correction is available it is still preferable over 2

  7. A review of 3D/2D registration methods for image-guided interventions.

    PubMed

    Markelj, P; Tomaževič, D; Likar, B; Pernuš, F

    2012-04-01

    Registration of pre- and intra-interventional data is one of the key technologies for image-guided radiation therapy, radiosurgery, minimally invasive surgery, endoscopy, and interventional radiology. In this paper, we survey those 3D/2D data registration methods that utilize 3D computer tomography or magnetic resonance images as the pre-interventional data and 2D X-ray projection images as the intra-interventional data. The 3D/2D registration methods are reviewed with respect to image modality, image dimensionality, registration basis, geometric transformation, user interaction, optimization procedure, subject, and object of registration.

  8. Application of a 2-D atomic force microscope system to metrology

    NASA Astrophysics Data System (ADS)

    Nyyssonen, Diana; Landstein, Laszlo; Coombs, E.

    1992-02-01

    This paper describes a 2-D atomic force microprobe (AFM) system designed specifically for accurate submicron critical dimension (CD) metrology. The system includes 2-D AFM sensing, 3-D position interferometry with 1.25 nm sensitivity, and a special tip design. Unlike conventional AFM scanning systems, the system operates like a nanorobot moving from point to point under computer control and sensing surfaces without making contact. The system design, operating characteristics, and application to metrology are

  9. Validation of a Computer-Administered Version of the Digits-in-Noise Test for Hearing Screening in the United States

    PubMed Central

    Folmer, Robert L.; Vachhani, Jay; McMillan, Garnett P.; Watson, Charles; Kidd, Gary R.; Feeney, M. Patrick

    2016-01-01

    Background The sooner people receive treatment for hearing loss, the quicker they are able to recognize speech and to master hearing aid technology. Unfortunately, a majority of people with hearing loss wait until their impairments have progressed from moderate to severe levels before seeking auditory rehabilitation. In order to increase the number of individuals with hearing loss who pursue and receive auditory rehabilitation, it is necessary to improve methods for identifying and informing these people via widely accessible hearing screening procedures. Screening for hearing loss is the first in a chain of events that must take place in order to increase the number of patients who enter the hearing healthcare system. New methods for hearing screening should be readily accessible through a common medium (e.g, telephone or computer) and should be relatively easy and quick for people to self-administer. Purpose The purpose of this study was to assess a digits-in-noise (DIN) hearing screening test that was delivered via personal computer. Research Design Participants completed the Hearing Handicap Inventory for Adults (HHIA) questionnaire, audiometric testing in a sound booth, and computerized DIN testing. During the DIN test, sequences of 3 spoken digits were presented in noise via headphones at varying signal-to-noise ratios. Participants entered each three-digit sequence they heard using an on-screen keypad. Study Sample Forty adults (16 females, 24 males) participated in the study, 20 of whom had normal hearing and 20 with hearing loss (pure-tone average [PTA] thresholds for 0.5, 1, 2, and 4 kHz >25 dB HL). Data Collection and Analysis DIN signal-to-noise (SNR) and PTA data were analyzed and compared for each ear tested. Receiver operating characteristic (ROC) curves based on these data were plotted. A measure of overall accuracy of a screening test is the area under the receiver operating characteristic curve (AUC). This measures the average true positive rate

  10. 2D Covalent Metals: A New Materials Domain of Electrochemical CO2 Conversion with Broken Scaling Relationship.

    PubMed

    Shin, Hyeyoung; Ha, Yoonhoo; Kim, Hyungjun

    2016-10-04

    Toward a sustainable carbon cycle, electrochemical conversion of CO2 into valuable fuels has drawn much attention. However, sluggish kinetics and a substantial overpotential, originating from the strong correlation between the adsorption energies of intermediates and products, are key obstacles of electrochemical CO2 conversion. Here we show that 2D covalent metals with a zero band gap can overcome the intrinsic limitation of conventional metals and metal alloys and thereby substantially decrease the overpotential for CO2 reduction because of their covalent characteristics. From first-principles-based high-throughput screening results on 61 2D covalent metals, we find that the strong correlation between the adsorption energies of COOH and CO can be entirely broken. This leads to the computational design of CO2-to-CO and CO2-to-CH4 conversion catalysts in addition to hydrogen-evolution-reaction catalysts. Toward efficient electrochemical catalysts for CO2 reduction, this work suggests a new materials domain having two contradictory properties in a single material: covalent nature and electrical conductance.

  11. A randomized trial of computer-based reminders and audit and feedback to improve HIV screening in a primary care setting.

    PubMed

    Sundaram, V; Lazzeroni, L C; Douglass, L R; Sanders, G D; Tempio, P; Owens, D K

    2009-08-01

    Despite recommendations for voluntary HIV screening, few medical centres have implemented screening programmes. The objective of the study was to determine whether an intervention with computer-based reminders and feedback would increase screening for HIV in a Department of Veterans Affairs (VA) health-care system. The design of the study was a randomized controlled trial at five primary care clinics at the VA Palo Alto Health Care System. All primary care providers were eligible to participate in the study. The study intervention was computer-based reminders to either assess HIV risk behaviours or to offer HIV testing; feedback on adherence to reminders was provided. The main outcome measure was the difference in HIV testing rates between intervention and control group providers. The control group providers tested 1.0% (n = 67) and 1.4% (n = 106) of patients in the preintervention and intervention period, respectively; intervention providers tested 1.8% (n = 98) and 1.9% (n = 114), respectively (P = 0.75). In our random sample of 753 untested patients, 204 (27%) had documented risk behaviours. Providers were more likely to adhere to reminders to test rather than with reminders to perform risk assessment (11% versus 5%, P < 0.01). Sixty-one percent of providers felt that lack of time prevented risk assessment. In conclusion, in primary care clinics in our setting, HIV testing rates were low. Providers were unaware of the high rates of risky behaviour in their patient population and perceived important barriers to testing. Low-intensity clinical reminders and feedback did not increase rates of screening.

  12. 2-D Model for Normal and Sickle Cell Blood Microcirculation

    NASA Astrophysics Data System (ADS)

    Tekleab, Yonatan; Harris, Wesley

    2011-11-01

    Sickle cell disease (SCD) is a genetic disorder that alters the red blood cell (RBC) structure and function such that hemoglobin (Hb) cannot effectively bind and release oxygen. Previous computational models have been designed to study the microcirculation for insight into blood disorders such as SCD. Our novel 2-D computational model represents a fast, time efficient method developed to analyze flow dynamics, O2 diffusion, and cell deformation in the microcirculation. The model uses a finite difference, Crank-Nicholson scheme to compute the flow and O2 concentration, and the level set computational method to advect the RBC membrane on a staggered grid. Several sets of initial and boundary conditions were tested. Simulation data indicate a few parameters to be significant in the perturbation of the blood flow and O2 concentration profiles. Specifically, the Hill coefficient, arterial O2 partial pressure, O2 partial pressure at 50% Hb saturation, and cell membrane stiffness are significant factors. Results were found to be consistent with those of Le Floch [2010] and Secomb [2006].

  13. Transition to turbulence: 2D directed percolation

    NASA Astrophysics Data System (ADS)

    Chantry, Matthew; Tuckerman, Laurette; Barkley, Dwight

    2016-11-01

    The transition to turbulence in simple shear flows has been studied for well over a century, yet in the last few years has seen major leaps forward. In pipe flow, this transition shows the hallmarks of (1 + 1) D directed percolation, a universality class of continuous phase transitions. In spanwisely confined Taylor-Couette flow the same class is found, suggesting the phenomenon is generic to shear flows. However in plane Couette flow the largest simulations and experiments to-date find evidence for a discrete transition. Here we study a planar shear flow, called Waleffe flow, devoid of walls yet showing the fundamentals of planar transition to turbulence. Working with a quasi-2D yet Navier-Stokes derived model of this flow we are able to attack the (2 + 1) D transition problem. Going beyond the system sizes previously possible we find all of the required scalings of directed percolation and thus establish planar shears flow in this class.

  14. 2D quantum gravity from quantum entanglement.

    PubMed

    Gliozzi, F

    2011-01-21

    In quantum systems with many degrees of freedom the replica method is a useful tool to study the entanglement of arbitrary spatial regions. We apply it in a way that allows them to backreact. As a consequence, they become dynamical subsystems whose position, form, and extension are determined by their interaction with the whole system. We analyze, in particular, quantum spin chains described at criticality by a conformal field theory. Its coupling to the Gibbs' ensemble of all possible subsystems is relevant and drives the system into a new fixed point which is argued to be that of the 2D quantum gravity coupled to this system. Numerical experiments on the critical Ising model show that the new critical exponents agree with those predicted by the formula of Knizhnik, Polyakov, and Zamolodchikov.

  15. 2D Electrostatic Actuation of Microshutter Arrays

    NASA Technical Reports Server (NTRS)

    Burns, Devin E.; Oh, Lance H.; Li, Mary J.; Jones, Justin S.; Kelly, Daniel P.; Zheng, Yun; Kutyrev, Alexander S.; Moseley, Samuel H.

    2015-01-01

    An electrostatically actuated microshutter array consisting of rotational microshutters (shutters that rotate about a torsion bar) were designed and fabricated through the use of models and experiments. Design iterations focused on minimizing the torsional stiffness of the microshutters, while maintaining their structural integrity. Mechanical and electromechanical test systems were constructed to measure the static and dynamic behavior of the microshutters. The torsional stiffness was reduced by a factor of four over initial designs without sacrificing durability. Analysis of the resonant behavior of the microshutter arrays demonstrates that the first resonant mode is a torsional mode occurring around 3000 Hz. At low vacuum pressures, this resonant mode can be used to significantly reduce the drive voltage necessary for actuation requiring as little as 25V. 2D electrostatic latching and addressing was demonstrated using both a resonant and pulsed addressing scheme.

  16. Graphene suspensions for 2D printing

    NASA Astrophysics Data System (ADS)

    Soots, R. A.; Yakimchuk, E. A.; Nebogatikova, N. A.; Kotin, I. A.; Antonova, I. V.

    2016-04-01

    It is shown that, by processing a graphite suspension in ethanol or water by ultrasound and centrifuging, it is possible to obtain particles with thicknesses within 1-6 nm and, in the most interesting cases, 1-1.5 nm. Analogous treatment of a graphite suspension in organic solvent yields eventually thicker particles (up to 6-10 nm thick) even upon long-term treatment. Using the proposed ink based on graphene and aqueous ethanol with ethylcellulose and terpineol additives for 2D printing, thin (~5 nm thick) films with sheet resistance upon annealing ~30 MΩ/□ were obtained. With the ink based on aqueous graphene suspension, the sheet resistance was ~5-12 kΩ/□ for 6- to 15-nm-thick layers with a carrier mobility of ~30-50 cm2/(V s).

  17. Canard configured aircraft with 2-D nozzle

    NASA Technical Reports Server (NTRS)

    Child, R. D.; Henderson, W. P.

    1978-01-01

    A closely-coupled canard fighter with vectorable two-dimensional nozzle was designed for enhanced transonic maneuvering. The HiMAT maneuver goal of a sustained 8g turn at a free-stream Mach number of 0.9 and 30,000 feet was the primary design consideration. The aerodynamic design process was initiated with a linear theory optimization minimizing the zero percent suction drag including jet effects and refined with three-dimensional nonlinear potential flow techniques. Allowances were made for mutual interference and viscous effects. The design process to arrive at the resultant configuration is described, and the design of a powered 2-D nozzle model to be tested in the LRC 16-foot Propulsion Wind Tunnel is shown.

  18. Numerical Evaluation of 2D Ground States

    NASA Astrophysics Data System (ADS)

    Kolkovska, Natalia

    2016-02-01

    A ground state is defined as the positive radial solution of the multidimensional nonlinear problem \\varepsilon propto k_ bot 1 - ξ with the function f being either f(u) =a|u|p-1u or f(u) =a|u|pu+b|u|2pu. The numerical evaluation of ground states is based on the shooting method applied to an equivalent dynamical system. A combination of fourth order Runge-Kutta method and Hermite extrapolation formula is applied to solving the resulting initial value problem. The efficiency of this procedure is demonstrated in the 1D case, where the maximal difference between the exact and numerical solution is ≈ 10-11 for a discretization step 0:00025. As a major application, we evaluate numerically the critical energy constant. This constant is defined as a functional of the ground state and is used in the study of the 2D Boussinesq equations.

  19. Stereoscopic Vascular Models of the Head and Neck: A Computed Tomography Angiography Visualization

    ERIC Educational Resources Information Center

    Cui, Dongmei; Lynch, James C.; Smith, Andrew D.; Wilson, Timothy D.; Lehman, Michael N.

    2016-01-01

    Computer-assisted 3D models are used in some medical and allied health science schools; however, they are often limited to online use and 2D flat screen-based imaging. Few schools take advantage of 3D stereoscopic learning tools in anatomy education and clinically relevant anatomical variations when teaching anatomy. A new approach to teaching…

  20. Metrology for graphene and 2D materials

    NASA Astrophysics Data System (ADS)

    Pollard, Andrew J.

    2016-09-01

    The application of graphene, a one atom-thick honeycomb lattice of carbon atoms with superlative properties, such as electrical conductivity, thermal conductivity and strength, has already shown that it can be used to benefit metrology itself as a new quantum standard for resistance. However, there are many application areas where graphene and other 2D materials, such as molybdenum disulphide (MoS2) and hexagonal boron nitride (h-BN), may be disruptive, areas such as flexible electronics, nanocomposites, sensing and energy storage. Applying metrology to the area of graphene is now critical to enable the new, emerging global graphene commercial world and bridge the gap between academia and industry. Measurement capabilities and expertise in a wide range of scientific areas are required to address this challenge. The combined and complementary approach of varied characterisation methods for structural, chemical, electrical and other properties, will allow the real-world issues of commercialising graphene and other 2D materials to be addressed. Here, examples of metrology challenges that have been overcome through a multi-technique or new approach are discussed. Firstly, the structural characterisation of defects in both graphene and MoS2 via Raman spectroscopy is described, and how nanoscale mapping of vacancy defects in graphene is also possible using tip-enhanced Raman spectroscopy (TERS). Furthermore, the chemical characterisation and removal of polymer residue on chemical vapour deposition (CVD) grown graphene via secondary ion mass spectrometry (SIMS) is detailed, as well as the chemical characterisation of iron films used to grow large domain single-layer h-BN through CVD growth, revealing how contamination of the substrate itself plays a role in the resulting h-BN layer. In addition, the role of international standardisation in this area is described, outlining the current work ongoing in both the International Organization of Standardization (ISO) and the

  1. Necessity of organized low-dose computed tomography screening for lung cancer: From epidemiologic comparisons between China and the Western nations

    PubMed Central

    Gou, Hong-Feng; Liu, Yang; Yang, Tian-Xia; Zhou, Cheng; Chen, Xin-Zu

    2017-01-01

    Objectives To compare the proportion of stage I lung cancer and population mortality in China to those in U.S. and Europe where lung cancer screening by low-dose computed tomography (LDCT) has been already well practiced. Methods The proportions of stage I lung cancer in LDCT screening population in U.S. and Europe were retrieved from NLST and NELSON trials. The general proportion of stage I lung cancer in China was retrieved from a rapid meta-analysis, based on a literature search in the China National Knowledge Infrastructure database. The lung cancer mortality and prevalence of China, U.S. and Europe was retrieved from Globocan 2012 fact sheet. Mortality-to-prevalence ratio (MPR) was applied to compare the population survival outcome of lung cancer. Results The estimated proportion of stage I lung cancer in China is merely 20.8% among hospital-based cross-sectional population, with relative ratios (RRs) being 2.40 (95% CI 2.18–2.65) and 2.98 (95% CI 2.62–3.38) compared by LDCT-screening population in U.S. and Europe trials, respectively. MPR of lung cancer is as high as 58.9% in China, with RRs being 0.46 (95% CI 0.31–0.67) and 0.58 (95% CI 0.39–0.85) compared by U.S. and Europe, respectively. Conclusions By the epidemiological inference, the LDCT mass screening might be associated with increasing stage I lung cancer and therefore improving population survival outcome. How to translate the experiences of lung cancer screening by LDCT from developed counties to China in a cost-effective manner needs to be further investigated. PMID:27705946

  2. Evaluation of Promotional Materials To Promote Low-Dose Computed Tomography (LDCT) Screening to High-Risk Consumers and Health Care Providers.

    PubMed

    Hudson, Janella N; Quinn, Gwendolyn P; Wilson, Lauren E; Simmons, Vani N

    2017-03-11

    Low-dose computed tomography (LDCT) screening is a promising screening modality for increasing the detection rate of early stage lung cancers among high-risk individuals. Despite being recommended by the US Preventative Services Task Force, uptake of LDCT remains low. The objective of the current study was to gather feedback from high-risk consumers and health care providers on LDCT promotional materials. Focus group discussions were conducted with high-risk individuals (8 focus groups; N = 38) and primary care providers (9 focus groups; N = 23). Participants reviewed existing LDCT promotional materials to assess their perceptions of media materials created to publicize LDCT. Data were analyzed using the constant comparative method. Several key themes emerged from focus groups that can be used to inform development of future LDCT promotional materials. High-risk (HR) participants expressed greater receptivity for promotional materials that did not further stigmatize lung cancer and/or smoking and expressed preferences for materials that clearly outlined the risks/benefits of screening. Primary care providers (PCPs) offered suggestions to facilitate the referral process such as diagnostic codes and requested a design that clearly outlined eligibility criteria. A clear and thorough explanation of LDCT eligibility, cost, harms, and benefits was of chief importance for both PCP and HR audiences. Given that PCPs and HR audiences are not well informed on the specifics of LDCT screening eligibility and insurance coverage, creating provider and patient education opportunities will aid in shared decision-making opportunities. Promotional materials that meet the needs of the target audience are needed to facilitate discussions of risks/benefits of screening with HR individuals.

  3. 2D to 3D conversion implemented in different hardware

    NASA Astrophysics Data System (ADS)

    Ramos-Diaz, Eduardo; Gonzalez-Huitron, Victor; Ponomaryov, Volodymyr I.; Hernandez-Fragoso, Araceli

    2015-02-01

    Conversion of available 2D data for release in 3D content is a hot topic for providers and for success of the 3D applications, in general. It naturally completely relies on virtual view synthesis of a second view given by original 2D video. Disparity map (DM) estimation is a central task in 3D generation but still follows a very difficult problem for rendering novel images precisely. There exist different approaches in DM reconstruction, among them manually and semiautomatic methods that can produce high quality DMs but they demonstrate hard time consuming and are computationally expensive. In this paper, several hardware implementations of designed frameworks for an automatic 3D color video generation based on 2D real video sequence are proposed. The novel framework includes simultaneous processing of stereo pairs using the following blocks: CIE L*a*b* color space conversions, stereo matching via pyramidal scheme, color segmentation by k-means on an a*b* color plane, and adaptive post-filtering, DM estimation using stereo matching between left and right images (or neighboring frames in a video), adaptive post-filtering, and finally, the anaglyph 3D scene generation. Novel technique has been implemented on DSP TMS320DM648, Matlab's Simulink module over a PC with Windows 7, and using graphic card (NVIDIA Quadro K2000) demonstrating that the proposed approach can be applied in real-time processing mode. The time values needed, mean Similarity Structural Index Measure (SSIM) and Bad Matching Pixels (B) values for different hardware implementations (GPU, Single CPU, and DSP) are exposed in this paper.

  4. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly

  5. High-Throughput Computational Screening of Electrical and Phonon Properties of Two-Dimensional Transition Metal Dichalcogenides

    NASA Astrophysics Data System (ADS)

    Williamson, Izaak; Hernandez, Andres Correa; Wong-Ng, Winnie; Li, Lan

    2016-10-01

    Two-dimensional transition metal dichalcogenides (2D-TMDs) are of broadening research interest due to their novel physical, electrical, and thermoelectric properties. Having the chemical formula MX 2, where M is a transition metal and X is a chalcogen, there are many possible combinations to consider for materials-by-design exploration. By identifying novel compositions and utilizing the lower dimensionality, which allows for improved thermoelectric performance (e.g., increased Seebeck coefficients without sacrificing electron concentration), MX 2 materials are promising candidates for thermoelectric applications. However, to develop these materials into wide-scale use, it is crucial to comprehensively understand the compositional affects. This work investigates the structure, electronic, and phonon properties of 18 different MX 2 materials compositions as a benchmark to explore the impact of various elements. There is significant correlation between properties of constituent transition metals (atomic mass and radius) and the structure/properties of the corresponding 2D-TMDs. As the mass of M increases, the n-type power factor and phonon frequency gap increases. Similarly, increases in the radius of M lead to increased layer thickness and Seebeck coefficient S. Our results identify key factors to optimize MX 2 compositions for desired performance.

  6. 2D Radiative Transfer in Magnetically Confined Structures

    NASA Astrophysics Data System (ADS)

    Heinzel, P.; Anzer, U.

    2003-01-01

    Magnetically confined structures in the solar atmosphere exhibit a large complexity in their shapes and physical conditions. As an example, we show the case of so-called magnetic dips in prominences which are in magnetohydrostatic equilibria. For such models we solve 2D non-LTE multilevel problem for hydrogen with PRD in Lyman resonance lines. The iterative technique used is based on the MALI approach with simple diagonal ALO and SC formal solver. To compute the hydrogen ionization balance, the preconditioned MALI equations are linearized with respect to atomic level populations and electron density and solved iteratively using the Newton-Raphson scheme. Two additional problems are addressed: (i) an adequate iteration method for cases when the column-mass scale is used in one of the two dimensions but varies along the other dimension (which has a geometrical scaling); and (ii) a possibility of using AMR (Adaptive Mesh Refinement) algorithms to account for steep 2D gradients of selected variables (temperature, density, etc.).

  7. Reconstruction of a 2D seismic wavefield by seismic gradiometry

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Nishida, Kiwamu; Takagi, Ryota; Obara, Kazushige

    2016-12-01

    We reconstructed a 2D seismic wavefield and obtained its propagation properties by using the seismic gradiometry method together with dense observations of the Hi-net seismograph network in Japan. The seismic gradiometry method estimates the wave amplitude and its spatial derivative coefficients at any location from a discrete station record by using a Taylor series approximation. From the spatial derivatives in horizontal directions, the properties of a propagating wave packet, including the arrival direction, slowness, geometrical spreading, and radiation pattern can be obtained. In addition, by using spatial derivatives together with free-surface boundary conditions, the 2D vector elastic wavefield can be decomposed into divergence and rotation components. First, as a feasibility test, we performed an analysis with a synthetic seismogram dataset computed by a numerical simulation for a realistic 3D medium and the actual Hi-net station layout. We confirmed that the wave amplitude and its spatial derivatives were very well-reproduced for period bands longer than 25 s. Applications to a real large earthquake showed that the amplitude and phase of the wavefield were well reconstructed, along with slowness vector. The slowness of the reconstructed wavefield showed a clear contrast between body and surface waves and regional non-great-circle-path wave propagation, possibly owing to scattering. Slowness vectors together with divergence and rotation decomposition are expected to be useful for determining constituents of observed wavefields in inhomogeneous media.

  8. 2D Quantum Transport Modeling in Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, B.

    2001-01-01

    We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions, oxide tunneling and phase-breaking scattering are treated on an equal footing. Electron bandstructure is treated within the anisotropic effective mass approximation. We present the results of our simulations of MIT 25 and 90 nm "well-tempered" MOSFETs and compare them to those of classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. These results are consistent with 1D Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and subthreshold current has been studied. The shorter gate length device has an order of magnitude smaller leakage current than the longer gate length device without a significant trade-off in on-current.

  9. Facial biometrics based on 2D vector geometry

    NASA Astrophysics Data System (ADS)

    Malek, Obaidul; Venetsanopoulos, Anastasios; Androutsos, Dimitrios

    2014-05-01

    The main challenge of facial biometrics is its robustness and ability to adapt to changes in position orientation, facial expression, and illumination effects. This research addresses the predominant deficiencies in this regard and systematically investigates a facial authentication system in the Euclidean domain. In the proposed method, Euclidean geometry in 2D vector space is being constructed for features extraction and the authentication method. In particular, each assigned point of the candidates' biometric features is considered to be a 2D geometrical coordinate in the Euclidean vector space. Algebraic shapes of the extracted candidate features are also computed and compared. The proposed authentication method is being tested on images from the public "Put Face Database". The performance of the proposed method is evaluated based on Correct Recognition (CRR), False Acceptance (FAR), and False Rejection (FRR) rates. The theoretical foundation of the proposed method along with the experimental results are also presented in this paper. The experimental results demonstrate the effectiveness of the proposed method.

  10. Comparison of 2D and 3D Displays and Sensor Fusion for Threat Detection, Surveillance, and Telepresence

    DTIC Science & Technology

    2003-05-19

    Comparison of 2D and 3D displays and sensor fusion for threat detection, surveillance, and telepresence T. Meitzler, Ph. D.a, D. Bednarz, Ph.D.a, K...camouflaged threats are compared on a two dimensional (2D) display and a three dimensional ( 3D ) display. A 3D display is compared alongside a 2D...technologies that take advantage of 3D and sensor fusion will be discussed. 1. INTRODUCTION Computer driven interactive 3D imaging has made

  11. Very Fast Algorithms and Detection Performance of Multi-Channel and 2-D Parametric Adaptive Matched Filters for Airborne Radar

    DTIC Science & Technology

    2007-06-05

    tive to the AMF, [1] and [5] discovered that multi-channel and two-dimensional parametric estimation approaches could (1) reduce the computational...dimensional (2-D) parametric estimation using the 2-D least-squares-based lattice algorithm [4]. The specifics of the inverse are found in the next...non- parametric estimation techniques • Least square error (LSE) vs mean square error (MSE) • Primarily multi-channel (M-C) structures; also try 2-D

  12. Random forest learning of ultrasonic statistical physics and object spaces for lesion detection in 2D sonomammography

    NASA Astrophysics Data System (ADS)

    Sheet, Debdoot; Karamalis, Athanasios; Kraft, Silvan; Noël, Peter B.; Vag, Tibor; Sadhu, Anup; Katouzian, Amin; Navab, Nassir; Chatterjee, Jyotirmoy; Ray, Ajoy K.

    2013-03-01

    Breast cancer is the most common form of cancer in women. Early diagnosis can significantly improve lifeexpectancy and allow different treatment options. Clinicians favor 2D ultrasonography for breast tissue abnormality screening due to high sensitivity and specificity compared to competing technologies. However, inter- and intra-observer variability in visual assessment and reporting of lesions often handicaps its performance. Existing Computer Assisted Diagnosis (CAD) systems though being able to detect solid lesions are often restricted in performance. These restrictions are inability to (1) detect lesion of multiple sizes and shapes, and (2) differentiate between hypo-echoic lesions from their posterior acoustic shadowing. In this work we present a completely automatic system for detection and segmentation of breast lesions in 2D ultrasound images. We employ random forests for learning of tissue specific primal to discriminate breast lesions from surrounding normal tissues. This enables it to detect lesions of multiple shapes and sizes, as well as discriminate between hypo-echoic lesion from associated posterior acoustic shadowing. The primal comprises of (i) multiscale estimated ultrasonic statistical physics and (ii) scale-space characteristics. The random forest learns lesion vs. background primal from a database of 2D ultrasound images with labeled lesions. For segmentation, the posterior probabilities of lesion pixels estimated by the learnt random forest are hard thresholded to provide a random walks segmentation stage with starting seeds. Our method achieves detection with 99.19% accuracy and segmentation with mean contour-to-contour error < 3 pixels on a set of 40 images with 49 lesions.

  13. E-2D Advanced Hawkeye: primary flight display

    NASA Astrophysics Data System (ADS)

    Paolillo, Paul W.; Saxena, Ragini; Garruba, Jonathan; Tripathi, Sanjay; Blanchard, Randy

    2006-05-01

    This paper is a response to the challenge of providing a large area avionics display for the E-2D AHE aircraft. The resulting display design provides a pilot with high-resolution visual information content covering an image area of almost three square feet (Active Area of Samsung display = 33.792cm x 27.0336 cm = 13.304" x 10.643" = 141.596 square inches = 0.983 sq. ft x 3 = 2.95 sq. ft). The avionics display application, design and performance being described is the Primary Flight Display for the E-2D Advanced Hawkeye aircraft. This cockpit display has a screen diagonal size of 17 inches. Three displays, with minimum bezel width, just fit within the available instrument panel area. The significant design constraints of supporting an upgrade installation have been addressed. These constraints include a display image size that is larger than the mounting opening in the instrument panel. This, therefore, requires that the Electromagnetic Interference (EMI) window, LCD panel and backlight all fit within the limited available bezel depth. High brightness and a wide dimming range are supported with a dual mode Cold Cathode Fluorescent Tube (CCFT) and LED backlight. Packaging constraints dictated the use of multiple U shaped fluorescent lamps in a direct view backlight design for a maximum display brightness of 300 foot-Lamberts. The low intensity backlight levels are provided by remote LEDs coupled through a fiber optic mesh. This architecture generates luminous uniformity within a minimum backlight depth. Cross-cockpit viewing is supported with ultra-wide field-of-view performance including contrast and the color stability of an advanced LCD cell design supports. Display system design tradeoffs directed a priority to high optical efficiency for minimum power and weight.

  14. Comparing the Effect of Two Types of Computer Screen Background Lighting on Students' Reading Engagement and Achievement

    ERIC Educational Resources Information Center

    Botello, Jennifer A.

    2014-01-01

    With increased dependence on computer-based standardized tests to assess academic achievement, technological literacy has become an essential skill. Yet, because students have unequal access to technology, they may not have equal opportunities to perform well on these computer-based tests. The researcher had observed students taking the STAR…

  15. 2D/3D Visual Tracker for Rover Mast

    NASA Technical Reports Server (NTRS)

    Bajracharya, Max; Madison, Richard W.; Nesnas, Issa A.; Bandari, Esfandiar; Kunz, Clayton; Deans, Matt; Bualat, Maria

    2006-01-01

    A visual-tracker computer program controls an articulated mast on a Mars rover to keep a designated feature (a target) in view while the rover drives toward the target, avoiding obstacles. Several prior visual-tracker programs have been tested on rover platforms; most require very small and well-estimated motion between consecutive image frames a requirement that is not realistic for a rover on rough terrain. The present visual-tracker program is designed to handle large image motions that lead to significant changes in feature geometry and photometry between frames. When a point is selected in one of the images acquired from stereoscopic cameras on the mast, a stereo triangulation algorithm computes a three-dimensional (3D) location for the target. As the rover moves, its body-mounted cameras feed images to a visual-odometry algorithm, which tracks two-dimensional (2D) corner features and computes their old and new 3D locations. The algorithm rejects points, the 3D motions of which are inconsistent with a rigid-world constraint, and then computes the apparent change in the rover pose (i.e., translation and rotation). The mast pan and tilt angles needed to keep the target centered in the field-of-view of the cameras (thereby minimizing the area over which the 2D-tracking algorithm must operate) are computed from the estimated change in the rover pose, the 3D position of the target feature, and a model of kinematics of the mast. If the motion between the consecutive frames is still large (i.e., 3D tracking was unsuccessful), an adaptive view-based matching technique is applied to the new image. This technique uses correlation-based template matching, in which a feature template is scaled by the ratio between the depth in the original template and the depth of pixels in the new image. This is repeated over the entire search window and the best correlation results indicate the appropriate match. The program could be a core for building application programs for systems

  16. Learning from graphically integrated 2D and 3D representations improves retention of neuroanatomy

    NASA Astrophysics Data System (ADS)

    Naaz, Farah

    Visualizations in the form of computer-based learning environments are highly encouraged in science education, especially for teaching spatial material. Some spatial material, such as sectional neuroanatomy, is very challenging to learn. It involves learning the two dimensional (2D) representations that are sampled from the three dimensional (3D) object. In this study, a computer-based learning environment was used to explore the hypothesis that learning sectional neuroanatomy from a graphically integrated 2D and 3D representation will lead to better learning outcomes than learning from a sequential presentation. The integrated representation explicitly demonstrates the 2D-3D transformation and should lead to effective learning. This study was conducted using a computer graphical model of the human brain. There were two learning groups: Whole then Sections, and Integrated 2D3D. Both groups learned whole anatomy (3D neuroanatomy) before learning sectional anatomy (2D neuroanatomy). The Whole then Sections group then learned sectional anatomy using 2D representations only. The Integrated 2D3D group learned sectional anatomy from a graphically integrated 3D and 2D model. A set of tests for generalization of knowledge to interpreting biomedical images was conducted immediately after learning was completed. The order of presentation of the tests of generalization of knowledge was counterbalanced across participants to explore a secondary hypothesis of the study: preparation for future learning. If the computer-based instruction programs used in this study are effective tools for teaching anatomy, the participants should continue learning neuroanatomy with exposure to new representations. A test of long-term retention of sectional anatomy was conducted 4-8 weeks after learning was completed. The Integrated 2D3D group was better than the Whole then Sections

  17. WormAssay: a novel computer application for whole-plate motion-based screening of macroscopic parasites.

    PubMed

    Marcellino, Chris; Gut, Jiri; Lim, K C; Singh, Rahul; McKerrow, James; Sakanari, Judy

    2012-01-01

    Lymphatic filariasis is caused by filarial nematode parasites, including Brugia malayi. Adult worms live in the lymphatic system and cause a strong immune reaction that leads to the obstruction of lymph vessels and swelling of the extremities. Chronic disease leads to the painful and disfiguring condition known as elephantiasis. Current drug therapy is effective against the microfilariae (larval stage) of the parasite, but no drugs are effective against the adult worms. One of the major stumbling blocks toward developing effective macrofilaricides to kill the adult worms is the lack of a high throughput screening method for candidate drugs. Current methods utilize systems that measure one well at a time and are time consuming and often expensive. We have developed a low-cost and simple visual imaging system to automate and quantify screening entire plates based on parasite movement. This system can be applied to the study of many macroparasites as well as other macroscopic organisms.

  18. WormAssay: A Novel Computer Application for Whole-Plate Motion-based Screening of Macroscopic Parasites

    PubMed Central

    Marcellino, Chris; Gut, Jiri; Lim, K. C.; Singh, Rahul; McKerrow, James; Sakanari, Judy

    2012-01-01

    Lymphatic filariasis is caused by filarial nematode parasites, including Brugia malayi. Adult worms live in the lymphatic system and cause a strong immune reaction that leads to the obstruction of lymph vessels and swelling of the extremities. Chronic disease leads to the painful and disfiguring condition known as elephantiasis. Current drug therapy is effective against the microfilariae (larval stage) of the parasite, but no drugs are effective against the adult worms. One of the major stumbling blocks toward developing effective macrofilaricides to kill the adult worms is the lack of a high throughput screening method for candidate drugs. Current methods utilize systems that measure one well at a time and are time consuming and often expensive. We have developed a low-cost and simple visual imaging system to automate and quantify screening entire plates based on parasite movement. This system can be applied to the study of many macroparasites as well as other macroscopic organisms. PMID:22303493

  19. Advecting Procedural Textures for 2D Flow Animation

    NASA Technical Reports Server (NTRS)

    Kao, David; Pang, Alex; Moran, Pat (Technical Monitor)

    2001-01-01

    This paper proposes the use of specially generated 3D procedural textures for visualizing steady state 2D flow fields. We use the flow field to advect and animate the texture over time. However, using standard texture advection techniques and arbitrary textures will introduce some undesirable effects such as: (a) expanding texture from a critical source point, (b) streaking pattern from the boundary of the flowfield, (c) crowding of advected textures near an attracting spiral or sink, and (d) absent or lack of textures in some regions of the flow. This paper proposes a number of strategies to solve these problems. We demonstrate how the technique works using both synthetic data and computational fluid dynamics data.

  20. The Anatomy of High-Performance 2D Similarity Calculations

    PubMed Central

    Haque, Imran S.; Pande, Vijay S.

    2011-01-01

    Similarity measures based on the comparison of dense bit-vectors of two-dimensional chemical features are a dominant method in chemical informatics. For large-scale problems, including compound selection and machine learning, computing the intersection between two dense bit-vectors is the overwhelming bottleneck. We describe efficient implementations of this primitive, as well as example applications, using features of modern CPUs that allow 20-40x performance increases relative to typical code. Specifically, we describe fast methods for population count on modern x86 processors and cache-efficient matrix traversal and leader clustering algorithms that alleviate memory bandwidth bottlenecks in similarity matrix construction and clustering. The speed of our 2D comparison primitives is within a small factor of that obtained on GPUs, and does not require specialized hardware. PMID:21854053

  1. Grid generation for general 2-D regions using hyperbolic equations

    NASA Technical Reports Server (NTRS)

    Cordova, Jeffrey Q.; Barth, Timothy J.

    1988-01-01

    A method for applying a hyperbolic grid generation scheme to the construction of meshes in general 2-D regions has been developed. This approach, which follows the theory developed by Steger and Chaussee (1980) and the algorithm outlined by Kinsey and Barth (1984), is based on improving local grid control. This is accomplished by adding an angle control source term to the equations and using a new algorithm for computing the volume source term. These modifications lead to superior methods for fixing the 'local' problems of hyperbolic grid generation, namely, propagation of initial discontinuities and formation of grid shocks (crossing grid lines). More importantly, a method for solving the global problem of constraining the grid with more than one boundary (internal grid generation) has been developed. These algorithms have been implemented in an interactive grid generation program and the results for several geometries are presented and discussed.

  2. The adaptive computer-aided diagnosis system based on tumor sizes for the classification of breast tumors detected at screening ultrasound.

    PubMed

    Moon, Woo Kyung; Chen, I-Ling; Chang, Jung Min; Shin, Sung Ui; Lo, Chung-Ming; Chang, Ruey-Feng

    2017-04-01

    Screening ultrasound (US) is increasingly used as a supplement to mammography in women with dense breasts, and more than 80% of cancers detected by US alone are 1cm or smaller. An adaptive computer-aided diagnosis (CAD) system based on tumor size was proposed to classify breast tumors detected at screening US images using quantitative morphological and textural features. In the present study, a database containing 156 tumors (78 benign and 78 malignant) was separated into two subsets of different tumor sizes (<1cm and ⩾1cm) to explore the improvement in the performance of the CAD system. After adaptation, the accuracies, sensitivities, specificities and Az values of the CAD for the entire database increased from 73.1% (114/156), 73.1% (57/78), 73.1% (57/78), and 0.790 to 81.4% (127/156), 83.3% (65/78), 79.5% (62/78), and 0.852, respectively. In the data subset of tumors larger than 1cm, the performance improved from 66.2% (51/77), 68.3% (28/41), 63.9% (23/36), and 0.703 to 81.8% (63/77), 85.4% (35/41), 77.8% (28/36), and 0.855, respectively. The proposed CAD system can be helpful to classify breast tumors detected at screening US.

  3. Computer-aided screening system for cervical precancerous cells based on field emission scanning electron microscopy and energy dispersive x-ray images and spectra

    NASA Astrophysics Data System (ADS)

    Jusman, Yessi; Ng, Siew-Cheok; Hasikin, Khairunnisa; Kurnia, Rahmadi; Osman, Noor Azuan Bin Abu; Teoh, Kean Hooi

    2016-10-01

    The capability of field emission scanning electron microscopy and energy dispersive x-ray spectroscopy (FE-SEM/EDX) to scan material structures at the microlevel and characterize the material with its elemental properties has inspired this research, which has developed an FE-SEM/EDX-based cervical cancer screening system. The developed computer-aided screening system consisted of two parts, which were the automatic features of extraction and classification. For the automatic features extraction algorithm, the image and spectra of cervical cells features extraction algorithm for extracting the discriminant features of FE-SEM/EDX data was introduced. The system automatically extracted two types of features based on FE-SEM/EDX images and FE-SEM/EDX spectra. Textural features were extracted from the FE-SEM/EDX image using a gray level co-occurrence matrix technique, while the FE-SEM/EDX spectra features were calculated based on peak heights and corrected area under the peaks using an algorithm. A discriminant analysis technique was employed to predict the cervical precancerous stage into three classes: normal, low-grade intraepithelial squamous lesion (LSIL), and high-grade intraepithelial squamous lesion (HSIL). The capability of the developed screening system was tested using 700 FE-SEM/EDX spectra (300 normal, 200 LSIL, and 200 HSIL cases). The accuracy, sensitivity, and specificity performances were 98.2%, 99.0%, and 98.0%, respectively.

  4. Ab initio modeling of 2D layered organohalide lead perovskites

    NASA Astrophysics Data System (ADS)

    Fraccarollo, Alberto; Cantatore, Valentina; Boschetto, Gabriele; Marchese, Leonardo; Cossi, Maurizio

    2016-04-01

    A number of 2D layered perovskites A2PbI4 and BPbI4, with A and B mono- and divalent ammonium and imidazolium cations, have been modeled with different theoretical methods. The periodic structures have been optimized (both in monoclinic and in triclinic systems, corresponding to eclipsed and staggered arrangements of the inorganic layers) at the DFT level, with hybrid functionals, Gaussian-type orbitals and dispersion energy corrections. With the same methods, the various contributions to the solid stabilization energy have been discussed, separating electrostatic and dispersion energies, organic-organic intralayer interactions and H-bonding effects, when applicable. Then the electronic band gaps have been computed with plane waves, at the DFT level with scalar and full relativistic potentials, and including the correlation energy through the GW approximation. Spin orbit coupling and GW effects have been combined in an additive scheme, validated by comparing the computed gap with well known experimental and theoretical results for a model system. Finally, various contributions to the computed band gaps have been discussed on some of the studied systems, by varying some geometrical parameters and by substituting one cation in another's place.

  5. SALE2D. General Transient Fluid Flow Algorithm

    SciTech Connect

    Amsden, A.A.; Ruppel, H.M.; Hirt, C.W.

    1981-06-01

    SALE2D calculates two-dimensional fluid flows at all speeds, from the incompressible limit to highly supersonic. An implicit treatment of the pressure calculation similar to that in the Implicit Continuous-fluid Eulerian (ICE) technique provides this flow speed flexibility. In addition, the computing mesh may move with the fluid in a typical Lagrangian fashion, be held fixed in an Eulerian manner, or move in some arbitrarily specified way to provide a continuous rezoning capability. This latitude results from use of an Arbitrary Lagrangian-Eulerian (ALE) treatment of the mesh. The partial differential equations solved are the Navier-Stokes equations and the mass and internal energy equations. The fluid pressure is determined from an equation of state and supplemented with an artificial viscous pressure for the computation of shock waves. The computing mesh consists of a two-dimensional network of quadrilateral cells for either cylindrical or Cartesian coordinates, and a variety of user-selectable boundary conditions are provided in the program.

  6. Persistence Measures for 2d Soap Froth

    NASA Astrophysics Data System (ADS)

    Feng, Y.; Ruskin, H. J.; Zhu, B.

    Soap froths as typical disordered cellular structures, exhibiting spatial and temporal evolution, have been studied through their distributions and topological properties. Recently, persistence measures, which permit representation of the froth as a two-phase system, have been introduced to study froth dynamics at different length scales. Several aspects of the dynamics may be considered and cluster persistence has been observed through froth experiment. Using a direct simulation method, we have investigated persistent properties in 2D froth both by monitoring the persistence of survivor cells, a topologically independent measure, and in terms of cluster persistence. It appears that the area fraction behavior for both survivor and cluster persistence is similar for Voronoi froth and uniform froth (with defects). Survivor and cluster persistent fractions are also similar for a uniform froth, particularly when geometries are constrained, but differences observed for the Voronoi case appear to be attributable to the strong topological dependency inherent in cluster persistence. Survivor persistence, on the other hand, depends on the number rather than size and position of remaining bubbles and does not exhibit the characteristic decay to zero.

  7. SEM signal emulation for 2D patterns

    NASA Astrophysics Data System (ADS)

    Sukhov, Evgenii; Muelders, Thomas; Klostermann, Ulrich; Gao, Weimin; Braylovska, Mariya

    2016-03-01

    The application of accurate and predictive physical resist simulation is seen as one important use model for fast and efficient exploration of new patterning technology options, especially if fully qualified OPC models are not yet available at an early pre-production stage. The methodology of using a top-down CD-SEM metrology to extract the 3D resist profile information, such as the critical dimension (CD) at various resist heights, has to be associated with a series of presumptions which may introduce such small, but systematic CD errors. Ideally, the metrology effects should be carefully minimized during measurement process, or if possible be taken into account through proper metrology modeling. In this paper we discuss the application of a fast SEM signal emulation describing the SEM image formation. The algorithm is applied to simulated resist 3D profiles and produces emulated SEM image results for 1D and 2D patterns. It allows estimating resist simulation quality by comparing CDs which were extracted from the emulated and from the measured SEM images. Moreover, SEM emulation is applied for resist model calibration to capture subtle error signatures through dose and defocus. Finally, it should be noted that our SEM emulation methodology is based on the approximation of physical phenomena which are taking place in real SEM image formation. This approximation allows achieving better speed performance compared to a fully physical model.

  8. Competing coexisting phases in 2D water

    NASA Astrophysics Data System (ADS)

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-05-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules.

  9. Competing coexisting phases in 2D water

    PubMed Central

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-01-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules. PMID:27185018

  10. Radiofrequency Spectroscopy and Thermodynamics of Fermi Gases in the 2D to Quasi-2D Dimensional Crossover

    NASA Astrophysics Data System (ADS)

    Cheng, Chingyun; Kangara, Jayampathi; Arakelyan, Ilya; Thomas, John

    2016-05-01

    We tune the dimensionality of a strongly interacting degenerate 6 Li Fermi gas from 2D to quasi-2D, by adjusting the radial confinement of pancake-shaped clouds to control the radial chemical potential. In the 2D regime with weak radial confinement, the measured pair binding energies are in agreement with 2D-BCS mean field theory, which predicts dimer pairing energies in the many-body regime. In the qausi-2D regime obtained with increased radial confinement, the measured pairing energy deviates significantly from 2D-BCS theory. In contrast to the pairing energy, the measured radii of the cloud profiles are not fit by 2D-BCS theory in either the 2D or quasi-2D regimes, but are fit in both regimes by a beyond mean field polaron-model of the free energy. Supported by DOE, ARO, NSF, and AFOSR.

  11. A novel method for measuring the 2D information of burst strong flashing object in space

    NASA Astrophysics Data System (ADS)

    Zhong, P.; Jin, Ye

    2009-11-01

    The burst strongly flashing event taking place in space such as strong explosion in low air is very random in time and position, and its duration time is very short. In this paper, a photoelectric measuring device, namely, 2D angle localizer for measuring 2D angle of a burst strongly flashing object appearing in place randomly has been presented. It mainly includes detecting head with narrow slot, cylinder silicon photoelectric receiver, absolute photoelectric encoder and computer. It can complete the measurement of 2D information, namely, the azimuth angle and pitching angle of the center position of a spatial flashing object. The principle of measuring angle and basic structure of measuring angle device are introduced. The critical parts of the device are briefly described. A contrast experiment of measuring the sun's 2D angle by 2D angle localizer and theodolite was made. The measuring results and accuracy analysis have been given. Due to being equipped with variable gain amplifiers and three silicon photoelectric accepters with cylinder surface, the 2D angle localizer has the characteristics of Wide dynamic measurement range and omnidirectional angle measurement. The measuring accuracy of 2D angle localizer is more than 2mil and the act of measuring can be finished in 0.5s.

  12. Methods for 2-D and 3-D Endobronchial Ultrasound Image Segmentation.

    PubMed

    Zang, Xiaonan; Bascom, Rebecca; Gilbert, Christopher; Toth, Jennifer; Higgins, William

    2016-07-01

    Endobronchial ultrasound (EBUS) is now commonly used for cancer-staging bronchoscopy. Unfortunately, EBUS is challenging to use and interpreting EBUS video sequences is difficult. Other ultrasound imaging domains, hampered by related difficulties, have benefited from computer-based image-segmentation methods. Yet, so far, no such methods have been proposed for EBUS. We propose image-segmentation methods for 2-D EBUS frames and 3-D EBUS sequences. Our 2-D method adapts the fast-marching level-set process, anisotropic diffusion, and region growing to the problem of segmenting 2-D EBUS frames. Our 3-D method builds upon the 2-D method while also incorporating the geodesic level-set process for segmenting EBUS sequences. Tests with lung-cancer patient data showed that the methods ran fully automatically for nearly 80% of test cases. For the remaining cases, the only user-interaction required was the selection of a seed point. When compared to ground-truth segmentations, the 2-D method achieved an overall Dice index = 90.0% ±4.9%, while the 3-D method achieved an overall Dice index = 83.9 ± 6.0%. In addition, the computation time (2-D, 0.070 s/frame; 3-D, 0.088 s/frame) was two orders of magnitude faster than interactive contour definition. Finally, we demonstrate the potential of the methods for EBUS localization in a multimodal image-guided bronchoscopy system.

  13. New 2D diffraction model and its applications to terahertz parallel-plate waveguide power splitters

    NASA Astrophysics Data System (ADS)

    Zhang, Fan; Song, Kaijun; Fan, Yong

    2017-02-01

    A two-dimensional (2D) diffraction model for the calculation of the diffraction field in 2D space and its applications to terahertz parallel-plate waveguide power splitters are proposed in this paper. Compared with the Huygens-Fresnel principle in three-dimensional (3D) space, the proposed model provides an approximate analytical expression to calculate the diffraction field in 2D space. The diffraction filed is regarded as the superposition integral in 2D space. The calculated results obtained from the proposed diffraction model agree well with the ones by software HFSS based on the element method (FEM). Based on the proposed 2D diffraction model, two parallel-plate waveguide power splitters are presented. The splitters consist of a transmitting horn antenna, reflectors, and a receiving antenna array. The reflector is cylindrical parabolic with superimposed surface relief to efficiently couple the transmitted wave into the receiving antenna array. The reflector is applied as computer-generated holograms to match the transformed field to the receiving antenna aperture field. The power splitters were optimized by a modified real-coded genetic algorithm. The computed results of the splitters agreed well with the ones obtained by software HFSS verify the novel design method for power splitter, which shows good applied prospects of the proposed 2D diffraction model.

  14. New 2D diffraction model and its applications to terahertz parallel-plate waveguide power splitters

    PubMed Central

    Zhang, Fan; Song, Kaijun; Fan, Yong

    2017-01-01

    A two-dimensional (2D) diffraction model for the calculation of the diffraction field in 2D space and its applications to terahertz parallel-plate waveguide power splitters are proposed in this paper. Compared with the Huygens-Fresnel principle in three-dimensional (3D) space, the proposed model provides an approximate analytical expression to calculate the diffraction field in 2D space. The diffraction filed is regarded as the superposition integral in 2D space. The calculated results obtained from the proposed diffraction model agree well with the ones by software HFSS based on the element method (FEM). Based on the proposed 2D diffraction model, two parallel-plate waveguide power splitters are presented. The splitters consist of a transmitting horn antenna, reflectors, and a receiving antenna array. The reflector is cylindrical parabolic with superimposed surface relief to efficiently couple the transmitted wave into the receiving antenna array. The reflector is applied as computer-generated holograms to match the transformed field to the receiving antenna aperture field. The power splitters were optimized by a modified real-coded genetic algorithm. The computed results of the splitters agreed well with the ones obtained by software HFSS verify the novel design method for power splitter, which shows good applied prospects of the proposed 2D diffraction model. PMID:28181514

  15. New 2D diffraction model and its applications to terahertz parallel-plate waveguide power splitters.

    PubMed

    Zhang, Fan; Song, Kaijun; Fan, Yong

    2017-02-09

    A two-dimensional (2D) diffraction model for the calculation of the diffraction field in 2D space and its applications to terahertz parallel-plate waveguide power splitters are proposed in this paper. Compared with the Huygens-Fresnel principle in three-dimensional (3D) space, the proposed model provides an approximate analytical expression to calculate the diffraction field in 2D space. The diffraction filed is regarded as the superposition integral in 2D space. The calculated results obtained from the proposed diffraction model agree well with the ones by software HFSS based on the element method (FEM). Based on the proposed 2D diffraction model, two parallel-plate waveguide power splitters are presented. The splitters consist of a transmitting horn antenna, reflectors, and a receiving antenna array. The reflector is cylindrical parabolic with superimposed surface relief to efficiently couple the transmitted wave into the receiving antenna array. The reflector is applied as computer-generated holograms to match the transformed field to the receiving antenna aperture field. The power splitters were optimized by a modified real-coded genetic algorithm. The computed results of the splitters agreed well with the ones obtained by software HFSS verify the novel design method for power splitter, which shows good applied prospects of the proposed 2D diffraction model.

  16. Multirate-based fast parallel algorithms for 2-D DHT-based real-valued discrete Gabor transform.

    PubMed

    Tao, Liang; Kwan, Hon Keung

    2012-07-01

    Novel algorithms for the multirate and fast parallel implementation of the 2-D discrete Hartley transform (DHT)-based real-valued discrete Gabor transform (RDGT) and its inverse transform are presented in this paper. A 2-D multirate-based analysis convolver bank is designed for the 2-D RDGT, and a 2-D multirate-based synthesis convolver bank is designed for the 2-D inverse RDGT. The parallel channels in each of the two convolver banks have a unified structure and can apply the 2-D fast DHT algorithm to speed up their computations. The computational complexity of each parallel channel is low and is independent of the Gabor oversampling rate. All the 2-D RDGT coefficients of an image are computed in parallel during the analysis process and can be reconstructed in parallel during the synthesis process. The computational complexity and time of the proposed parallel algorithms are analyzed and compared with those of the existing fastest algorithms for 2-D discrete Gabor transforms. The results indicate that the proposed algorithms are the fastest, which make them attractive for real-time image processing.

  17. ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data

    NASA Astrophysics Data System (ADS)

    Akca, Irfan

    2016-04-01

    ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discre-tized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.

  18. gpuSPHASE-A shared memory caching implementation for 2D SPH using CUDA

    NASA Astrophysics Data System (ADS)

    Winkler, Daniel; Meister, Michael; Rezavand, Massoud; Rauch, Wolfgang

    2017-04-01

    Smoothed particle hydrodynamics (SPH) is a meshless Lagrangian method that has been successfully applied to computational fluid dynamics (CFD), solid mechanics and many other multi-physics problems. Using the method to solve transport phenomena in process engineering requires the simulation of several days to weeks of physical time. Based on the high computational demand of CFD such simulations in 3D need a computation time of years so that a reduction to a 2D domain is inevitable. In this paper gpuSPHASE, a new open-source 2D SPH solver implementation for graphics devices, is developed. It is optimized for simulations that must be executed with thousands of frames per second to be computed in reasonable time. A novel caching algorithm for Compute Unified Device Architecture (CUDA) shared memory is proposed and implemented. The software is validated and the performance is evaluated for the well established dambreak test case.

  19. Generates 2D Input for DYNA NIKE & TOPAZ

    SciTech Connect

    Hallquist, J. O.; Sanford, Larry

    1996-07-15

    MAZE is an interactive program that serves as an input and two-dimensional mesh generator for DYNA2D, NIKE2D, TOPAZ2D, and CHEMICAL TOPAZ2D. MAZE also generates a basic template for ISLAND input. MAZE has been applied to the generation of input data to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  20. MAZE96. Generates 2D Input for DYNA NIKE & TOPAZ

    SciTech Connect

    Sanford, L.; Hallquist, J.O.

    1992-02-24

    MAZE is an interactive program that serves as an input and two-dimensional mesh generator for DYNA2D, NIKE2D, TOPAZ2D, and CHEMICAL TOPAZ2D. MAZE also generates a basic template for ISLAND input. MAZE has been applied to the generation of input data to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  1. Stability of two-dimensional (2D) natural convection flows in air-filled differentially heated cavities: 2D/3D disturbances

    NASA Astrophysics Data System (ADS)

    Xin, Shihe; Le Quéré, Patrick

    2012-06-01

    Following our previous two-dimensional (2D) studies of flows in differentially heated cavities filled with air, we studied the stability of 2D natural convection flows in these cavities with respect to 3D periodic perturbations. The basis of the numerical methods is a time-stepping code using the Chebyshev spectral collocation method and the direct Uzawa method for velocity-pressure coupling. Newton's iteration, Arnoldi's method and the continuation method have been used in order to, respectively, compute the 2D steady-state base solution, estimate the leading eigenmodes of the Jacobian and perform linear stability analysis. Differentially heated air-filled cavities of aspect ratios from 1 to 7 were investigated. Neutral curves (Rayleigh number versus wave number) have been obtained. It turned out that only for aspect ratio 7, 3D stationary instability occurs at slightly higher Rayleigh numbers than the onset of 2D time-dependent flow and that for other aspect ratios 3D instability always takes place before 2D time-dependent flows. 3D unstable modes are stationary and anti-centro-symmetric. 3D nonlinear simulations revealed that the corresponding pitchfork bifurcations are supercritical and that 3D instability leads only to weak flow in the third direction. Further 3D computations are also performed at higher Rayleigh number in order to understand the effects of the weak 3D fluid motion on the onset of time-dependent flow. 3D flow structures are responsible for the onset of time-dependent flow for aspect ratios 1, 2 and 3, while for larger aspect ratios they do not alter the transition scenario, which was observed in the 2D cases and that vertical boundary layers become unstable to traveling waves.

  2. Representativeness of 2D models to simulate 3D unstable variable density flow in porous media

    NASA Astrophysics Data System (ADS)

    Knorr, Bastian; Xie, Yueqing; Stumpp, Christine; Maloszewski, Piotr; Simmons, Craig T.

    2016-11-01

    Variable density flow in porous media has been studied primarily using numerical models because it is a semi-chaotic and transient process. Most of these studies have been 2D, owing to the computational restrictions on 3D simulations, and the ability to observe variable density flow in 2D experimentation. However, it is recognised that variable density flow is a three-dimensional process. A 3D system may cause weaker variable density flow than a 2D system due to stronger dispersion, but may also result in bigger fingers and hence stronger variable density flow because of more space for fingers to coalesce. This study aimed to determine the representativeness of 2D modelling to simulate 3D variable density flow. 3D homogeneous sand column experiments were conducted at three different water flow velocities with three different bromide tracer solutions mixed with methanol resulting in different density ratios. Both 2D axisymmetric and 3D numerical simulations were performed to reproduce experimental data. Experimental results showed that the magnitude of variable density flow increases with decreasing flow rates and decreasing density ratios. The shapes of the observed breakthrough curves differed significantly from those produced by 2D axisymmetric and 3D simulations. Compared to 2D simulations, the onset of instabilities was delayed but the growth was more pronounced in 3D simulations. Despite this difference, both 2D axisymmetric and 3D models successfully simulated mass recovery with high efficiency (between 77% and 99%). This study indicates that 2D simulations are sufficient to understand integrated features of variable density flow in homogeneous sand column experiments.

  3. Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods

    PubMed Central

    McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando

    2010-01-01

    Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349

  4. Framework for 2D-3D image fusion of infrared thermography with preoperative MRI.

    PubMed

    Hoffmann, Nico; Weidner, Florian; Urban, Peter; Meyer, Tobias; Schnabel, Christian; Radev, Yordan; Schackert, Gabriele; Petersohn, Uwe; Koch, Edmund; Gumhold, Stefan; Steiner, Gerald; Kirsch, Matthias

    2017-01-23

    Multimodal medical image fusion combines information of one or more images in order to improve the diagnostic value. While previous applications mainly focus on merging images from computed tomography, magnetic resonance imaging (MRI), ultrasonic and single-photon emission computed tomography, we propose a novel approach for the registration and fusion of preoperative 3D MRI with intraoperative 2D infrared thermography. Image-guided neurosurgeries are based on neuronavigation systems, which further allow us track the position and orientation of arbitrary cameras. Hereby, we are able to relate the 2D coordinate system of the infrared camera with the 3D MRI coordinate system. The registered image data are now combined by calibration-based image fusion in order to map our intraoperative 2D thermographic images onto the respective brain surface recovered from preoperative MRI. In extensive accuracy measurements, we found that the proposed framework achieves a mean accuracy of 2.46 mm.

  5. FRANC2D: A two-dimensional crack propagation simulator. Version 2.7: User's guide

    NASA Technical Reports Server (NTRS)

    Wawrzynek, Paul; Ingraffea, Anthony

    1994-01-01

    FRANC 2D (FRacture ANalysis Code, 2 Dimensions) is a menu driven, interactive finite element computer code that performs fracture mechanics analyses of 2-D structures. The code has an automatic mesh generator for triangular and quadrilateral elements. FRANC2D calculates the stress intensity factor using linear elastic fracture mechanics and evaluates crack extension using several methods that may be selected by the user. The code features a mesh refinement and adaptive mesh generation capability that is automatically developed according to the predicted crack extension direction and length. The code also has unique features that permit the analysis of layered structure with load transfer through simulated mechanical fasteners or bonded joints. The code was written for UNIX workstations with X-windows graphics and may be executed on the following computers: DEC DecStation 3000 and 5000 series, IBM RS/6000 series, Hewlitt-Packard 9000/700 series, SUN Sparc stations, and most Silicon Graphics models.

  6. Digital phase-stepping holographic interferometry in measuring 2-D density fields

    NASA Astrophysics Data System (ADS)

    Lanen, T. A. W. M.; Nebbeling, C.; van Ingen, J. L.

    1990-06-01

    This paper presents a holographic interferometer technique for measuring transparent (2-D or quasi 2-D) density fields. To be able to study the realization of such a field at a certain moment of time, the field is “frozen” on a holographic plate. During the reconstruction of the density field from the hologram the length of the path traversed by the reconstruction beam is diminished in equal steps by applying a computer controlled voltage to a piezo-electric crystal that translates a mirror. Four phase-stepped interferograms resulting from this pathlength variation are digitized and serve as input to an algorithm for computing the phase surface. The method is illustrated by measuring the basically 2-D density field existing around a heated horizontal cylinder in free convection.

  7. Measurement of astrophysical S factors and electron screening potentials for d(d, n){sup 3}He reaction In ZrD{sub 2}, TiD{sub 2}, D{sub 2}O, and CD{sub 2} targets in the ultralow energy region using plasma accelerators

    SciTech Connect

    Bystritsky, V. M.; Bystritskii, Vit. M.; Dudkin, G. N.; Filipowicz, M.; Gazi, S.; Huran, J.; Kobzev, A. P.; Mesyats, G. A.; Nechaev, B. A.; Padalko, V. N.; Parzhitskii, S. S.; Pen'kov, F. M.; Philippov, A. V.; Kaminskii, V. L.; Tuleushev, Yu. Zh.; Wozniak, J.

    2012-01-15

    The paper is devoted to study electron screening effect influence on the rate of d(d, n){sup 3}He reaction in the ultralow deuteron collision energy range in the deuterated polyethylene (CD{sub 2}), frozen heavy water (D{sub 2}O) and deuterated metals (ZrD{sub 2} and TiD{sub 2}). The ZrD{sub 2} and TiD{sub 2} targets were fabricated via magnetron sputtering of titanium and zirconium in gas (deuterium) environment. The experiments have been carried out using high-current plasma pulsed accelerator with forming of inverse Z pinch (HCEIRAS, Russia) and pulsed Hall plasma accelerator (NPI at TPU, Russia). The detection of neutrons with energy of 2.5MeV from dd reaction was done with plastic scintillation spectrometers. As a result of the experiments the energy dependences of astrophysical S factor for the dd reaction in the deuteron collision energy range of 2-7 keV and the values of the electron screening potential U{sub e} of interacting deuterons have been measured for the indicated above target: U{sub e}(CD{sub 2}) Less-Than-Or-Slanted-Equal-To 40 eV; U{sub e}(D{sub 2}O) Less-Than-Or-Slanted-Equal-To 26 eV; U{sub e}(ZrD{sub 2}) = 157 {+-} 43 eV; U{sub e}(TiD{sub 2}) = 125{+-}34 eV. The value of astrophysical S factor, corresponding to the deuteron collision energy equal to zero, in the experiments with D{sub 2}O target is found: S{sub b}(0) = 58.6 {+-} 3.6 keV b. The paper compares our results with other available published experimental and calculated data.

  8. Lung cancer screening.

    PubMed

    Tanoue, Lynn T; Tanner, Nichole T; Gould, Michael K; Silvestri, Gerard A

    2015-01-01

    The United States Preventive Services Task Force recommends lung cancer screening with low-dose computed tomography (LDCT) in adults of age 55 to 80 years who have a 30 pack-year smoking history and are currently smoking or have quit within the past 15 years. This recommendation is largely based on the findings of the National Lung Screening Trial. Both policy-level and clinical decision-making about LDCT screening must consider the potential benefits of screening (reduced mortality from lung cancer) and possible harms. Effective screening requires an appreciation that screening should be limited to individuals at high risk of death from lung cancer, and that the risk of harm related to false positive findings, overdiagnosis, and unnecessary invasive testing is real. A comprehensive understanding of these aspects of screening will inform appropriate implementation, with the objective that an evidence-based and systematic approach to screening will help to reduce the enormous mortality burden of lung cancer.

  9. Seeing through the Screen: Is Evaluative Feedback Communicated More Effectively in Face-to-Face or Computer-Mediated Exchanges?

    ERIC Educational Resources Information Center

    Hebert, Brenda G.; Vorauer, Jacquie D.

    2003-01-01

    Describes a study of college students that examined how the use of computer mediated communication affected the transmission of performance and interpersonal appraisal information. Examined whether interpersonal judgments obtained through face-to-face communication resulted in greater positivity, but compromised accuracy, relative to…

  10. Newborn Screening

    MedlinePlus

    ... Activities Importance of Newborn Screening Newborn Screening and Molecular Biology Branch Pulse Oximetry Screening for CCHDs Sickle Cell Disease Laboratory SCID Quality Assurance Training and Resources ...

  11. CYP2D7 Sequence Variation Interferes with TaqMan CYP2D6*15 and *35 Genotyping

    PubMed Central

    Riffel, Amanda K.; Dehghani, Mehdi; Hartshorne, Toinette; Floyd, Kristen C.; Leeder, J. Steven; Rosenblatt, Kevin P.; Gaedigk, Andrea

    2016-01-01

    TaqMan™ genotyping assays are widely used to genotype CYP2D6, which encodes a major drug metabolizing enzyme. Assay design for CYP2D6 can be challenging owing to the presence of two pseudogenes, CYP2D7 and CYP2D8, structural and copy number variation and numerous single nucleotide polymorphisms (SNPs) some of which reflect the wild-type sequence of the CYP2D7 pseudogene. The aim of this study was to identify the mechanism causing false-positive CYP2D6*15 calls and remediate those by redesigning and validating alternative TaqMan genotype assays. Among 13,866 DNA samples genotyped by the CompanionDx® lab on the OpenArray platform, 70 samples were identified as heterozygotes for 137Tins, the key SNP of CYP2D6*15. However, only 15 samples were confirmed when tested with the Luminex xTAG CYP2D6 Kit and sequencing of CYP2D6-specific long range (XL)-PCR products. Genotype and gene resequencing of CYP2D6 and CYP2D7-specific XL-PCR products revealed a CC>GT dinucleotide SNP in exon 1 of CYP2D7 that reverts the sequence to CYP2D6 and allows a TaqMan assay PCR primer to bind. Because CYP2D7 also carries a Tins, a false-positive mutation signal is generated. This CYP2D7 SNP was also responsible for generating false-positive signals for rs769258 (CYP2D6*35) which is also located in exon 1. Although alternative CYP2D6*15 and *35 assays resolved the issue, we discovered a novel CYP2D6*15 subvariant in one sample that carries additional SNPs preventing detection with the alternate assay. The frequency of CYP2D6*15 was 0.1% in this ethnically diverse U.S. population sample. In addition, we also discovered linkage between the CYP2D7 CC>GT dinucleotide SNP and the 77G>A (rs28371696) SNP of CYP2D6*43. The frequency of this tentatively functional allele was 0.2%. Taken together, these findings emphasize that regardless of how careful genotyping assays are designed and evaluated before being commercially marketed, rare or unknown SNPs underneath primer and/or probe regions can impact

  12. CYP2D7 Sequence Variation Interferes with TaqMan CYP2D6 (*) 15 and (*) 35 Genotyping.

    PubMed

    Riffel, Amanda K; Dehghani, Mehdi; Hartshorne, Toinette; Floyd, Kristen C; Leeder, J Steven; Rosenblatt, Kevin P; Gaedigk, Andrea

    2015-01-01

    TaqMan™ genotyping assays are widely used to genotype CYP2D6, which encodes a major drug metabolizing enzyme. Assay design for CYP2D6 can be challenging owing to the presence of two pseudogenes, CYP2D7 and CYP2D8, structural and copy number variation and numerous single nucleotide polymorphisms (SNPs) some of which reflect the wild-type sequence of the CYP2D7 pseudogene. The aim of this study was to identify the mechanism causing false-positive CYP2D6 (*) 15 calls and remediate those by redesigning and validating alternative TaqMan genotype assays. Among 13,866 DNA samples genotyped by the CompanionDx® lab on the OpenArray platform, 70 samples were identified as heterozygotes for 137Tins, the key SNP of CYP2D6 (*) 15. However, only 15 samples were confirmed when tested with the Luminex xTAG CYP2D6 Kit and sequencing of CYP2D6-specific long range (XL)-PCR products. Genotype and gene resequencing of CYP2D6 and CYP2D7-specific XL-PCR products revealed a CC>GT dinucleotide SNP in exon 1 of CYP2D7 that reverts the sequence to CYP2D6 and allows a TaqMan assay PCR primer to bind. Because CYP2D7 also carries a Tins, a false-positive mutation signal is generated. This CYP2D7 SNP was also responsible for generating false-positive signals for rs769258 (CYP2D6 (*) 35) which is also located in exon 1. Although alternative CYP2D6 (*) 15 and (*) 35 assays resolved the issue, we discovered a novel CYP2D6 (*) 15 subvariant in one sample that carries additional SNPs preventing detection with the alternate assay. The frequency of CYP2D6 (*) 15 was 0.1% in this ethnically diverse U.S. population sample. In addition, we also discovered linkage between the CYP2D7 CC>GT dinucleotide SNP and the 77G>A (rs28371696) SNP of CYP2D6 (*) 43. The frequency of this tentatively functional allele was 0.2%. Taken together, these findings emphasize that regardless of how careful genotyping assays are designed and evaluated before being commercially marketed, rare or unknown SNPs underneath primer

  13. 2D Quantum Mechanical Study of Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, B.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    With the onset of quantum confinement in the inversion layer in nanoscale MOSFETs, behavior of the resonant level inevitably determines all device characteristics. While most classical device simulators take quantization into account in some simplified manner, the important details of electrostatics are missing. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL, and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density-gradient and quantum-corrected MEDICI). We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions and oxide tunneling are treated on an equal footing. Electrons in the ellipsoids of the conduction band are treated within the anisotropic effective mass approximation. We present the results of our simulations of MIT 25, 50 and 90 nm "well-tempered" MOSFETs and compare them to those of classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. Surprisingly, the self-consistent potential profile shows lower injection barrier in the channel in quantum case. These results are qualitatively consistent with ID Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and subthreshold current has been studied. The shorter gate length device has an order of magnitude smaller current at zero gate bias than the longer gate length device without a significant trade-off in on-current. This should be a device design consideration.

  14. Thermocapillary bubble dynamics in a 2D axis swirl domain

    NASA Astrophysics Data System (ADS)

    Alhendal, Yousuf; Turan, Ali

    2014-09-01

    The lack of significant buoyancy effects in zero-gravity conditions poses an issue with fluid transfer in a stagnant liquid. In this paper, bubble movement in a stagnant liquid is analysed and presented nu