Science.gov

Sample records for 2d computer screen

  1. MAGNUM-2D computer code: user's guide

    SciTech Connect

    England, R.L.; Kline, N.W.; Ekblad, K.J.; Baca, R.G.

    1985-01-01

    Information relevant to the general use of the MAGNUM-2D computer code is presented. This computer code was developed for the purpose of modeling (i.e., simulating) the thermal and hydraulic conditions in the vicinity of a waste package emplaced in a deep geologic repository. The MAGNUM-2D computer computes (1) the temperature field surrounding the waste package as a function of the heat generation rate of the nuclear waste and thermal properties of the basalt and (2) the hydraulic head distribution and associated groundwater flow fields as a function of the temperature gradients and hydraulic properties of the basalt. MAGNUM-2D is a two-dimensional numerical model for transient or steady-state analysis of coupled heat transfer and groundwater flow in a fractured porous medium. The governing equations consist of a set of coupled, quasi-linear partial differential equations that are solved using a Galerkin finite-element technique. A Newton-Raphson algorithm is embedded in the Galerkin functional to formulate the problem in terms of the incremental changes in the dependent variables. Both triangular and quadrilateral finite elements are used to represent the continuum portions of the spatial domain. Line elements may be used to represent discrete conduits. 18 refs., 4 figs., 1 tab.

  2. 2D NMR-spectroscopic screening reveals polyketides in ladybugs

    PubMed Central

    Deyrup, Stephen T.; Eckman, Laura E.; McCarthy, Patrick H.; Smedley, Scott R.; Meinwald, Jerrold; Schroeder, Frank C.

    2011-01-01

    Small molecules of biological origin continue to yield the most promising leads for drug design, but systematic approaches for exploring nature’s cache of structural diversity are lacking. Here, we demonstrate the use of 2D NMR spectroscopy to screen a library of biorationally selected insect metabolite samples for partial structures indicating the presence of new chemical entities. This NMR-spectroscopic survey enabled detection of novel compounds in complex metabolite mixtures without prior fractionation or isolation. Our screen led to discovery and subsequent isolation of two families of tricyclic pyrones in Delphastus catalinae, a tiny ladybird beetle that is employed commercially as a biological pest control agent. The D. catalinae pyrones are based on 23-carbon polyketide chains forming 1,11-dioxo-2,6,10-trioxaanthracene and 4,8-dioxo-1,9,13-trioxaanthracene derivatives, representing ring systems not previously found in nature. This study highlights the utility of 2D NMR-spectroscopic screening for exploring nature’s structure space and suggests that insect metabolomes remain vastly underexplored. PMID:21646540

  3. 2D NMR-spectroscopic screening reveals polyketides in ladybugs.

    PubMed

    Deyrup, Stephen T; Eckman, Laura E; McCarthy, Patrick H; Smedley, Scott R; Meinwald, Jerrold; Schroeder, Frank C

    2011-06-14

    Small molecules of biological origin continue to yield the most promising leads for drug design, but systematic approaches for exploring nature's cache of structural diversity are lacking. Here, we demonstrate the use of 2D NMR spectroscopy to screen a library of biorationally selected insect metabolite samples for partial structures indicating the presence of new chemical entities. This NMR-spectroscopic survey enabled detection of novel compounds in complex metabolite mixtures without prior fractionation or isolation. Our screen led to discovery and subsequent isolation of two families of tricyclic pyrones in Delphastus catalinae, a tiny ladybird beetle that is employed commercially as a biological pest control agent. The D. catalinae pyrones are based on 23-carbon polyketide chains forming 1,11-dioxo-2,6,10-trioxaanthracene and 4,8-dioxo-1,9,13-trioxaanthracene derivatives, representing ring systems not previously found in nature. This study highlights the utility of 2D NMR-spectroscopic screening for exploring nature's structure space and suggests that insect metabolomes remain vastly underexplored. PMID:21646540

  4. Computational Design of 2D materials for Energy Applications

    NASA Astrophysics Data System (ADS)

    Sun, Qiang

    2015-03-01

    Since the successful synthesis of graphene, tremendous efforts have been devoted to two-dimensional monolayers such as boron nitride (BN), silicene and MoS2. These 2D materials exhibit a large variety of physical and chemical properties with unprecedented applications. Here we report our recent studies of computational design of 2D materials for fuel cell applications which include hydrogen storage, CO2 capture, CO conversion and O2 reduction.

  5. 2-D and 3-D computations of curved accelerator magnets

    SciTech Connect

    Turner, L.R.

    1991-01-01

    In order to save computer memory, a long accelerator magnet may be computed by treating the long central region and the end regions separately. The dipole magnets for the injector synchrotron of the Advanced Photon Source (APS), now under construction at Argonne National Laboratory (ANL), employ magnet iron consisting of parallel laminations, stacked with a uniform radius of curvature of 33.379 m. Laplace's equation for the magnetic scalar potential has a different form for a straight magnet (x-y coordinates), a magnet with surfaces curved about a common center (r-{theta} coordinates), and a magnet with parallel laminations like the APS injector dipole. Yet pseudo 2-D computations for the three geometries give basically identical results, even for a much more strongly curved magnet. Hence 2-D (x-y) computations of the central region and 3-D computations of the end regions can be combined to determine the overall magnetic behavior of the magnets. 1 ref., 6 figs.

  6. Validation and testing of the VAM2D computer code

    SciTech Connect

    Kool, J.B.; Wu, Y.S. )

    1991-10-01

    This document describes two modeling studies conducted by HydroGeoLogic, Inc. for the US NRC under contract no. NRC-04089-090, entitled, Validation and Testing of the VAM2D Computer Code.'' VAM2D is a two-dimensional, variably saturated flow and transport code, with applications for performance assessment of nuclear waste disposal. The computer code itself is documented in a separate NUREG document (NUREG/CR-5352, 1989). The studies presented in this report involve application of the VAM2D code to two diverse subsurface modeling problems. The first one involves modeling of infiltration and redistribution of water and solutes in an initially dry, heterogeneous field soil. This application involves detailed modeling over a relatively short, 9-month time period. The second problem pertains to the application of VAM2D to the modeling of a waste disposal facility in a fractured clay, over much larger space and time scales and with particular emphasis on the applicability and reliability of using equivalent porous medium approach for simulating flow and transport in fractured geologic media. Reflecting the separate and distinct nature of the two problems studied, this report is organized in two separate parts. 61 refs., 31 figs., 9 tabs.

  7. Screening and transport in 2D semiconductor systems at low temperatures

    PubMed Central

    Das Sarma, S.; Hwang, E. H.

    2015-01-01

    Low temperature carrier transport properties in 2D semiconductor systems can be theoretically well-understood within RPA-Boltzmann theory as being limited by scattering from screened Coulomb disorder arising from random quenched charged impurities in the environment. In this work, we derive a number of analytical formula, supported by realistic numerical calculations, for the relevant density, mobility, and temperature range where 2D transport should manifest strong intrinsic (i.e., arising purely from electronic effects) metallic temperature dependence in different semiconductor materials arising entirely from the 2D screening properties, thus providing an explanation for why the strong temperature dependence of the 2D resistivity can only be observed in high-quality and low-disorder 2D samples and also why some high-quality 2D materials manifest much weaker metallicity than other materials. We also discuss effects of interaction and disorder on the 2D screening properties in this context as well as compare 2D and 3D screening functions to comment why such a strong intrinsic temperature dependence arising from screening cannot occur in 3D metallic carrier transport. Experimentally verifiable predictions are made about the quantitative magnitude of the maximum possible low-temperature metallicity in 2D systems and the scaling behavior of the temperature scale controlling the quantum to classical crossover. PMID:26572738

  8. Computing 2D constrained delaunay triangulation using the GPU.

    PubMed

    Qi, Meng; Cao, Thanh-Tung; Tan, Tiow-Seng

    2013-05-01

    We propose the first graphics processing unit (GPU) solution to compute the 2D constrained Delaunay triangulation (CDT) of a planar straight line graph (PSLG) consisting of points and edges. There are many existing CPU algorithms to solve the CDT problem in computational geometry, yet there has been no prior approach to solve this problem efficiently using the parallel computing power of the GPU. For the special case of the CDT problem where the PSLG consists of just points, which is simply the normal Delaunay triangulation (DT) problem, a hybrid approach using the GPU together with the CPU to partially speed up the computation has already been presented in the literature. Our work, on the other hand, accelerates the entire computation on the GPU. Our implementation using the CUDA programming model on NVIDIA GPUs is numerically robust, and runs up to an order of magnitude faster than the best sequential implementations on the CPU. This result is reflected in our experiment with both randomly generated PSLGs and real-world GIS data having millions of points and edges. PMID:23492377

  9. Prestack depth migration for complex 2D structure using phase-screen propagators

    SciTech Connect

    Roberts, P.; Huang, Lian-Jie; Burch, C.; Fehler, M.; Hildebrand, S.

    1997-11-01

    We present results for the phase-screen propagator method applied to prestack depth migration of the Marmousi synthetic data set. The data were migrated as individual common-shot records and the resulting partial images were superposed to obtain the final complete Image. Tests were performed to determine the minimum number of frequency components required to achieve the best quality image and this in turn provided estimates of the minimum computing time. Running on a single processor SUN SPARC Ultra I, high quality images were obtained in as little as 8.7 CPU hours and adequate images were obtained in as little as 4.4 CPU hours. Different methods were tested for choosing the reference velocity used for the background phase-shift operation and for defining the slowness perturbation screens. Although the depths of some of the steeply dipping, high-contrast features were shifted slightly the overall image quality was fairly insensitive to the choice of the reference velocity. Our jests show the phase-screen method to be a reliable and fast algorithm for imaging complex geologic structures, at least for complex 2D synthetic data where the velocity model is known.

  10. Preconditioning 2D Integer Data for Fast Convex Hull Computations.

    PubMed

    Cadenas, José Oswaldo; Megson, Graham M; Luengo Hendriks, Cris L

    2016-01-01

    In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved. PMID:26938221

  11. Preconditioning 2D Integer Data for Fast Convex Hull Computations

    PubMed Central

    2016-01-01

    In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved. PMID:26938221

  12. Comparison of 2D versus 3D mammography with screening cases: an observer study

    NASA Astrophysics Data System (ADS)

    Fernandez, James Reza; Deshpande, Ruchi; Hovanessian-Larsen, Linda; Liu, Brent

    2012-02-01

    Breast cancer is the most common type of non-skin cancer in women. 2D mammography is a screening tool to aid in the early detection of breast cancer, but has diagnostic limitations of overlapping tissues, especially in dense breasts. 3D mammography has the potential to improve detection outcomes by increasing specificity, and a new 3D screening tool with a 3D display for mammography aims to improve performance and efficiency as compared to 2D mammography. An observer study using human studies collected from was performed to compare traditional 2D mammography with this new 3D mammography technique. A prior study using a mammography phantom revealed no difference in calcification detection, but improved mass detection in 2D as compared to 3D. There was a significant decrease in reading time for masses, calcifications, and normals in 3D compared to 2D, however, as well as more favorable confidence levels in reading normal cases. Data for this current study is currently being obtained, and a full report should be available in the next few weeks.

  13. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988

  14. Computation of 2D Navier-Stokes equations

    NASA Astrophysics Data System (ADS)

    Chakrabartty, Sunil Kumar

    Two schemes for computing two-dimensional Navier-Stokes equations are described and applied to laminar flow over a flat plate and viscous flow over a NACA0012 airfoil. The variation of local skin-friction coefficient with local Reynolds number is compared with the Blasius solution and that of Swanson and Turkel (1985). The effect of free-stream Mach number on the temperature profile is shown, and a comparison is made of velocity profile at M(infinity) = 0.50 and Re = 500, with no artificial viscosity used for stability. Pressure distributions, local skin friction distributions, and velocity profiles on the airfoil and wake are presented.

  15. Cytochrome P450-2D6 Screening Among Elderly Using Antidepressants (CYSCE)

    ClinicalTrials.gov

    2015-12-09

    Depression; Depressive Disorder; Poor Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Intermediate Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Ultrarapid Metabolizer Due to Cytochrome P450 CYP2D6 Variant

  16. 8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN TRACKS RESIDUAL CHLORINE; INDICATES AMOUNT OF SUNLIGHT WHICH ENABLES OPERATOR TO ESTIMATE NEEDED CHLORINE; CENTER SCREEN SHOWS TURNOUT STRUCTURES; RIGHT SCREEN SHOWS INDICATORS OF ALUMINUM SULFATE TANK FARM. - F. E. Weymouth Filtration Plant, 700 North Moreno Avenue, La Verne, Los Angeles County, CA

  17. Fast Computation of Wideband Beam Pattern for Designing Large-Scale 2-D Arrays.

    PubMed

    Chi, Cheng; Li, Zhaohui

    2016-06-01

    For real-time and high-resolution 3-D ultrasound imaging, the design of sparse distribution and weights of elements of a large-scale wideband 2-D array is needed to reduce hardware cost and achieve better directivity. However, due to the high time consumption of computing the wideband beam pattern, the design methods that need massive iterations have rarely been applied to design large-scale wideband 2-D arrays by directly computing the wideband beam pattern. In this paper, a fast method is proposed to realize the computation of a wideband beam pattern of arbitrary 2-D arrays in the far field in order to design large-scale wideband 2-D arrays. The proposed fast method exploits two important techniques: 1) nonuniform fast Fourier transform (FFT) and 2) short inverse FFT. Compared with the commonly used ultrasound simulator Field II, two orders of magnitude improvement in computation speed is achieved with comparable accuracy. The proposed fast method enables massive iterations of direct wideband beam pattern computation of arbitrary large-scale 2-D arrays. A design example in this paper demonstrates that the proposed fast method can help achieve better performance in designing large-scale wideband 2-D arrays. PMID:27046870

  18. GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System

    DOE Data Explorer

    James Menart

    2013-06-07

    This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..

  19. Quantum computational capability of a 2D valence bond solid phase

    SciTech Connect

    Miyake, Akimasa

    2011-07-15

    Highlights: > Our model is the 2D valence bond solid phase of a quantum antiferromagnet. > Universal quantum computation is processed by measurements of quantum correlations. > An intrinsic complexity of strongly-correlated quantum systems could be a resource. - Abstract: Quantum phases of naturally-occurring systems exhibit distinctive collective phenomena as manifestation of their many-body correlations, in contrast to our persistent technological challenge to engineer at will such strong correlations artificially. Here we show theoretically that quantum correlations exhibited in the 2D valence bond solid phase of a quantum antiferromagnet, modeled by Affleck, Kennedy, Lieb, and Tasaki (AKLT) as a precursor of spin liquids and topological orders, are sufficiently complex yet structured enough to simulate universal quantum computation when every single spin can be measured individually. This unveils that an intrinsic complexity of naturally-occurring 2D quantum systems-which has been a long-standing challenge for traditional computers-could be tamed as a computationally valuable resource, even if we are limited not to create newly entanglement during computation. Our constructive protocol leverages a novel way to herald the correlations suitable for deterministic quantum computation through a random sampling, and may be extensible to other ground states of various 2D valence bond phases beyond the AKLT state.

  20. Topological evolutionary computing in the optimal design of 2D and 3D structures

    NASA Astrophysics Data System (ADS)

    Burczynski, T.; Poteralski, A.; Szczepanik, M.

    2007-10-01

    An application of evolutionary algorithms and the finite-element method to the topology optimization of 2D structures (plane stress, bending plates, and shells) and 3D structures is described. The basis of the topological evolutionary optimization is the direct control of the density material distribution (or thickness for 2D structures) by the evolutionary algorithm. The structures are optimized for stress, mass, and compliance criteria. The numerical examples demonstrate that this method is an effective technique for solving problems in computer-aided optimal design.

  1. Using Membrane Computing for Obtaining Homology Groups of Binary 2D Digital Images

    NASA Astrophysics Data System (ADS)

    Christinal, Hepzibah A.; Díaz-Pernil, Daniel; Jurado, Pedro Real

    Membrane Computing is a new paradigm inspired from cellular communication. Until now, P systems have been used in research areas like modeling chemical process, several ecosystems, etc. In this paper, we apply P systems to Computational Topology within the context of the Digital Image. We work with a variant of P systems called tissue-like P systems to calculate in a general maximally parallel manner the homology groups of 2D images. In fact, homology computation for binary pixel-based 2D digital images can be reduced to connected component labeling of white and black regions. Finally, we use a software called Tissue Simulator to show with some examples how these systems work.

  2. Numerical computation of 2D Sommerfeld integrals - Decomposition of the angular integral

    NASA Astrophysics Data System (ADS)

    Dvorak, Steven L.; Kuester, Edward F.

    1992-02-01

    The computational efficiency of the 2D Sommerfeld integrals is shown to undergo improvement through the discovery of novel ways to compute the inner angular integral in polar representations. It is shown that the angular integral can be decomposed into a finite number of incomplete Lipschitz-Hankel integrals; these can in turn be calculated through a series of expansions, so that the angular integral can be computed by summing a series rather than applying a standard numerical integration algorithm. The technique is most efficient and accurate when piecewise-sinusoidal basis functions are employed to analyze a printed strip-dipole antenna in a layered medium.

  3. Computational efficient segmentation of cell nuclei in 2D and 3D fluorescent micrographs

    NASA Astrophysics Data System (ADS)

    De Vylder, Jonas; Philips, Wilfried

    2011-02-01

    This paper proposes a new segmentation technique developed for the segmentation of cell nuclei in both 2D and 3D fluorescent micrographs. The proposed method can deal with both blurred edges as with touching nuclei. Using a dual scan line algorithm its both memory as computational efficient, making it interesting for the analysis of images coming from high throughput systems or the analysis of 3D microscopic images. Experiments show good results, i.e. recall of over 0.98.

  4. Automatic computation of 2D cardiac measurements from B-mode echocardiography

    NASA Astrophysics Data System (ADS)

    Park, JinHyeong; Feng, Shaolei; Zhou, S. Kevin

    2012-03-01

    We propose a robust and fully automatic algorithm which computes the 2D echocardiography measurements recommended by America Society of Echocardiography. The algorithm employs knowledge-based imaging technologies which can learn the expert's knowledge from the training images and expert's annotation. Based on the models constructed from the learning stage, the algorithm searches initial location of the landmark points for the measurements by utilizing heart structure of left ventricle including mitral valve aortic valve. It employs the pseudo anatomic M-mode image generated by accumulating the line images in 2D parasternal long axis view along the time to refine the measurement landmark points. The experiment results with large volume of data show that the algorithm runs fast and is robust comparable to expert.

  5. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in

  6. Large-scale systematic analysis of 2D fingerprint methods and parameters to improve virtual screening enrichments.

    PubMed

    Sastry, Madhavi; Lowrie, Jeffrey F; Dixon, Steven L; Sherman, Woody

    2010-05-24

    A systematic virtual screening study on 11 pharmaceutically relevant targets has been conducted to investigate the interrelation between 8 two-dimensional (2D) fingerprinting methods, 13 atom-typing schemes, 13 bit scaling rules, and 12 similarity metrics using the new cheminformatics package Canvas. In total, 157 872 virtual screens were performed to assess the ability of each combination of parameters to identify actives in a database screen. In general, fingerprint methods, such as MOLPRINT2D, Radial, and Dendritic that encode information about local environment beyond simple linear paths outperformed other fingerprint methods. Atom-typing schemes with more specific information, such as Daylight, Mol2, and Carhart were generally superior to more generic atom-typing schemes. Enrichment factors across all targets were improved considerably with the best settings, although no single set of parameters performed optimally on all targets. The size of the addressable bit space for the fingerprints was also explored, and it was found to have a substantial impact on enrichments. Small bit spaces, such as 1024, resulted in many collisions and in a significant degradation in enrichments compared to larger bit spaces that avoid collisions. PMID:20450209

  7. Health risks from computed tomographic screening.

    PubMed

    Krantz, Seth B; Meyers, Bryan F

    2015-05-01

    Results of the recent National Lung Cancer Screening Trial show a significant survival benefit for annual screening with a low-dose computed tomographic (CT) scan in high-risk individuals. This result has led the US Preventive Services Task Force to recommend annual low-dose CT scans for this at-risk population. Less well characterized are the risks from screening. The primary risks from screening are radiation exposure, false-positive results and unnecessary diagnostic and therapeutic procedures, overdiagnosis and overtreatment, and increased psychological distress. This article reviews these risks, which must be considered and weighed against the benefits when discussing enrollment with patients. PMID:25901559

  8. GPU computing with OpenCL to model 2D elastic wave propagation: exploring memory usage

    NASA Astrophysics Data System (ADS)

    Iturrarán-Viveros, Ursula; Molero-Armenta, Miguel

    2015-01-01

    Graphics processing units (GPUs) have become increasingly powerful in recent years. Programs exploring the advantages of this architecture could achieve large performance gains and this is the aim of new initiatives in high performance computing. The objective of this work is to develop an efficient tool to model 2D elastic wave propagation on parallel computing devices. To this end, we implement the elastodynamic finite integration technique, using the industry open standard open computing language (OpenCL) for cross-platform, parallel programming of modern processors, and an open-source toolkit called [Py]OpenCL. The code written with [Py]OpenCL can run on a wide variety of platforms; it can be used on AMD or NVIDIA GPUs as well as classical multicore CPUs, adapting to the underlying architecture. Our main contribution is its implementation with local and global memory and the performance analysis using five different computing devices (including Kepler, one of the fastest and most efficient high performance computing technologies) with various operating systems.

  9. Screen-printed ultrasonic 2-D matrix array transducers for microparticle manipulation.

    PubMed

    Qiu, Yongqiang; Wang, Han; Gebhardt, Sylvia; Bolhovitins, Aleksandrs; Démoré, Christine E M; Schönecker, Andreas; Cochran, Sandy

    2015-09-01

    This paper reports the development of a two-dimensional thick film lead zirconate titanate (PZT) ultrasonic transducer array, operating at frequency approximately 7.5MHz, to demonstrate the potential of this fabrication technique for microparticle manipulation. All layers of the array are screen-printed then sintered on an alumina substrate without any subsequent patterning processes. The thickness of the thick film PZT is 139±2μm, the element pitch of the array is 2.3mm, and the dimension of each individual PZT element is 2×2mm(2) with top electrode 1.7×1.7mm(2). The measured relative dielectric constant of the PZT is 2250±100 and the dielectric loss is 0.09±0.005 at 10kHz. Finite element analysis was used to predict the behaviour of the array and to optimise its configuration. Electrical impedance spectroscopy and laser vibrometry were used to characterise the array experimentally. The measured surface motion of a single element is on the order of tens of nanometres with a 10Vpeak continuous sinusoidal excitation. Particle manipulation experiments have been demonstrated with the array by manipulating Ø10μm polystyrene microspheres in degassed water. The simplified array fabrication process and the bulk production capability of screen-printing suggest potential for the commercialisation of multilayer planar resonant devices for ultrasonic particle manipulation. PMID:26026870

  10. Numerical computation of 2D sommerfeld integrals— A novel asymptotic extraction technique

    NASA Astrophysics Data System (ADS)

    Dvorak, Steven L.; Kuester, Edward F.

    1992-02-01

    The accurate and efficient computation of the elements in the impedance matrix is a crucial step in the application of Galerkin's method to the analysis of planar structures. As was demonstrated in a previous paper, it is possible to decompose the angular integral, in the polar representation for the 2D Sommerfeld integrals, in terms of incomplete Lipschitz-Hankel integrals (ILHIs) when piecewise sinusoidal basis functions are employed. Since Bessel series expansions can be used to compute these ILHIs, a numerical integration of the inner angular integral is not required. This technique provides an efficient method for the computation of the inner angular integral; however, the outer semi-infinite integral still converges very slowly when a real axis integration is applied. Therefore, it is very difficult to compute the impedance elements accurately and efficiently. In this paper, it is shown that this problem can be overcome by using the ILHI representation for the angular integral to develop a novel asymptotic extraction technique for the outer semi-infinite integral. The usefulness of this asymptotic extraction technique is demonstrated by applying it to the analysis of a printed strip dipole antenna in a layered medium.

  11. Breast density measurement: 3D cone beam computed tomography (CBCT) images versus 2D digital mammograms

    NASA Astrophysics Data System (ADS)

    Han, Tao; Lai, Chao-Jen; Chen, Lingyun; Liu, Xinming; Shen, Youtao; Zhong, Yuncheng; Ge, Shuaiping; Yi, Ying; Wang, Tianpeng; Yang, Wei T.; Shaw, Chris C.

    2009-02-01

    Breast density has been recognized as one of the major risk factors for breast cancer. However, breast density is currently estimated using mammograms which are intrinsically 2D in nature and cannot accurately represent the real breast anatomy. In this study, a novel technique for measuring breast density based on the segmentation of 3D cone beam CT (CBCT) images was developed and the results were compared to those obtained from 2D digital mammograms. 16 mastectomy breast specimens were imaged with a bench top flat-panel based CBCT system. The reconstructed 3D CT images were corrected for the cupping artifacts and then filtered to reduce the noise level, followed by using threshold-based segmentation to separate the dense tissue from the adipose tissue. For each breast specimen, volumes of the dense tissue structures and the entire breast were computed and used to calculate the volumetric breast density. BI-RADS categories were derived from the measured breast densities and compared with those estimated from conventional digital mammograms. The results show that in 10 of 16 cases the BI-RADS categories derived from the CBCT images were lower than those derived from the mammograms by one category. Thus, breasts considered as dense in mammographic examinations may not be considered as dense with the CBCT images. This result indicates that the relation between breast cancer risk and true (volumetric) breast density needs to be further investigated.

  12. Computer program BL2D for solving two-dimensional and axisymmetric boundary layers

    NASA Technical Reports Server (NTRS)

    Iyer, Venkit

    1995-01-01

    This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.

  13. The spine in 3D. Computed tomographic reformation from 2D axial sections.

    PubMed

    Virapongse, C; Gmitro, A; Sarwar, M

    1986-01-01

    A new program (3D83, General Electric) was used to reformat three-dimensional (3D) images from two-dimensional (2D) computed tomographic axial scans in 18 patients who had routine scans of the spine. The 3D spine images were extremely true to life and could be rotated around all three principle axes (constituting a movie), so that an illusion of head-motion parallax was created. The benefit of 3D reformation with this program is primarily for preoperative planning. It appears that 3D can also effectively determine the patency of foraminal stenosis by reformatting in hemisections. Currently this program is subject to several drawbacks that require user interaction and long reconstruction time. With further improvement, 3D reformation will find increasing clinical applicability. PMID:3787319

  14. Experimental and Computational Study of Multiphase Flow Hydrodynamics in 2D Trickle Bed Reactors

    NASA Astrophysics Data System (ADS)

    Nadeem, H.; Ben Salem, I.; Kurnia, J. C.; Rabbani, S.; Shamim, T.; Sassi, M.

    2014-12-01

    Trickle bed reactors are largely used in the refining processes. Co-current heavy oil and hydrogen gas flow downward on catalytic particle bed. Fine particles in the heavy oil and/or soot formed by the exothermic catalytic reactions deposit on the bed and clog the flow channels. This work is funded by the refining company of Abu Dhabi and aims at mitigating pressure buildup due to fine deposition in the TBR. In this work, we focus on meso-scale experimental and computational investigations of the interplay between flow regimes and the various parameters that affect them. A 2D experimental apparatus has been built to investigate the flow regimes with an average pore diameter close to the values encountered in trickle beds. A parametric study is done for the development of flow regimes and the transition between them when the geometry and arrangement of the particles within the porous medium are varied. Liquid and gas flow velocities have also been varied to capture the different flow regimes. Real time images of the multiphase flow are captured using a high speed camera, which were then used to characterize the transition between the different flow regimes. A diffused light source was used behind the 2D Trickle Bed Reactor to enhance visualizations. Experimental data shows very good agreement with the published literature. The computational study focuses on the hydrodynamics of multiphase flow and to identify the flow regime developed inside TBRs using the ANSYS Fluent Software package. Multiphase flow inside TBRs is investigated using the "discrete particle" approach together with Volume of Fluid (VoF) multiphase flow modeling. The effect of the bed particle diameter, spacing, and arrangement are presented that may be used to provide guidelines for designing trickle bed reactors.

  15. Adiabatic and Hamiltonian computing on a 2D lattice with simple two-qubit interactions

    NASA Astrophysics Data System (ADS)

    Lloyd, Seth; Terhal, Barbara M.

    2016-02-01

    We show how to perform universal Hamiltonian and adiabatic computing using a time-independent Hamiltonian on a 2D grid describing a system of hopping particles which string together and interact to perform the computation. In this construction, the movement of one particle is controlled by the presence or absence of other particles, an effective quantum field effect transistor that allows the construction of controlled-NOT and controlled-rotation gates. The construction translates into a model for universal quantum computation with time-independent two-qubit ZZ and XX+YY interactions on an (almost) planar grid. The effective Hamiltonian is arrived at by a single use of first-order perturbation theory avoiding the use of perturbation gadgets. The dynamics and spectral properties of the effective Hamiltonian can be fully determined as it corresponds to a particular realization of a mapping between a quantum circuit and a Hamiltonian called the space-time circuit-to-Hamiltonian construction. Because of the simple interactions required, and because no higher-order perturbation gadgets are employed, our construction is potentially realizable using superconducting or other solid-state qubits.

  16. Diverse Geological Applications For Basil: A 2d Finite-deformation Computational Algorithm

    NASA Astrophysics Data System (ADS)

    Houseman, Gregory A.; Barr, Terence D.; Evans, Lynn

    Geological processes are often characterised by large finite-deformation continuum strains, on the order of 100% or greater. Microstructural processes cause deformation that may be represented by a viscous constitutive mechanism, with viscosity that may depend on temperature, pressure, or strain-rate. We have developed an effective com- putational algorithm for the evaluation of 2D deformation fields produced by Newto- nian or non-Newtonian viscous flow. With the implementation of this algorithm as a computer program, Basil, we have applied it to a range of diverse applications in Earth Sciences. Viscous flow fields in 2D may be defined for the thin-sheet case or, using a velocity-pressure formulation, for the plane-strain case. Flow fields are represented using 2D triangular elements with quadratic interpolation for velocity components and linear for pressure. The main matrix equation is solved by an efficient and compact conjugate gradient algorithm with iteration for non-Newtonian viscosity. Regular grids may be used, or grids based on a random distribution of points. Definition of the prob- lem requires that velocities, tractions, or some combination of the two, are specified on all external boundary nodes. Compliant boundaries may also be defined, based on the idea that traction is opposed to and proportional to boundary displacement rate. In- ternal boundary segments, allowing fault-like displacements within a viscous medium have also been developed, and we find that the computed displacement field around the fault tip is accurately represented for Newtonian and non-Newtonian viscosities, in spite of the stress singularity at the fault tip. Basil has been applied by us and colleagues to problems that include: thin sheet calculations of continental collision, Rayleigh-Taylor instability of the continental mantle lithosphere, deformation fields around fault terminations at the outcrop scale, stress and deformation fields in and around porphyroblasts, and

  17. Parallel computation of optimized arrays for 2-D electrical imaging surveys

    NASA Astrophysics Data System (ADS)

    Loke, M. H.; Wilkinson, P. B.; Chambers, J. E.

    2010-12-01

    Modern automatic multi-electrode survey instruments have made it possible to use non-traditional arrays to maximize the subsurface resolution from electrical imaging surveys. Previous studies have shown that one of the best methods for generating optimized arrays is to select the set of array configurations that maximizes the model resolution for a homogeneous earth model. The Sherman-Morrison Rank-1 update is used to calculate the change in the model resolution when a new array is added to a selected set of array configurations. This method had the disadvantage that it required several hours of computer time even for short 2-D survey lines. The algorithm was modified to calculate the change in the model resolution rather than the entire resolution matrix. This reduces the computer time and memory required as well as the computational round-off errors. The matrix-vector multiplications for a single add-on array were replaced with matrix-matrix multiplications for 28 add-on arrays to further reduce the computer time. The temporary variables were stored in the double-precision Single Instruction Multiple Data (SIMD) registers within the CPU to minimize computer memory access. A further reduction in the computer time is achieved by using the computer graphics card Graphics Processor Unit (GPU) as a highly parallel mathematical coprocessor. This makes it possible to carry out the calculations for 512 add-on arrays in parallel using the GPU. The changes reduce the computer time by more than two orders of magnitude. The algorithm used to generate an optimized data set adds a specified number of new array configurations after each iteration to the existing set. The resolution of the optimized data set can be increased by adding a smaller number of new array configurations after each iteration. Although this increases the computer time required to generate an optimized data set with the same number of data points, the new fast numerical routines has made this practical on

  18. A comparative study for 2D and 3D computer-aided diagnosis methods for solitary pulmonary nodules.

    PubMed

    Yeh, Chinson; Wang, Jen-Feng; Wu, Ming-Ting; Yen, Chen-Wen; Nagurka, Mark L; Lin, Chen-Liang

    2008-06-01

    Many computer-aided diagnosis (CAD) methods, including 2D and 3D approaches, have been proposed for solitary pulmonary nodules (SPNs). However, the detection and diagnosis of SPNs remain challenging in many clinical circumstances. One goal of this work is to investigate the relative diagnostic accuracy of 2D and 3D methods. An additional goal is to develop a two-stage approach that combines the simplicity of 2D and the accuracy of 3D methods. The experimental results show statistically significant differences between the diagnostic accuracy of 2D and 3D methods. The results also show that with a very minor drop in diagnostic performance the two-stage approach can significantly reduce the number of nodules needed to be processed by the 3D method, streamlining the computational demand. PMID:18313899

  19. Computational Study and Analysis of Structural Imperfections in 1D and 2D Photonic Crystals

    SciTech Connect

    K.R. Maskaly

    2005-06-01

    increasing RMS roughness. Again, the homogenization approximation is able to predict these results. The problem of surface scratches on 1D photonic crystals is also addressed. Although the reflectivity decreases are lower in this study, up to a 15% change in reflectivity is observed in certain scratched photonic crystal structures. However, this reflectivity change can be significantly decreased by adding a low index protective coating to the surface of the photonic crystal. Again, application of homogenization theory to these structures confirms its predictive power for this type of imperfection as well. Additionally, the problem of a circular pores in 2D photonic crystals is investigated, showing that almost a 50% change in reflectivity can occur for some structures. Furthermore, this study reveals trends that are consistent with the 1D simulations: parameter changes that increase the absolute reflectivity of the photonic crystal will also increase its tolerance to structural imperfections. Finally, experimental reflectance spectra from roughened 1D photonic crystals are compared to the results predicted computationally in this thesis. Both the computed and experimental spectra correlate favorably, validating the findings presented herein.

  20. A computational model of the short-cut rule for 2D shape decomposition.

    PubMed

    Luo, Lei; Shen, Chunhua; Liu, Xinwang; Zhang, Chunyuan

    2015-01-01

    We propose a new 2D shape decomposition method based on the short-cut rule. The short-cut rule originates from cognition research, and states that the human visual system prefers to partition an object into parts using the shortest possible cuts. We propose and implement a computational model for the short-cut rule and apply it to the problem of shape decomposition. The model we proposed generates a set of cut hypotheses passing through the points on the silhouette, which represent the negative minima of curvature. We then show that most part-cut hypotheses can be eliminated by analysis of local properties of each. Finally, the remaining hypotheses are evaluated in ascending length order, which guarantees that of any pair of conflicting cuts only the shortest will be accepted. We demonstrate that, compared with state-of-the-art shape decomposition methods, the proposed approach achieves decomposition results, which better correspond to human intuition as revealed in psychological experiments. PMID:25438318

  1. An algorithm for computing the 2D structure of fast rotating stars

    NASA Astrophysics Data System (ADS)

    Rieutord, Michel; Espinosa Lara, Francisco; Putigny, Bertrand

    2016-08-01

    Stars may be understood as self-gravitating masses of a compressible fluid whose radiative cooling is compensated by nuclear reactions or gravitational contraction. The understanding of their time evolution requires the use of detailed models that account for a complex microphysics including that of opacities, equation of state and nuclear reactions. The present stellar models are essentially one-dimensional, namely spherically symmetric. However, the interpretation of recent data like the surface abundances of elements or the distribution of internal rotation have reached the limits of validity of one-dimensional models because of their very simplified representation of large-scale fluid flows. In this article, we describe the ESTER code, which is the first code able to compute in a consistent way a two-dimensional model of a fast rotating star including its large-scale flows. Compared to classical 1D stellar evolution codes, many numerical innovations have been introduced to deal with this complex problem. First, the spectral discretization based on spherical harmonics and Chebyshev polynomials is used to represent the 2D axisymmetric fields. A nonlinear mapping maps the spheroidal star and allows a smooth spectral representation of the fields. The properties of Picard and Newton iterations for solving the nonlinear partial differential equations of the problem are discussed. It turns out that the Picard scheme is efficient on the computation of the simple polytropic stars, but Newton algorithm is unsurpassed when stellar models include complex microphysics. Finally, we discuss the numerical efficiency of our solver of Newton iterations. This linear solver combines the iterative Conjugate Gradient Squared algorithm together with an LU-factorization serving as a preconditioner of the Jacobian matrix.

  2. Icarus: A 2-D Direct Simulation Monte Carlo (DSMC) Code for Multi-Processor Computers

    SciTech Connect

    BARTEL, TIMOTHY J.; PLIMPTON, STEVEN J.; GALLIS, MICHAIL A.

    2001-10-01

    Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird[11.1] and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modeled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modeled using steric factors derived from Arrhenius reaction rates or in a manner similar to continuum modeling. Surface chemistry is modeled with surface reaction probabilities; an optional site density, energy dependent, coverage model is included. Electrons are modeled by either a local charge neutrality assumption or as discrete simulational particles. Ion chemistry is modeled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electro-static fields can either be: externally input, a Langmuir-Tonks model or from a Green's Function (Boundary Element) based Poison Solver. Icarus has been used for subsonic to hypersonic, chemically reacting, and plasma flows. The Icarus software package includes the grid generation, parallel processor decomposition, post-processing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. All of the software packages are written in standard Fortran.

  3. Computation of neutron fluxes in clusters of fuel pins arranged in hexagonal assemblies (2D and 3D)

    SciTech Connect

    Prabha, H.; Marleau, G.

    2012-07-01

    For computations of fluxes, we have used Carvik's method of collision probabilities. This method requires tracking algorithms. An algorithm to compute tracks (in 2D and 3D) has been developed for seven hexagonal geometries with cluster of fuel pins. This has been implemented in the NXT module of the code DRAGON. The flux distribution in cluster of pins has been computed by using this code. For testing the results, they are compared when possible with the EXCELT module of the code DRAGON. Tracks are plotted in the NXT module by using MATLAB, these plots are also presented here. Results are presented with increasing number of lines to show the convergence of these results. We have numerically computed volumes, surface areas and the percentage errors in these computations. These results show that 2D results converge faster than 3D results. The accuracy on the computation of fluxes up to second decimal is achieved with fewer lines. (authors)

  4. PC2D simulation and optimization of the selective emitter solar cells fabricated by screen printing phosphoric paste method

    NASA Astrophysics Data System (ADS)

    Jia, Xiaojie; Ai, Bin; Deng, Youjun; Xu, Xinxiang; Peng, Hua; Shen, Hui

    2015-08-01

    On the basis of perfect PC2D simulation to the measured current density vs voltage (J-V) curve of the best selective emitter (SE) solar cell fabricated by the CSG Company using the screen printing phosphoric paste method, we systematically investigated the effect of the parameters of gridline, base, selective emitter, back surface field (BSF) layer and surface recombination rate on performance of the SE solar cell. Among these parameters, we identified that the base minority carrier lifetime, the front and back surface recombination rate and the ratio of the sheet-resistance of heavily and lightly doped region are the four largest efficiency-affecting factors. If all the parameters have ideal values, the SE solar cell fabricated on a p-type monocrystalline silicon wafer can even obtain the efficiency of 20.45%. In addition, the simulation also shows that fine gridline combining dense gridline and increasing bus bar number while keeping the lower area ratio can offer the other ways to improve the efficiency.

  5. MULTI2D - a computer code for two-dimensional radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.; Ramírez, J.

    2009-06-01

    Simulation of radiation hydrodynamics in two spatial dimensions is developed, having in mind, in particular, target design for indirectly driven inertial confinement energy (IFE) and the interpretation of related experiments. Intense radiation pulses by laser or particle beams heat high-Z target configurations of different geometries and lead to a regime which is optically thick in some regions and optically thin in others. A diffusion description is inadequate in this situation. A new numerical code has been developed which describes hydrodynamics in two spatial dimensions (cylindrical R-Z geometry) and radiation transport along rays in three dimensions with the 4 π solid angle discretized in direction. Matter moves on a non-structured mesh composed of trilateral and quadrilateral elements. Radiation flux of a given direction enters on two (one) sides of a triangle and leaves on the opposite side(s) in proportion to the viewing angles depending on the geometry. This scheme allows to propagate sharply edged beams without ray tracing, though at the price of some lateral diffusion. The algorithm treats correctly both the optically thin and optically thick regimes. A symmetric semi-implicit (SSI) method is used to guarantee numerical stability. Program summaryProgram title: MULTI2D Catalogue identifier: AECV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 151 098 No. of bytes in distributed program, including test data, etc.: 889 622 Distribution format: tar.gz Programming language: C Computer: PC (32 bits architecture) Operating system: Linux/Unix RAM: 2 Mbytes Word size: 32 bits Classification: 19.7 External routines: X-window standard library (libX11.so) and corresponding heading files (X11/*.h) are

  6. Computer vision for high content screening.

    PubMed

    Kraus, Oren Z; Frey, Brendan J

    2016-01-01

    High Content Screening (HCS) technologies that combine automated fluorescence microscopy with high throughput biotechnology have become powerful systems for studying cell biology and drug screening. These systems can produce more than 100 000 images per day, making their success dependent on automated image analysis. In this review, we describe the steps involved in quantifying microscopy images and different approaches for each step. Typically, individual cells are segmented from the background using a segmentation algorithm. Each cell is then quantified by extracting numerical features, such as area and intensity measurements. As these feature representations are typically high dimensional (>500), modern machine learning algorithms are used to classify, cluster and visualize cells in HCS experiments. Machine learning algorithms that learn feature representations, in addition to the classification or clustering task, have recently advanced the state of the art on several benchmarking tasks in the computer vision community. These techniques have also recently been applied to HCS image analysis. PMID:26806341

  7. A BENCHMARKING ANALYSIS FOR FIVE RADIONUCLIDE VADOSE ZONE MODELS (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, AND CHAIN 2D) IN SOIL SCREENING LEVEL CALCULATIONS

    EPA Science Inventory

    Five radionuclide vadose zone models with different degrees of complexity (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide (99Tc) rele...

  8. Orbit computation of the TELECOM-2D satellite with a Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Deleflie, Florent; Coulot, David; Vienne, Alain; Decosta, Romain; Richard, Pascal; Lasri, Mohammed Amjad

    2014-07-01

    In order to test a preliminary orbit determination method, we fit an orbit of the geostationary satellite TELECOM-2D, as if we did not know any a priori information on its trajectory. The method is based on a genetic algorithm coupled to an analytical propagator of the trajectory, that is used over a couple of days, and that uses a whole set of altazimutal data that are acquired by the tracking network made up of the two TAROT telescopes. The adjusted orbit is then compared to a numerical reference. The method is described, and the results are analyzed, as a step towards an operational method of preliminary orbit determination for uncatalogued objects.

  9. Auto-masked 2D/3D image registration and its validation with clinical cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Steininger, P.; Neuner, M.; Weichenberger, H.; Sharp, G. C.; Winey, B.; Kametriser, G.; Sedlmayer, F.; Deutschmann, H.

    2012-07-01

    Image-guided alignment procedures in radiotherapy aim at minimizing discrepancies between the planned and the real patient setup. For that purpose, we developed a 2D/3D approach which rigidly registers a computed tomography (CT) with two x-rays by maximizing the agreement in pixel intensity between the x-rays and the corresponding reconstructed radiographs from the CT. Moreover, the algorithm selects regions of interest (masks) in the x-rays based on 3D segmentations from the pre-planning stage. For validation, orthogonal x-ray pairs from different viewing directions of 80 pelvic cone-beam CT (CBCT) raw data sets were used. The 2D/3D results were compared to corresponding standard 3D/3D CBCT-to-CT alignments. Outcome over 8400 2D/3D experiments showed that parametric errors in root mean square were <0.18° (rotations) and <0.73 mm (translations), respectively, using rank correlation as intensity metric. This corresponds to a mean target registration error, related to the voxels of the lesser pelvis, of <2 mm in 94.1% of the cases. From the results we conclude that 2D/3D registration based on sequentially acquired orthogonal x-rays of the pelvis is a viable alternative to CBCT-based approaches if rigid alignment on bony anatomy is sufficient, no volumetric intra-interventional data set is required and the expected error range fits the individual treatment prescription.

  10. Auto-masked 2D/3D image registration and its validation with clinical cone-beam computed tomography.

    PubMed

    Steininger, P; Neuner, M; Weichenberger, H; Sharp, G C; Winey, B; Kametriser, G; Sedlmayer, F; Deutschmann, H

    2012-07-01

    Image-guided alignment procedures in radiotherapy aim at minimizing discrepancies between the planned and the real patient setup. For that purpose, we developed a 2D/3D approach which rigidly registers a computed tomography (CT) with two x-rays by maximizing the agreement in pixel intensity between the x-rays and the corresponding reconstructed radiographs from the CT. Moreover, the algorithm selects regions of interest (masks) in the x-rays based on 3D segmentations from the pre-planning stage. For validation, orthogonal x-ray pairs from different viewing directions of 80 pelvic cone-beam CT (CBCT) raw data sets were used. The 2D/3D results were compared to corresponding standard 3D/3D CBCT-to-CT alignments. Outcome over 8400 2D/3D experiments showed that parametric errors in root mean square were <0.18° (rotations) and <0.73 mm (translations), respectively, using rank correlation as intensity metric. This corresponds to a mean target registration error, related to the voxels of the lesser pelvis, of <2 mm in 94.1% of the cases. From the results we conclude that 2D/3D registration based on sequentially acquired orthogonal x-rays of the pelvis is a viable alternative to CBCT-based approaches if rigid alignment on bony anatomy is sufficient, no volumetric intra-interventional data set is required and the expected error range fits the individual treatment prescription. PMID:22705709

  11. Coupling 2-D cylindrical and 3-D x-y-z transport computations

    SciTech Connect

    Abu-Shumays, I.K.; Yehnert, C.E.; Pitcairn, T.N.

    1998-06-30

    This paper describes a new two-dimensional (2-D) cylindrical geometry to three-dimensional (3-D) rectangular x-y-z splice option for multi-dimensional discrete ordinates solutions to the neutron (photon) transport equation. Of particular interest are the simple transformations developed and applied in order to carry out the required spatial and angular interpolations. The spatial interpolations are linear and equivalent to those applied elsewhere. The angular interpolations are based on a high order spherical harmonics representation of the angular flux. Advantages of the current angular interpolations over previous work are discussed. An application to an intricate streaming problem is provided to demonstrate the advantages of the new method for efficient and accurate prediction of particle behavior in complex geometries.

  12. Distributed Computing Architecture for Image-Based Wavefront Sensing and 2 D FFTs

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan

    2006-01-01

    Image-based wavefront sensing (WFS) provides significant advantages over interferometric-based wavefi-ont sensors such as optical design simplicity and stability. However, the image-based approach is computational intensive, and therefore, specialized high-performance computing architectures are required in applications utilizing the image-based approach. The development and testing of these high-performance computing architectures are essential to such missions as James Webb Space Telescope (JWST), Terrestial Planet Finder-Coronagraph (TPF-C and CorSpec), and Spherical Primary Optical Telescope (SPOT). The development of these specialized computing architectures require numerous two-dimensional Fourier Transforms, which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of DSPs, multiple DSP FPGAs, and an application of low-diameter graph theory. Timing results and performance analysis will be presented. The solutions offered could be applied to other all-to-all communication and scientifically computationally complex problems.

  13. Cytochrome P450 bio-affinity detection coupled to gradient HPLC: on-line screening of affinities to cytochrome P4501A2 and 2D6.

    PubMed

    Kool, Jeroen; van Liempd, Sebastiaan M; Harmsen, Stefan; Beckman, Joran; van Elswijk, Danny; Commandeur, Jan N M; Irth, Hubertus; Vermeulen, Nico P E

    2007-10-15

    Here we describe novel on-line human CYP1A2 and CYP2D6 Enzyme Affinity Detection (EAD) systems coupled to gradient HPLC. The use of the systems lies in the detection of individual inhibitory ligands in mixtures (e.g. metabolic mixtures or herbal extracts) towards two relevant drug metabolizing human CYPs. The systems can rapidly detect individual compounds in mixtures with affinities to CYP1A2 or 2D6. The HPLC-EAD systems were first evaluated and validated in flow injection analysis mode. IC50 values of known ligands for both CYPs, tested both in flow injection and in HPLC mode, were well comparable with those measured in microplate reader formats. Both EAD systems were also connected to gradient HPLC and used to screen known compound mixtures for the presence of CYP1A2 and 2D6 inhibitors. Finally, the on-line CYP2D6 EAD system was used to screen for the inhibitory activities of stereoisomers of a mixture of five methylenedioxy-alkylamphetamines (XTC analogs) on a chiral analytical column. PMID:17826363

  14. Novel low-cost 2D/3D switchable autostereoscopic system for notebook computers and other portable devices

    NASA Astrophysics Data System (ADS)

    Eichenlaub, Jesse B.

    1995-03-01

    Mounting a lenticular lens in front of a flat panel display is a well known, inexpensive, and easy way to create an autostereoscopic system. Such a lens produces half resolution 3D images because half the pixels on the LCD are seen by the left eye and half by the right eye. This may be acceptable for graphics, but it makes full resolution text, as displayed by common software, nearly unreadable. Very fine alignment tolerances normally preclude the possibility of removing and replacing the lens in order to switch between 2D and 3D applications. Lenticular lens based displays are therefore limited to use as dedicated 3D devices. DTI has devised a technique which removes this limitation, allowing switching between full resolution 2D and half resolution 3D imaging modes. A second element, in the form of a concave lenticular lens array whose shape is exactly the negative of the first lens, is mounted on a hinge so that it can be swung down over the first lens array. When so positioned the two lenses cancel optically, allowing the user to see full resolution 2D for text or numerical applications. The two lenses, having complementary shapes, naturally tend to nestle together and snap into perfect alignment when pressed together--thus obviating any need for user operated alignment mechanisms. This system represents an ideal solution for laptop and notebook computer applications. It was devised to meet the stringent requirements of a laptop computer manufacturer including very compact size, very low cost, little impact on existing manufacturing or assembly procedures, and compatibility with existing full resolution 2D text- oriented software as well as 3D graphics. Similar requirements apply to high and electronic calculators, several models of which now use LCDs for the display of graphics.

  15. Geometric Neural Computing for 2D Contour and 3D Surface Reconstruction

    NASA Astrophysics Data System (ADS)

    Rivera-Rovelo, Jorge; Bayro-Corrochano, Eduardo; Dillmann, Ruediger

    In this work we present an algorithm to approximate the surface of 2D or 3D objects combining concepts from geometric algebra and artificial neural networks. Our approach is based on the self-organized neural network called Growing Neural Gas (GNG), incorporating versors of the geometric algebra in its neural units; such versors are the transformations that will be determined during the training stage and then applied to a point to approximate the surface of the object. We also incorporate the information given by the generalized gradient vector flow to select automatically the input patterns, and also in the learning stage in order to improve the performance of the net. Several examples using medical images are presented, as well as images of automatic visual inspection. We compared the results obtained using snakes against the GSOM incorporating the gradient information and using versors. Such results confirm that our approach is very promising. As a second application, a kind of morphing or registration procedure is shown; namely the algorithm can be used when transforming one model at time t 1 into another at time t 2. We include also examples applying the same procedure, now extended to models based on spheres.

  16. The PR2D (Place, Route in 2-Dimensions) automatic layout computer program handbook

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1978-01-01

    Place, Route in 2-Dimensions is a standard cell automatic layout computer program for generating large scale integrated/metal oxide semiconductor arrays. The program was utilized successfully for a number of years in both government and private sectors but until now was undocumented. The compilation, loading, and execution of the program on a Sigma V CP-V operating system is described.

  17. Lattice Boltzmann methods for some 2-D nonlinear diffusion equations:Computational results

    SciTech Connect

    Elton, B.H.; Rodrigue, G.H. . Dept. of Applied Science Lawrence Livermore National Lab., CA ); Levermore, C.D. . Dept. of Mathematics)

    1990-01-01

    In this paper we examine two lattice Boltzmann methods (that are a derivative of lattice gas methods) for computing solutions to two two-dimensional nonlinear diffusion equations of the form {partial derivative}/{partial derivative}t u = v ({partial derivative}/{partial derivative}x D(u){partial derivative}/{partial derivative}x u + {partial derivative}/{partial derivative}y D(u){partial derivative}/{partial derivative}y u), where u = u({rvec x},t), {rvec x} {element of} R{sup 2}, v is a constant, and D(u) is a nonlinear term that arises from a Chapman-Enskog asymptotic expansion. In particular, we provide computational evidence supporting recent results showing that the methods are second order convergent (in the L{sub 1}-norm), conservative, conditionally monotone finite difference methods. Solutions computed via the lattice Boltzmann methods are compared with those computed by other explicit, second order, conservative, monotone finite difference methods. Results are reported for both the L{sub 1}- and L{sub {infinity}}-norms.

  18. Identification of the wave speed and the second viscosity of cavitation flows with 2D RANS computations - Part I

    NASA Astrophysics Data System (ADS)

    Decaix, J.; Alligné, S.; Nicolet, C.; Avellan, F.; Münch, C.

    2015-12-01

    1D hydro-electric models are useful to predict dynamic behaviour of hydro-power plants. Regarding vortex rope and cavitation surge in Francis turbines, the 1D models require some inputs that can be provided by numerical simulations. In this paper, a 2D cavitating Venturi is considered. URANS computations are performed to investigate the dynamic behaviour of the cavitation sheet depending on the frequency variation of the outlet pressure. The results are used to calibrate and to assess the reliability of the 1D models.

  19. Eye-screen distance monitoring for computer use.

    PubMed

    Eastwood-Sutherland, Caillin; Gale, Timothy J

    2011-01-01

    The extended period many people now spend looking at computer screens is thought to affect eyesight over the long term. In this paper we are concerned with developing and initial evaluation of a wireless camera-based tracking system providing quantitative assessment of computer screen interaction. The system utilizes a stereo camera system and wireless XBee based infrared markers and enables unobtrusive monitoring. Preliminary results indicate that the system is an excellent method of monitoring eye-screen distance. This type of system will enable future studies of eye-screen distance for computer users. PMID:22254767

  20. Quantitative comparison of dose distribution in radiotherapy plans using 2D gamma maps and X-ray computed tomography

    PubMed Central

    Balosso, Jacques

    2016-01-01

    Background The advanced dose calculation algorithms implemented in treatment planning system (TPS) have remarkably improved the accuracy of dose calculation especially the modeling of electrons transport in the low density medium. The purpose of this study is to evaluate the use of 2D gamma (γ) index to quantify and evaluate the impact of the calculation of electrons transport on dose distribution for lung radiotherapy. Methods X-ray computed tomography images were used to calculate the dose for twelve radiotherapy treatment plans. The doses were originally calculated with Modified Batho (MB) 1D density correction method, and recalculated with anisotropic analytical algorithm (AAA), using the same prescribed dose. Dose parameters derived from dose volume histograms (DVH) and target coverage indices were compared. To compare dose distribution, 2D γ-index was applied, ranging from 1%/1 mm to 6%/6 mm. The results were displayed using γ-maps in 2D. Correlation between DVH metrics and γ passing rates was tested using Spearman’s rank test and Wilcoxon paired test to calculate P values. Results the plans generated with AAA predicted more heterogeneous dose distribution inside the target, with P<0.05. However, MB overestimated the dose predicting more coverage of the target by the prescribed dose. The γ analysis showed that the difference between MB and AAA could reach up to ±10%. The 2D γ-maps illustrated that AAA predicted more dose to organs at risks, as well as lower dose to the target compared to MB. Conclusions Taking into account of the electrons transport on radiotherapy plans showed a significant impact on delivered dose and dose distribution. When considering the AAA represent the true cumulative dose, a readjusting of the prescribed dose and an optimization to protect the organs at risks should be taken in consideration in order to obtain the better clinical outcome. PMID:27429908

  1. The Roles of Endstopped and Curvature Tuned Computations in a Hierarchical Representation of 2D Shape

    PubMed Central

    Rodríguez-Sánchez, Antonio J.; Tsotsos, John K.

    2012-01-01

    That shape is important for perception has been known for almost a thousand years (thanks to Alhazen in 1083) and has been a subject of study ever since by scientists and phylosophers (such as Descartes, Helmholtz or the Gestalt psychologists). Shapes are important object descriptors. If there was any remote doubt regarding the importance of shape, recent experiments have shown that intermediate areas of primate visual cortex such as V2, V4 and TEO are involved in analyzing shape features such as corners and curvatures. The primate brain appears to perform a wide variety of complex tasks by means of simple operations. These operations are applied across several layers of neurons, representing increasingly complex, abstract intermediate processing stages. Recently, new models have attempted to emulate the human visual system. However, the role of intermediate representations in the visual cortex and their importance have not been adequately studied in computational modeling. This paper proposes a model of shape-selective neurons whose shape-selectivity is achieved through intermediate layers of visual representation not previously fully explored. We hypothesize that hypercomplex - also known as endstopped - neurons play a critical role to achieve shape selectivity and show how shape-selective neurons may be modeled by integrating endstopping and curvature computations. This model - a representational and computational system for the detection of 2-dimensional object silhouettes that we term 2DSIL - provides a highly accurate fit with neural data and replicates responses from neurons in area V4 with an average of 83% accuracy. We successfully test a biologically plausible hypothesis on how to connect early representations based on Gabor or Difference of Gaussian filters and later representations closer to object categories without the need of a learning phase as in most recent models. PMID:22912683

  2. Emergent Power-Law Phase in the 2D Heisenberg Windmill Antiferromagnet: A Computational Experiment.

    PubMed

    Jeevanesan, Bhilahari; Chandra, Premala; Coleman, Piers; Orth, Peter P

    2015-10-23

    In an extensive computational experiment, we test Polyakov's conjecture that under certain circumstances an isotropic Heisenberg model can develop algebraic spin correlations. We demonstrate the emergence of a multispin U(1) order parameter in a Heisenberg antiferromagnet on interpenetrating honeycomb and triangular lattices. The correlations of this relative phase angle are observed to decay algebraically at intermediate temperatures in an extended critical phase. Using finite-size scaling we show that both phase transitions are of the Berezinskii-Kosterlitz-Thouless type, and at lower temperatures we find long-range Z(6) order. PMID:26551137

  3. Manifest: A computer program for 2-D flow modeling in Stirling machines

    NASA Technical Reports Server (NTRS)

    Gedeon, David

    1989-01-01

    A computer program named Manifest is discussed. Manifest is a program one might want to use to model the fluid dynamics in the manifolds commonly found between the heat exchangers and regenerators of Stirling machines; but not just in the manifolds - in the regenerators as well. And in all sorts of other places too, such as: in heaters or coolers, or perhaps even in cylinder spaces. There are probably nonStirling uses for Manifest also. In broad strokes, Manifest will: (1) model oscillating internal compressible laminar fluid flow in a wide range of two-dimensional regions, either filled with porous materials or empty; (2) present a graphics-based user-friendly interface, allowing easy selection and modification of region shape and boundary condition specification; (3) run on a personal computer, or optionally (in the case of its number-crunching module) on a supercomputer; and (4) allow interactive examination of the solution output so the user can view vector plots of flow velocity, contour plots of pressure and temperature at various locations and tabulate energy-related integrals of interest.

  4. A numerical method for computing unsteady 2-D boundary layer flows

    NASA Technical Reports Server (NTRS)

    Krainer, Andreas

    1988-01-01

    A numerical method for computing unsteady two-dimensional boundary layers in incompressible laminar and turbulent flows is described and applied to a single airfoil changing its incidence angle in time. The solution procedure adopts a first order panel method with a simple wake model to solve for the inviscid part of the flow, and an implicit finite difference method for the viscous part of the flow. Both procedures integrate in time in a step-by-step fashion, in the course of which each step involves the solution of the elliptic Laplace equation and the solution of the parabolic boundary layer equations. The Reynolds shear stress term of the boundary layer equations is modeled by an algebraic eddy viscosity closure. The location of transition is predicted by an empirical data correlation originating from Michel. Since transition and turbulence modeling are key factors in the prediction of viscous flows, their accuracy will be of dominant influence to the overall results.

  5. Computing Aerodynamic Performance of a 2D Iced Airfoil: Blocking Topology and Grid Generation

    NASA Technical Reports Server (NTRS)

    Chi, X.; Zhu, B.; Shih, T. I.-P.; Slater, J. W.; Addy, H. E.; Choo, Yung K.; Lee, Chi-Ming (Technical Monitor)

    2002-01-01

    The ice accrued on airfoils can have enormously complicated shapes with multiple protruded horns and feathers. In this paper, several blocking topologies are proposed and evaluated on their ability to produce high-quality structured multi-block grid systems. A transition layer grid is introduced to ensure that jaggedness on the ice-surface geometry do not to propagate into the domain. This is important for grid-generation methods based on hyperbolic PDEs (Partial Differential Equations) and algebraic transfinite interpolation. A 'thick' wrap-around grid is introduced to ensure that grid lines clustered next to solid walls do not propagate as streaks of tightly packed grid lines into the interior of the domain along block boundaries. For ice shapes that are not too complicated, a method is presented for generating high-quality single-block grids. To demonstrate the usefulness of the methods developed, grids and CFD solutions were generated for two iced airfoils: the NLF0414 airfoil with and without the 623-ice shape and the B575/767 airfoil with and without the 145m-ice shape. To validate the computations, the computed lift coefficients as a function of angle of attack were compared with available experimental data. The ice shapes and the blocking topologies were prepared by NASA Glenn's SmaggIce software. The grid systems were generated by using a four-boundary method based on Hermite interpolation with controls on clustering, orthogonality next to walls, and C continuity across block boundaries. The flow was modeled by the ensemble-averaged compressible Navier-Stokes equations, closed by the shear-stress transport turbulence model in which the integration is to the wall. All solutions were generated by using the NPARC WIND code.

  6. A Practical Deconvolution Computation Algorithm to Extract 1D Spectra from 2D Images of Optical Fiber Spectroscopy

    NASA Astrophysics Data System (ADS)

    Guangwei, Li; Haotong, Zhang; Zhongrui, Bai

    2015-06-01

    Bolton & Schlegel presented a promising deconvolution method to extract one-dimensional (1D) spectra from a two-dimensional (2D) optical fiber spectral CCD (charge-coupled device) image. The method could eliminate the PSF (point-spread function) difference between fibers, extract spectra to the photo noise level, as well as improve the resolution. But the method is limited by its huge computation requirement and thus can not be implemented in actual data reduction. In this article, we develop a practical computation method to solve the computation problem. The new computation method can deconvolve a 2D fiber spectral image of any size with actual PSFs, which may vary with positions. Our method does not require large amounts of memory and can extract a 4 k × 4 k noise-free CCD image with 250 fibers in 2 hr. To make our method more practical, we further consider the influence of noise, which is thought to be an intrinsic ill-posed problem in deconvolution algorithms. We modify our method with a Tikhonov regularization item to depress the method induced noise. We do a series of simulations to test how our method performs under more real situations with Poisson noise and extreme cross talk. Compared with the results of traditional extraction methods, i.e., the Aperture Extraction Method and the Profile Fitting Method, our method has the least residual and influence by cross talk. For the noise-added image, the computation speed does not depend very much on fiber distance, the signal-to-noise ratio converges in 2-4 iterations, and the computation times are about 3.5 hr for the extreme fiber distance and about 2 hr for nonextreme cases. A better balance between the computation time and result precision could be achieved by setting the precision threshold similar to the noise level. Finally, we apply our method to real LAMOST (Large sky Area Multi-Object fiber Spectroscopic Telescope; a.k.a. Guo Shou Jing Telescope) data. We find that the 1D spectrum extracted by our

  7. Computational Amide I 2D IR Spectroscopy as a Probe of Protein Structure and Dynamics.

    PubMed

    Reppert, Mike; Tokmakoff, Andrei

    2016-05-27

    Two-dimensional infrared spectroscopy of amide I vibrations is increasingly being used to study the structure and dynamics of proteins and peptides. Amide I, a primarily carbonyl stretching vibration of the protein backbone, provides information on secondary structures as a result of vibrational couplings and on hydrogen-bonding contacts when isotope labeling is used to isolate specific sites. In parallel with experiments, computational models of amide I spectra that use atomistic structures from molecular dynamics simulations have evolved to calculate experimental spectra. Mixed quantum-classical models use spectroscopic maps to translate the structural information into a quantum-mechanical Hamiltonian for the spectroscopically observed vibrations. This allows one to model the spectroscopy of large proteins, disordered states, and protein conformational dynamics. With improvements in amide I models, quantitative modeling of time-dependent structural ensembles and of direct feedback between experiments and simulations is possible. We review the advances in developing these models, their theoretical basis, and current and future applications. PMID:27023758

  8. Computational Amide I 2D IR Spectroscopy as a Probe of Protein Structure and Dynamics

    NASA Astrophysics Data System (ADS)

    Reppert, Mike; Tokmakoff, Andrei

    2016-05-01

    Two-dimensional infrared spectroscopy of amide I vibrations is increasingly being used to study the structure and dynamics of proteins and peptides. Amide I, a primarily carbonyl stretching vibration of the protein backbone, provides information on secondary structures as a result of vibrational couplings and on hydrogen-bonding contacts when isotope labeling is used to isolate specific sites. In parallel with experiments, computational models of amide I spectra that use atomistic structures from molecular dynamics simulations have evolved to calculate experimental spectra. Mixed quantum-classical models use spectroscopic maps to translate the structural information into a quantum-mechanical Hamiltonian for the spectroscopically observed vibrations. This allows one to model the spectroscopy of large proteins, disordered states, and protein conformational dynamics. With improvements in amide I models, quantitative modeling of time-dependent structural ensembles and of direct feedback between experiments and simulations is possible. We review the advances in developing these models, their theoretical basis, and current and future applications.

  9. Scaffold hopping through virtual screening using 2D and 3D similarity descriptors: ranking, voting, and consensus scoring.

    PubMed

    Zhang, Qiang; Muegge, Ingo

    2006-03-01

    The ability to find novel bioactive scaffolds in compound similarity-based virtual screening experiments has been studied comparing Tanimoto-based, ranking-based, voting, and consensus scoring protocols. Ligand sets for seven well-known drug targets (CDK2, COX2, estrogen receptor, neuraminidase, HIV-1 protease, p38 MAP kinase, thrombin) have been assembled such that each ligand represents its own unique chemotype, thus ensuring that each similarity recognition event between ligands constitutes a scaffold hopping event. In a series of virtual screening studies involving 9969 MDDR compounds as negative controls it has been found that atom pair descriptors and 3D pharmacophore fingerprints combined with ranking, voting, and consensus scoring strategies perform well in finding novel bioactive scaffolds. In addition, often superior performance has been observed for similarity-based virtual screening compared to structure-based methods. This finding suggests that information about a target obtained from known bioactive ligands is as valuable as knowledge of the target structures for identifying novel bioactive scaffolds through virtual screening. PMID:16509572

  10. 2-D FDTD computation of seismoelectric fields excited by an underground double couple in a horizontally layered formation

    NASA Astrophysics Data System (ADS)

    Wang, Z.; Guan, W.; Gao, Y.; Hu, H.

    2012-04-01

    Electromagnetic signals have been recorded during earthquakes (e.g. Karakelian et al., 2002). One important mechanism for the coupling between the elastic and the electromagnetic energies is the electrokinetic effect. Gao and Hu (2010) simulated the electromagnetic fields excited by a double couple by solving analytically the set of equations derived by Pride (1994), which combines the Biot equations with the Maxwell equations. However, analytical solution is not available when the geological structure is complex. Numerical methods are thus needed to solve for the seismoelectric fields. In the present work, seismoelectric fields excited by an underground double couple in a horizontally layered geological structure are computed by solving the Pride equations with a finite-difference time-domain (FDTD) algorithm with 2-D grids. A double couple source represents a small fault, and it has no axisymmetric nature. However, as the layered formation is axisymmetric, we only need to solve a 2-D problem by Fourier transforming the seismoelectric fields from the azimuthal angle θ domain to the corresponding wavenumber m domain in cylindrical coordinates. Further, we can prove that m ≤ 2 for a double couple source. 2-D FDTD grid is developed, and the perfectly matched layer technique (Guan and Hu, 2008) is applied to truncate the computational region. The radiation pattern of the double couple is computed. The seismic and the electromagnetic fields on the surface of the layered formation are obtained and compared to the analytical results given by Hu and Gao (2011). Good agreements between the FDTD results and the analytical solutions show the validity of our FDTD algorithm. Extension to a general 3-D problem is under way. A key issue involved in our modeling of the earthquake source in a porous medium is to find out the body forces in the Pride equations. We point out that if Biot (1956) theory (which is one base of Pride equations) is used, no equivalent force should be

  11. A computational model that recovers the 3D shape of an object from a single 2D retinal representation.

    PubMed

    Li, Yunfeng; Pizlo, Zygmunt; Steinman, Robert M

    2009-05-01

    Human beings perceive 3D shapes veridically, but the underlying mechanisms remain unknown. The problem of producing veridical shape percepts is computationally difficult because the 3D shapes have to be recovered from 2D retinal images. This paper describes a new model, based on a regularization approach, that does this very well. It uses a new simplicity principle composed of four shape constraints: viz., symmetry, planarity, maximum compactness and minimum surface. Maximum compactness and minimum surface have never been used before. The model was tested with random symmetrical polyhedra. It recovered their 3D shapes from a single randomly-chosen 2D image. Neither learning, nor depth perception, was required. The effectiveness of the maximum compactness and the minimum surface constraints were measured by how well the aspect ratio of the 3D shapes was recovered. These constraints were effective; they recovered the aspect ratio of the 3D shapes very well. Aspect ratios recovered by the model were compared to aspect ratios adjusted by four human observers. They also adjusted aspect ratios very well. In those rare cases, in which the human observers showed large errors in adjusted aspect ratios, their errors were very similar to the errors made by the model. PMID:18621410

  12. Computer-assisted assignment of 2D 1H NMR spectra of proteins: basic algorithms and application to phoratoxin B.

    PubMed

    Kleywegt, G J; Boelens, R; Cox, M; Llinás, M; Kaptein, R

    1991-05-01

    A suite of computer programs (CLAIRE) is described which can be of assistance in the process of assigning 2D 1H NMR spectra of proteins. The programs embody a software implementation of the sequential assignment approach first developed by Wüthrich and co-workers (K. Wüthrich, G. Wider, G. Wagner and W. Braun (1982) J. Mol. Biol. 155, 311). After data-abstraction (peakpicking), the software can be used to detect patterns (spin systems), to find cross peaks between patterns in 2D NOE data sets and to generate assignments that are consistent with all available data and which satisfy a number of constraints imposed by the user. An interactive graphics program called CONPAT is used to control the entire assignment process as well as to provide the essential feedback from the experimental NMR spectra. The algorithms are described in detail and the approach is demonstrated on a set of spectra from the mistletoe protein phoratoxin B, a homolog of crambin. The results obtained compare well with those reported earlier based entirely on a manual assignment process. PMID:1841687

  13. Motivational Screen Design Guidelines for Effective Computer-Mediated Instruction.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; Boling, Elizabeth

    Screen designers for computer-mediated instruction (CMI) products must consider the motivational appeal of their designs. Although learners may be motivated to use CMI programs initially because of their novelty, this effect wears off and the instruction must stand on its own. Instructional screens must provide effective and efficient instruction,…

  14. Systematic E2 screening reveals a UBE2D-RNF138-CtIP axis promoting DNA repair.

    PubMed

    Schmidt, Christine K; Galanty, Yaron; Sczaniecka-Clift, Matylda; Coates, Julia; Jhujh, Satpal; Demir, Mukerrem; Cornwell, Matthew; Beli, Petra; Jackson, Stephen P

    2015-11-01

    Ubiquitylation is crucial for proper cellular responses to DNA double-strand breaks (DSBs). If unrepaired, these highly cytotoxic lesions cause genome instability, tumorigenesis, neurodegeneration or premature ageing. Here, we conduct a comprehensive, multilayered screen to systematically profile all human ubiquitin E2 enzymes for impacts on cellular DSB responses. With a widely applicable approach, we use an exemplary E2 family, UBE2Ds, to identify ubiquitylation-cascade components downstream of E2s. Thus, we uncover the nuclear E3 ligase RNF138 as a key homologous recombination (HR)-promoting factor that functions with UBE2Ds in cells. Mechanistically, UBE2Ds and RNF138 accumulate at DNA-damage sites and act at early resection stages by promoting CtIP ubiquitylation and accrual. This work supplies insights into regulation of DSB repair by HR. Moreover, it provides a rich information resource on E2s that can be exploited by follow-on studies. PMID:26502057

  15. Systematic E2 screening reveals a UBE2D-RNF138-CtIP axis promoting DNA repair

    PubMed Central

    Sczaniecka-Clift, Matylda; Coates, Julia; Jhujh, Satpal; Demir, Mukerrem; Cornwell, Matthew; Beli, Petra; Jackson, Stephen P

    2016-01-01

    Ubiquitylation is crucial for proper cellular responses to DNA double-strand breaks (DSBs). If unrepaired, these highly cytotoxic lesions cause genome instability, tumourigenesis, neurodegeneration or premature ageing. Here, we conduct a comprehensive, multilayered screen to systematically profile all human ubiquitin E2-enzymes for impacts on cellular DSB responses. Applying a widely applicable approach, we use an exemplary E2 family, UBE2Ds, to identify ubiquitylation-cascade components downstream of E2s. Thus, we uncover the nuclear E3-ligase RNF138 as a key homologous recombination (HR)-promoting factor that functions with UBE2Ds in cells. Mechanistically, UBE2Ds and RNF138 accumulate at DNA-damage sites and act at early resection stages by promoting CtIP ubiquitylation and accrual. This work supplies insights into regulation of DSB repair by HR. Moreover, it provides a rich information resource on E2s that can be exploited by follow-on studies. PMID:26502057

  16. SCREENING CHEMICALS FOR ESTROGEN RECEPTOR BIOACTIVITY USING A COMPUTATIONAL MODEL

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is considering the use high-throughput and computational methods for regulatory applications in the Endocrine Disruptor Screening Program (EDSP). To use these new tools for regulatory decision making, computational methods must be a...

  17. ASIC-based architecture for the real-time computation of 2D convolution with large kernel size

    NASA Astrophysics Data System (ADS)

    Shao, Rui; Zhong, Sheng; Yan, Luxin

    2015-12-01

    Bidimensional convolution is a low-level processing algorithm of interest in many areas, but its high computational cost constrains the size of the kernels, especially in real-time embedded systems. This paper presents a hardware architecture for the ASIC-based implementation of 2-D convolution with medium-large kernels. Aiming to improve the efficiency of storage resources on-chip, reducing off-chip bandwidth of these two issues, proposed construction of a data cache reuse. Multi-block SPRAM to cross cached images and the on-chip ping-pong operation takes full advantage of the data convolution calculation reuse, design a new ASIC data scheduling scheme and overall architecture. Experimental results show that the structure can achieve 40× 32 size of template real-time convolution operations, and improve the utilization of on-chip memory bandwidth and on-chip memory resources, the experimental results show that the structure satisfies the conditions to maximize data throughput output , reducing the need for off-chip memory bandwidth.

  18. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Mai, Juliane; Cuntz, Matthias; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2016-04-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  19. Computationally inexpensive identification of noninformative model parameters by sequential screening

    NASA Astrophysics Data System (ADS)

    Cuntz, Matthias; Mai, Juliane; Zink, Matthias; Thober, Stephan; Kumar, Rohini; Schäfer, David; Schrön, Martin; Craven, John; Rakovec, Oldrich; Spieler, Diana; Prykhodko, Vladyslav; Dalmasso, Giovanni; Musuuza, Jude; Langenberg, Ben; Attinger, Sabine; Samaniego, Luis

    2015-08-01

    Environmental models tend to require increasing computational time and resources as physical process descriptions are improved or new descriptions are incorporated. Many-query applications such as sensitivity analysis or model calibration usually require a large number of model evaluations leading to high computational demand. This often limits the feasibility of rigorous analyses. Here we present a fully automated sequential screening method that selects only informative parameters for a given model output. The method requires a number of model evaluations that is approximately 10 times the number of model parameters. It was tested using the mesoscale hydrologic model mHM in three hydrologically unique European river catchments. It identified around 20 informative parameters out of 52, with different informative parameters in each catchment. The screening method was evaluated with subsequent analyses using all 52 as well as only the informative parameters. Subsequent Sobol's global sensitivity analysis led to almost identical results yet required 40% fewer model evaluations after screening. mHM was calibrated with all and with only informative parameters in the three catchments. Model performances for daily discharge were equally high in both cases with Nash-Sutcliffe efficiencies above 0.82. Calibration using only the informative parameters needed just one third of the number of model evaluations. The universality of the sequential screening method was demonstrated using several general test functions from the literature. We therefore recommend the use of the computationally inexpensive sequential screening method prior to rigorous analyses on complex environmental models.

  20. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 1: Theory and method

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  1. Verification and benchmarking of MAGNUM-2D: a finite element computer code for flow and heat transfer in fractured porous media

    SciTech Connect

    Eyler, L.L.; Budden, M.J.

    1985-03-01

    The objective of this work is to assess prediction capabilities and features of the MAGNUM-2D computer code in relation to its intended use in the Basalt Waste Isolation Project (BWIP). This objective is accomplished through a code verification and benchmarking task. Results are documented which support correctness of prediction capabilities in areas of intended model application. 10 references, 43 figures, 11 tables.

  2. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and

  3. A Benchmarking Analysis for Five Radionuclide Vadose Zone Models (Chain, Multimed{_}DP, Fectuz, Hydrus, and Chain 2D) in Soil Screening Level Calculations

    SciTech Connect

    Chen, J-S.; Drake, R.; Lin, Z.; Jewett, D. G.

    2002-02-26

    Five vadose zone models with different degrees of complexity (CHAIN, MULTIMED{_}DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in radionuclide soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide ({sup 99}Tc) release scenario at the Las Cruces Trench Site in New Mexico. Sensitivity of three model outputs to the input parameters were evaluated and compared among the models. The three outputs were peak contaminant concentrations, time to peak concentrations at the water table, and time to exceed the contaminants maximum critical level at a representative receptor well. Model parameters investigated include soil properties such as bulk density, water content, soil water retention parameters and hydraulic conductivity. Chemical properties examined include distribution coefficient, radionuclide half-life, dispersion coefficient, and molecular diffusion. Other soil characteristics, such as recharge rate, also were examined. Model sensitivity was quantified in the form of sensitivity and relative sensitivity coefficients. Relative sensitivities were used to compare the sensitivities of different parameters. The analysis indicates that soil water content, recharge rate, saturated soil water content, and soil retention parameter, {beta}, have a great influence on model outputs. In general, the results of sensitivities and relative sensitivities using five models are similar for a specific scenario. Slight differences were observed in predicted peak contaminant concentrations due to different mathematical treatment among models. The results of benchmarking and sensitivity analysis would facilitate the model selection and application of the model in SSL calculations.

  4. A review of automated image understanding within 3D baggage computed tomography security screening.

    PubMed

    Mouton, Andre; Breckon, Toby P

    2015-01-01

    Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT. PMID:26409422

  5. Decision trees and integrated features for computer aided mammographic screening

    SciTech Connect

    Kegelmeyer, W.P. Jr.; Groshong, B.; Allmen, M.; Woods, K.

    1997-02-01

    Breast cancer is a serious problem, which in the United States causes 43,000 deaths a year, eventually striking 1 in 9 women. Early detection is the only effective countermeasure, and mass mammography screening is the only reliable means for early detection. Mass screening has many shortcomings which could be addressed by a computer-aided mammographic screening system. Accordingly, we have applied the pattern recognition methods developed in earlier investigations of speculated lesions in mammograms to the detection of microcalcifications and circumscribed masses, generating new, more rigorous and uniform methods for the detection of both those signs. We have also improved the pattern recognition methods themselves, through the development of a new approach to combinations of multiple classifiers.

  6. Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model.

    PubMed

    Browne, Patience; Judson, Richard S; Casey, Warren M; Kleinstreuer, Nicole C; Thomas, Russell S

    2015-07-21

    The U.S. Environmental Protection Agency (EPA) is considering high-throughput and computational methods to evaluate the endocrine bioactivity of environmental chemicals. Here we describe a multistep, performance-based validation of new methods and demonstrate that these new tools are sufficiently robust to be used in the Endocrine Disruptor Screening Program (EDSP). Results from 18 estrogen receptor (ER) ToxCast high-throughput screening assays were integrated into a computational model that can discriminate bioactivity from assay-specific interference and cytotoxicity. Model scores range from 0 (no activity) to 1 (bioactivity of 17β-estradiol). ToxCast ER model performance was evaluated for reference chemicals, as well as results of EDSP Tier 1 screening assays in current practice. The ToxCast ER model accuracy was 86% to 93% when compared to reference chemicals and predicted results of EDSP Tier 1 guideline and other uterotrophic studies with 84% to 100% accuracy. The performance of high-throughput assays and ToxCast ER model predictions demonstrates that these methods correctly identify active and inactive reference chemicals, provide a measure of relative ER bioactivity, and rapidly identify chemicals with potential endocrine bioactivities for additional screening and testing. EPA is accepting ToxCast ER model data for 1812 chemicals as alternatives for EDSP Tier 1 ER binding, ER transactivation, and uterotrophic assays. PMID:26066997

  7. Computer Simulation Of Radiographic Screen-Film Images

    NASA Astrophysics Data System (ADS)

    Metter, Richard V.; Dillon, Peter L.; Huff, Kenneth E.; Rabbani, Majid

    1986-06-01

    A method is described for computer simulation of radiographic screen-film images. This method is based on a previously published model of the screen-film imaging process.l The x-ray transmittance of a test object is sampled at a pitch of 50 μm by scanning a high-resolution, low-noise direct-exposure radiograph. This transmittance is then used, along with the x-ray exposure incident upon the object, to determine the expected number of quanta per pixel incident upon the screen. The random nature of x-ray arrival and absorption, x-ray quantum to light photon conversion, and photon absorption by the film is simulated by appropriate random number generation. Standard FFT techniques are used for computing the effects of scattering. Finally, the computed film density for each pixel is produced on a high-resolution, low-noise output film by a scanning printer. The simulation allows independent specification of x-ray exposure, x-ray quantum absorption, light conversion statistics, light scattering, and film characteristics (sensitometry and gran-ularity). Each of these parameters is independently measured for radiographic systems of interest. The simulator is tested by comparing actual radiographic images with simulated images resulting from the independently measured parameters. Images are also shown illustrating the effects of changes in these parameters on image quality. Finally, comparison is made with a "perfect" imaging system where information content is only limited by the finite number of x-rays.

  8. Validity of computational hemodynamics in human arteries based on 3D time-of-flight MR angiography and 2D electrocardiogram gated phase contrast images

    NASA Astrophysics Data System (ADS)

    Yu, Huidan (Whitney); Chen, Xi; Chen, Rou; Wang, Zhiqiang; Lin, Chen; Kralik, Stephen; Zhao, Ye

    2015-11-01

    In this work, we demonstrate the validity of 4-D patient-specific computational hemodynamics (PSCH) based on 3-D time-of-flight (TOF) MR angiography (MRA) and 2-D electrocardiogram (ECG) gated phase contrast (PC) images. The mesoscale lattice Boltzmann method (LBM) is employed to segment morphological arterial geometry from TOF MRA, to extract velocity profiles from ECG PC images, and to simulate fluid dynamics on a unified GPU accelerated computational platform. Two healthy volunteers are recruited to participate in the study. For each volunteer, a 3-D high resolution TOF MRA image and 10 2-D ECG gated PC images are acquired to provide the morphological geometry and the time-varying flow velocity profiles for necessary inputs of the PSCH. Validation results will be presented through comparisons of LBM vs. 4D Flow Software for flow rates and LBM simulation vs. MRA measurement for blood flow velocity maps. Indiana University Health (IUH) Values Fund.

  9. Affinity-Based Screening of Tetravalent Peptides Identifies Subtype-Selective Neutralizers of Shiga Toxin 2d, a Highly Virulent Subtype, by Targeting a Unique Amino Acid Involved in Its Receptor Recognition.

    PubMed

    Mitsui, Takaaki; Watanabe-Takahashi, Miho; Shimizu, Eiko; Zhang, Baihao; Funamoto, Satoru; Yamasaki, Shinji; Nishikawa, Kiyotaka

    2016-09-01

    Shiga toxin (Stx), a major virulence factor of enterohemorrhagic Escherichia coli (EHEC), can be classified into two subgroups, Stx1 and Stx2, each consisting of various closely related subtypes. Stx2 subtypes Stx2a and Stx2d are highly virulent and linked with serious human disorders, such as acute encephalopathy and hemolytic-uremic syndrome. Through affinity-based screening of a tetravalent peptide library, we previously developed peptide neutralizers of Stx2a in which the structure was optimized to bind to the B-subunit pentamer. In this study, we identified Stx2d-selective neutralizers by targeting Asn16 of the B subunit, an amino acid unique to Stx2d that plays an essential role in receptor binding. We synthesized a series of tetravalent peptides on a cellulose membrane in which the core structure was exactly the same as that of peptides in the tetravalent library. A total of nine candidate motifs were selected to synthesize tetravalent forms of the peptides by screening two series of the tetravalent peptides. Five of the tetravalent peptides effectively inhibited the cytotoxicity of Stx2a and Stx2d, and notably, two of the peptides selectively inhibited Stx2d. These two tetravalent peptides bound to the Stx2d B subunit with high affinity dependent on Asn16. The mechanism of binding to the Stx2d B subunit differed from that of binding to Stx2a in that the peptides covered a relatively wide region of the receptor-binding surface. Thus, this highly optimized screening technique enables the development of subtype-selective neutralizers, which may lead to more sophisticated treatments of infections by Stx-producing EHEC. PMID:27382021

  10. Analysis of 2D Torus and Hub Topologies of 100Mb/s Ethernet for the Whitney Commodity Computing Testbed

    NASA Technical Reports Server (NTRS)

    Pedretti, Kevin T.; Fineberg, Samuel A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    A variety of different network technologies and topologies are currently being evaluated as part of the Whitney Project. This paper reports on the implementation and performance of a Fast Ethernet network configured in a 4x4 2D torus topology in a testbed cluster of 'commodity' Pentium Pro PCs. Several benchmarks were used for performance evaluation: an MPI point to point message passing benchmark, an MPI collective communication benchmark, and the NAS Parallel Benchmarks version 2.2 (NPB2). Our results show that for point to point communication on an unloaded network, the hub and 1 hop routes on the torus have about the same bandwidth and latency. However, the bandwidth decreases and the latency increases on the torus for each additional route hop. Collective communication benchmarks show that the torus provides roughly four times more aggregate bandwidth and eight times faster MPI barrier synchronizations than a hub based network for 16 processor systems. Finally, the SOAPBOX benchmarks, which simulate real-world CFD applications, generally demonstrated substantially better performance on the torus than on the hub. In the few cases the hub was faster, the difference was negligible. In total, our experimental results lead to the conclusion that for Fast Ethernet networks, the torus topology has better performance and scales better than a hub based network.

  11. A novel approach of computer-aided detection of focal ground-glass opacity in 2D lung CT images

    NASA Astrophysics Data System (ADS)

    Li, Song; Liu, Xiabi; Yang, Ali; Pang, Kunpeng; Zhou, Chunwu; Zhao, Xinming; Zhao, Yanfeng

    2013-02-01

    Focal Ground-Glass Opacity (fGGO) plays an important role in diagnose of lung cancers. This paper proposes a novel approach for detecting fGGOs in 2D lung CT images. The approach consists of two stages: extracting regions of interests (ROIs) and labeling each ROI as fGGO or non-fGGO. In the first stage, we use the techniques of Otsu thresholding and mathematical morphology to segment lung parenchyma from lung CT images and extract ROIs in lung parenchyma. In the second stage, a Bayesian classifier is constructed based on the Gaussian mixture Modeling (GMM) of the distribution of visual features of fGGOs to fulfill ROI identification. The parameters in the classifier are estimated from training data by the discriminative learning method of Max-Min posterior Pseudo-probabilities (MMP). A genetic algorithm is further developed to select compact and discriminative features for the classifier. We evaluated the proposed fGGO detection approach through 5-fold cross-validation experiments on a set of 69 lung CT scans that contain 70 fGGOs. The proposed approach achieves the detection sensitivity of 85.7% at the false positive rate of 2.5 per scan, which proves its effectiveness. We also demonstrate the usefulness of our genetic algorithm based feature selection method and MMP discriminative learning method through comparing them with without-selection strategy and Support Vector Machines (SVMs), respectively, in the experiments.

  12. Fault-tolerant quantum computation and communication on a distributed 2D array of small local systems

    SciTech Connect

    Fujii, K.; Yamamoto, T.; Imoto, N.; Koashi, M.

    2014-12-04

    We propose a scheme for distributed quantum computation with small local systems connected via noisy quantum channels. We show that the proposed scheme tolerates errors with probabilities ∼30% and ∼ 0.1% in quantum channels and local operations, respectively, both of which are improved substantially compared to the previous works.

  13. 3D computations of flow field in a guide vane blading designed by means of 2D model for a low head hydraulic turbine

    NASA Astrophysics Data System (ADS)

    Krzemianowski, Z.; Puzyrewski, R.

    2014-08-01

    The paper presents the main parameters of the flow field behind the guide vane cascade designed by means of 2D inverse problem and following check by means of 3D commercial program ANSYS/Fluent applied for a direct problem. This approach of using different models reflects the contemporary design procedure for non-standardized turbomachinery stage. Depending on the model, the set of conservation equation to be solved differs, although the physical background remains the same. The example of computations for guide vane cascade for a low head hydraulic turbine is presented.

  14. Coupled 2-dimensional cascade theory for noise an d unsteady aerodynamics of blade row interaction in turbofans. Volume 2: Documentation for computer code CUP2D

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.

    1994-01-01

    A two dimensional linear aeroacoustic theory for rotor/stator interaction with unsteady coupling was derived and explored in Volume 1 of this report. Computer program CUP2D has been written in FORTRAN embodying the theoretical equations. This volume (Volume 2) describes the structure of the code, installation and running, preparation of the input file, and interpretation of the output. A sample case is provided with printouts of the input and output. The source code is included with comments linking it closely to the theoretical equations in Volume 1.

  15. FACET: a radiation view factor computer code for axisymmetric, 2D planar, and 3D geometries with shadowing

    SciTech Connect

    Shapiro, A.B.

    1983-08-01

    The computer code FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors for input to finite-element heat-transfer analysis codes. The first section of this report is a brief review of previous radiation-view-factor computer codes. The second section presents the defining integral equation for the geometric view factor between two surfaces and the assumptions made in its derivation. Also in this section are the numerical algorithms used to integrate this equation for the various geometries. The third section presents the algorithms used to detect self-shadowing and third-surface shadowing between the two surfaces for which a view factor is being calculated. The fourth section provides a user's input guide followed by several example problems.

  16. VFLOW2D - A Vorte-Based Code for Computing Flow Over Elastically Supported Tubes and Tube Arrays

    SciTech Connect

    WOLFE,WALTER P.; STRICKLAND,JAMES H.; HOMICZ,GREGORY F.; GOSSLER,ALBERT A.

    2000-10-11

    A numerical flow model is developed to simulate two-dimensional fluid flow past immersed, elastically supported tube arrays. This work is motivated by the objective of predicting forces and motion associated with both deep-water drilling and production risers in the oil industry. This work has other engineering applications including simulation of flow past tubular heat exchangers or submarine-towed sensor arrays and the flow about parachute ribbons. In the present work, a vortex method is used for solving the unsteady flow field. This method demonstrates inherent advantages over more conventional grid-based computational fluid dynamics. The vortex method is non-iterative, does not require artificial viscosity for stability, displays minimal numerical diffusion, can easily treat moving boundaries, and allows a greatly reduced computational domain since vorticity occupies only a small fraction of the fluid volume. A gridless approach is used in the flow sufficiently distant from surfaces. A Lagrangian remap scheme is used near surfaces to calculate diffusion and convection of vorticity. A fast multipole technique is utilized for efficient calculation of velocity from the vorticity field. The ability of the method to correctly predict lift and drag forces on simple stationary geometries over a broad range of Reynolds numbers is presented.

  17. Computational Studies of Condensed Matter Systems: Manganese Vanadium Oxide and 2D attractive Hubbard model with spin-dependent disorder

    NASA Astrophysics Data System (ADS)

    Nanguneri, Ravindra

    -dependent disorder. Further, the finite temperature phase diagram for the 2D attractive fermion Hubbard model with spin-dependent disorder is also considered within BdG mean field theory. Three types of disorder are studied. In the first, only one species is coupled to a random site energy; in the second, the two species both move in random site energy landscapes which are of the same amplitude, but different realizations; and finally, in the third, the disorder is in the hopping rather than the site energy. For all three cases we find that, unlike the case of spin-symmetric randomness, where the energy gap and average order parameter do not vanish as the disorder strength increases, a critical disorder strength exists separating distinct phases. In fact, the energy gap and the average order parameter vanish at distinct transitions, Vcgap and Vc op, allowing for a gapless superconducting (gSC) phase. The gSC phase becomes smaller with increasing temperature, until it vanishes at a temperature T*.

  18. Investigation of mechanical strength of 2D nanoscale structures using a molecular dynamics based computational intelligence approach

    NASA Astrophysics Data System (ADS)

    Garg, A.; Vijayaraghavan, V.; Wong, C. H.; Tai, K.; Singru, Pravin M.; Mahapatra, S. S.; Sangwan, K. S.

    2015-09-01

    A molecular dynamics (MD) based computational intelligence (CI) approach is proposed to investigate the Young modulus of two graphene sheets: Armchair and Zigzag. In this approach, the effect of aspect ratio, the temperature, the number of atomic planes and the vacancy defects on the Young modulus of two graphene sheets are first analyzed using the MD simulation. The data obtained using the MD simulation is then fed into the paradigm of a CI cluster comprising of genetic programming, which was specifically designed to formulate the explicit relationship of Young modulus of two graphene structures. We find that the MD-based-CI model is able to model the Young modulus of two graphene structures very well, which compiles in good agreement with that of experimental results obtained from the literature. Additionally, we also conducted sensitivity and parametric analysis and found that the number of defects has the most dominating influence on the Young modulus of two graphene structures.

  19. Defining the RNA internal loops preferred by benzimidazole derivatives via 2D combinatorial screening and computational analysis.

    PubMed

    Velagapudi, Sai Pradeep; Seedhouse, Steven J; French, Jonathan; Disney, Matthew D

    2011-07-01

    RNA is an important therapeutic target; however, RNA targets are generally underexploited due to a lack of understanding of the small molecules that bind RNA and the RNA motifs that bind small molecules. Herein, we describe the identification of the RNA internal loops derived from a 4096 member 3 × 3 nucleotide loop library that are the most specific and highest affinity binders to a series of four designer, druglike benzimidazoles. These studies establish a potentially general protocol to define the highest affinity and most specific RNA motif targets for heterocyclic small molecules. Such information could be used to target functionally important RNAs in genomic sequence. PMID:21604752

  20. A Computational Methodology to Screen Activities of Enzyme Variants

    PubMed Central

    Hediger, Martin R.; De Vico, Luca; Svendsen, Allan; Besenmatter, Werner; Jensen, Jan H.

    2012-01-01

    We present a fast computational method to efficiently screen enzyme activity. In the presented method, the effect of mutations on the barrier height of an enzyme-catalysed reaction can be computed within 24 hours on roughly 10 processors. The methodology is based on the PM6 and MOZYME methods as implemented in MOPAC2009, and is tested on the first step of the amide hydrolysis reaction catalyzed by the Candida Antarctica lipase B (CalB) enzyme. The barrier heights are estimated using adiabatic mapping and shown to give barrier heights to within 3 kcal/mol of B3LYP/6-31G(d)//RHF/3-21G results for a small model system. Relatively strict convergence criteria (0.5 kcal/(molÅ)), long NDDO cutoff distances within the MOZYME method (15 Å) and single point evaluations using conventional PM6 are needed for reliable results. The generation of mutant structures and subsequent setup of the semiempirical calculations are automated so that the effect on barrier heights can be estimated for hundreds of mutants in a matter of weeks using high performance computing. PMID:23284627

  1. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems. PMID:14558996

  2. Local finite element enrichment strategies for 2D contact computations and a corresponding post-processing scheme

    NASA Astrophysics Data System (ADS)

    Sauer, Roger A.

    2013-08-01

    Recently an enriched contact finite element formulation has been developed that substantially increases the accuracy of contact computations while keeping the additional numerical effort at a minimum reported by Sauer (Int J Numer Meth Eng, 87: 593-616, 2011). Two enrich-ment strategies were proposed, one based on local p-refinement using Lagrange interpolation and one based on Hermite interpolation that produces C 1-smoothness on the contact surface. Both classes, which were initially considered for the frictionless Signorini problem, are extended here to friction and contact between deformable bodies. For this, a symmetric contact formulation is used that allows the unbiased treatment of both contact partners. This paper also proposes a post-processing scheme for contact quantities like the contact pressure. The scheme, which provides a more accurate representation than the raw data, is based on an averaging procedure that is inspired by mortar formulations. The properties of the enrichment strategies and the corresponding post-processing scheme are illustrated by several numerical examples considering sliding and peeling contact in the presence of large deformations.

  3. Computer simulation of topological evolution in 2-d grain growth using a continuum diffuse-interface field model

    SciTech Connect

    Fan, D.; Geng, C.; Chen, L.Q.

    1997-03-01

    The local kinetics and topological phenomena during normal grain growth were studied in two dimensions by computer simulations employing a continuum diffuse-interface field model. The relationships between topological class and individual grain growth kinetics were examined, and compared with results obtained previously from analytical theories, experimental results and Monte Carlo simulations. It was shown that both the grain-size and grain-shape (side) distributions are time-invariant and the linear relationship between the mean radii of individual grains and topological class n was reproduced. The moments of the shape distribution were determined, and the differences among the data from soap froth. Potts model and the present simulation were discussed. In the limit when the grain size goes to zero, the average number of grain edges per grain is shown to be between 4 and 5, implying the direct vanishing of 4- and 5-sided grains, which seems to be consistent with recent experimental observations on thin films. Based on the simulation results, the conditions for the applicability of the familiar Mullins-Von Neumann law and the Hillert`s equation were discussed.

  4. Documentation of computer program VS2D to solve the equations of fluid flow in variably saturated porous media

    USGS Publications Warehouse

    Lappala, E.G.; Healy, R.W.; Weeks, E.P.

    1987-01-01

    This report documents FORTRAN computer code for solving problems involving variably saturated single-phase flow in porous media. The flow equation is written with total hydraulic potential as the dependent variable, which allows straightforward treatment of both saturated and unsaturated conditions. The spatial derivatives in the flow equation are approximated by central differences, and time derivatives are approximated either by a fully implicit backward or by a centered-difference scheme. Nonlinear conductance and storage terms may be linearized using either an explicit method or an implicit Newton-Raphson method. Relative hydraulic conductivity is evaluated at cell boundaries by using either full upstream weighting, the arithmetic mean, or the geometric mean of values from adjacent cells. Nonlinear boundary conditions treated by the code include infiltration, evaporation, and seepage faces. Extraction by plant roots that is caused by atmospheric demand is included as a nonlinear sink term. These nonlinear boundary and sink terms are linearized implicitly. The code has been verified for several one-dimensional linear problems for which analytical solutions exist and against two nonlinear problems that have been simulated with other numerical models. A complete listing of data-entry requirements and data entry and results for three example problems are provided. (USGS)

  5. 2-D computer modeling of oil generation and migration in a Transect of the Eastern Venezuela Basin

    SciTech Connect

    Gallango, O. ); Parnaud, F. )

    1993-02-01

    The aim of the study was a two-dimensional computer simulation of the basin evolution based on available geological, geophysical, geochemical, geothermal, and hydrodynamic data with the main purpose of determining the hydrocarbon generation and migration history. The modeling was done in two geological sections (platform and pre-thrusting) located along the Chacopata-Uverito Transect in the Eastern Venezuelan Basin. In the platform section an hypothetic source rock equivalent to the Gyayuta Group was considered in order to simulate the migration of hydrocarbons. The thermal history reconstruction of hypothetic source rock confirms that this source rock does not reach the oil window before the middle Miocene and that the maturity in this sector is due to the sedimentation of the Freites, La Pica, and Mesa-Las Piedras formations. The oil expulsion and migration from this hypothetic source rock began after middle Miocene time. The expulsion of the hydrocarbons took place mainly along the Oligocene-Miocene reservoir and do not reach at the present time zones located beyond of the Oritupano field, which imply that the oil accumulated in south part of the basin was generated by a source rock located to the north, in the actual deformation zone. Since 17 m.y. ago water migration pattern from north to south was observed in this section. In the pre-thrusting section the hydrocarbon expulsion started during the early Tertiary and took place mainly toward the lower Cretaceous (El Cantil and Barranquim formations). At the end of the passive margin the main migration occur across the Merecure reservoir, through which the hydrocarbon migrated forward to the Onado sector before the thrusting.

  6. Applications of the computer codes FLUX2D and PHI3D for the electromagnetic analysis of compressed magnetic field generators and power flow channels

    SciTech Connect

    Hodgdon, M.L.; Oona, H.; Martinez, A.R.; Salon, S.; Wendling, P.; Krahenbuhl, L.; Nicolas, A.; Nicolas, L.

    1989-01-01

    We present herein the results of three electromagnetic field problems for compressed magnetic field generators and their associated power flow channels. The first problem is the computation of the transient magnetic field in a two-dimensional model of helical generator during loading. The second problem is the three-dimensional eddy current patterns in a section of an armature beneath a bifurcation point of a helical winding. Our third problem is the calculation of the three-dimensional electrostatic fields in a region known as the post-hole convolute in which a rod connects the inner and outer walls of a system of three concentric cylinders through a hole in the middle cylinder. While analytic solutions exist for many electromagnetic field problems in cases of special and ideal geometries, the solutions of these and similar problems for the proper analysis and design of compressed magnetic field generators and their related hardware require computer simulations. In earlier studies, computer models have been proposed, several based on research oriented hydrocodes to which uncoupled or partially coupled Maxwell's equations solvers are added. Although the hydrocode models address the problem of moving, deformable conductors, they are not useful for electromagnetic analysis, nor can they be considered design tools. For our studies, we take advantage of the commercial, electromagnetic computer-aided design software packages FLUX2D nd PHI3D that were developed for motor manufacturers and utilities industries. 4 refs., 6 figs.

  7. Electroencephalography (EEG)-based brain-computer interface (BCI): a 2-D virtual wheelchair control based on event-related desynchronization/synchronization and state control.

    PubMed

    Huang, Dandan; Qian, Kai; Fei, Ding-Yu; Jia, Wenchuan; Chen, Xuedong; Bai, Ou

    2012-05-01

    This study aims to propose an effective and practical paradigm for a brain-computer interface (BCI)-based 2-D virtual wheelchair control. The paradigm was based on the multi-class discrimination of spatiotemporally distinguishable phenomenon of event-related desynchronization/synchronization (ERD/ERS) in electroencephalogram signals associated with motor execution/imagery of right/left hand movement. Comparing with traditional method using ERD only, where bilateral ERDs appear during left/right hand mental tasks, the 2-D control exhibited high accuracy within a short time, as incorporating ERS into the paradigm hypothetically enhanced the spatiotemoral feature contrast of ERS versus ERD. We also expected users to experience ease of control by including a noncontrol state. In this study, the control command was sent discretely whereas the virtual wheelchair was moving continuously. We tested five healthy subjects in a single visit with two sessions, i.e., motor execution and motor imagery. Each session included a 20 min calibration and two sets of games that were less than 30 min. Average target hit rate was as high as 98.4% with motor imagery. Every subject achieved 100% hit rate in the second set of wheelchair control games. The average time to hit a target 10 m away was about 59 s, with 39 s for the best set. The superior control performance in subjects without intensive BCI training suggested a practical wheelchair control paradigm for BCI users. PMID:22498703

  8. Comparative inhibitory potential of selected dietary bioactive polyphenols, phytosterols on CYP3A4 and CYP2D6 with fluorometric high-throughput screening.

    PubMed

    Vijayakumar, Thangavel Mahalingam; Kumar, Ramasamy Mohan; Agrawal, Aruna; Dubey, Govind Prasad; Ilango, Kaliappan

    2015-07-01

    Cytochrome P450 (CYP450) inhibition by the bioactive molecules of dietary supplements or herbal products leading to greater potential for toxicity of co-administered drugs. The present study was aimed to compare the inhibitory potential of selected common dietary bioactive molecules (Gallic acid, Ellagic acid, β-Sitosterol, Stigmasterol, Quercetin and Rutin) on CYP3A4 and CYP2D6 to assess safety through its inhibitory potency and to predict interaction potential with co-administered drugs. CYP450-CO complex assay was carried out for all the selected dietary bioactive molecules in isolated rat microsomes. CYP450 concentration of the rat liver microsome was found to be 0.474 nmol/mg protein, quercetin in DMSO has shown maximum inhibition on CYP450 (51.02 ± 1.24 %) but less when compared with positive control (79.02 ± 1.61 %). In high throughput fluorometric assay, IC50 value of quercetin (49.08 ± 1.02-54.36 ± 0.85 μg/ml) and gallic acid (78.46 ± 1.32-83.84 ± 1.06 μg/ml) was lower than other bioactive compounds on CYP3A4 and CYP2D6 respectively but it was higher than positive controls (06.28 ± 1.76-07.74 ± 1.32 μg/ml). In comparison of in vitro inhibitory potential on CYP3A4 and CYP2D6, consumption of food or herbal or dietary supplements containing quercetin and gallic acid without any limitation should be carefully considered when narrow therapeutic drugs are administered together. PMID:26139922

  9. Development of a V79 cell line expressing human cytochrome P450 2D6 and its application as a metabolic screening tool.

    PubMed

    Rauschenbach, R; Gieschen, H; Salomon, B; Kraus, C; Kühne, G; Hildebrand, M

    1997-02-15

    Expression of human cytochrome P450 (CYP) in heterologous cells is a means of specifically studying the role of these enzymes in drug metabolism. The complete cDNA encoding CYP2D6-VAL(374) was inserted into an expression vector containing the strong mycloproliferative sarcoma virus promotor in combination with the enhancer of the cytomegalovirus and stably expressed in V79 Chinese hamster cells. The presence of genomically integrated CYP2D6 cDNA was confirmed by polymerase chain reaction analysis. The protein expression was shown by Western blotting. Functional expression could be demonstrated by O-demethylation of dextromethorphan to dextrorphan in live cells. The enzymatic activity of 154 ± 16 pmol min(-1) mg(-1) protein was comparable with dextromethorphan-O-demethylation activities of human liver. The metabolism of two dopaminergic ergoline derivatives was investigated in whole recombinant V19 cells. Both lisuride and terguride were monodeethylated; in case of lisuride a correlation to the in vivo situation was demonstrated comparing poor and extensive metabolizers. PMID:21781755

  10. CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm: 2D and 3D Ising, Potts, and XY models

    NASA Astrophysics Data System (ADS)

    Komura, Yukihiro; Okabe, Yutaka

    2014-03-01

    We present sample CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm. We deal with the classical spin models; the Ising model, the q-state Potts model, and the classical XY model. As for the lattice, both the 2D (square) lattice and the 3D (simple cubic) lattice are treated. We already reported the idea of the GPU implementation for 2D models (Komura and Okabe, 2012). We here explain the details of sample programs, and discuss the performance of the present GPU implementation for the 3D Ising and XY models. We also show the calculated results of the moment ratio for these models, and discuss phase transitions. Catalogue identifier: AERM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERM_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5632 No. of bytes in distributed program, including test data, etc.: 14688 Distribution format: tar.gz Programming language: C, CUDA. Computer: System with an NVIDIA CUDA enabled GPU. Operating system: System with an NVIDIA CUDA enabled GPU. Classification: 23. External routines: NVIDIA CUDA Toolkit 3.0 or newer Nature of problem: Monte Carlo simulation of classical spin systems. Ising, q-state Potts model, and the classical XY model are treated for both two-dimensional and three-dimensional lattices. Solution method: GPU-based Swendsen-Wang multi-cluster spin flip Monte Carlo method. The CUDA implementation for the cluster-labeling is based on the work by Hawick et al. [1] and that by Kalentev et al. [2]. Restrictions: The system size is limited depending on the memory of a GPU. Running time: For the parameters used in the sample programs, it takes about a minute for each program. Of course, it depends on the system size, the number of Monte Carlo steps, etc. References: [1] K

  11. Low-dose computed tomography screening for lung cancer in a clinical setting: essential elements of a screening program.

    PubMed

    McKee, Brady J; McKee, Andrea B; Kitts, Andrea Borondy; Regis, Shawn M; Wald, Christoph

    2015-03-01

    The purpose of this article is to review clinical computed tomography (CT) lung screening program elements essential to safely and effectively manage the millions of Americans at high risk for lung cancer expected to enroll in lung cancer screening programs over the next 3 to 5 years. To optimize the potential net benefit of CT lung screening and facilitate medical audits benchmarked to national quality standards, radiologists should interpret these examinations using a validated structured reporting system such as Lung-RADS. Patient and physician educational outreach should be enacted to support an informed and shared decision-making process without creating barriers to screening access. Programs must integrate smoking cessation interventions to maximize the clinical efficacy and cost-effectiveness of screening. At an institutional level, budgets should account for the necessary expense of hiring and/or training qualified support staff and equipping them with information technology resources adequate to enroll and track patients accurately over decades of future screening evaluation. At a national level, planning should begin on ways to accommodate the upcoming increased demand for physician services in fields critical to the success of CT lung screening such as diagnostic radiology and thoracic surgery. Institutions with programs that follow these specifications will be well equipped to meet the significant oncoming demand for CT lung screening services and bestow clinical benefits on their patients equal to or beyond what was observed in the National Lung Screening Trial. PMID:25658476

  12. Short interfering RNA guide strand modifiers from computational screening.

    PubMed

    Onizuka, Kazumitsu; Harrison, Jason G; Ball-Jones, Alexi A; Ibarra-Soza, José M; Zheng, Yuxuan; Ly, Diana; Lam, Walter; Mac, Stephanie; Tantillo, Dean J; Beal, Peter A

    2013-11-13

    Short interfering RNAs (siRNAs) are promising drug candidates for a wide range of targets including those previously considered "undruggable". However, properties associated with the native RNA structure limit drug development, and chemical modifications are necessary. Here we describe the structure-guided discovery of functional modifications for the guide strand 5'-end using computational screening with the high-resolution structure of human Ago2, the key nuclease on the RNA interference pathway. Our results indicate the guide strand 5'-end nucleotide need not engage in Watson-Crick (W/C) H-bonding but must fit the general shape of the 5'-end binding site in MID/PIWI domains of hAgo2 for efficient knockdown. 1,2,3-Triazol-4-yl bases formed from the CuAAC reaction of azides and 1-ethynylribose, which is readily incorporated into RNA via the phosphoramidite, perform well at the guide strand 5'-end. In contrast, purine derivatives with modified Hoogsteen faces or N2 substituents are poor choices for 5'-end modifications. Finally, we identified a 1,2,3-triazol-4-yl base incapable of W/C H-bonding that performs well at guide strand position 12, where base pairing to target was expected to be important. This work expands the repertoire of functional nucleotide analogues for siRNAs. PMID:24152142

  13. Short Interfering RNA Guide Strand Modifiers from Computational Screening

    PubMed Central

    Onizuka, Kazumitsu; Harrison, Jason G.; Ball-Jones, Alexi A.; Ibarra-Soza, José M.; Zheng, Yuxuan; Ly, Diana; Lam, Walter; Mac, Stephanie; Tantillo, Dean J.; Beal, Peter A.

    2013-01-01

    Short interfering RNAs (siRNAs) are promising drug candidates for a wide range of targets including those previously considered “undruggable”. However, properties associated with the native RNA structure limit drug development and chemical modifications are necessary. Here we describe the structure-guided discovery of functional modifications for the guide strand 5’ end using computational screening with the high resolution structure of human Ago2, the key nuclease on the RNA interference pathway. Our results indicate the guide strand 5’-end nucleotide need not engage in Watson-Crick (W/C) H-bonding but must fit the general shape of the 5’-end binding site in MID/PIWI domains of hAgo2 for efficient knockdown. 1,2,3-Triazol-4-yl bases formed from the CuAAC reaction of azides and 1-ethynylribose, which is readily incorporated into RNA via the phosphoramidite, perform well at the guide strand 5’-end. In contrast, purine derivatives with modified Hoogsteen faces or N2 substituents are poor choices for 5’-end modifications. Finally, we identified a 1,2,3-triazol-4-yl base incapable of W/C H-bonding that performs well at guide strand position 12, where base pairing to target was expected to be important. This work expands the repertoire of functional nucleotide analogs for siRNAs. PMID:24152142

  14. A 2D-Computer Model of Atrial Tissue Based on Histographs Describes the Electro-Anatomical Impact of Microstructure on Endocardiac Potentials and Electric Near-Fields

    PubMed Central

    Campos, Fernando O.; Wiener, Thomas; Prassl, Anton J.; Ahammer, Helmut; Plank, Gernot; dos Santos, Rodrigo Weber; Sánchez-Quintana, Damián; Hofer, Ernst

    2014-01-01

    In experiments with cardiac tissue, local conduction is described by waveform analysis of the derivative of the extracellular potential Φ.e and by the loop morphology of the near-field strength E (the components of the electric field parallel and very close to the tissue surface). The question arises whether the features of these signals can be used to quantify the degree of fibrosis in the heart. A computer model allows us to study the behavior of electric signals at the endocardium with respect to known configurations of microstructure which can not be detected during the electrophysiological experiments. This work presents a 2D-computer model with sub-cellular resolution of atrial micro-conduction in the rabbit heart. It is based on the monodomain equations and digitized histographs from tissue slices obtained post-experimentum. It could be shown that excitation spread in densely coupled regions produces uniform and anisotropic conduction. In contrast, zones with parallel fibers separated by uncoupling interstitial space or connective tissue may show uniform or complex signals depending on pacing site. These results suggest that the analysis of Φ.e and E combined with multi-site pacing could be used to characterize the type and the size of fibrosis. PMID:21096441

  15. Icarus: A 2D direct simulation Monte Carlo (DSMC) code for parallel computers. User`s manual - V.3.0

    SciTech Connect

    Bartel, T.; Plimpton, S.; Johannes, J.; Payne, J.

    1996-10-01

    Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modelled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modelled using steric factors derived from Arrhenius reaction rates. Surface chemistry is modelled with surface reaction probabilities. The electron number density is either a fixed external generated field or determined using a local charge neutrality assumption. Ion chemistry is modelled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electrostatic fields can either be externally input or internally generated using a Langmuir-Tonks model. The Icarus software package includes the grid generation, parallel processor decomposition, postprocessing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. The majority of the software packages are written in standard Fortran.

  16. Combination of transient 2D-IR experiments and ab initio computations sheds light on the formation of the charge-transfer state in photoexcited carbonyl carotenoids.

    PubMed

    Di Donato, Mariangela; Segado Centellas, Mireia; Lapini, Andrea; Lima, Manuela; Avila, Francisco; Santoro, Fabrizio; Cappelli, Chiara; Righini, Roberto

    2014-08-14

    The excited state dynamics of carbonyl carotenoids is very complex because of the coupling of single- and doubly excited states and the possible involvement of intramolecular charge-transfer (ICT) states. In this contribution we employ ultrafast infrared spectroscopy and theoretical computations to investigate the relaxation dynamics of trans-8'-apo-β-carotenal occurring on the picosecond time scale, after excitation in the S2 state. In a (slightly) polar solvent like chloroform, one-dimensional (T1D-IR) and two-dimensional (T2D-IR) transient infrared spectroscopy reveal spectral components with characteristic frequencies and lifetimes that are not observed in nonpolar solvents (cyclohexane). Combining experimental evidence with an analysis of CASPT2//CASSCF ground and excited state minima and energy profiles, complemented with TDDFT calculations in gas phase and in solvent, we propose a photochemical decay mechanism for this system where only the bright single-excited 1Bu(+) and the dark double-excited 2Ag(-) states are involved. Specifically, the initially populated 1Bu(+) relaxes toward 2Ag(-) in 200 fs. In a nonpolar solvent 2Ag(-) decays to the ground state (GS) in 25 ps. In polar solvents, distortions along twisting modes of the chain promote a repopulation of the 1Bu(+) state which then quickly relaxes to the GS (18 ps in chloroform). The 1Bu(+) state has a high electric dipole and is the main contributor to the charge-transfer state involved in the dynamics in polar solvents. The 2Ag(-) → 1Bu(+) population transfer is evidenced by a cross peak on the T2D-IR map revealing that the motions along the same stretching of the conjugated chain on the 2Ag(-) and 1Bu(+) states are coupled. PMID:25050938

  17. Virtual screening in drug discovery -- a computational perspective.

    PubMed

    Reddy, A Srinivas; Pati, S Priyadarshini; Kumar, P Praveen; Pradeep, H N; Sastry, G Narahari

    2007-08-01

    Virtual screening emerged as an important tool in our quest to access novel drug like compounds. There are a wide range of comparable and contrasting methodological protocols available in screening databases for the lead compounds. The number of methods and software packages which employ the target and ligand based virtual screening are increasing at a rapid pace. However, the general understanding on the applicability and limitations of these methodologies is not emerging as fast as the developments of various methods. Therefore, it is extremely important to compare and contrast various protocols with practical examples to gauge the strength and applicability of various methods. The review provides a comprehensive appraisal on several of the available virtual screening methods to-date. Recent developments of the docking and similarity based methods have been discussed besides the descriptor selection and pharmacophore based searching. The review touches upon the application of statistical, graph theory based methods machine learning tools in virtual screening and combinatorial library design. Finally, several case studies are undertaken where the virtual screening technology has been applied successfully. A critical analysis of these case studies provides a good platform to estimate the applicability of various virtual screening methods in the new lead identification and optimization. PMID:17696867

  18. Screening for lung cancer with low-dose computed tomography: a review of current status

    PubMed Central

    Bowman, Rayleen V.; Yang, Ian A.; Fong, Kwun M.; Berg, Christine D.

    2013-01-01

    Screening using low-dose computed tomography (CT) represents an exciting new development in the struggle to improve outcomes for people with lung cancer. Randomised controlled evidence demonstrating a 20% relative lung cancer mortality benefit has led to endorsement of screening by several expert bodies in the US and funding by healthcare providers. Despite this pivotal result, many questions remain regarding technical and logistical aspects of screening, cost-effectiveness and generalizability to other settings. This review discusses the rationale behind screening, the results of on-going trials, potential harms of screening and current knowledge gaps. PMID:24163745

  19. Computed Tomography Screening for Lung Cancer in the National Lung Screening Trial

    PubMed Central

    Black, William C.

    2016-01-01

    The National Lung Screening Trial (NLST) demonstrated that screening with low-dose CT versus chest radiography reduced lung cancer mortality by 16% to 20%. More recently, a cost-effectiveness analysis (CEA) of CT screening for lung cancer versus no screening in the NLST was performed. The CEA conformed to the reference-case recommendations of the US Panel on Cost-Effectiveness in Health and Medicine, including the use of the societal perspective and an annual discount rate of 3%. The CEA was based on several important assumptions. In this paper, I review the methods and assumptions used to obtain the base case estimate of $81,000 per quality-adjusted life-year gained. In addition, I show how this estimate varied widely among different subsets and when some of the base case assumptions were changed and speculate on the cost-effectiveness of CT screening for lung cancer outside the NLST. PMID:25635704

  20. Image fusion of Ultrasound Computer Tomography volumes with X-ray mammograms using a biomechanical model based 2D/3D registration.

    PubMed

    Hopp, T; Duric, N; Ruiter, N V

    2015-03-01

    Ultrasound Computer Tomography (USCT) is a promising breast imaging modality under development. Comparison to a standard method like mammography is essential for further development. Due to significant differences in image dimensionality and compression state of the breast, correlating USCT images and X-ray mammograms is challenging. In this paper we present a 2D/3D registration method to improve the spatial correspondence and allow direct comparison of the images. It is based on biomechanical modeling of the breast and simulation of the mammographic compression. We investigate the effect of including patient-specific material parameters estimated automatically from USCT images. The method was systematically evaluated using numerical phantoms and in-vivo data. The average registration accuracy using the automated registration was 11.9mm. Based on the registered images a method for analysis of the diagnostic value of the USCT images was developed and initially applied to analyze sound speed and attenuation images based on X-ray mammograms as ground truth. Combining sound speed and attenuation allows differentiating lesions from surrounding tissue. Overlaying this information on mammograms, combines quantitative and morphological information for multimodal diagnosis. PMID:25456144

  1. VIBA-Lab 3.0: Computer program for simulation and semi-quantitative analysis of PIXE and RBS spectra and 2D elemental maps

    NASA Astrophysics Data System (ADS)

    Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana

    2015-11-01

    VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.

  2. Elastic Deformations in 2D van der waals Heterostructures and their Impact on Optoelectronic Properties: Predictions from a Multiscale Computational Approach

    NASA Astrophysics Data System (ADS)

    Kumar, Hemant; Er, Dequan; Dong, Liang; Li, Junwen; Shenoy, Vivek B.

    2015-06-01

    Recent technological advances in the isolation and transfer of different 2-dimensional (2D) materials have led to renewed interest in stacked Van der Waals (vdW) heterostructures. Interlayer interactions and lattice mismatch between two different monolayers cause elastic strains, which significantly affects their electronic properties. Using a multiscale computational method, we demonstrate that significant in-plane strains and the out-of-plane displacements are introduced in three different bilayer structures, namely graphene-hBN, MoS2-WS2 and MoSe2-WSe2, due to interlayer interactions which can cause bandgap change of up to ~300 meV. Furthermore, the magnitude of the elastic deformations can be controlled by changing the relative rotation angle between two layers. Magnitude of the out-of-plane displacements in graphene agrees well with those observed in experiments and can explain the experimentally observed bandgap opening in graphene. Upon increasing the relative rotation angle between the two lattices from 0° to 10°, the magnitude of the out-of-plane displacements decrease while in-plane strains peaks when the angle is ~6°. For large misorientation angles (>10°), the out-of-plane displacements become negligible. We further predict the deformation fields for MoS2-WS2 and MoSe2-WSe2 heterostructures that have been recently synthesized experimentally and estimate the effect of these deformation fields on near-gap states.

  3. A 2-D spectral-element method for computing spherical-earth seismograms-II. Waves in solid-fluid media

    NASA Astrophysics Data System (ADS)

    Nissen-Meyer, Tarje; Fournier, Alexandre; Dahlen, F. A.

    2008-09-01

    We portray a dedicated spectral-element method to solve the elastodynamic wave equation upon spherically symmetric earth models at the expense of a 2-D domain. Using this method, 3-D wavefields of arbitrary resolution may be computed to obtain Fréchet sensitivity kernels, especially for diffracted arrivals. The meshing process is presented for varying frequencies in terms of its efficiency as measured by the total number of elements, their spacing variations and stability criteria. We assess the mesh quantitatively by defining these numerical parameters in a general non-dimensionalized form such that comparisons to other grid-based methods are straightforward. Efficient-mesh generation for the PREM example and a minimum-messaging domain decomposition and parallelization strategy lay foundations for waveforms up to frequencies of 1 Hz on moderate PC clusters. The discretization of fluid, solid and respective boundary regions is similar to previous spectral-element implementations, save for a fluid potential formulation that incorporates the density, thereby yielding identical boundary terms on fluid and solid sides. We compare the second-order Newmark time extrapolation scheme with a newly implemented fourth-order symplectic scheme and argue in favour of the latter in cases of propagation over many wavelengths due to drastic accuracy improvements. Various validation examples such as full moment-tensor seismograms, wavefield snapshots, and energy conservation illustrate the favourable behaviour and potential of the method.

  4. The New Screen Time: Computers, Tablets, and Smartphones Enter the Equation

    ERIC Educational Resources Information Center

    Wiles, Bradford B.; Schachtner, Laura; Pentz, Julie L.

    2016-01-01

    Emerging technologies attract children and push parents' and caregivers' abilities to attend to their families. This article presents recommendations related to the new version of screen time, which includes time with computers, tablets, and smartphones. Recommendations are provided for screen time for very young children and those in middle and…

  5. DockScreen: A database of in silico biomolecular interactions to support computational toxicology

    EPA Science Inventory

    We have developed DockScreen, a database of in silico biomolecular interactions designed to enable rational molecular toxicological insight within a computational toxicology framework. This database is composed of chemical/target (receptor and enzyme) binding scores calculated by...

  6. COMPUTATIONAL TOXICOLOGY - OBJECTIVE 2: DEVELOPING APPROACHES FOR PRIORITIZING CHEMICALS FOR SUBSEQUENT SCREENING AND TESTING

    EPA Science Inventory

    One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...

  7. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  8. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  9. Automatic multimodal 2D/3D image fusion of ultrasound computer tomography and x-ray mammography for breast cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Hopp, Torsten; Duric, Neb; Ruiter, Nicole V.

    2012-03-01

    Breast cancer is the most common cancer among women. The established screening method to detect breast cancer in an early state is X-ray mammography. However, X-ray frequently provides limited contrast of tumors located within glandular tissue. A new imaging approach is Ultrasound Computer Tomography generating threedimensional volumes of the breast. Three different images are available: reflectivity, attenuation and speed of sound. The correlation of USCT volumes with X-ray mammograms is of interest for evaluation of the new imaging modality as well as for a multimodal diagnosis. Yet, both modalities differ in image dimensionality, patient positioning and deformation state of the breast. In earlier work we proposed a methodology based on Finite Element Method to register speed of sound images with the according mammogram. In this work, we enhanced the methodology to register all three image types provided by USCT. Furthermore, the methodology is now completely automated using image similarity measures to estimate rotations in datasets. A fusion methodology is proposed which combines the information of the three USCT image types with the X-ray mammogram via semitransparent overlay images. The evaluation was done using 13 datasets from a clinical study. The registration accuracy was measured by the displacement of the center of a lesion marked in both modalities. Using the automated rotation estimation, a mean displacement of 10.4 mm was achieved. Due to the clinically relevant registration accuracy, the methodology provides a basis for evaluation of the new imaging device USCT as well as for multimodal diagnosis.

  10. The Use of Geometric Properties of 2D Arrays across Development

    ERIC Educational Resources Information Center

    Gibson, Brett M.; Leichtman, Michelle D.; Costa, Rachel; Bemis, Rhyannon

    2009-01-01

    Four- to 10-year-old children (n = 50) participated in a 2D search task that included geometry (with- and without lines) and feature conditions. During each of 27 trials, participants watched as a cartoon character hid behind one of three landmarks arranged in a triangle on a computer screen. During feature condition trials, participants could use…

  11. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 2: User's manual and program listing

    NASA Technical Reports Server (NTRS)

    Bailey, R. T.; Shih, T. I.-P.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D, was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  12. Accommodative and convergence response to computer screen and printed text

    NASA Astrophysics Data System (ADS)

    Ferreira, Andreia; Lira, Madalena; Franco, Sandra

    2011-05-01

    The aim of this work was to find out if differences exist in accommodative and convergence response for different computer monitors' and a printed text. It was also tried to relate the horizontal heterophoria value and accommodative response with the symptoms associated with computer use. Two independents experiments were carried out in this study. The first experiment was measuring the accommodative response on 89 subjects using the Grand Seiko WAM-5500 (Grand Seiko Co., Ltd., Japan). The accommodative response was measured using three computer monitors: a 17-inch cathode ray tube (CRT), two liquid crystal displays LCDs, one 17-inch (LCD17) and one 15 inches (LCD15) and a printed text. The text displayed was always the same for all the subjects and tests. A second experiment aimed to measure the value of habitual horizontal heterophoria on 80 subjects using the Von Graefe technique. The measurements were obtained using the same target presented on two different computer monitors, one 19-inch cathode ray tube (CRT) and other 19 inches liquid crystal displays (LCD) and printed on paper. A small survey about the incidence and prevalence of symptoms was performed similarly in both experiments. In the first experiment, the accommodation response was higher in the CRT and LCD's than for paper. There were not found significantly different response for both LCD monitors'. The second experiment showed that, the heterophoria values were similar for all the stimuli. On average, participants presented a small exophoria. In both experiments, asthenopia was the symptom that presented higher incidence. There are different accommodative responses when reading on paper or on computer monitors. This difference is more significant for CRT monitors. On the other hand, there was no difference in the values of convergence for the computer monitors' and paper. The symptoms associated with the use of computers are not related with the increase in accommodation and with the horizontal

  13. Protein engineering by highly parallel screening of computationally designed variants.

    PubMed

    Sun, Mark G F; Seo, Moon-Hyeong; Nim, Satra; Corbi-Verge, Carles; Kim, Philip M

    2016-07-01

    Current combinatorial selection strategies for protein engineering have been successful at generating binders against a range of targets; however, the combinatorial nature of the libraries and their vast undersampling of sequence space inherently limit these methods due to the difficulty in finely controlling protein properties of the engineered region. Meanwhile, great advances in computational protein design that can address these issues have largely been underutilized. We describe an integrated approach that computationally designs thousands of individual protein binders for high-throughput synthesis and selection to engineer high-affinity binders. We show that a computationally designed library enriches for tight-binding variants by many orders of magnitude as compared to conventional randomization strategies. We thus demonstrate the feasibility of our approach in a proof-of-concept study and successfully obtain low-nanomolar binders using in vitro and in vivo selection systems. PMID:27453948

  14. Protein engineering by highly parallel screening of computationally designed variants

    PubMed Central

    Sun, Mark G. F.; Seo, Moon-Hyeong; Nim, Satra; Corbi-Verge, Carles; Kim, Philip M.

    2016-01-01

    Current combinatorial selection strategies for protein engineering have been successful at generating binders against a range of targets; however, the combinatorial nature of the libraries and their vast undersampling of sequence space inherently limit these methods due to the difficulty in finely controlling protein properties of the engineered region. Meanwhile, great advances in computational protein design that can address these issues have largely been underutilized. We describe an integrated approach that computationally designs thousands of individual protein binders for high-throughput synthesis and selection to engineer high-affinity binders. We show that a computationally designed library enriches for tight-binding variants by many orders of magnitude as compared to conventional randomization strategies. We thus demonstrate the feasibility of our approach in a proof-of-concept study and successfully obtain low-nanomolar binders using in vitro and in vivo selection systems. PMID:27453948

  15. Computer Simulation Study of Graphene Oxide Supercapacitors: Charge Screening Mechanism.

    PubMed

    Park, Sang-Won; DeYoung, Andrew D; Dhumal, Nilesh R; Shim, Youngseon; Kim, Hyung J; Jung, YounJoon

    2016-04-01

    Graphene oxide supercapacitors in the parallel plate configuration are studied via molecular dynamics (MD) simulations. The full range of electrode oxidation from 0 to 100% is examined by oxidizing the graphene surface with hydroxyl groups. Two different electrolytes, 1-ethyl-3-methylimidazolium tetrafluoroborate (EMI(+)BF4(-)) as an ionic liquid and its 1.3 M solution in acetonitrile as an organic electrolyte, are considered. While the area-specific capacitance tends to decrease with increasing electrode oxidation for both electrolytes, its details show interesting differences between the organic electrolyte and ionic liquid, including the extent of decrease. For detailed insight into these differences, the screening mechanisms of electrode charges by electrolytes and their variations with electrode oxidation are analyzed with special attention paid to the aspects shared by and the contrasts between the organic electrolyte and ionic liquid. PMID:26966918

  16. School Students and Computer Games with Screen Violence

    ERIC Educational Resources Information Center

    Fedorov, A. V.

    2005-01-01

    In this article, the author states how these days, school students from low-income strata of the population in Russia spend hours sitting in computer rooms and Internet clubs, where, for a relatively small fee, they can play interactive video games. And to determine what games they prefer the author conducted a content analysis of eighty-seven…

  17. Staring 2-D hadamard transform spectral imager

    DOEpatents

    Gentry, Stephen M.; Wehlburg, Christine M.; Wehlburg, Joseph C.; Smith, Mark W.; Smith, Jody L.

    2006-02-07

    A staring imaging system inputs a 2D spatial image containing multi-frequency spectral information. This image is encoded in one dimension of the image with a cyclic Hadamarid S-matrix. The resulting image is detecting with a spatial 2D detector; and a computer applies a Hadamard transform to recover the encoded image.

  18. Lung Cancer Screening with Low-Dose Computed Tomography for Primary Care Providers

    PubMed Central

    Richards, Thomas B.; White, Mary C.; Caraballo, Ralph S.

    2015-01-01

    This review provides an update on lung cancer screening with low-dose computed tomography (LDCT) and its implications for primary care providers. One of the unique features of lung cancer screening is the potential complexity in patient management if an LDCT scan reveals a small pulmonary nodule. Additional tests, consultation with multiple specialists, and follow-up evaluations may be needed to evaluate whether lung cancer is present. Primary care providers should know the resources available in their communities for lung cancer screening with LDCT and smoking cessation, and the key points to be addressed in informed and shared decision-making discussions with patients. PMID:24830610

  19. Automatic classification of pulmonary peri-fissural nodules in computed tomography using an ensemble of 2D views and a convolutional neural network out-of-the-box.

    PubMed

    Ciompi, Francesco; de Hoop, Bartjan; van Riel, Sarah J; Chung, Kaman; Scholten, Ernst Th; Oudkerk, Matthijs; de Jong, Pim A; Prokop, Mathias; van Ginneken, Bram

    2015-12-01

    In this paper, we tackle the problem of automatic classification of pulmonary peri-fissural nodules (PFNs). The classification problem is formulated as a machine learning approach, where detected nodule candidates are classified as PFNs or non-PFNs. Supervised learning is used, where a classifier is trained to label the detected nodule. The classification of the nodule in 3D is formulated as an ensemble of classifiers trained to recognize PFNs based on 2D views of the nodule. In order to describe nodule morphology in 2D views, we use the output of a pre-trained convolutional neural network known as OverFeat. We compare our approach with a recently presented descriptor of pulmonary nodule morphology, namely Bag of Frequencies, and illustrate the advantages offered by the two strategies, achieving performance of AUC = 0.868, which is close to the one of human experts. PMID:26458112

  20. Computational screening of oxetane monomers for novel hydroxy terminated polyethers.

    PubMed

    Sarangapani, Radhakrishnan; Ghule, Vikas D; Sikder, Arun K

    2014-06-01

    Energetic hydroxy terminated polyether prepolymers find paramount importance in search of energetic binders for propellant applications. In the present study, density functional theory (DFT) has been employed to screen the various novel energetic oxetane derivatives, which usually construct the backbone for these energetic polymers. Molecular structures were investigated at the B3LYP/6-31G* level, and isodesmic reactions were designed for calculating the gas phase heats of formation. The condensed phase heats of formation for designed compounds were calculated by the Politzer approach using heats of sublimation. Among the designed oxetane derivatives, T4 and T5 possess condensed phase heat of formation above 210 kJ mol(-1). The crystal packing density of the designed oxetane derivatives varied from 1.2 to 1.6 g/cm(3). The detonation velocities and pressures were evaluated using the Kamlet-Jacobs equations, utilizing the predicted densities and HOFCond. It was found that most of the designed oxetane derivatives have detonation performance comparable to the monomers of benchmark energetic polymers viz., NIMMO, AMMO, and BAMO. The strain energy (SE) for the oxetane derivatives were calculated using homodesmotic reactions, while intramolecular group interactions were predicted through the disproportionation energies. The concept of chemical hardness is used to analyze the susceptibility of designed compounds to reactivity and chemical transformations. The heats of formation, density, and predicted performance imply that the designed molecules are expected to be candidates for polymer synthesis and potential molecules for energetic binders. PMID:24863529

  1. Computational screening of organic materials towards improved photovoltaic properties

    NASA Astrophysics Data System (ADS)

    Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan; Borunda, Mario

    2015-03-01

    The world today faces an energy crisis that is an obstruction to the development of the human civilization. One of the most promising solutions is solar energy harvested by economical solar cells. Being the third generation of solar cell materials, organic photovoltaic (OPV) materials is now under active development from both theoretical and experimental points of view. In this study, we constructed a parameter to select the desired molecules based on their optical spectra performance. We applied it to investigate a large collection of potential OPV materials, which were from the CEPDB database set up by the Harvard Clean Energy Project. Time dependent density functional theory (TD-DFT) modeling was used to calculate the absorption spectra of the molecules. Then based on the parameter, we screened out the top performing molecules for their potential OPV usage and suggested experimental efforts toward their synthesis. In addition, from those molecules, we summarized the functional groups that provided molecules certain spectrum capability. It is hoped that useful information could be mined out to provide hints to molecular design of OPV materials.

  2. Threshold perception performance with computed and screen-film radiography: implications for chest radiography.

    PubMed

    Dobbins, J T; Rice, J J; Beam, C A; Ravin, C E

    1992-04-01

    Images of a phantom obtained with computed radiography and standard screen-film imaging were compared to evaluate observer threshold perception performance with a modified contrast-detail technique. Optimum exposure necessary for performance with the imaging plate technique to match that with screen-film techniques was determined, as was comparative performance with variation in kilovoltages, plate type, spatial enhancement, and hard-copy interpolation method. It was found that computed radiography necessitates about 75%-100% more exposure than screen-film radiography to optimally match performance with Ortho-C film with Lanex regular or medium screens (Eastman Kodak, Rochester, NY) for detection of objects 0.05-2.0 cm in diameter. However, only minimal loss of detection performance (approximately 10% overall) was experienced if standard screen-film exposures were used with computed radiography. Little change in observer performance was found with variation in plate type, spatial enhancement, or method of hard-copy interpolation. However, perception performance with computed radiographic images was better at lower kilovoltages. PMID:1549669

  3. The importance of lung cancer screening with low-dose computed tomography for Medicare beneficiaries.

    PubMed

    Wood, Douglas E

    2014-12-01

    The National Lung Screening Trial has provided convincing evidence of a substantial mortality benefit of lung cancer screening with low-dose computed tomography (CT) for current and former smokers at high risk. The United States Preventive Services Task Force has recommended screening, triggering coverage of low-dose CT by private health insurers under provisions of the Affordable Care Act. The Centers for Medicare & Medicaid Services (CMS) are currently evaluating coverage of lung cancer screening for Medicare beneficiaries. Since 70% of lung cancer occurs in patients 65 years or older, CMS should cover low-dose CT, thus avoiding the situation of at-risk patients being screened up to age 64 through private insurers and then abruptly ceasing screening at exactly the ages when their risk for developing lung cancer is increasing. Legitimate concerns include false-positive findings that lead to further testing and invasive procedures, overdiagnosis (detection of clinically unimportant cancers), the morbidity and mortality of surgery, and the overall costs of follow-up tests and procedures. These concerns can be mitigated by clear criteria for screening high-risk patients, disciplined management of abnormalities based on algorithms, and high-quality multidisciplinary care. Lung cancer screening with low-dose CT can lead to early diagnosis and cure for thousands of patients each year. Professional societies can help CMS responsibly implement a program that is patient-centered and minimizes unintended harms and costs. PMID:25317992

  4. Reading from computer screen versus reading from paper: does it still make a difference?

    PubMed

    Köpper, Maja; Mayr, Susanne; Buchner, Axel

    2016-05-01

    Four experiments were conducted to test whether recent developments in display technology would suffice to eliminate the well-known disadvantages in reading from screen as compared with paper. Proofreading speed and performance were equal for a TFT-LCD and a paper display, but there were more symptoms of eyestrain in the screen condition accompanied by a strong preference for paper (Experiment 1). These results were replicated using a longer reading duration (Experiment 2). Additional experiments were conducted to test hypotheses about the reasons for the higher amount of eyestrain associated with reading from screen. Reduced screen luminance did not change the pattern of results (Experiment 3), but positioning both displays in equal inclination angles eliminated the differences in eyestrain symptoms and increased proofreading speed in the screen condition (Experiment 4). A paper-like positioning of TFT-LCDs seems to enable unimpaired reading without evidence of increased physical strain. Practitioner Summary: Given the developments in screen technology, a re-assessment of the differences in proofreading speed and performance, well-being, and preference between computer screen and paper was conducted. State-of-the-art TFT-LCDs enable unimpaired reading, but a book-like positioning of screens seems necessary to minimise eyestrain symptoms. PMID:26736059

  5. Cost-Effectiveness of Computed Tomographic Colonography Screening for Colorectal Cancer in the Medicare Population

    PubMed Central

    Lansdorp-Vogelaar, Iris; Rutter, Carolyn M.; Savarino, James E.; van Ballegooijen, Marjolein; Kuntz, Karen M.; Zauber, Ann G.

    2010-01-01

    Background The Centers for Medicare and Medicaid Services (CMS) considered whether to reimburse computed tomographic colonography (CTC) for colorectal cancer screening of Medicare enrollees. To help inform its decision, we evaluated the reimbursement rate at which CTC screening could be cost-effective compared with the colorectal cancer screening tests that are currently reimbursed by CMS and are included in most colorectal cancer screening guidelines, namely annual fecal occult blood test (FOBT), flexible sigmoidoscopy every 5 years, flexible sigmoidoscopy every 5 years in conjunction with annual FOBT, and colonoscopy every 10 years. Methods We used three independently developed microsimulation models to assess the health outcomes and costs associated with CTC screening and with currently reimbursed colorectal cancer screening tests among the average-risk Medicare population. We assumed that CTC was performed every 5 years (using test characteristics from either a Department of Defense CTC study or the National CTC Trial) and that individuals with findings of 6 mm or larger were referred to colonoscopy. We computed incremental cost-effectiveness ratios for the currently reimbursed screening tests and calculated the maximum cost per scan (ie, the threshold cost) for the CTC strategy to lie on the efficient frontier. Sensitivity analyses were performed on key parameters and assumptions. Results Assuming perfect adherence with all tests, the undiscounted number life-years gained from CTC screening ranged from 143 to 178 per 1000 65-year-olds, which was slightly less than the number of life-years gained from 10-yearly colonoscopy (152–185 per 1000 65-year-olds) and comparable to that from 5-yearly sigmoidoscopy with annual FOBT (149–177 per 1000 65-year-olds). If CTC screening was reimbursed at $488 per scan (slightly less than the reimbursement for a colonoscopy without polypectomy), it would be the most costly strategy. CTC screening could be cost-effective at

  6. Aniso2D

    2005-07-01

    Aniso2d is a two-dimensional seismic forward modeling code. The earth is parameterized by an X-Z plane in which the seismic properties Can have monoclinic with x-z plane symmetry. The program uses a user define time-domain wavelet to produce synthetic seismograms anrwhere within the two-dimensional media.

  7. Automated computational screening of the thiol reactivity of substituted alkenes.

    PubMed

    Smith, Jennifer M; Rowley, Christopher N

    2015-08-01

    Electrophilic olefins can react with the S-H moiety of cysteine side chains. The formation of a covalent adduct through this mechanism can result in the inhibition of an enzyme. The reactivity of an olefin towards cysteine depends on its functional groups. In this study, 325 reactions of thiol-Michael-type additions to olefins were modeled using density functional theory. All combinations of ethenes with hydrogen, methyl ester, amide, and cyano substituents were included. An automated workflow was developed to perform the construction, conformation search, minimization, and calculation of molecular properties for the reactant, carbanion intermediate, and thioether products for a model reaction of the addition of methanethiol to the electrophile. Known cysteine-reactive electrophiles present in the database were predicted to react exergonically with methanethiol through a carbanion with a stability in the 30-40 kcal mol(-1) range. 13 other compounds in our database that are also present in the PubChem database have similar properties. Natural bond orbital parameters were computed and regression analysis was used to determine the relationship between properties of the olefin electronic structure and the product and intermediate stability. The stability of the intermediates is very sensitive to electronic effects on the carbon where the anionic charge is centered. The stability of the products is more sensitive to steric factors. PMID:26159564

  8. An algorithm for computing screened Coulomb scattering in G EANT4

    NASA Astrophysics Data System (ADS)

    Mendenhall, Marcus H.; Weller, Robert A.

    2005-01-01

    An algorithm has been developed for the GEANT4 Monte-Carlo package for the efficient computation of screened Coulomb interatomic scattering. It explicitly integrates the classical equations of motion for scattering events, resulting in precise tracking of both the projectile and the recoil target nucleus. The algorithm permits the user to plug in an arbitrary screening function, such as Lens-Jensen screening, which is good for backscattering calculations, or Ziegler-Biersack-Littmark screening, which is good for nuclear straggling and implantation problems. This will allow many of the applications of the TRIM and SRIM codes to be extended into the much more general GEANT4 framework where nuclear and other effects can be included.

  9. Computationally efficient autoregressive method for generating phase screens with frozen flow and turbulence in optical simulations.

    PubMed

    Srinath, Srikar; Poyneer, Lisa A; Rudy, Alexander R; Ammons, S Mark

    2015-12-28

    We present a sample-based, autoregressive (AR) method for the generation and time evolution of atmospheric phase screens that is computationally efficient and uses a single parameter per Fourier mode to vary the power contained in the frozen flow and stochastic components. We address limitations of Fourier-based methods such as screen periodicity and low spatial frequency power content. Comparisons of adaptive optics (AO) simulator performance when fed AR phase screens and translating phase screens reveal significantly elevated residual closed-loop temporal power for small increases in added stochastic content at each time step, thus displaying the importance of properly modeling atmospheric "boiling". We present preliminary evidence that our model fits to AO telemetry are better reflections of real conditions than the pure frozen flow assumption. PMID:26831998

  10. 3D face reconstruction from 2D pictures: first results of a web-based computer aided system for aesthetic procedures.

    PubMed

    Oliveira-Santos, Thiago; Baumberger, Christian; Constantinescu, Mihai; Olariu, Radu; Nolte, Lutz-Peter; Alaraibi, Salman; Reyes, Mauricio

    2013-05-01

    The human face is a vital component of our identity and many people undergo medical aesthetics procedures in order to achieve an ideal or desired look. However, communication between physician and patient is fundamental to understand the patient's wishes and to achieve the desired results. To date, most plastic surgeons rely on either "free hand" 2D drawings on picture printouts or computerized picture morphing. Alternatively, hardware dependent solutions allow facial shapes to be created and planned in 3D, but they are usually expensive or complex to handle. To offer a simple and hardware independent solution, we propose a web-based application that uses 3 standard 2D pictures to create a 3D representation of the patient's face on which facial aesthetic procedures such as filling, skin clearing or rejuvenation, and rhinoplasty are planned in 3D. The proposed application couples a set of well-established methods together in a novel manner to optimize 3D reconstructions for clinical use. Face reconstructions performed with the application were evaluated by two plastic surgeons and also compared to ground truth data. Results showed the application can provide accurate 3D face representations to be used in clinics (within an average of 2 mm error) in less than 5 min. PMID:23319167

  11. An Evaluation of Student Perceptions of Screen Presentations in Computer-based Laboratory Simulations.

    ERIC Educational Resources Information Center

    Edward, Norrie S.

    1997-01-01

    Evaluates the importance of realism in the screen presentation of the plant in computer-based laboratory simulations for part-time engineering students. Concludes that simulations are less effective than actual laboratories but that realism minimizes the disadvantages. The schematic approach was preferred for ease of use. (AIM)

  12. Computer Decision Support to Improve Autism Screening and Care in Community Pediatric Clinics

    ERIC Educational Resources Information Center

    Bauer, Nerissa S.; Sturm, Lynne A.; Carroll, Aaron E.; Downs, Stephen M.

    2013-01-01

    An autism module was added to an existing computer decision support system (CDSS) to facilitate adherence to recommended guidelines for screening for autism spectrum disorders in primary care pediatric clinics. User satisfaction was assessed by survey and informal feedback at monthly meetings between clinical staff and the software team. To assess…

  13. Mesh2d

    SciTech Connect

    Greg Flach, Frank Smith

    2011-12-31

    Mesh2d is a Fortran90 program designed to generate two-dimensional structured grids of the form [x(i),y(i,j)] where [x,y] are grid coordinates identified by indices (i,j). The x(i) coordinates alone can be used to specify a one-dimensional grid. Because the x-coordinates vary only with the i index, a two-dimensional grid is composed in part of straight vertical lines. However, the nominally horizontal y(i,j0) coordinates along index i are permitted to undulate or otherwise vary. Mesh2d also assigns an integer material type to each grid cell, mtyp(i,j), in a user-specified manner. The complete grid is specified through three separate input files defining the x(i), y(i,j), and mtyp(i,j) variations.

  14. Mesh2d

    2011-12-31

    Mesh2d is a Fortran90 program designed to generate two-dimensional structured grids of the form [x(i),y(i,j)] where [x,y] are grid coordinates identified by indices (i,j). The x(i) coordinates alone can be used to specify a one-dimensional grid. Because the x-coordinates vary only with the i index, a two-dimensional grid is composed in part of straight vertical lines. However, the nominally horizontal y(i,j0) coordinates along index i are permitted to undulate or otherwise vary. Mesh2d also assignsmore » an integer material type to each grid cell, mtyp(i,j), in a user-specified manner. The complete grid is specified through three separate input files defining the x(i), y(i,j), and mtyp(i,j) variations.« less

  15. Vertical 2D Heterostructures

    NASA Astrophysics Data System (ADS)

    Lotsch, Bettina V.

    2015-07-01

    Graphene's legacy has become an integral part of today's condensed matter science and has equipped a whole generation of scientists with an armory of concepts and techniques that open up new perspectives for the postgraphene area. In particular, the judicious combination of 2D building blocks into vertical heterostructures has recently been identified as a promising route to rationally engineer complex multilayer systems and artificial solids with intriguing properties. The present review highlights recent developments in the rapidly emerging field of 2D nanoarchitectonics from a materials chemistry perspective, with a focus on the types of heterostructures available, their assembly strategies, and their emerging properties. This overview is intended to bridge the gap between two major—yet largely disjunct—developments in 2D heterostructures, which are firmly rooted in solid-state chemistry or physics. Although the underlying types of heterostructures differ with respect to their dimensions, layer alignment, and interfacial quality, there is common ground, and future synergies between the various assembly strategies are to be expected.

  16. Mean sojourn time and effectiveness of mortality reduction for lung cancer screening with computed tomography.

    PubMed

    Chien, Chun-Ru; Chen, Tony Hsiu-Hsi

    2008-06-01

    This study aimed to estimate the mean sojourn time (MST) and sensitivity of asymptomatic lung cancer (ALC) detected by computed tomography (CT) or chest X-ray (CXR). Translation of early diagnosis into mortality reduction by 2 detection modalities and inter-screening interval was projected using a Markov model. On the basis of systematic literature review, data from 6 prospective CT screening studies were retrieved. The MST in association with the natural history of lung cancer depicted by a 3-state Markov model was estimated with a Bayesian approach. To project mortality reduction attributed to screening, the model was further extended to 5 health states for the inclusion of prognostic part. The analysis was run with a 10-year time horizon of follow-up, mimicking the Dutch-Belgian randomized lung cancer screening trial (NELSON). Screening for lung cancer with CT had high sensitivity (median: 97%) and may advance 1 year earlier than CXR in detecting ALC. By simulating the scenario similar to NELSON study, CT screen may gain an extra of 0.019 year of life expectancy per person, yields 15% mortality reduction (relative risk (RR): 0.85, 95% confidence interval [95%CI: (0.58-1.01)]. Approximate 23% [RR: 0.77, 95%CI: (0.43-0.98)] mortality reduction would be achieved by annual CT screening program. The mortality findings in conjunction with higher sensitivity and shorter MST estimate given data on prevalent and incident (2nd) screen may provide a tentative evidence, suggesting that annual CT screening may be required in order to be effective in reducing mortality before the results of randomized controlled studies available. PMID:18302157

  17. Patient Perspectives on Low-Dose Computed Tomography for Lung Cancer Screening, New Mexico, 2014

    PubMed Central

    Sussman, Andrew L.; Murrietta, Ambroshia M.; Getrich, Christina M.; Rhyne, Robert; Crowell, Richard E.; Taylor, Kathryn L.; Reifler, Ellen J.; Wescott, Pamela H.; Saeed, Ali I.; Hoffman, Richard M.

    2016-01-01

    Introduction National guidelines call for annual lung cancer screening for high-risk smokers using low-dose computed tomography (LDCT). The objective of our study was to characterize patient knowledge and attitudes about lung cancer screening, smoking cessation, and shared decision making by patient and health care provider. Methods We conducted semistructured qualitative interviews with patients with histories of heavy smoking who received care at a Federally Qualified Health Center (FQHC Clinic) and at a comprehensive cancer center-affiliated chest clinic (Chest Clinic) in Albuquerque, New Mexico. The interviews, conducted from February through September 2014, focused on perceptions about health screening, knowledge and attitudes about LDCT screening, and preferences regarding decision aids. We used a systematic iterative analytic process to identify preliminary and emergent themes and to create a coding structure. Results We reached thematic saturation after 22 interviews (10 at the FQHC Clinic, 12 at the Chest Clinic). Most patients were unaware of LDCT screening for lung cancer but were receptive to the test. Some smokers said they would consider quitting smoking if their screening result were positive. Concerns regarding screening were cost, radiation exposure, and transportation issues. To support decision making, most patients said they preferred one-on-one discussions with a provider. They also valued decision support tools (print materials, videos), but raised concerns about readability and Internet access. Conclusion Implementing lung cancer screening in sociodemographically diverse populations poses significant challenges. The value of tobacco cessation counseling cannot be overemphasized. Effective interventions for shared decision making to undergo lung cancer screening will need the active engagement of health care providers and will require the use of accessible decision aids designed for people with low health literacy. PMID:27536900

  18. Performance of linear and nonlinear texture measures in 2D and 3D for monitoring architectural changes in osteoporosis using computer-generated models of trabecular bone

    NASA Astrophysics Data System (ADS)

    Boehm, Holger F.; Link, Thomas M.; Monetti, Roberto A.; Mueller, Dirk; Rummeny, Ernst J.; Raeth, Christoph W.

    2005-04-01

    Osteoporosis is a metabolic bone disease leading to de-mineralization and increased risk of fracture. The two major factors that determine the biomechanical competence of bone are the degree of mineralization and the micro-architectural integrity. Today, modern imaging modalities (high resolution MRI, micro-CT) are capable of depicting structural details of trabecular bone tissue. From the image data, structural properties obtained by quantitative measures are analysed with respect to the presence of osteoporotic fractures of the spine (in-vivo) or correlated with biomechanical strength as derived from destructive testing (in-vitro). Fairly well established are linear structural measures in 2D that are originally adopted from standard histo-morphometry. Recently, non-linear techniques in 2D and 3D based on the scaling index method (SIM), the standard Hough transform (SHT), and the Minkowski Functionals (MF) have been introduced, which show excellent performance in predicting bone strength and fracture risk. However, little is known about the performance of the various parameters with respect to monitoring structural changes due to progression of osteoporosis or as a result of medical treatment. In this contribution, we generate models of trabecular bone with pre-defined structural properties which are exposed to simulated osteoclastic activity. We apply linear and non-linear texture measures to the models and analyse their performance with respect to detecting architectural changes. This study demonstrates, that the texture measures are capable of monitoring structural changes of complex model data. The diagnostic potential varies for the different parameters and is found to depend on the topological composition of the model and initial "bone density". In our models, non-linear texture measures tend to react more sensitively to small structural changes than linear measures. Best performance is observed for the 3rd and 4th Minkowski Functionals and for the scaling

  19. Two-Dimensional Metal Dichalcogenides and Oxides for Hydrogen Evolution: A Computational Screening Approach.

    PubMed

    Pandey, Mohnish; Vojvodic, Aleksandra; Thygesen, Kristian S; Jacobsen, Karsten W

    2015-05-01

    We explore the possibilities of hydrogen evolution by basal planes of 2D metal dichalcogenides and oxides in the 2H and 1T class of structures using the hydrogen binding energy as a computational activity descriptor. For some groups of systems like the Ti, Zr, and Hf dichalcogenides the hydrogen bonding to the 2H structure is stronger than that to the 1T structure, while for the Cr, Mo, and W dichalcogenides the behavior is opposite. This is rationalized by investigating shifts in the chalcogenide p levels comparing the two structures. We find that usually for a given material only at most one of the two phases will be active for the hydrogen evolution reaction; however, in most cases the two phases are very close in formation energy, opening up the possibility for stabilizing the active phase. The study points to many new possible 2D HER materials beyond the few that are already known. PMID:26263317

  20. Designing specific protein–protein interactions using computation, experimental library screening, or integrated methods

    PubMed Central

    Chen, T Scott; Keating, Amy E

    2012-01-01

    Given the importance of protein–protein interactions for nearly all biological processes, the design of protein affinity reagents for use in research, diagnosis or therapy is an important endeavor. Engineered proteins would ideally have high specificities for their intended targets, but achieving interaction specificity by design can be challenging. There are two major approaches to protein design or redesign. Most commonly, proteins and peptides are engineered using experimental library screening and/or in vitro evolution. An alternative approach involves using protein structure and computational modeling to rationally choose sequences predicted to have desirable properties. Computational design has successfully produced novel proteins with enhanced stability, desired interactions and enzymatic function. Here we review the strengths and limitations of experimental library screening and computational structure-based design, giving examples where these methods have been applied to designing protein interaction specificity. We highlight recent studies that demonstrate strategies for combining computational modeling with library screening. The computational methods provide focused libraries predicted to be enriched in sequences with the properties of interest. Such integrated approaches represent a promising way to increase the efficiency of protein design and to engineer complex functionality such as interaction specificity. PMID:22593041

  1. Reviewing risks and benefits of low-dose computed tomography screening for lung cancer.

    PubMed

    Chopra, Ishveen; Chopra, Avijeet; Bias, Thomas K

    2016-01-01

    Lung cancer is the third most common cancer among men and women and is one of the leading causes of cancer-related mortality. Diagnosis at an early stage has been suggested crucial for improving survival in individuals at high-risk of lung cancer. One potential facilitator to early diagnosis is low-dose computed tomography (LDCT). The United States Preventive Services Task Force guidelines call for annual LDCT screening for individuals at high-risk of lung cancer. This recommendation was based on the effectiveness of LDCT in early diagnosis of lung cancer, as indicated by the findings from the National Lung Screening Trial conducted in 2011. Although lung cancer accounts for more than a quarter of all cancer deaths in the United States and LDCT screening shows promising results regarding early lung cancer diagnosis, screening for lung cancer remains controversial. There is uncertainty about risks, cost-effectiveness, adequacy of evidence, and application of screening in a clinical setting. This narrative review provides an overview of risks and benefits of LDCT screening for lung cancer. Further, this review discusses the potential for implementation of LDCT in clinical setting. PMID:26680693

  2. Optimisation and Assessment of Three Modern Touch Screen Tablet Computers for Clinical Vision Testing

    PubMed Central

    Tahir, Humza J.; Murray, Ian J.; Parry, Neil R. A.; Aslam, Tariq M.

    2014-01-01

    Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration. PMID:24759774

  3. 2-D Animation's Not Just for Mickey Mouse.

    ERIC Educational Resources Information Center

    Weinman, Lynda

    1995-01-01

    Discusses characteristics of two-dimensional (2-D) animation; highlights include character animation, painting issues, and motion graphics. Sidebars present Silicon Graphics animations tools and 2-D animation programs for the desktop computer. (DGM)

  4. Overdiagnosis in Low-Dose Computed Tomography Screening for Lung Cancer

    PubMed Central

    Patz, Edward F.; Pinsky, Paul; Gatsonis, Constantine; Sicks, JoRean D.; Kramer, Barnett S.; Tammemägi, Martin C.; Chiles, Caroline; Black, William C.; Aberle, Denise R.

    2014-01-01

    IMPORTANCE Screening for lung cancer has the potential to reduce mortality, but in addition to detecting aggressive tumors, screening will also detect indolent tumors that otherwise may not cause clinical symptoms. These overdiagnosis cases represent an important potential harm of screening because they incur additional cost, anxiety, and morbidity associated with cancer treatment. OBJECTIVE To estimate overdiagnosis in the National Lung Screening Trial (NLST). DESIGN, SETTING, AND PARTICIPANTS We used data from the NLST, a randomized trial comparing screening using low-dose computed tomography (LDCT) vs chest radiography (CXR) among 53 452 persons at high risk for lung cancer observed for 6.4 years, to estimate the excess number of lung cancers in the LDCT arm of the NLST compared with the CXR arm. MAIN OUTCOMES AND MEASURES We calculated 2 measures of overdiagnosis: the probability that a lung cancer detected by screening with LDCT is an overdiagnosis (PS), defined as the excess lung cancers detected by LDCT divided by all lung cancers detected by screening in the LDCT arm; and the number of cases that were considered overdiagnosis relative to the number of persons needed to screen to prevent 1 death from lung cancer. RESULTS During follow-up, 1089 lung cancers were reported in the LDCT arm and 969 in the CXR arm of the NLST. The probability is 18.5% (95% CI, 5.4%–30.6%) that any lung cancer detected by screening with LDCT was an overdiagnosis, 22.5% (95% CI, 9.7%–34.3%) that a non-small cell lung cancer detected by LDCT was an overdiagnosis, and 78.9% (95% CI, 62.2%–93.5%) that a bronchioalveolar lung cancer detected by LDCT was an overdiagnosis. The number of cases of overdiagnosis found among the 320 participants who would need to be screened in the NLST to prevent 1 death from lung cancer was 1.38. CONCLUSIONS AND RELEVANCE More than 18% of all lung cancers detected by LDCT in the NLST seem to be indolent, and overdiagnosis should be considered when

  5. Feasibility of Tablet Computer Screening for Opioid Abuse in the Emergency Department

    PubMed Central

    Weiner, Scott G.; Horton, Laura C.; Green, Traci C.; Butler, Stephen F.

    2015-01-01

    Introduction Tablet computer-based screening may have the potential for detecting patients at risk for opioid abuse in the emergency department (ED). Study objectives were a) to determine if the revised Screener and Opioid Assessment for Patients with Pain (SOAPP®-R), a 24-question previously paper-based screening tool for opioid abuse potential, could be administered on a tablet computer to an ED patient population; b) to demonstrate that >90% of patients can complete the electronic screener without assistance in <5 minutes and; c) to determine patient ease of use with screening on a tablet computer. Methods This was a cross-sectional convenience sample study of patients seen in an urban academic ED. SOAPP®-R was programmed on a tablet computer by study investigators. Inclusion criteria were patients ages ≥18 years who were being considered for discharge with a prescription for an opioid analgesic. Exclusion criteria included inability to understand English or physical disability preventing use of the tablet. Results 93 patients were approached for inclusion and 82 (88%) provided consent. Fifty-two percent (n=43) of subjects were male; 46% (n=38) of subjects were between 18–35 years, and 54% (n=44) were >35 years. One hundred percent of subjects completed the screener. Median time to completion was 148 (interquartile range 117.5–184.3) seconds, and 95% (n=78) completed in <5 minutes. 93% (n=76) rated ease of completion as very easy. Conclusions It is feasible to administer a screening tool to a cohort of ED patients on a tablet computer. The screener administration time is minimal and patient ease of use with this modality is high. PMID:25671003

  6. Designing Multimedia Learning Application with Learning Theories: A Case Study on a Computer Science Subject with 2-D and 3-D Animated Versions

    ERIC Educational Resources Information Center

    Rias, Riaza Mohd; Zaman, Halimah Badioze

    2011-01-01

    Higher learning based instruction may be primarily concerned in most cases with the content of their academic lessons, and not very much with their instructional delivery. However, the effective application of learning theories and technology in higher education has an impact on student performance. With the rapid progress in the computer and…

  7. Comparison of 2D Radiographic Images and 3D Cone Beam Computed Tomography for Positioning Head-and-Neck Radiotherapy Patients

    SciTech Connect

    Li Heng; Zhu, X. Ronald Zhang Lifei; Dong Lei; Tung, Sam; Ahamad, Anesa M.D.; Chao, K. S. Clifford; Morrison, William H.; Rosenthal, David I.; Schwartz, David L.; Mohan, Radhe; Garden, Adam S.

    2008-07-01

    Purpose: To assess the positioning accuracy using two-dimensional kilovoltage (2DkV) imaging and three-dimensional cone beam CT (CBCT) in patients with head and neck (H and N) cancer receiving radiation therapy. To assess the benefit of patient-specific headrest. Materials and Methods: All 21 patients studied were immobilized using thermoplastic masks with either a patient-specific vacuum bag (11 of 21, IMA) or standard clear plastic (10 of 21, IMB) headrests. Each patient was imaged with a pair of orthogonal 2DkV images in treatment position using onboard imaging before the CBCT procedure. The 2DkV and CBCT images were acquired weekly during the same session. The 2DkV images were reviewed by oncologists and also analyzed by a software tool based on mutual information (MI). Results: Ninety-eight pairs of assessable 2DkV-CBCT alignment sets were obtained. Systematic and random errors were <1.6 mm for both 2DkV and CBCT alignments. When we compared shifts determined by CBCT and 2DkV for the same patient setup, statistically significant correlations were observed in all three major directions. Among all CBCT couch shifts, 4.1% {>=} 0.5 cm and 18.7% {>=} 0.3 cm, whereas among all 2DkV (MI) shifts, 1.7% {>=} 0.5 cm and 11.2% {>=} 0.3 cm. Statistically significant difference was found on anteroposterior direction between IMA and IMB with the CBCT alignment only. Conclusions: The differences between 2D and 3D alignments were mainly caused by the relative flexibility of certain H and N structures and possibly by rotation. Better immobilization of the flexible neck is required to further reduce the setup errors for H and N patients receiving radiotherapy.

  8. Increasing Chemical Space Coverage by Combining Empirical and Computational Fragment Screens

    PubMed Central

    2015-01-01

    Most libraries for fragment-based drug discovery are restricted to 1,000–10,000 compounds, but over 500,000 fragments are commercially available and potentially accessible by virtual screening. Whether this larger set would increase chemotype coverage, and whether a computational screen can pragmatically prioritize them, is debated. To investigate this question, a 1281-fragment library was screened by nuclear magnetic resonance (NMR) against AmpC β-lactamase, and hits were confirmed by surface plasmon resonance (SPR). Nine hits with novel chemotypes were confirmed biochemically with KI values from 0.2 to low mM. We also computationally docked 290,000 purchasable fragments with chemotypes unrepresented in the empirical library, finding 10 that had KI values from 0.03 to low mM. Though less novel than those discovered by NMR, the docking-derived fragments filled chemotype holes from the empirical library. Crystal structures of nine of the fragments in complex with AmpC β-lactamase revealed new binding sites and explained the relatively high affinity of the docking-derived fragments. The existence of chemotype holes is likely a general feature of fragment libraries, as calculation suggests that to represent the fragment substructures of even known biogenic molecules would demand a library of minimally over 32,000 fragments. Combining computational and empirical fragment screens enables the discovery of unexpected chemotypes, here by the NMR screen, while capturing chemotypes missing from the empirical library and tailored to the target, with little extra cost in resources. PMID:24807704

  9. Computational high-throughput screening of fluid permeability in heterogeneous fiber materials.

    PubMed

    Röding, Magnus; Schuster, Erich; Logg, Katarina; Lundman, Malin; Bergström, Per; Hanson, Charlotta; Gebäck, Tobias; Lorén, Niklas

    2016-07-20

    We explore computational high-throughput screening as a design strategy for heterogeneous, isotropic fiber materials. Fluid permeability, a key property in the design of soft porous materials, is systematically studied using a multi-scale lattice Boltzmann framework. After characterizing microscopic permeability as a function of solid volume fraction in the microstructure, we perform high-throughput computational screening of in excess of 35 000 macrostructures consisting of a continuous bulk interrupted by spherical/elliptical domains with either lower or higher microscopic permeability (hence with two distinct microscopic solid volume fractions and therefore two distinct microscopic permeabilities) to assess which parameters determine macroscopic permeability for a fixed average solid volume fraction. We conclude that the fractions of bulk and domains and the distribution of solid volume fraction between them are the primary determinants of macroscopic permeability, and that a substantial increase in permeability compared to the corresponding homogenous material is attainable. PMID:27367292

  10. Improved CUDA programs for GPU computing of Swendsen-Wang multi-cluster spin flip algorithm: 2D and 3D Ising, Potts, and XY models

    NASA Astrophysics Data System (ADS)

    Komura, Yukihiro; Okabe, Yutaka

    2016-03-01

    We present new versions of sample CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm. In this update, we add the method of GPU-based cluster-labeling algorithm without the use of conventional iteration (Komura, 2015) to those programs. For high-precision calculations, we also add a random-number generator in the cuRAND library. Moreover, we fix several bugs and remove the extra usage of shared memory in the kernel functions.

  11. Brain-Computer Interfaces for 1-D and 2-D Cursor Control: Designs Using Volitional Control of the EEG Spectrum or Steady-State Visual Evoked Potentials

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Matthews, Bryan; Rosipal, Roman

    2005-01-01

    We have developed and tested two EEG-based brain-computer interfaces (BCI) for users to control a cursor on a computer display. Our system uses an adaptive algorithm, based on kernel partial least squares classification (KPLS), to associate patterns in multichannel EEG frequency spectra with cursor controls. Our first BCI, Target Practice, is a system for one-dimensional device control, in which participants use biofeedback to learn voluntary control of their EEG spectra. Target Practice uses a KF LS classifier to map power spectra of 30-electrode EEG signals to rightward or leftward position of a moving cursor on a computer display. Three subjects learned to control motion of a cursor on a video display in multiple blocks of 60 trials over periods of up to six weeks. The best subject s average skill in correct selection of the cursor direction grew from 58% to 88% after 13 training sessions. Target Practice also implements online control of two artifact sources: a) removal of ocular artifact by linear subtraction of wavelet-smoothed vertical and horizontal EOG signals, b) control of muscle artifact by inhibition of BCI training during periods of relatively high power in the 40-64 Hz band. The second BCI, Think Pointer, is a system for two-dimensional cursor control. Steady-state visual evoked potentials (SSVEP) are triggered by four flickering checkerboard stimuli located in narrow strips at each edge of the display. The user attends to one of the four beacons to initiate motion in the desired direction. The SSVEP signals are recorded from eight electrodes located over the occipital region. A KPLS classifier is individually calibrated to map multichannel frequency bands of the SSVEP signals to right-left or up-down motion of a cursor on a computer display. The display stops moving when the user attends to a central fixation point. As for Target Practice, Think Pointer also implements wavelet-based online removal of ocular artifact; however, in Think Pointer muscle

  12. Touch screen computer health assessment in Australian general practice patients: a cross-sectional study protocol

    PubMed Central

    Carey, Mariko Leanne; Sanson-Fisher, Robert William; Russell, Grant; Mazza, Danielle; Makeham, Meredith; Paul, Christine Louise; Inder, Kerry Jane; D'Este, Catherine

    2012-01-01

    Introduction Cardiovascular disease (CVD) and cancer are leading causes of death globally. Early detection of cancer and risk factors for CVD may improve health outcomes and reduce mortality. General practitioners (GPs) are accessed by the majority of the population and play a key role in the prevention and early detection of chronic disease risk factors. This cross-sectional study aims to assess the acceptability of an electronic method of data collection in general practice patients. The study will describe the proportion screened in line with guidelines for CVD risk factors and cancer as well as report the prevalence of depression, lifestyle risk factors, level of provision of preconception care, cervical cancer vaccination and bone density testing. Lastly, the study will assess the level of agreement between GPs and patients perception regarding presence of risk factors and screening. Methods and analysis The study has been designed to maximise recruitment of GPs by including practitioners in the research team, minimising participation burden on GPs and offering remuneration for participation. Patient recruitment will be carried out by a research assistant located in general practice waiting rooms. Participants will be asked regarding the acceptability of the touch screen computer and to report on a range of health risk and preventive behaviours using the touch screen computer. GPs will complete a one-page survey indicating their perception of the presence of risk behaviours in their patients. Descriptive statistics will be generated to describe the acceptability of the touch screen and prevalence of health risk behaviours. Cohen's κ will be used to assess agreement between GP and patient perception of presence of health risk behaviours. Ethics and dissemination This study has been approved by the human research committees in participating universities. Findings will be disseminated via peer-reviewed publications, conference presentations as well as practice

  13. Computed Tomography Imaging Spectrometer (CTIS) with 2D Reflective Grating for Ultraviolet to Long-Wave Infrared Detection Especially Useful for Surveying Transient Events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for events it is also for investigation of some slow moving phenomena as in the life sciences.

  14. Comparison of 2D and 3D Computational Multiphase Fluid Flow Models of Oxygen Lancing of Pyrometallurgical Furnace Tap-Holes

    NASA Astrophysics Data System (ADS)

    Erwee, M. W.; Reynolds, Q. G.; Zietsman, J. H.

    2016-03-01

    Furnace tap-holes vary in design depending on the type of furnace and process involved, but they share one common trait: The tap-hole must be opened and closed periodically. In general, tap-holes are plugged with refractory clay after tapping, thereby stopping the flow of molten material. Once a furnace is ready to be tapped, drilling and/or lancing with oxygen are typically used to remove tap-hole clay from the tap-hole. Lancing with oxygen is an energy-intensive, mostly manual process, which affects the performance and longevity of the tap-hole refractory material as well as the processes inside the furnace. Computational modeling offers an opportunity to gain insight into the possible effects of oxygen lancing on various aspects of furnace operation.

  15. Computed tomography imaging spectrometer (CTIS) with 2D reflective grating for ultraviolet to long-wave infrared detection especially useful for surveying transient events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.

  16. Comparison of 2D and 3D Computational Multiphase Fluid Flow Models of Oxygen Lancing of Pyrometallurgical Furnace Tap-Holes

    NASA Astrophysics Data System (ADS)

    Erwee, M. W.; Reynolds, Q. G.; Zietsman, J. H.

    2016-06-01

    Furnace tap-holes vary in design depending on the type of furnace and process involved, but they share one common trait: The tap-hole must be opened and closed periodically. In general, tap-holes are plugged with refractory clay after tapping, thereby stopping the flow of molten material. Once a furnace is ready to be tapped, drilling and/or lancing with oxygen are typically used to remove tap-hole clay from the tap-hole. Lancing with oxygen is an energy-intensive, mostly manual process, which affects the performance and longevity of the tap-hole refractory material as well as the processes inside the furnace. Computational modeling offers an opportunity to gain insight into the possible effects of oxygen lancing on various aspects of furnace operation.

  17. Light field morphing using 2D features.

    PubMed

    Wang, Lifeng; Lin, Stephen; Lee, Seungyong; Guo, Baining; Shum, Heung-Yeung

    2005-01-01

    We present a 2D feature-based technique for morphing 3D objects represented by light fields. Existing light field morphing methods require the user to specify corresponding 3D feature elements to guide morph computation. Since slight errors in 3D specification can lead to significant morphing artifacts, we propose a scheme based on 2D feature elements that is less sensitive to imprecise marking of features. First, 2D features are specified by the user in a number of key views in the source and target light fields. Then the two light fields are warped view by view as guided by the corresponding 2D features. Finally, the two warped light fields are blended together to yield the desired light field morph. Two key issues in light field morphing are feature specification and warping of light field rays. For feature specification, we introduce a user interface for delineating 2D features in key views of a light field, which are automatically interpolated to other views. For ray warping, we describe a 2D technique that accounts for visibility changes and present a comparison to the ideal morphing of light fields. Light field morphing based on 2D features makes it simple to incorporate previous image morphing techniques such as nonuniform blending, as well as to morph between an image and a light field. PMID:15631126

  18. A brief measure of Smokers' knowledge of lung cancer screening with low-dose computed tomography.

    PubMed

    Lowenstein, Lisa M; Richards, Vincent F; Leal, Viola B; Housten, Ashley J; Bevers, Therese B; Cantor, Scott B; Cinciripini, Paul M; Cofta-Woerpel, Ludmila M; Escoto, Kamisha H; Godoy, Myrna C B; Linder, Suzanne K; Munden, Reginald F; Volk, Robert J

    2016-12-01

    We describe the development and psychometric properties of a new, brief measure of smokers' knowledge of lung cancer screening with low-dose computed tomography (LDCT). Content experts identified key facts smokers should know in making an informed decision about lung cancer screening. Sample questions were drafted and iteratively refined based on feedback from content experts and cognitive testing with ten smokers. The resulting 16-item knowledge measure was completed by 108 heavy smokers in Houston, Texas, recruited from 12/2014 to 09/2015. Item difficulty, item discrimination, internal consistency and test-retest reliability were assessed. Group differences based upon education levels and smoking history were explored. Several items were dropped due to ceiling effects or overlapping constructs, resulting in a 12-item knowledge measure. Additional items with high item uncertainty were retained because of their importance in informed decision making about lung cancer screening. Internal consistency reliability of the final scale was acceptable (KR-20 = 0.66) and test-retest reliability of the overall scale was 0.84 (intraclass correlation). Knowledge scores differed across education levels (F = 3.36, p = 0.04), while no differences were observed between current and former smokers (F = 1.43, p = 0.24) or among participants who met or did not meet the 30-pack-year screening eligibility criterion (F = 0.57, p = 0.45). The new measure provides a brief, valid and reliable indicator of smokers' knowledge of key concepts central to making an informed decision about lung cancer screening with LDCT, and can be part of a broader assessment of the quality of smokers' decision making about lung cancer screening. PMID:27512650

  19. Creative Computing with Landlab: Open-Source Python Software for Building and Exploring 2D Models of Earth-Surface Dynamics

    NASA Astrophysics Data System (ADS)

    Tucker, G. E.; Hobley, D. E.; Gasparini, N. M.; Hutton, E.; Istanbulluoglu, E.; Nudurupati, S.; Adams, J. M.

    2013-12-01

    Computer models help us explore the consequences of scientific hypotheses at a level of precision and quantification that is impossible for our unaided minds. The process of writing and debugging the necessary code is often time-consuming, however, and this cost can inhibit progress. The code-development barrier can be especially problematic when a field is rapidly unearthing new data and new ideas, as is presently the case in surface dynamics. To help meet the need for rapid, flexible model development, we have written a prototype software framework for two-dimensional numerical modeling of planetary surface processes. The Landlab software can be used to develop new models from scratch, to create models from existing components, or a combination of the two. Landlab provides a gridding module that allows you to create and configure a model grid in just a few lines of code. Grids can be regular or unstructured, and can readily be used to implement staggered-grid numerical solutions to equations for various types of geophysical flow. The gridding module provides built-in functions for common numerical operations, such as calculating gradients and integrating fluxes around the perimeter of cells. Landlab is written in Python, a high-level language that enables rapid code development and takes advantage of a wealth of libraries for scientific computing and graphical output. Landlab also provides a framework for assembling new models from combinations of pre-built components. This capability is illustrated with several examples, including flood inundation, long-term landscape evolution, impact cratering, post-wildfire erosion, and ecohydrology. Interoperability with the Community Surface Dynamics Modeling System (CSDMS) Model-Coupling Framework allows models created in Landlab to be combined with other CSDMS models, which helps to bring frontier problems in landscape and seascape dynamics within closer theoretical reach.

  20. 2D electronic materials for army applications

    NASA Astrophysics Data System (ADS)

    O'Regan, Terrance; Perconti, Philip

    2015-05-01

    The record electronic properties achieved in monolayer graphene and related 2D materials such as molybdenum disulfide and hexagonal boron nitride show promise for revolutionary high-speed and low-power electronic devices. Heterogeneous 2D-stacked materials may create enabling technology for future communication and computation applications to meet soldier requirements. For instance, transparent, flexible and even wearable systems may become feasible. With soldier and squad level electronic power demands increasing, the Army is committed to developing and harnessing graphene-like 2D materials for compact low size-weight-and-power-cost (SWAP-C) systems. This paper will review developments in 2D electronic materials at the Army Research Laboratory over the last five years and discuss directions for future army applications.

  1. 2-d Finite Element Code Postprocessor

    1996-07-15

    ORION is an interactive program that serves as a postprocessor for the analysis programs NIKE2D, DYNA2D, TOPAZ2D, and CHEMICAL TOPAZ2D. ORION reads binary plot files generated by the two-dimensional finite element codes currently used by the Methods Development Group at LLNL. Contour and color fringe plots of a large number of quantities may be displayed on meshes consisting of triangular and quadrilateral elements. ORION can compute strain measures, interface pressures along slide lines, reaction forcesmore » along constrained boundaries, and momentum. ORION has been applied to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.« less

  2. A clinical screening protocol for the RSVP Keyboard™ brain-computer interface

    PubMed Central

    Fried-Oken, Melanie; Mooney, Aimee; Peters, Betts; Oken, Barry

    2013-01-01

    Purpose To propose a screening protocol that identifies requisite sensory, motor, cognitive, and communication skills for people with locked-in syndrome (PLIS) to use the RSVP Keyboard™ brain-computer interface (BCI). Method A multidisciplinary clinical team of seven individuals representing five disciplines identified requisite skills for the BCI. They chose questions and subtests from existing standardized instruments for auditory comprehension, reading, and spelling;modified them to accommodate nonverbal response modalities; and developed novel tasks to screen visual perception, sustained visual attention, and working memory. Questions were included about sensory skills, positioning, pain interference, and medications. The result is a compilation of questions, adapted subtests and original tasks designed for this new BCI system. It was administered to 12 PLIS and six healthy controls. Results Administration required one hour or less. Yes/no choices and eye gaze were adequate response modes for PLIS. Healthy controls and 9 PLIS were 100% accurate on all tasks; three PLIS missed single items. Conclusions The RSVP BCI screening protocol is a brief, repeatable technique for patients with different levels of LIS to identify the presence/absence of skills for BCI use. Widespread adoption of screening methods should be a clinical goal and will help standardize BCI implementation for research and intervention. PMID:24059536

  3. Does patient time spent viewing computer-tailored colorectal cancer screening materials predict patient-reported discussion of screening with providers?

    PubMed

    Sanders, Mechelle; Fiscella, Kevin; Veazie, Peter; Dolan, James G; Jerant, Anthony

    2016-08-01

    The main aim is to examine whether patients' viewing time on information about colorectal cancer (CRC) screening before a primary care physician (PCP) visit is associated with discussion of screening options during the visit. We analyzed data from a multi-center randomized controlled trial of a tailored interactive multimedia computer program (IMCP) to activate patients to undergo CRC screening, deployed in primary care offices immediately before a visit. We employed usage time information stored in the IMCP to examine the association of patient time spent using the program with patient-reported discussion of screening during the visit, adjusting for previous CRC screening recommendation and reading speed.On average, patients spent 33 minutes on the program. In adjusted analyses, 30 minutes spent using the program was associated with a 41% increase in the odds of the patient having a discussion with their PCP (1.04, 1.59, 95% CI). In a separate analysis of the tailoring modules; the modules encouraging adherence to the tailored screening recommendation and discussion with the patient's PCP yielded significant results. Other predictors of screening discussion included better self-reported physical health and increased patient activation. Time spent on the program predicted greater patient-physician discussion of screening during a linked visit.Usage time information gathered automatically by IMCPs offers promise for objectively assessing patient engagement around a topic and predicting likelihood of discussion between patients and their clinician. PMID:27343254

  4. Three-dimensional mapping of soil chemical characteristics at micrometric scale: Statistical prediction by combining 2D SEM-EDX data and 3D X-ray computed micro-tomographic images

    NASA Astrophysics Data System (ADS)

    Hapca, Simona

    2015-04-01

    Many soil properties and functions emerge from interactions of physical, chemical and biological processes at microscopic scales, which can be understood only by integrating techniques that traditionally are developed within separate disciplines. While recent advances in imaging techniques, such as X-ray computed tomography (X-ray CT), offer the possibility to reconstruct the 3D physical structure at fine resolutions, for the distribution of chemicals in soil, existing methods, based on scanning electron microscope (SEM) and energy dispersive X-ray detection (EDX), allow for characterization of the chemical composition only on 2D surfaces. At present, direct 3D measurement techniques are still lacking, sequential sectioning of soils, followed by 2D mapping of chemical elements and interpolation to 3D, being an alternative which is explored in this study. Specifically, we develop an integrated experimental and theoretical framework which combines 3D X-ray CT imaging technique with 2D SEM-EDX and use spatial statistics methods to map the chemical composition of soil in 3D. The procedure involves three stages 1) scanning a resin impregnated soil cube by X-ray CT, followed by precision cutting to produce parallel thin slices, the surfaces of which are scanned by SEM-EDX, 2) alignment of the 2D chemical maps within the internal 3D structure of the soil cube, and 3) development, of spatial statistics methods to predict the chemical composition of 3D soil based on the observed 2D chemical and 3D physical data. Specifically, three statistical models consisting of a regression tree, a regression tree kriging and cokriging model were used to predict the 3D spatial distribution of carbon, silicon, iron and oxygen in soil, these chemical elements showing a good spatial agreement between the X-ray grayscale intensities and the corresponding 2D SEM-EDX data. Due to the spatial correlation between the physical and chemical data, the regression-tree model showed a great potential

  5. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    PubMed Central

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  6. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    PubMed

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  7. Traditional and computer-based screening and diagnosis of reading disabilities in Greek.

    PubMed

    Protopapas, Athanassios; Skaloumbakas, Christos

    2007-01-01

    In this study, we examined the characteristics of reading disability (RD) in the seventh grade of the Greek educational system and the corresponding diagnostic practice. We presented a clinically administered assessment battery, composed of typically employed tasks, and a fully automated, computer-based assessment battery that evaluates some of the same constructs. In all, 261 children ages 12 to 14 were tested. The results of the traditional assessment indicated that RD concerns primarily slow reading and secondarily poor reading and spelling accuracy. This pattern was matched in the domains most attended to in expert student evaluation. Automatic (computer-based) screening for RD in the target age range matched expert judgment in validity and reliability in the absence of a full clinical evaluation. It is proposed that the educational needs of the middle and high school population in Greece will be best served by concentrating on reading and spelling performance--particularly fluency--employing widespread computer-based screening to partially make up for expert-personnel shortage. PMID:17274545

  8. Implementing low-dose computed tomography screening for lung cancer in Canada: implications of alternative at-risk populations, screening frequency, and duration

    PubMed Central

    Evans, W.K.; Flanagan, W.M.; Miller, A.B.; Goffin, J.R.; Memon, S.; Fitzgerald, N.; Wolfson, M.C.

    2016-01-01

    Background Low-dose computed tomography (ldct) screening has been shown to reduce mortality from lung cancer; however, the optimal screening duration and “at risk” population are not known. Methods The Cancer Risk Management Model developed by Statistics Canada for the Canadian Partnership Against Cancer includes a lung screening module based on data from the U.S. National Lung Screening Trial (nlst). The base-case scenario reproduces nlst outcomes with high fidelity. The impact in Canada of annual screening on the number of incident cases and life-years gained, with a wider range of age and smoking history eligibility criteria and varied participation rates, was modelled to show the magnitude of clinical benefit nationally and by province. Life-years gained, costs (discounted and undiscounted), and resource requirements were also estimated. Results In 2014, 1.4 million Canadians were eligible for screening according to nlst criteria. Over 10 years, screening would detect 12,500 more lung cancers than the expected 268,300 and would gain 9200 life-years. The computed tomography imaging requirement of 24,000–30,000 at program initiation would rise to between 87,000 and 113,000 by the 5th year of an annual nlst-like screening program. Costs would increase from approximately $75 million to $128 million at 10 years, and the cumulative cost nationally over 10 years would approach $1 billion, partially offset by a reduction in the costs of managing advanced lung cancer. Conclusions Modelling various ways in which ldct might be implemented provides decision-makers with estimates of the effect on clinical benefit and on resource needs that clinical trial results are unable to provide. PMID:27330355

  9. JAC2D: A two-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method; Yucca Mountain Site Characterization Project

    SciTech Connect

    Biffle, J.H.; Blanford, M.L.

    1994-05-01

    JAC2D is a two-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equations. The method is implemented in a two-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. A four-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic/plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere.

  10. Identification of Serine Conformers by Matrix-Isolation IR Spectroscopy Aided by Near-Infrared Laser-Induced Conformational Change, 2D Correlation Analysis, and Quantum Mechanical Anharmonic Computations.

    PubMed

    Najbauer, Eszter E; Bazsó, Gábor; Apóstolo, Rui; Fausto, Rui; Biczysko, Malgorzata; Barone, Vincenzo; Tarczay, György

    2015-08-20

    The conformers of α-serine were investigated by matrix-isolation IR spectroscopy combined with NIR laser irradiation. This method, aided by 2D correlation analysis, enabled unambiguously grouping the spectral lines to individual conformers. On the basis of comparison of at least nine experimentally observed vibrational transitions of each conformer with empirically scaled (SQM) and anharmonic (GVPT2) computed IR spectra, six conformers were identified. In addition, the presence of at least one more conformer in Ar matrix was proved, and a short-lived conformer with a half-life of (3.7 ± 0.5) × 10(3) s in N2 matrix was generated by NIR irradiation. The analysis of the NIR laser-induced conversions revealed that the excitation of the stretching overtone of both the side chain and the carboxylic OH groups can effectively promote conformational changes, but remarkably different paths were observed for the two kinds of excitations. PMID:26201050

  11. Redesign of a computerized clinical reminder for colorectal cancer screening: a human-computer interaction evaluation

    PubMed Central

    2011-01-01

    Background Based on barriers to the use of computerized clinical decision support (CDS) learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's) colorectal cancer (CRC) screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs) used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice. PMID:22126324

  12. Computer-administration of questionnaires: a health screening system (HSS) developed for veterans.

    PubMed

    Kovera, C A; Anger, W K; Campbell, K A; Binder, L M; Storzbach, D; Davis, K L; Rohlman, D S

    1996-01-01

    The introduction of microcomputers in psychological research has spawned a burgeoning number of tests of psychological or behavioral function, but few computerized systems for administering questionnaires have been developed. A Health Screening System (HSS) is described that combines the benefits of the paper-and-pencil format (e.g., convenient navigation within test questions) and the added benefits of computer-implementation (e.g., efficiency, automated scoring). The HSS features; a) appealing test appearance (e.g., text in large-size fonts, color backgrounds); b) clear wording of tests and instructions (identical wording as original tests except when clarity is served by changes); c) limiting need for Examiner-Subject interaction (e.g., continuously available on-line training, navigation within test questions, answer review capability, durable 9-button response unit); d) options (e.g., question skipping, spoken instructions, test questions, and answers on command); e) modification capabilities (e.g., color, text, test layout editing, control of test order, automated breaks, addition of tests to system); and f) extras (e.g., kernel of main instruction on each test screen, digitized video, audio message from Examiner in training, copyright notification on each screen, raw and summary data outputs in spreadsheet formal). Ten HSS tests were administered to 22 US military veterans, who took slightly longer to complete them than did 10 veterans who were administered the same tests in their original paper-and-pencil format. User reaction to the computerized HSS was positive. PMID:8866546

  13. ATARiS: Computational quantification of gene suppression phenotypes from multisample RNAi screens

    PubMed Central

    Shao, Diane D.; Tsherniak, Aviad; Gopal, Shuba; Weir, Barbara A.; Tamayo, Pablo; Stransky, Nicolas; Schumacher, Steven E.; Zack, Travis I.; Beroukhim, Rameen; Garraway, Levi A.; Margolin, Adam A.; Root, David E.; Hahn, William C.; Mesirov, Jill P.

    2013-01-01

    Genome-scale RNAi libraries enable the systematic interrogation of gene function. However, the interpretation of RNAi screens is complicated by the observation that RNAi reagents designed to suppress the mRNA transcripts of the same gene often produce a spectrum of phenotypic outcomes due to differential on-target gene suppression or perturbation of off-target transcripts. Here we present a computational method, Analytic Technique for Assessment of RNAi by Similarity (ATARiS), that takes advantage of patterns in RNAi data across multiple samples in order to enrich for RNAi reagents whose phenotypic effects relate to suppression of their intended targets. By summarizing only such reagent effects for each gene, ATARiS produces quantitative, gene-level phenotype values, which provide an intuitive measure of the effect of gene suppression in each sample. This method is robust for data sets that contain as few as 10 samples and can be used to analyze screens of any number of targeted genes. We used this analytic approach to interrogate RNAi data derived from screening more than 100 human cancer cell lines and identified HNF1B as a transforming oncogene required for the survival of cancer cells that harbor HNF1B amplifications. ATARiS is publicly available at http://broadinstitute.org/ataris. PMID:23269662

  14. Attitudes and Beliefs of Primary Care Providers in New Mexico About Lung Cancer Screening Using Low-Dose Computed Tomography

    PubMed Central

    Hoffman, Richard M.; Sussman, Andrew L.; Getrich, Christina M.; Rhyne, Robert L.; Crowell, Richard E.; Taylor, Kathryn L.; Reifler, Ellen J.; Wescott, Pamela H.; Murrietta, Ambroshia M.; Saeed, Ali I.

    2015-01-01

    Introduction On the basis of results from the National Lung Screening Trial (NLST), national guidelines now recommend using low-dose computed tomography (LDCT) to screen high-risk smokers for lung cancer. Our study objective was to characterize the knowledge, attitudes, and beliefs of primary care providers about implementing LDCT screening. Methods We conducted semistructured interviews with primary care providers practicing in New Mexico clinics for underserved minority populations. The interviews, conducted from February through September 2014, focused on providers’ tobacco cessation efforts, lung cancer screening practices, perceptions of NLST and screening guidelines, and attitudes about informed decision making for cancer screening. Investigators iteratively reviewed transcripts to create a coding structure. Results We reached thematic saturation after interviewing 10 providers practicing in 6 urban and 4 rural settings; 8 practiced at federally qualified health centers. All 10 providers promoted smoking cessation, some screened with chest x-rays, and none screened with LDCT. Not all were aware of NLST results or current guideline recommendations. Providers viewed study results skeptically, particularly the 95% false-positive rate, the need to screen 320 patients to prevent 1 lung cancer death, and the small proportion of minority participants. Providers were uncertain whether New Mexico had the necessary infrastructure to support high-quality screening, and worried about access barriers and financial burdens for rural, underinsured populations. Providers noted the complexity of discussing benefits and harms of screening and surveillance with their patient population. Conclusion Providers have several concerns about the feasibility and appropriateness of implementing LDCT screening. Effective lung cancer screening programs will need to educate providers and patients to support informed decision making and to ensure that high-quality screening can be

  15. High divergent 2D grating

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Ma, Jianyong; Zhou, Changhe

    2014-11-01

    A 3×3 high divergent 2D-grating with period of 3.842μm at wavelength of 850nm under normal incidence is designed and fabricated in this paper. This high divergent 2D-grating is designed by the vector theory. The Rigorous Coupled Wave Analysis (RCWA) in association with the simulated annealing (SA) is adopted to calculate and optimize this 2D-grating.The properties of this grating are also investigated by the RCWA. The diffraction angles are more than 10 degrees in the whole wavelength band, which are bigger than the traditional 2D-grating. In addition, the small period of grating increases the difficulties of fabrication. So we fabricate the 2D-gratings by direct laser writing (DLW) instead of traditional manufacturing method. Then the method of ICP etching is used to obtain the high divergent 2D-grating.

  16. Discovery of new [Formula: see text] proteasome inhibitors using a knowledge-based computational screening approach.

    PubMed

    Mehra, Rukmankesh; Chib, Reena; Munagala, Gurunadham; Yempalla, Kushalava Reddy; Khan, Inshad Ali; Singh, Parvinder Pal; Khan, Farrah Gul; Nargotra, Amit

    2015-11-01

    Mycobacterium tuberculosis bacteria cause deadly infections in patients [Corrected]. The rise of multidrug resistance associated with tuberculosis further makes the situation worse in treating the disease. M. tuberculosis proteasome is necessary for the pathogenesis of the bacterium validated as an anti-tubercular target, thus making it an attractive enzyme for designing Mtb inhibitors. In this study, a computational screening approach was applied to identify new proteasome inhibitor candidates from a library of 50,000 compounds. This chemical library was procured from the ChemBridge (20,000 compounds) and the ChemDiv (30,000 compounds) databases. After a detailed analysis of the computational screening results, 50 in silico hits were retrieved and tested in vitro finding 15 compounds with [Formula: see text] values ranging from 35.32 to 64.15 [Formula: see text]M on lysate. A structural analysis of these hits revealed that 14 of these compounds probably have non-covalent mode of binding to the target and have not reported for anti-tubercular or anti-proteasome activity. The binding interactions of all the 14 protein-inhibitor complexes were analyzed using molecular docking studies. Further, molecular dynamics simulations of the protein in complex with the two most promising hits were carried out so as to identify the key interactions and validate the structural stability. PMID:26232029

  17. ChemScreener: A Distributed Computing Tool for Scaffold based Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Vyas, Renu

    2015-01-01

    In this work we present ChemScreener, a Java-based application to perform virtual library generation combined with virtual screening in a platform-independent distributed computing environment. ChemScreener comprises a scaffold identifier, a distinct scaffold extractor, an interactive virtual library generator as well as a virtual screening module for subsequently selecting putative bioactive molecules. The virtual libraries are annotated with chemophore-, pharmacophore- and toxicophore-based information for compound prioritization. The hits selected can then be further processed using QSAR, docking and other in silico approaches which can all be interfaced within the ChemScreener framework. As a sample application, in this work scaffold selectivity, diversity, connectivity and promiscuity towards six important therapeutic classes have been studied. In order to illustrate the computational power of the application, 55 scaffolds extracted from 161 anti-psychotic compounds were enumerated to produce a virtual library comprising 118 million compounds (17 GB) and annotated with chemophore, pharmacophore and toxicophore based features in a single step which would be non-trivial to perform with many standard software tools today on libraries of this size. PMID:26138574

  18. Computational reverse chemical ecology: Virtual screening and predicting behaviorally active semiochemicals for Bactrocera dorsalis

    PubMed Central

    2014-01-01

    Background Semiochemical is a generic term used for a chemical substance that influences the behaviour of an organism. It is a common term used in the field of chemical ecology to encompass pheromones, allomones, kairomones, attractants and repellents. Insects have mastered the art of using semiochemicals as communication signals and rely on them to find mates, host or habitat. This dependency of insects on semiochemicals has allowed chemical ecologists to develop environment friendly pest management strategies. However, discovering semiochemicals is a laborious process that involves a plethora of behavioural and analytical techniques, making it expansively time consuming. Recently, reverse chemical ecology approach using odorant binding proteins (OBPs) as target for elucidating behaviourally active compounds is gaining eminence. In this scenario, we describe a “computational reverse chemical ecology” approach for rapid screening of potential semiochemicals. Results We illustrate the high prediction accuracy of our computational method. We screened 25 semiochemicals for their binding potential to a GOBP of B. dorsalis using molecular docking (in silico) and molecular dynamics. Parallely, compounds were subjected to fluorescent quenching assays (Experimental). The correlation between in silico and experimental data were significant (r2 = 0.9408; P < 0.0001). Further, predicted compounds were subjected to behavioral bioassays and were found to be highly attractive to insects. Conclusions The present study provides a unique methodology for rapid screening and predicting behaviorally active semiochemicals. This methodology may be developed as a viable approach for prospecting active semiochemicals for pest control, which otherwise is a laborious process. PMID:24640964

  19. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  20. Estimating Development Cost for a Tailored Interactive Computer Program to Enhance Colorectal Cancer Screening Compliance

    PubMed Central

    Lairson, David R.; Chang, Yu-Chia; Bettencourt, Judith L.; Vernon, Sally W.; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was $328,866. The development cost was $52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  1. 2D microwave imaging reflectometer electronics

    SciTech Connect

    Spear, A. G.; Domier, C. W. Hu, X.; Muscatello, C. M.; Ren, X.; Luhmann, N. C.; Tobias, B. J.

    2014-11-15

    A 2D microwave imaging reflectometer system has been developed to visualize electron density fluctuations on the DIII-D tokamak. Simultaneously illuminated at four probe frequencies, large aperture optics image reflections from four density-dependent cutoff surfaces in the plasma over an extended region of the DIII-D plasma. Localized density fluctuations in the vicinity of the plasma cutoff surfaces modulate the plasma reflections, yielding a 2D image of electron density fluctuations. Details are presented of the receiver down conversion electronics that generate the in-phase (I) and quadrature (Q) reflectometer signals from which 2D density fluctuation data are obtained. Also presented are details on the control system and backplane used to manage the electronics as well as an introduction to the computer based control program.

  2. 2D microwave imaging reflectometer electronics

    NASA Astrophysics Data System (ADS)

    Spear, A. G.; Domier, C. W.; Hu, X.; Muscatello, C. M.; Ren, X.; Tobias, B. J.; Luhmann, N. C.

    2014-11-01

    A 2D microwave imaging reflectometer system has been developed to visualize electron density fluctuations on the DIII-D tokamak. Simultaneously illuminated at four probe frequencies, large aperture optics image reflections from four density-dependent cutoff surfaces in the plasma over an extended region of the DIII-D plasma. Localized density fluctuations in the vicinity of the plasma cutoff surfaces modulate the plasma reflections, yielding a 2D image of electron density fluctuations. Details are presented of the receiver down conversion electronics that generate the in-phase (I) and quadrature (Q) reflectometer signals from which 2D density fluctuation data are obtained. Also presented are details on the control system and backplane used to manage the electronics as well as an introduction to the computer based control program.

  3. 2D microwave imaging reflectometer electronics.

    PubMed

    Spear, A G; Domier, C W; Hu, X; Muscatello, C M; Ren, X; Tobias, B J; Luhmann, N C

    2014-11-01

    A 2D microwave imaging reflectometer system has been developed to visualize electron density fluctuations on the DIII-D tokamak. Simultaneously illuminated at four probe frequencies, large aperture optics image reflections from four density-dependent cutoff surfaces in the plasma over an extended region of the DIII-D plasma. Localized density fluctuations in the vicinity of the plasma cutoff surfaces modulate the plasma reflections, yielding a 2D image of electron density fluctuations. Details are presented of the receiver down conversion electronics that generate the in-phase (I) and quadrature (Q) reflectometer signals from which 2D density fluctuation data are obtained. Also presented are details on the control system and backplane used to manage the electronics as well as an introduction to the computer based control program. PMID:25430247

  4. MOLA: a bootable, self-configuring system for virtual screening using AutoDock4/Vina on computer clusters

    PubMed Central

    2010-01-01

    Background Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. Implementation MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. Conclusion MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a potential maximum speed-up of 10x

  5. Short-Term Outcomes of Screening Mammography Using Computer-Aided Detection

    PubMed Central

    Fenton, Joshua J.; Xing, Guibo; Elmore, Joann G.; Bang, Heejung; Chen, Steven L.; Lindfors, Karen K.; Baldwin, Laura-Mae

    2013-01-01

    Background Computer-aided detection (CAD) has rapidly diffused into screening mammography practice despite limited and conflicting data on its clinical effect. Objective To determine associations between CAD use during screening mammography and the incidence of ductal carcinoma in situ (DCIS) and invasive breast cancer, invasive cancer stage, and diagnostic testing. Design Retrospective cohort study. Setting Medicare program. Participants Women aged 67 to 89 years having screening mammography between 2001 and 2006 in U.S. SEER (Surveillance, Epidemiology and End Results) regions (409 459 mammograms from 163 099 women). Measurements Incident DCIS and invasive breast cancer within 1 year after mammography, invasive cancer stage, and diagnostic testing within 90 days after screening among women without breast cancer. Results From 2001 to 2006, CAD prevalence increased from 3.6% to 60.5%. Use of CAD was associated with greater DCIS incidence (adjusted odds ratio [OR], 1.17 [95% CI, 1.11 to 1.23]) but no difference in invasive breast cancer incidence (adjusted OR, 1.00 [CI, 0.97 to 1.03]). Among women with invasive cancer, CAD was associated with greater likelihood of stage I to II versus III to IV cancer (adjusted OR, 1.27 [CI, 1.14 to 1.41]). In women without breast cancer, CAD was associated with increased odds of diagnostic mammography (adjusted OR, 1.28 [CI, 1.27 to 1.29]), breast ultrasonography (adjusted OR, 1.07 [CI, 1.06 to 1.09]), and breast biopsy (adjusted OR, 1.10 [CI, 1.08 to 1.12]). Limitation Short follow-up for cancer stage, potential unmeasured confounding, and uncertain generalizability to younger women. Conclusion Use of CAD during screening mammography among Medicare enrollees is associated with increased DCIS incidence, the diagnosis of invasive breast cancer at earlier stages, and increased diagnostic testing among women without breast cancer. Primary Funding Source Center for Healthcare Policy and Research, University of California, Davis. PMID

  6. Randomized Trial of A Lay Health Advisor and Computer Intervention to Increase Mammography Screening in African American Women

    PubMed Central

    Russell, Kathleen M.; Champion, Victoria L.; Monahan, Patrick O.; Millon-Underwood, Sandra; Zhao, Qianqian; Spacey, Nicole; Rush, Nathan L.; Paskett, Electra D.

    2009-01-01

    Background Low-income African American women face numerous barriers to mammography screening. We tested the efficacy of a combined interactive computer program and lay health advisor (LHA) intervention to increase mammography screening. Methods In this randomized, single blind study, participants were 181 African American female health center patients ages 41-75, ≤250% of poverty level with no breast cancer history and no screening mammogram in the past 15 months. They were assigned to either (a) a low dose comparison group consisting of a culturally appropriate mammography screening pamphlet or (b) interactive, tailored computer instruction at baseline and 4 monthly LHA counseling sessions. Self-reported screening data were collected at baseline and 6 months and verified by medical record. Results For intent-to-treat analysis of primary outcome (medical-record-verified mammography screening, available on all but two participants), the intervention group had increased screening to 51% (45/89) compared to 18% (16/90) for the comparison group at 6 months. When adjusted for employment status, disability, first-degree relatives with breast cancer, health insurance, and previous breast biopsies, the intervention group was three times more likely (adjusted relative risk [RR]=2.7 [95% CI: 1.8, 3.7], p<.0001) to get screened than the low dose comparison group. Similar results were found for self-reported mammography stage of screening adoption. Conclusions The combined intervention was efficacious in improving mammography screening in low-income African American women, with an unadjusted effect size (RR = 2.84) significantly higher (p < .05) than previous studies of each intervention alone. PMID:20056639

  7. Beyond lung cancer: a strategic approach to interpreting screening computed tomography scans on the basis of mortality data from the National Lung Screening Trial.

    PubMed

    Chiles, Caroline; Paul, Narinder S

    2013-11-01

    Low-dose computed tomography screening in older patients with a heavy-smoking history can be viewed as an opportunity to screen for smoking-related illnesses and not just for lung cancer. Within the National Lung Screening Trial, 24.1% of all deaths were attributed to lung cancer, but there were significant competing causes of mortality in this patient population. Cardiovascular illness caused 24.8% of deaths. Other neoplasms were listed as the cause of death in 22.3%, and respiratory illness was the cause of death in 10.4%. All of these illnesses might be attributed to smoking. Low-dose computed tomography of the thorax may provide information about these diseases, which could be used to guide therapeutic intervention and, hopefully, alter the courses of these diseases. Information about coronary artery calcification, chronic obstructive pulmonary disease, and potential extrapulmonary malignancy should be provided in the report of the screening examination. This must be balanced against the risk of the burden of false-positive findings and the costs, both psychological and financial, associated with additional investigative evaluations. PMID:24071622

  8. The Role of Screening Sinus Computed Tomography in Pediatric Hematopoietic Stem Cell Transplant Patients

    PubMed Central

    Zamora, Carlos A.; Oppenheimer, Avi G.; Dave, Hema; Symons, Heather; Huisman, Thierry A. G. M.; Izbudak, Izlem

    2015-01-01

    Objective The objective of this study was to evaluate pretransplant sinus computed tomography (CT) as predictor of post–hematopoietic stem cell transplant sinusitis. Methods We evaluated pretransplant and posttransplant CT findings in 100 children using the Lund-Mackay system and “common-practice” radiology reporting and correlated these with the presence of acute sinusitis. Results Fourteen percent of patients with normal screening CT developed posttransplant sinusitis, compared with 23%with radiographic abnormalities and 22% with clinical sinusitis alone, not statistically significant. Sensitivity of CT findings for clinical sinusitis ranged between 19% and 56%. Except for mucosal thickening (71% specificity), other findings had high specificity between 92% and 97%, particularly when combined. Lund-Mackay score change of 10 or greater from baseline was associated with a 2.8-fold increased likelihood of having sinusitis (P < 0.001). Conclusions Screening CT can serve as a baseline, with a Lund-Mackay score change of 10 or greater constituting a significant threshold. The strongest correlation with the presence of acute sinusitis was seen with combined CT findings. PMID:25474147

  9. A Computational Screen for Regulators of Oxidative Phosphorylation Implicates SLIRP in Mitochondrial RNA Homeostasis

    PubMed Central

    Baughman, Joshua M.; Nilsson, Roland; Gohil, Vishal M.; Arlow, Daniel H.; Gauhar, Zareen; Mootha, Vamsi K.

    2009-01-01

    The human oxidative phosphorylation (OxPhos) system consists of approximately 90 proteins encoded by nuclear and mitochondrial genomes and serves as the primary cellular pathway for ATP biosynthesis. While the core protein machinery for OxPhos is well characterized, many of its assembly, maturation, and regulatory factors remain unknown. We exploited the tight transcriptional control of the genes encoding the core OxPhos machinery to identify novel regulators. We developed a computational procedure, which we call expression screening, which integrates information from thousands of microarray data sets in a principled manner to identify genes that are consistently co-expressed with a target pathway across biological contexts. We applied expression screening to predict dozens of novel regulators of OxPhos. For two candidate genes, CHCHD2 and SLIRP, we show that silencing with RNAi results in destabilization of OxPhos complexes and a marked loss of OxPhos enzymatic activity. Moreover, we show that SLIRP plays an essential role in maintaining mitochondrial-localized mRNA transcripts that encode OxPhos protein subunits. Our findings provide a catalogue of potential novel OxPhos regulators that advance our understanding of the coordination between nuclear and mitochondrial genomes for the regulation of cellular energy metabolism. PMID:19680543

  10. A computational design approach for virtual screening of peptide interactions across K+ channel families☆

    PubMed Central

    Doupnik, Craig A.; Parra, Katherine C.; Guida, Wayne C.

    2014-01-01

    Ion channels represent a large family of membrane proteins with many being well established targets in pharmacotherapy. The ‘druggability’ of heteromeric channels comprised of different subunits remains obscure, due largely to a lack of channel-specific probes necessary to delineate their therapeutic potential in vivo. Our initial studies reported here, investigated the family of inwardly rectifying potassium (Kir) channels given the availability of high resolution crystal structures for the eukaryotic constitutively active Kir2.2 channel. We describe a ‘limited’ homology modeling approach that can yield chimeric Kir channels having an outer vestibule structure representing nearly any known vertebrate or invertebrate channel. These computationally-derived channel structures were tested ""in silico for ‘docking’ to NMR structures of tertiapin (TPN), a 21 amino acid peptide found in bee venom. TPN is a highly selective and potent blocker for the epithelial rat Kir1.1 channel, but does not block human or zebrafish Kir1.1 channel isoforms. Our Kir1.1 channel-TPN docking experiments recapitulated published in vitro ""findings for TPN-sensitive and TPN-insensitive channels. Additionally, in silico site-directed mutagenesis identified ‘hot spots’ within the channel outer vestibule that mediate energetically favorable docking scores and correlate with sites previously identified with in vitro thermodynamic mutant-cycle analysis. These ‘proof-of-principle’ results establish a framework for virtual screening of re-engineered peptide toxins for interactions with computationally derived Kir channels that currently lack channel-specific blockers. When coupled with electrophysiological validation, this virtual screening approach may accelerate the drug discovery process, and can be readily applied to other ion channels families where high resolution structures are available. PMID:25709757

  11. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    NASA Astrophysics Data System (ADS)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  12. AnisWave 2D

    2004-08-01

    AnisWave2D is a 2D finite-difference code for a simulating seismic wave propagation in fully anisotropic materials. The code is implemented to run in parallel over multiple processors and is fully portable. A mesh refinement algorithm has been utilized to allow the grid-spacing to be tailored to the velocity model, avoiding the over-sampling of high-velocity materials that usually occurs in fixed-grid schemes.

  13. Resource Utilization and Costs during the Initial Years of Lung Cancer Screening with Computed Tomography in Canada

    PubMed Central

    Lam, Stephen; Tammemagi, Martin C.; Evans, William K.; Leighl, Natasha B.; Regier, Dean A.; Bolbocean, Corneliu; Shepherd, Frances A.; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R.; Mayo, John R.; McWilliams, Annette; Couture, Christian; English, John C.; Goffin, John; Hwang, David M.; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J.; Goss, Glenwood D.; Nicholas, Garth; Seely, Jean M.; Sekhon, Harmanjatinder S.; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N.; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D.; Tan, Wan C.; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J.

    2014-01-01

    Background: It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Methods: Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer’s perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. Results: The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400–$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553–$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254–$52,200; p = 0.061). Conclusion: In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure. PMID:25105438

  14. Computer-aided structural engineering (CASE) project: Application of finite-element, grid generation, and scientific visualization techniques to 2-D and 3-d seepage and ground-water modeling. Final report

    SciTech Connect

    Tracy, F.T.

    1991-09-01

    This report describes new advances in the computational modeling of ground water and seepage using the finite element method (FEM) in conjunction with tools and techniques typically used by the aerospace engineers. The unsolved environmental issues regarding our hazardous and toxic waste problems must be resolved, and significant resources must be placed on this effort. Some military bases are contaminated with hazardous waste that has entered the groundwater domain. A groundwater model that takes into account contaminant flow is therefore critical. First, an extension of the technique of generating an orthogonal structured grid (using the Cauchy-Riemann equations) to automatically generate a flow net for two-dimensional (2-D) steady-state seepage problems is presented for various boundary conditions. Second, a complete implementation of a three-dimensional (3-D) seepage package is described where (1) grid generation is accomplished using the EAGLE program, (2) the seepage and groundwater analysis for either confined or unconfined steady-state flow, homogeneous or inhomogeneous media, and isotropic or anisotropic soil is accomplished with no restriction on the FE grid or requirement of an initial guess of the free surface for unconfined flow problems, and (3) scientific visualization is accomplished using the program FAST developed by NASA.

  15. The UK Lung Cancer Screening Trial: a pilot randomised controlled trial of low-dose computed tomography screening for the early detection of lung cancer.

    PubMed Central

    Field, John K; Duffy, Stephen W; Baldwin, David R; Brain, Kate E; Devaraj, Anand; Eisen, Tim; Green, Beverley A; Holemans, John A; Kavanagh, Terry; Kerr, Keith M; Ledson, Martin; Lifford, Kate J; McRonald, Fiona E; Nair, Arjun; Page, Richard D; Parmar, Mahesh Kb; Rintoul, Robert C; Screaton, Nicholas; Wald, Nicholas J; Weller, David; Whynes, David K; Williamson, Paula R; Yadegarfar, Ghasem; Hansell, David M

    2016-01-01

    BACKGROUND Lung cancer kills more people than any other cancer in the UK (5-year survival < 13%). Early diagnosis can save lives. The USA-based National Lung Cancer Screening Trial reported a 20% relative reduction in lung cancer mortality and 6.7% all-cause mortality in low-dose computed tomography (LDCT)-screened subjects. OBJECTIVES To (1) analyse LDCT lung cancer screening in a high-risk UK population, determine optimum recruitment, screening, reading and care pathway strategies; and (2) assess the psychological consequences and the health-economic implications of screening. DESIGN A pilot randomised controlled trial comparing intervention with usual care. A population-based risk questionnaire identified individuals who were at high risk of developing lung cancer (≥ 5% over 5 years). SETTING Thoracic centres with expertise in lung cancer imaging, respiratory medicine, pathology and surgery: Liverpool Heart & Chest Hospital, Merseyside, and Papworth Hospital, Cambridgeshire. PARTICIPANTS Individuals aged 50-75 years, at high risk of lung cancer, in the primary care trusts adjacent to the centres. INTERVENTIONS A thoracic LDCT scan. Follow-up computed tomography (CT) scans as per protocol. Referral to multidisciplinary team clinics was determined by nodule size criteria. MAIN OUTCOME MEASURES Population-based recruitment based on risk stratification; management of the trial through web-based database; optimal characteristics of CT scan readers (radiologists vs. radiographers); characterisation of CT-detected nodules utilising volumetric analysis; prevalence of lung cancer at baseline; sociodemographic factors affecting participation; psychosocial measures (cancer distress, anxiety, depression, decision satisfaction); and cost-effectiveness modelling. RESULTS A total of 247,354 individuals were approached to take part in the trial; 30.7% responded positively to the screening invitation. Recruitment of participants resulted in 2028 in the CT arm and 2027 in

  16. Roton Excitations and the Fluid-Solid Phase Transition in Superfluid 2D Yukawa Bosons

    NASA Astrophysics Data System (ADS)

    Molinelli, S.; Galli, D. E.; Reatto, L.; Motta, M.

    2016-05-01

    We compute several ground-state properties and the dynamical structure factor of a zero-temperature system of Bosons interacting with the 2D screened Coulomb (2D-SC) potential. We resort to the exact shadow path integral ground state (SPIGS) quantum Monte Carlo method to compute the imaginary-time correlation function of the model, and to the genetic algorithm via falsification of theories (GIFT) to retrieve the dynamical structure factor. We provide a detailed comparison of ground-state properties and collective excitations of 2D-SC and ^4 He atoms. The roton energy of the 2D-SC system is an increasing function of density, and not a decreasing one as in ^4 He. This result is in contrast with the view that the roton is the soft mode of the fluid-solid transition. We uncover a remarkable quasi-universality of backflow and of other properties when expressed in terms of the amount of short-range order as quantified by the height of the first peak of the static structure factor.

  17. Drug search for leishmaniasis: a virtual screening approach by grid computing.

    PubMed

    Ochoa, Rodrigo; Watowich, Stanley J; Flórez, Andrés; Mesa, Carol V; Robledo, Sara M; Muskus, Carlos

    2016-07-01

    The trypanosomatid protozoa Leishmania is endemic in ~100 countries, with infections causing ~2 million new cases of leishmaniasis annually. Disease symptoms can include severe skin and mucosal ulcers, fever, anemia, splenomegaly, and death. Unfortunately, therapeutics approved to treat leishmaniasis are associated with potentially severe side effects, including death. Furthermore, drug-resistant Leishmania parasites have developed in most endemic countries. To address an urgent need for new, safe and inexpensive anti-leishmanial drugs, we utilized the IBM World Community Grid to complete computer-based drug discovery screens (Drug Search for Leishmaniasis) using unique leishmanial proteins and a database of 600,000 drug-like small molecules. Protein structures from different Leishmania species were selected for molecular dynamics (MD) simulations, and a series of conformational "snapshots" were chosen from each MD trajectory to simulate the protein's flexibility. A Relaxed Complex Scheme methodology was used to screen ~2000 MD conformations against the small molecule database, producing >1 billion protein-ligand structures. For each protein target, a binding spectrum was calculated to identify compounds predicted to bind with highest average affinity to all protein conformations. Significantly, four different Leishmania protein targets were predicted to strongly bind small molecules, with the strongest binding interactions predicted to occur for dihydroorotate dehydrogenase (LmDHODH; PDB:3MJY). A number of predicted tight-binding LmDHODH inhibitors were tested in vitro and potent selective inhibitors of Leishmania panamensis were identified. These promising small molecules are suitable for further development using iterative structure-based optimization and in vitro/in vivo validation assays. PMID:27438595

  18. Usability testing of a respiratory interface using computer screen and facial expressions videos.

    PubMed

    Oliveira, Ana; Pinho, Cátia; Monteiro, Sandra; Marcos, Ana; Marques, Alda

    2013-12-01

    Computer screen videos (CSVs) and users' facial expressions videos (FEVs) are recommended to evaluate systems performance. However, software combining both methods is often non-accessible in clinical research fields. The Observer-XT software is commonly used for clinical research to assess human behaviours. Thus, this study reports on the combination of CSVs and FEVs, to evaluate a graphical user interface (GUI). Eight physiotherapists entered clinical information in the GUI while CSVs and FEVs were collected. The frequency and duration of a list of behaviours found in FEVs were analysed using the Observer-XT-10.5. Simultaneously, the frequency and duration of usability problems of CSVs were manually registered. CSVs and FEVs timelines were also matched to verify combinations. The analysis of FEVs revealed that the category most frequently observed in users behaviour was the eye contact with the screen (ECS, 32±9) whilst verbal communication achieved the highest duration (14.8±6.9min). Regarding the CSVs, 64 problems, related with the interface (73%) and the user (27%), were found. In total, 135 usability problems were identified by combining both methods. The majority were reported through verbal communication (45.8%) and ECS (40.8%). "False alarms" and "misses" did not cause quantifiable reactions and the facial expressions problems were mainly related with the lack of familiarity (55.4%) felt by users when interacting with the interface. These findings encourage the use of Observer-XT-10.5 to conduct small usability sessions, as it identifies emergent groups of problems by combining methods. However, to validate final versions of systems further validation should be conducted using specialized software. PMID:24290937

  19. 2D vs. 3D mammography observer study

    NASA Astrophysics Data System (ADS)

    Fernandez, James Reza F.; Hovanessian-Larsen, Linda; Liu, Brent

    2011-03-01

    Breast cancer is the most common type of non-skin cancer in women. 2D mammography is a screening tool to aid in the early detection of breast cancer, but has diagnostic limitations of overlapping tissues, especially in dense breasts. 3D mammography has the potential to improve detection outcomes by increasing specificity, and a new 3D screening tool with a 3D display for mammography aims to improve performance and efficiency as compared to 2D mammography. An observer study using a mammography phantom was performed to compare traditional 2D mammography with this ne 3D mammography technique. In comparing 3D and 2D mammography there was no difference in calcification detection, and mass detection was better in 2D as compared to 3D. There was a significant decrease in reading time for masses, calcifications, and normals in 3D compared to 2D, however, as well as more favorable confidence levels in reading normal cases. Given the limitations of the mammography phantom used, however, a clearer picture in comparing 3D and 2D mammography may be better acquired with the incorporation of human studies in the future.

  20. Effect of Screen Reading and Reading from Printed Out Material on Student Success and Permanency in Introduction to Computer Lesson

    ERIC Educational Resources Information Center

    Tuncer, Murat; Bahadir, Ferdi

    2014-01-01

    In this study, the effect of screen reading and reading from printed out material on student success and permanency in Introduction to Computer Lesson is investigated. Study group of the research consists of 78 freshman students registered in Erzincan University Refahiye Vocational School Post Service department. Study groups of research consist…

  1. Effects of Text Display Variables on Reading Tasks: Computer Screen vs. Hard Copy. CDC Technical Report No. 3.

    ERIC Educational Resources Information Center

    Haas, Christina; Hayes, John R.

    Two studies were conducted to compare subjects' performance reading texts displayed on a computer terminal screen and on paper. In the first study, 10 graduate students read a 1,000-word article on knee injuries from "Science 83" magazine and were tested for recall of information on eight items. While subjects in the control condition (reading…

  2. The Study of Learners' Preference for Visual Complexity on Small Screens of Mobile Computers Using Neural Networks

    ERIC Educational Resources Information Center

    Wang, Lan-Ting; Lee, Kun-Chou

    2014-01-01

    The vision plays an important role in educational technologies because it can produce and communicate quite important functions in teaching and learning. In this paper, learners' preference for the visual complexity on small screens of mobile computers is studied by neural networks. The visual complexity in this study is divided into five…

  3. Validation of Unsupervised Computer-Based Screening for Reading Disability in Greek Elementary Grades 3 and 4

    ERIC Educational Resources Information Center

    Protopapas, Athanassios; Skaloumbakas, Christos; Bali, Persefoni

    2008-01-01

    After reviewing past efforts related to computer-based reading disability (RD) assessment, we present a fully automated screening battery that evaluates critical skills relevant for RD diagnosis designed for unsupervised application in the Greek educational system. Psychometric validation in 301 children, 8-10 years old (grades 3 and 4; including…

  4. Static & Dynamic Response of 2D Solids

    1996-07-15

    NIKE2D is an implicit finite-element code for analyzing the finite deformation, static and dynamic response of two-dimensional, axisymmetric, plane strain, and plane stress solids. The code is fully vectorized and available on several computing platforms. A number of material models are incorporated to simulate a wide range of material behavior including elasto-placicity, anisotropy, creep, thermal effects, and rate dependence. Slideline algorithms model gaps and sliding along material interfaces, including interface friction, penetration and single surfacemore » contact. Interactive-graphics and rezoning is included for analyses with large mesh distortions. In addition to quasi-Newton and arc-length procedures, adaptive algorithms can be defined to solve the implicit equations using the solution language ISLAND. Each of these capabilities and more make NIKE2D a robust analysis tool.« less

  5. Toxicology screen

    MedlinePlus

    Barbiturates - screen; Benzodiazepines - screen; Amphetamines - screen; Analgesics - screen; Antidepressants - screen; Narcotics - screen; Phenothiazines - screen; Drug abuse screen; Blood alcohol test

  6. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    PubMed

    Fredin, Lisa A; Allison, Thomas C

    2016-04-01

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules. PMID:26982657

  7. Realistic and efficient 2D crack simulation

    NASA Astrophysics Data System (ADS)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  8. Stacking up 2D materials

    NASA Astrophysics Data System (ADS)

    Mayor, Louise

    2016-05-01

    Graphene might be the most famous example, but there are other 2D materials and compounds too. Louise Mayor explains how these atomically thin sheets can be layered together to create flexible “van der Waals heterostructures”, which could lead to a range of novel applications.

  9. Compound image compression for real-time computer screen image transmission.

    PubMed

    Lin, Tony; Hao, Pengwei

    2005-08-01

    We present a compound image compression algorithm for real-time applications of computer screen image transmission. It is called shape primitive extraction and coding (SPEC). Real-time image transmission requires that the compression algorithm should not only achieve high compression ratio, but also have low complexity and provide excellent visual quality. SPEC first segments a compound image into text/graphics pixels and pictorial pixels, and then compresses the text/graphics pixels with a new lossless coding algorithm and the pictorial pixels with the standard lossy JPEG, respectively. The segmentation first classifies image blocks into picture and text/graphics blocks by thresholding the number of colors of each block, then extracts shape primitives of text/graphics from picture blocks. Dynamic color palette that tracks recent text/graphics colors is used to separate small shape primitives of text/graphics from pictorial pixels. Shape primitives are also extracted from text/graphics blocks. All shape primitives from both block types are losslessly compressed by using a combined shape-based and palette-based coding algorithm. Then, the losslessly coded bitstream is fed into a LZW coder. Experimental results show that the SPEC has very low complexity and provides visually lossless quality while keeping competitive compression ratios. PMID:16121449

  10. High-performance computational analysis and peptide screening from databases of cyclotides from poaceae.

    PubMed

    Porto, William F; Miranda, Vivian J; Pinto, Michelle F S; Dohms, Stephan M; Franco, Octavio L

    2016-01-01

    Cyclotides are a family of head-to-tail cyclized peptides containing three conserved disulfide bonds, in a structural scaffold also known as a cyclic cysteine knot. Due to the high degree of cysteine conservation, novel members from this peptide family can be identified in protein databases through a search through regular expression (REGEX). In this work, six novel cyclotide-like precursors from the Poaceae were identified from NCBI's non-redundant protein database by the use of REGEX. Two out of six sequences (named Zea mays L and M) showed an Asp residue in the C-terminal, which indicated that they could be cyclic. Gene expression in maize tissues was investigated, showing that the previously described cyclotide-like Z. mays J is expressed in the roots. According to molecular dynamics, the structure of Z. mays J seems to be stable, despite the putative absence of cyclization. As regards cyclotide evolution, it was hypothesized that this is an outcome from convergent evolution and/or horizontal gene transfer. The results showed that peptide screening from databases should be performed periodically in order to include novel sequences, which are deposited as the databases grow. Indeed, the advances in computational and experimental methods will together help to answer key questions and reach new horizons in defense-related peptide identification. PMID:26572696

  11. The Effect of All-Capital vs. Regular Mixed Print, as Presented on a Computer Screen, on Reading Rate and Accuracy.

    ERIC Educational Resources Information Center

    Henney, Maribeth

    Two related studies were conducted to determine whether students read all-capital text and mixed text displayed on a computer screen with the same speed and accuracy. Seventy-seven college students read M. A. Tinker's "Basic Reading Rate Test" displayed on a PLATO computer screen. One treatment consisted of paragraphs in all-capital type followed…

  12. Large-Scale Computational Screening Identifies First in Class Multitarget Inhibitor of EGFR Kinase and BRD4

    PubMed Central

    Allen, Bryce K.; Mehta, Saurabh; Ember, Stewart W. J.; Schonbrunn, Ernst; Ayad, Nagi; Schürer, Stephan C.

    2015-01-01

    Inhibition of cancer-promoting kinases is an established therapeutic strategy for the treatment of many cancers, although resistance to kinase inhibitors is common. One way to overcome resistance is to target orthogonal cancer-promoting pathways. Bromo and Extra-Terminal (BET) domain proteins, which belong to the family of epigenetic readers, have recently emerged as promising therapeutic targets in multiple cancers. The development of multitarget drugs that inhibit kinase and BET proteins therefore may be a promising strategy to overcome tumor resistance and prolong therapeutic efficacy in the clinic. We developed a general computational screening approach to identify novel dual kinase/bromodomain inhibitors from millions of commercially available small molecules. Our method integrated machine learning using big datasets of kinase inhibitors and structure-based drug design. Here we describe the computational methodology, including validation and characterization of our models and their application and integration into a scalable virtual screening pipeline. We screened over 6 million commercially available compounds and selected 24 for testing in BRD4 and EGFR biochemical assays. We identified several novel BRD4 inhibitors, among them a first in class dual EGFR-BRD4 inhibitor. Our studies suggest that this computational screening approach may be broadly applicable for identifying dual kinase/BET inhibitors with potential for treating various cancers. PMID:26596901

  13. Smoking cessation interventions within the context of Low-Dose Computed Tomography lung cancer screening: A systematic review.

    PubMed

    Piñeiro, Bárbara; Simmons, Vani N; Palmer, Amanda M; Correa, John B; Brandon, Thomas H

    2016-08-01

    The integration of smoking cessation interventions (SCIs) within the context of lung cancer screening programs is strongly recommended by screening guidelines, and is a requirement for Medicare coverage of screening in the US. In Europe, there are no lung cancer screening guidelines, however, research trials are ongoing, and prominent professional societies have begun to recommend lung cancer screening. Little is known about the types and efficacy of SCIs among patients receiving low-dose computed tomography (LDCT) screening. This review addresses this gap. Based on a systematic search, we identified six empirical studies published prior to July 1, 2015, that met inclusion criteria for our review: English language, SCI for LDCT patients, and reported smoking-related outcomes. Three randomized studies and three single-arm studies were identified. Two randomized controlled trials (RCTs) evaluated self-help SCIs, whereas one pilot RCT evaluated the timing (before or after the LDCT scan) of a combined (counseling and pharmacotherapy) SCI. Among the single-arm trials, two observational studies evaluated the efficacy of combined SCI, and one retrospectively assessed the efficacy of clinician-delivered smoking assessment, advice, and assistance. Given the limited research to date, and particularly the lack of studies reporting results from RCTs, assumptions that SCIs would be effective among this population should be made with caution. Findings from this review suggest that participation in a lung screening trial promotes smoking cessation and may represent a teachable moment to quit smoking. Findings also suggest that providers can take advantage of this potentially teachable moment, and that SCIs have been successfully implemented in screening settings. Continued systematic and methodologically sound research in this area will help improve the knowledge base and implementation of interventions for this population of smokers at risk for chronic disease. PMID:27393513

  14. Computer aided screening and evaluation of herbal therapeutics against MRSA infections.

    PubMed

    Skariyachan, Sinosh; Krishnan, Rao Shruti; Siddapa, Snehapriya Bangalore; Salian, Chithra; Bora, Prerana; Sebastian, Denoj

    2011-01-01

    Methicillin resistant Staphylococcus aureus (MRSA), a pathogenic bacterium that causes life threatening outbreaks such as community-onset and nosocomial infections has emerged as 'superbug'. The organism developed resistance to all classes of antibiotics including the best known Vancomycin (VRSA). Hence, there is a need to develop new therapeutic agents. This study mainly evaluates the potential use of botanicals against MRSA infections. Computer aided design is an initial platform to screen novel inhibitors and the data finds applications in drug development. The drug-likeness and efficiency of various herbal compounds were screened by ADMET and docking studies. The virulent factor of most of the MRSA associated infections are Penicillin Binding Protein 2A (PBP2A) and Panton-Valentine Leukocidin (PVL). Hence, native structures of these proteins (PDB: 1VQQ and 1T5R) were used as the drug targets. The docking studies revealed that the active component of Aloe vera, β-sitosterol (3S, 8S, 9S, 10R, 13R, 14S, 17R) -17- [(2R, 5R)-5-ethyl-6-methylheptan-2-yl] -10, 13-dimethyl 2, 3, 4, 7, 8, 9, 11, 12, 14, 15, 16, 17- dodecahydro-1H-cyclopenta [a] phenanthren-3-ol) showed best binding energies of -7.40 kcal/mol and -6.34 kcal/mol for PBP2A and PVL toxin, respectively. Similarly, Meliantriol (1S-1-[ (2R, 3R, 5R)-5-hydroxy-3-[(3S, 5R, 9R, 10R, 13S, 14S, 17S)-3-hydroxy 4, 4, 10, 13, 14-pentamethyl-2, 3, 5, 6, 9, 11, 12, 15, 16, 17-decahydro-1H-cyclopenta[a] phenanthren-17-yl] oxolan-2-yl] -2- methylpropane-1, 2 diol), active compound in Azadirachta indica (Neem) showed the binding energies of -6.02 kcal/mol for PBP2A and -8.94 for PVL toxin. Similar studies were conducted with selected herbal compound based on pharmacokinetic properties. All in silico data tested in vitro concluded that herbal extracts of Aloe-vera, Neem, Guava (Psidium guajava), Pomegranate (Punica granatum) and tea (Camellia sinensis) can be used as therapeutics against MRSA infections. PMID:22125390

  15. MOSS2D V1

    2001-01-31

    This software reduces the data from two-dimensional kSA MOS program, k-Space Associates, Ann Arbor, MI. Initial MOS data is recorded without headers in 38 columns, with one row of data per acquisition per lase beam tracked. The final MOSS 2d data file is reduced, graphed, and saved in a tab-delimited column format with headers that can be plotted in any graphing software.

  16. The interplay of attention economics and computer-aided detection marks in screening mammography

    NASA Astrophysics Data System (ADS)

    Schwartz, Tayler M.; Sridharan, Radhika; Wei, Wei; Lukyanchenko, Olga; Geiser, William; Whitman, Gary J.; Haygood, Tamara Miner

    2016-03-01

    Introduction: According to attention economists, overabundant information leads to decreased attention for individual pieces of information. Computer-aided detection (CAD) alerts radiologists to findings potentially associated with breast cancer but is notorious for creating an abundance of false-positive marks. We suspected that increased CAD marks do not lengthen mammogram interpretation time, as radiologists will selectively disregard these marks when present in larger numbers. We explore the relevance of attention economics in mammography by examining how the number of CAD marks affects interpretation time. Methods: We performed a retrospective review of bilateral digital screening mammograms obtained between January 1, 2011 and February 28, 2014, using only weekend interpretations to decrease distractions and the likelihood of trainee participation. We stratified data according to reader and used ANOVA to assess the relationship between number of CAD marks and interpretation time. Results: Ten radiologists, with median experience after residency of 12.5 years (range 6 to 24,) interpreted 1849 mammograms. When accounting for number of images, Breast Imaging Reporting and Data System category, and breast density, increasing numbers of CAD marks was correlated with longer interpretation time only for the three radiologists with the fewest years of experience (median 7 years.) Conclusion: For the 7 most experienced readers, increasing CAD marks did not lengthen interpretation time. We surmise that as CAD marks increase, the attention given to individual marks decreases. Experienced radiologists may rapidly dismiss larger numbers of CAD marks as false-positive, having learned that devoting extra attention to such marks does not improve clinical detection.

  17. A Combination of Screening and Computational Approaches for the Identification of Novel Compounds That Decrease Mast Cell Degranulation

    PubMed Central

    McShane, Marisa P.; Friedrichson, Tim; Giner, Angelika; Meyenhofer, Felix; Barsacchi, Rico; Bickle, Marc

    2015-01-01

    High-content screening of compound libraries poses various challenges in the early steps in drug discovery such as gaining insights into the mode of action of the selected compounds. Here, we addressed these challenges by integrating two biological screens through bioinformatics and computational analysis. We screened a small-molecule library enriched in amphiphilic compounds in a degranulation assay in rat basophilic leukemia 2H3 (RBL-2H3) cells. The same library was rescreened in a high-content image-based endocytosis assay in HeLa cells. This assay was previously applied to a genome-wide RNAi screen that produced quantitative multiparametric phenotypic profiles for genes that directly or indirectly affect endocytosis. By correlating the endocytic profiles of the compounds with the genome-wide siRNA profiles, we identified candidate pathways that may be inhibited by the compounds. Among these, we focused on the Akt pathway and validated its inhibition in HeLa and RBL-2H3 cells. We further showed that the compounds inhibited the translocation of the Akt-PH domain to the plasma membrane. The approach performed here can be used to integrate chemical and functional genomics screens for investigating the mechanism of action of compounds. PMID:25838434

  18. Excitons in van der Waals heterostructures: The important role of dielectric screening

    NASA Astrophysics Data System (ADS)

    Latini, S.; Olsen, T.; Thygesen, K. S.

    2015-12-01

    The existence of strongly bound excitons is one of the hallmarks of the newly discovered atomically thin semiconductors. While it is understood that the large binding energy is mainly due to the weak dielectric screening in two dimensions, a systematic investigation of the role of screening on two-dimensional (2D) excitons is still lacking. Here we provide a critical assessment of a widely used 2D hydrogenic exciton model, which assumes a dielectric function of the form ɛ (q )=1 +2 π α q , and we develop a quasi-2D model with a much broader applicability. Within the quasi-2D picture, electrons and holes are described as in-plane point charges with a finite extension in the perpendicular direction, and their interaction is screened by a dielectric function with a nonlinear q dependence which is computed ab initio. The screened interaction is used in a generalized Mott-Wannier model to calculate exciton binding energies in both isolated and supported 2D materials. For isolated 2D materials, the quasi-2D treatment yields results almost identical to those of the strict 2D model, and both are in good agreement with ab initio many-body calculations. On the other hand, for more complex structures such as supported layers or layers embedded in a van der Waals heterostructure, the size of the exciton in reciprocal space extends well beyond the linear regime of the dielectric function, and a quasi-2D description has to replace the 2D one. Our methodology has the merit of providing a seamless connection between the strict 2D limit of isolated monolayer materials and the more bulk-like screening characteristics of supported 2D materials or van der Waals heterostructures.

  19. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    SciTech Connect

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  20. Nanoimprint lithography: 2D or not 2D? A review

    NASA Astrophysics Data System (ADS)

    Schift, Helmut

    2015-11-01

    Nanoimprint lithography (NIL) is more than a planar high-end technology for the patterning of wafer-like substrates. It is essentially a 3D process, because it replicates various stamp topographies by 3D displacement of material and takes advantage of the bending of stamps while the mold cavities are filled. But at the same time, it keeps all assets of a 2D technique being able to pattern thin masking layers like in photon- and electron-based traditional lithography. This review reports about 20 years of development of replication techniques at Paul Scherrer Institut, with a focus on 3D aspects of molding, which enable NIL to stay 2D, but at the same time enable 3D applications which are "more than Moore." As an example, the manufacturing of a demonstrator for backlighting applications based on thermally activated selective topography equilibration will be presented. This technique allows generating almost arbitrary sloped, convex and concave profiles in the same polymer film with dimensions in micro- and nanometer scale.

  1. Computer-Aided Virtual Screening and Designing of Cell-Penetrating Peptides.

    PubMed

    Gautam, Ankur; Chaudhary, Kumardeep; Kumar, Rahul; Raghava, Gajendra Pal Singh

    2015-01-01

    Cell-penetrating peptides (CPPs) have proven their potential as versatile drug delivery vehicles. Last decade has witnessed an unprecedented growth in CPP-based research, demonstrating the potential of CPPs as therapeutic candidates. In the past, many in silico algorithms have been developed for the prediction and screening of CPPs, which expedites the CPP-based research. In silico screening/prediction of CPPs followed by experimental validation seems to be a reliable, less time-consuming, and cost-effective approach. This chapter describes the prediction, screening, and designing of novel efficient CPPs using "CellPPD," an in silico tool. PMID:26202262

  2. Using Computational Modeling to Assess the Impact of Clinical Decision Support on Cancer Screening within Community Health Centers

    PubMed Central

    Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.

    2014-01-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  3. Predicting antitrichomonal activity: a computational screening using atom-based bilinear indices and experimental proofs.

    PubMed

    Marrero-Ponce, Yovani; Meneses-Marcel, Alfredo; Castillo-Garit, Juan A; Machado-Tugores, Yanetsy; Escario, José Antonio; Barrio, Alicia Gómez; Pereira, David Montero; Nogal-Ruiz, Juan José; Arán, Vicente J; Martínez-Fernández, Antonio R; Torrens, Francisco; Rotondo, Richard; Ibarra-Velarde, Froylán; Alvarado, Ysaias J

    2006-10-01

    classification functions were then applied to find new lead antitrichomonal agents and six compounds were selected as possible active compounds by computational screening. The designed compounds were synthesized and tested for in vitro activity against T. vaginalis. Out of the six compounds that were designed, and synthesized, three molecules (chemicals VA5-5a, VA5-5c, and VA5-12b) showed high to moderate cytocidal activity at the concentration of 10 microg/ml, other two compounds (VA5-8pre and VA5-8) showed high cytocidal and cytostatic activity at the concentration of 100 microg/ml and 10 microg/ml, correspondingly, and the remaining chemical (compound VA5-5e) was inactive at these assayed concentrations. Nonetheless, these compounds possess structural features not seen in known trichomonacidal compounds and thus can serve as excellent leads for further optimization of antitrichomonal activity. The LDA-based QSAR models presented here can be considered as a computer-assisted system that could potentially significantly reduce the number of synthesized and tested compounds and increase the chance of finding new chemical entities with antitrichomonal activity. PMID:16875830

  4. Efficient framework for deformable 2D-3D registration

    NASA Astrophysics Data System (ADS)

    Fluck, Oliver; Aharon, Shmuel; Khamene, Ali

    2008-03-01

    Using 2D-3D registration it is possible to extract the body transformation between the coordinate systems of X-ray and volumetric CT images. Our initial motivation is the improvement of accuracy of external beam radiation therapy, an effective method for treating cancer, where CT data play a central role in radiation treatment planning. Rigid body transformation is used to compute the correct patient setup. The drawback of such approaches is that the rigidity assumption on the imaged object is not valid for most of the patient cases, mainly due to respiratory motion. In the present work, we address this limitation by proposing a flexible framework for deformable 2D-3D registration consisting of a learning phase incorporating 4D CT data sets and hardware accelerated free form DRR generation, 2D motion computation, and 2D-3D back projection.

  5. Computer-aided detection of breast masses: Four-view strategy for screening mammography

    SciTech Connect

    Wei Jun; Chan Heangping; Zhou Chuan; Wu Yita; Sahiner, Berkman; Hadjiiski, Lubomir M.; Roubidoux, Marilyn A.; Helvie, Mark A.

    2011-04-15

    Purpose: To improve the performance of a computer-aided detection (CAD) system for mass detection by using four-view information in screening mammography. Methods: The authors developed a four-view CAD system that emulates radiologists' reading by using the craniocaudal and mediolateral oblique views of the ipsilateral breast to reduce false positives (FPs) and the corresponding views of the contralateral breast to detect asymmetry. The CAD system consists of four major components: (1) Initial detection of breast masses on individual views, (2) information fusion of the ipsilateral views of the breast (referred to as two-view analysis), (3) information fusion of the corresponding views of the contralateral breast (referred to as bilateral analysis), and (4) fusion of the four-view information with a decision tree. The authors collected two data sets for training and testing of the CAD system: A mass set containing 389 patients with 389 biopsy-proven masses and a normal set containing 200 normal subjects. All cases had four-view mammograms. The true locations of the masses on the mammograms were identified by an experienced MQSA radiologist. The authors randomly divided the mass set into two independent sets for cross validation training and testing. The overall test performance was assessed by averaging the free response receiver operating characteristic (FROC) curves of the two test subsets. The FP rates during the FROC analysis were estimated by using the normal set only. The jackknife free-response ROC (JAFROC) method was used to estimate the statistical significance of the difference between the test FROC curves obtained with the single-view and the four-view CAD systems. Results: Using the single-view CAD system, the breast-based test sensitivities were 58% and 77% at the FP rates of 0.5 and 1.0 per image, respectively. With the four-view CAD system, the breast-based test sensitivities were improved to 76% and 87% at the corresponding FP rates, respectively

  6. Generating a 2D Representation of a Complex Data Structure

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A computer program, designed to assist in the development and debugging of other software, generates a two-dimensional (2D) representation of a possibly complex n-dimensional (where n is an integer >2) data structure or abstract rank-n object in that other software. The nature of the 2D representation is such that it can be displayed on a non-graphical output device and distributed by non-graphical means.

  7. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    PubMed Central

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-01-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening. PMID:27126741

  8. Screening for pulmonary tuberculosis in a Tanzanian prison and computer-aided interpretation of chest X-rays

    PubMed Central

    Mangu, C.; van den Hombergh, J.; van Deutekom, H.; van Ginneken, B.; Clowes, P.; Mhimbira, F.; Mfinanga, S.; Rachow, A.; Hoelscher, M.

    2015-01-01

    Setting: Tanzania is a high-burden country for tuberculosis (TB), and prisoners are a high-risk group that should be screened actively, as recommended by the World Health Organization. Screening algorithms, starting with chest X-rays (CXRs), can detect asymptomatic cases, but depend on experienced readers, who are scarce in the penitentiary setting. Recent studies with patients seeking health care for TB-related symptoms showed good diagnostic performance of the computer software CAD4TB. Objective: To assess the potential of computer-assisted screening using CAD4TB in a predominantly asymptomatic prison population. Design: Cross-sectional study. Results: CAD4TB and seven health care professionals reading CXRs in local tuberculosis wards evaluated a set of 511 CXRs from the Ukonga prison in Dar es Salaam. Performance was compared using a radiological reference. Two readers performed significantly better than CAD4TB, three were comparable, and two performed significantly worse (area under the curve 0.75 in receiver operating characteristics analysis). On a superset of 1321 CXRs, CAD4TB successfully interpreted >99%, with a predictably short time to detection, while 160 (12.2%) reports were delayed by over 24 h with conventional CXR reading. Conclusion: CAD4TB reliably evaluates CXRs from a mostly asymptomatic prison population, with a diagnostic performance inferior to that of expert readers but comparable to local readers. PMID:26767179

  9. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information.

    PubMed

    Melendez, Jaime; Sánchez, Clara I; Philipsen, Rick H H M; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-01-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening. PMID:27126741

  10. PERSONAL COMPUTER MONITORS: A SCREENING EVALUATION OF VOLATILE ORGANIC EMISSIONS FROM EXISTING PRINTED CIRCUIT BOARD LAMINATES AND POTENTIAL POLLUTION PREVENTION ALTERNATIVES

    EPA Science Inventory

    The report gives results of a screening evaluation of volatile organic emissions from printed circuit board laminates and potential pollution prevention alternatives. In the evaluation, printed circuit board laminates, without circuitry, commonly found in personal computer (PC) m...

  11. Circulating microRNA signature as liquid-biopsy to monitor lung cancer in low-dose computed tomography screening

    PubMed Central

    Marchiano, Alfonso; Pelosi, Giuseppe; Galeone, Carlotta; Verri, Carla; Suatoni, Paola; Sverzellati, Nicola

    2015-01-01

    Liquid biopsies can detect biomarkers carrying information on the development and progression of cancer. We demonstrated that a 24 plasma-based microRNA signature classifier (MSC) was capable of increasing the specificity of low dose computed tomography (LDCT) in a lung cancer screening trial. In the present study, we tested the prognostic performance of MSC, and its ability to monitor disease status recurrence in LDCT screening-detected lung cancers. Between 2000 and 2010, 3411 heavy smokers enrolled in two screening programmes, underwent annual or biennial LDCT. During the first five years of screening, 84 lung cancer patients were classified according to one of the three MSC levels of risk: high, intermediate or low. Kaplan-Meier survival analysis was performed according to MSC and clinico-pathological information. Follow-up MSC analysis was performed on longitudinal plasma samples (n = 100) collected from 31 patients before and after surgical resection. Five-year survival was 88.9% for low risk, 79.5% for intermediate risk and 40.1% for high risk MSC (p = 0.001). The prognostic power of MSC persisted after adjusting for tumor stage (p = 0.02) and when the analysis was restricted to LDCT-detected cases after exclusion of interval cancers (p < 0.001). The MSC risk level decreased after surgery in 76% of the 25 high-intermediate subjects who remained disease free, whereas in relapsing patients an increase of the MSC risk level was observed at the time of detection of second primary tumor or metastatic progression. These results encourage exploiting the MSC test for lung cancer monitoring in LDCT screening for lung cancer. PMID:26451608

  12. Circulating microRNA signature as liquid-biopsy to monitor lung cancer in low-dose computed tomography screening.

    PubMed

    Sestini, Stefano; Boeri, Mattia; Marchiano, Alfonso; Pelosi, Giuseppe; Galeone, Carlotta; Verri, Carla; Suatoni, Paola; Sverzellati, Nicola; La Vecchia, Carlo; Sozzi, Gabriella; Pastorino, Ugo

    2015-10-20

    Liquid biopsies can detect biomarkers carrying information on the development and progression of cancer. We demonstrated that a 24 plasma-based microRNA signature classifier (MSC) was capable of increasing the specificity of low dose computed tomography (LDCT) in a lung cancer screening trial. In the present study, we tested the prognostic performance of MSC, and its ability to monitor disease status recurrence in LDCT screening-detected lung cancers.Between 2000 and 2010, 3411 heavy smokers enrolled in two screening programmes, underwent annual or biennial LDCT. During the first five years of screening, 84 lung cancer patients were classified according to one of the three MSC levels of risk: high, intermediate or low. Kaplan-Meier survival analysis was performed according to MSC and clinico-pathological information. Follow-up MSC analysis was performed on longitudinal plasma samples (n = 100) collected from 31 patients before and after surgical resection.Five-year survival was 88.9% for low risk, 79.5% for intermediate risk and 40.1% for high risk MSC (p = 0.001). The prognostic power of MSC persisted after adjusting for tumor stage (p = 0.02) and when the analysis was restricted to LDCT-detected cases after exclusion of interval cancers (p < 0.001). The MSC risk level decreased after surgery in 76% of the 25 high-intermediate subjects who remained disease free, whereas in relapsing patients an increase of the MSC risk level was observed at the time of detection of second primary tumor or metastatic progression.These results encourage exploiting the MSC test for lung cancer monitoring in LDCT screening for lung cancer. PMID:26451608

  13. Assessment of an Interactive Computer-Based Patient Prenatal Genetic Screening and Testing Education Tool

    ERIC Educational Resources Information Center

    Griffith, Jennifer M.; Sorenson, James R.; Bowling, J. Michael; Jennings-Grant, Tracey

    2005-01-01

    The Enhancing Patient Prenatal Education study tested the feasibility and educational impact of an interactive program for patient prenatal genetic screening and testing education. Patients at two private practices and one public health clinic participated (N = 207). The program collected knowledge and measures of anxiety before and after use of…

  14. Beneficial effects through aggressive coronary screening for type 2 diabetes patients with advanced vascular complications.

    PubMed

    Tsujimoto, Tetsuro; Sugiyama, Takehiro; Yamamoto-Honda, Ritsuko; Kishimoto, Miyako; Noto, Hiroshi; Morooka, Miyako; Kubota, Kazuo; Kamimura, Munehiro; Hara, Hisao; Kajio, Hiroshi; Kakei, Masafumi; Noda, Mitsuhiko

    2016-08-01

    Glycemic control alone does not reduce cardiovascular events in patients with type 2 diabetes (T2D), and routine screening of all T2D patients for asymptomatic coronary artery disease (CAD) is not effective for preventing acute cardiac events. We examined the effectiveness of an aggressive screening protocol for asymptomatic CAD in T2D patients with advanced vascular complications.We designed a 3-year cohort study investigating the effectiveness of the aggressive coronary screening for T2D patients with advanced vascular complications and no known coronary events using propensity score adjusted analysis at a national center in Japan. Eligibility criteria included T2D without known coronary events and with any 1 of the following 4 complications: advanced diabetic retinopathy, advanced chronic kidney disease, peripheral artery disease, or cerebrovascular disease. In the aggressive screening group (n = 122), all patients received stress single photon emission computed tomography and those exhibiting myocardial perfusion abnormalities underwent coronary angiography. In the conventional screening group (n = 108), patients were examined for CAD at the discretion of their medical providers. Primary endpoint was composite outcome of cardiovascular death and nonfatal cardiovascular events.Asymptomatic CAD with ≥70% stenosis was detected in 39.3% of patients completing aggressive screening. The proportions achieving revascularization and receiving intensive medical therapy within 90 days after the screening were significantly higher in the aggressive screening group than in the conventional screening group [19.7% vs 0% (P < 0.001) and 48.4% vs 9.3% (P < 0.001), respectively]. The cumulative rate of primary composite outcome was significantly lower in the aggressive screening group according to a propensity score adjusted Cox proportional hazards model (hazard ratio, 0.35; 95% confidence interval, 0.12-0.96; P = 0.04).Aggressive coronary screening for T2D patients

  15. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals

    PubMed Central

    Amat-ur-Rasool, Hafsa; Ahmed, Mehboob

    2015-01-01

    Alzheimer's disease (AD), a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh). The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE), an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals) and self-drawn ligands were compared with Food and Drug Administration (FDA) approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD. PMID:26325402

  16. The integration of digital camera derived images with a computer based diabetes register for use in retinal screening.

    PubMed

    Taylor, D J; Jacob, J S; Tooke, J E

    2000-07-01

    Exeter district provides a retinal screening service based on a mobile non-mydriatic camera operated by a dedicated retinal screener visiting general practices on a 2-yearly cycle. Digital attachments to eye cameras can now provide a cost effective alternative to the use of film in population based eye screening programmes. Whilst the manufacturers of digital cameras provide a database for the storage of pictures, the images do not as yet interface readily with the rest of the patient's computer held data or allow for a sophisticated grading, reporting and administration system. The system described is a development of the Exeter diabetes register (EXSYST) which can import digitally derived pictures from either Ris-Lite TM and Imagenet TM camera systems or scanned Polaroids Pictures can be reported by the screener, checked by a consultant ophthalmologist via the hospital network, and a report, consisting of colour pictures, map of relevant pathology and referral recommendations produced. This concise report can be hard copied inexpensively on a high resolution ink-jet printer to be returned to the patient's general practitioner. Eye images remain available within the hospital diabetes centre computer network to facilitate shared care. This integrated system would form an ideal platform for the addition of computer based pathology recognition and total paperless transmission when suitable links to GP surgeries become available. PMID:10837903

  17. Microwave Imaging with Infrared 2-D Lock-in Amplifier

    NASA Astrophysics Data System (ADS)

    Chiyo, Noritaka; Arai, Mizuki; Tanaka, Yasuhiro; Nishikata, Atsuhiro; Maeno, Takashi

    We have developed a 3-D electromagnetic field measurement system using 2-D lock-in amplifier. This system uses an amplitude modulated electromagnetic wave source to heat a resistive screen. A very small change of temperature on a screen illuminated with the modulated electromagnetic wave is measured using an infrared thermograph camera. In this paper, we attempted to apply our system to microwave imaging. By placing conductor patches in front of the resistive screen and illuminating with microwave, the shape of each conductor was clearly observed as the temperature difference image of the screen. In this way, the conductor pattern inside the non-contact type IC card could be visualized. Moreover, we could observe the temperature difference image reflecting the shape of a Konnyaku (a gelatinous food made from devil's-tonge starch) or a dried fishbone, both as non-conducting material resembling human body. These results proved that our method is applicable to microwave see-through imaging.

  18. Excitonic effects in 2D semiconductors: Path Integral Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Velizhanin, Kirill; Saxena, Avadh

    One of the most striking features of novel 2D semiconductors (e.g., transition metal dichalcogenide monolayers or phosphorene) is a strong Coulomb interaction between charge carriers resulting in large excitonic effects. In particular, this leads to the formation of multi-carrier bound states (e.g., excitons, trions and biexcitons), which could remain stable at near-room temperatures and contribute significantly to optical properties of such materials. In my talk, I will report on our recent progress in using the Path Integral Monte Carlo methodology to numerically study properties of multi-carrier bound states in 2D semiconductors. Incorporating the effect of the dielectric confinement (via Keldysh potential), we have investigated and tabulated the dependence of single exciton, trion and biexciton binding energies on the strength of dielectric screening, including the limiting cases of very strong and very weak screening. The implications of the obtained results and the possible limitations of the used model will be discussed. The results of this work are potentially useful in the analysis of experimental data and benchmarking of theoretical and computational models.

  19. Parallel map analysis on 2-D grids

    SciTech Connect

    Berry, M.; Comiskey, J.; Minser, K.

    1993-12-31

    In landscape ecology, computer modeling is used to assess habitat fragmentation and its ecological iMPLications. Specifically, maps (2-D grids) of habitat clusters must be analyzed to determine number, sizes and geometry of clusters. Models prior to this study relied upon sequential Fortran-77 programs which limited the sizes of maps and densities of clusters which could be analyzed. In this paper, we present more efficient computer models which can exploit recursion or parallelism. Significant improvements over the original Fortran-77 programs have been achieved using both recursive and nonrecursive C implementations on a variety of workstations such as the Sun Sparc 2, IBM RS/6000-350, and HP 9000-750. Parallel implementations on a 4096-processor MasPar MP-1 and a 32-processor CM-5 are also studied. Preliminary experiments suggest that speed improvements for the parallel model on the MasPar MP-1 (written in MPL) and on the CM-5 (written in C using CMMD) can be as much as 39 and 34 times faster, respectively, than the most efficient sequential C program on a Sun Sparc 2 for a 512 map. An important goal in this research effort is to produce a scalable map analysis algorithm for the identification and characterization of clusters for relatively large maps on massively-parallel computers.

  20. Inhibitory effects of phytochemicals on metabolic capabilities of CYP2D6*1 and CYP2D6*10 using cell-based models in vitro

    PubMed Central

    Qu, Qiang; Qu, Jian; Han, Lu; Zhan, Min; Wu, Lan-xiang; Zhang, Yi-wen; Zhang, Wei; Zhou, Hong-hao

    2014-01-01

    Aim: Herbal products have been widely used, and the safety of herb-drug interactions has aroused intensive concerns. This study aimed to investigate the effects of phytochemicals on the catalytic activities of human CYP2D6*1 and CYP2D6*10 in vitro. Methods: HepG2 cells were stably transfected with CYP2D6*1 and CYP2D6*10 expression vectors. The metabolic kinetics of the enzymes was studied using HPLC and fluorimetry. Results: HepG2-CYP2D6*1 and HepG2-CYP2D6*10 cell lines were successfully constructed. Among the 63 phytochemicals screened, 6 compounds, including coptisine sulfate, bilobalide, schizandrin B, luteolin, schizandrin A and puerarin, at 100 μmol/L inhibited CYP2D6*1- and CYP2D6*10-mediated O-demethylation of a coumarin compound AMMC by more than 50%. Furthermore, the inhibition by these compounds was dose-dependent. Eadie-Hofstee plots demonstrated that these compounds competitively inhibited CYP2D6*1 and CYP2D6*10. However, their Ki values for CYP2D6*1 and CYP2D6*10 were very close, suggesting that genotype-dependent herb-drug inhibition was similar between the two variants. Conclusion: Six phytochemicals inhibit CYP2D6*1 and CYP2D6*10-mediated catalytic activities in a dose-dependent manner in vitro. Thus herbal products containing these phytochemicals may inhibit the in vivo metabolism of co-administered drugs whose primary route of elimination is CYP2D6. PMID:24786236

  1. Recent trends and future prospects in computational GPCR drug discovery: from virtual screening to polypharmacology.

    PubMed

    Carrieri, Antonio; Pérez-Nueno, Violeta I; Lentini, Giovanni; Ritchie, David W

    2013-01-01

    Extending virtual screening approaches to deal with multi-target drug design and polypharmacology is an increasingly important aspect in drug design. In light of this, the concept of accessible chemical space and its exploration should be reviewed. The great advantages of re-using drugs with safe pharmacological profiles with favourable pharmacokinetic properties highlights drug repositioning as a valid alternative to rational drug design, massive drug development efforts, and high-throughput screening, especially when supported by in silico techniques. Here, we discuss some of the advantages of multi-target approaches, and we review some significant examples of their application in the last decade to that well known class of pharmaceutical targets, the G-protein coupled receptors. PMID:23651484

  2. Stuck on Screens: Patterns of Computer and Gaming Station Use in Youth Seen in a Psychiatric Clinic

    PubMed Central

    Baer, Susan; Bogusz, Elliot; Green, David A.

    2011-01-01

    Objective: Computer and gaming-station use has become entrenched in the culture of our youth. Parents of children with psychiatric disorders report concerns about overuse, but research in this area is limited. The goal of this study is to evaluate computer/gaming-station use in adolescents in a psychiatric clinic population and to examine the relationship between use and functional impairment. Method: 102 adolescents, ages 11–17, from out-patient psychiatric clinics participated. Amount of computer/gaming-station use, type of use (gaming or non-gaming), and presence of addictive features were ascertained along with emotional/functional impairment. Multivariate linear regression was used to examine correlations between patterns of use and impairment. Results: Mean screen time was 6.7±4.2 hrs/day. Presence of addictive features was positively correlated with emotional/functional impairment. Time spent on computer/gaming-station use was not correlated overall with impairment after controlling for addictive features, but non-gaming time was positively correlated with risky behavior in boys. Conclusions: Youth with psychiatric disorders are spending much of their leisure time on the computer/gaming-station and a substantial subset show addictive features of use which is associated with impairment. Further research to develop measures and to evaluate risk is needed to identify the impact of this problem. PMID:21541096

  3. Evaluation of the measurement uncertainty in screening immunoassays in blood establishments: computation of diagnostic accuracy models.

    PubMed

    Pereira, Paulo; Westgard, James O; Encarnação, Pedro; Seghatchian, Jerard

    2015-02-01

    The European Union regulation for blood establishments does not require the evaluation of measurement uncertainty in virology screening tests, which is required by ISO 15189 guideline following GUM principles. GUM modular approaches have been discussed by medical laboratory researchers but no consensus has been achieved regarding practical application. Meanwhile, the application of empirical approaches fulfilling GUM principles has gained support. Blood establishments' screening tests accredited by ISO 15189 need to select an appropriate model even GUM models are intended uniquely for quantitative examination procedures. Alternative (to GUM) models focused on probability have been proposed in medical laboratories' diagnostic tests. This article reviews, discusses and proposes models for diagnostic accuracy in blood establishments' screening tests. The output of these models is an alternative to VIM's measurement uncertainty concept. Example applications are provided for an anti-HCV test where calculations were performed using a commercial spreadsheet. The results show that these models satisfy ISO 15189 principles and that the estimation of clinical sensitivity, clinical specificity, binary results agreement and area under the ROC curve are alternatives to the measurement uncertainty concept. PMID:25617905

  4. Screen time and children

    MedlinePlus

    "Screen time" is a term used for activities done in front of a screen, such as watching TV, working on a computer, or playing video games. Screen time is sedentary activity, meaning you are being physically ...

  5. Screening for Substance Use Disorder among Incarcerated Men with the Alcohol, Smoking, Substance Involvement Screening Test (ASSIST): A Comparative Analysis of Computer-administered and Interviewer-administered Modalities

    PubMed Central

    Wolff, Nancy; Shi, Jing

    2015-01-01

    Substance use disorders are overrepresented in incarcerated male populations. Cost- effective screening for alcohol and substance use problems among incarcerated populations is a necessary first step forward intervention. The Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST) holds promise because it has strong psychometric properties, requires minimal training, is easy to score, is available in the public domain but, because of complicated skip patterns, cannot be self-administered. This study tests the feasibility, reliability, and validity of using computer-administered self-interviewing (CASI) versus interviewer-administered interviewing (IAI) to screen for substance use problems among incarcerated men using the ASSIST. A 2 X 2 factorial design was used to randomly assign 396 incarcerated men to screening modality. Findings indicate that computer screening was feasible. Compared to IAI, CASI produced equally reliable screening information on substance use and symptom severity, with test-retest intraclass correlations for ASSIST total and substance-specific scores ranging from 0.7 to 0.9, and ASSIST substance-specific scores and a substance abuse disorder diagnosis based on the Structured Clinical Interview (SCID) were significantly correlated for IAI and CASI. These findings indicate that data on substance use and symptom severity using the ASSIST can be reliably and validly obtained from CASI technology, increasing the efficiency by which incarcerated populations can be screened for substance use problems and, those at risk, identified for treatment. PMID:25659203

  6. Screening for Substance Use Disorder Among Incarcerated Men with the Alcohol, Smoking, Substance Involvement Screening Test (ASSIST): A Comparative Analysis of Computer-Administered and Interviewer-Administered Modalities.

    PubMed

    Wolff, Nancy; Shi, Jing

    2015-06-01

    Substance use disorders are overrepresented in incarcerated male populations. Cost-effective screening for alcohol and substance use problems among incarcerated populations is a necessary first step forward intervention. The Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST) holds promise because it has strong psychometric properties, requires minimal training, is easy to score, is available in the public domain but, because of complicated skip patterns, cannot be self-administered. This study tests the feasibility, reliability, and validity of using computer-administered self-interviewing (CASI) versus interviewer-administered interviewing (IAI) to screen for substance use problems among incarcerated men using the ASSIST. A 2×2 factorial design was used to randomly assign 396 incarcerated men to screening modality. Findings indicate that computer screening was feasible. Compared to IAI, CASI produced equally reliable screening information on substance use and symptom severity, with test-retest intraclass correlations for ASSIST total and substance-specific scores ranging from 0.7 to 0.9, and ASSIST substance-specific scores and a substance abuse disorder diagnosis based on the Structured Clinical Interview (SCID) were significantly correlated for IAI and CASI. These findings indicate that data on substance use and symptom severity using the ASSIST can be reliably and validly obtained from CASI technology, increasing the efficiency by which incarcerated populations can be screened for substance use problems and, those at risk, identified for treatment. PMID:25659203

  7. Ultrafast 2D NMR: an emerging tool in analytical spectroscopy.

    PubMed

    Giraudeau, Patrick; Frydman, Lucio

    2014-01-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry--from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  8. Ultrafast 2D NMR: An Emerging Tool in Analytical Spectroscopy

    NASA Astrophysics Data System (ADS)

    Giraudeau, Patrick; Frydman, Lucio

    2014-06-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry—from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications.

  9. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding. PMID:16236534

  10. Three-dimensional (3D) microarchitecture correlations with 2D projection image gray-level variations assessed by trabecular bone score using high-resolution computed tomographic acquisitions: effects of resolution and noise.

    PubMed

    Winzenrieth, Renaud; Michelet, Franck; Hans, Didier

    2013-01-01

    The aim of the present study is to determine the level of correlation between the 3-dimensional (3D) characteristics of trabecular bone microarchitecture, as evaluated using microcomputed tomography (μCT) reconstruction, and trabecular bone score (TBS), as evaluated using 2D projection images directly derived from 3D μCT reconstruction (TBSμCT). Moreover, we have evaluated the effects of image degradation (resolution and noise) and X-ray energy of projection on these correlations. Thirty human cadaveric vertebrae were acquired on a microscanner at an isotropic resolution of 93 μm. The 3D microarchitecture parameters were obtained using MicroView (GE Healthcare, Wauwatosa, MI). The 2D projections of these 3D models were generated using the Beer-Lambert law at different X-ray energies. Degradation of image resolution was simulated (from 93 to 1488 μm). Relationships between 3D microarchitecture parameters and TBSμCT at different resolutions were evaluated using linear regression analysis. Significant correlations were observed between TBSμCT and 3D microarchitecture parameters, regardless of the resolution. Correlations were detected that were strongly to intermediately positive for connectivity density (0.711 ≤ r² ≤ 0.752) and trabecular number (0.584 ≤ r² ≤ 0.648) and negative for trabecular space (-0.407 ≤ r² ≤ -0.491), up to a pixel size of 1023 μm. In addition, TBSμCT values were strongly correlated between each other (0.77 ≤ r² ≤ 0.96). Study results show that the correlations between TBSμCT at 93 μm and 3D microarchitecture parameters are weakly impacted by the degradation of image resolution and the presence of noise. PMID:22749406

  11. Computational Toxicology as Implemented by the U.S. EPA: Providing High Throughput Decision Support Tools for Screening and Assessing Chemical Exposure, Hazard and Risk

    EPA Science Inventory

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...

  12. NKG2D ligands as therapeutic targets

    PubMed Central

    Spear, Paul; Wu, Ming-Ru; Sentman, Marie-Louise; Sentman, Charles L.

    2013-01-01

    The Natural Killer Group 2D (NKG2D) receptor plays an important role in protecting the host from infections and cancer. By recognizing ligands induced on infected or tumor cells, NKG2D modulates lymphocyte activation and promotes immunity to eliminate ligand-expressing cells. Because these ligands are not widely expressed on healthy adult tissue, NKG2D ligands may present a useful target for immunotherapeutic approaches in cancer. Novel therapies targeting NKG2D ligands for the treatment of cancer have shown preclinical success and are poised to enter into clinical trials. In this review, the NKG2D receptor and its ligands are discussed in the context of cancer, infection, and autoimmunity. In addition, therapies targeting NKG2D ligands in cancer are also reviewed. PMID:23833565

  13. Relative speeds of Kodak computed radiography phosphors and screen-film systems.

    PubMed

    Huda, W; Rill, L N; Bruner, A P

    1997-10-01

    Relative mAs values required to generate a constant plate readout signal for the Kodak Ektascan general purpose (GP-25) and high resolution (HR) photostimulable phosphors were measured as a function of x-ray beam quality and for a range of representative x-ray examinations. The signal intensity was determined from the exposure index (EI) generated during the read out of uniformly exposed phosphor imaging plates. These data were compared to the corresponding relative mAs values required to produce a constant film density of Lanex screen-film combinations with nominal speeds of 40, 400, and 600. The relative detection performance of the photostimulable phosphors generally decreased with increasing kVp and beam filtration. The relative response of GP-25 phosphors was independent of examination type, and modified by approximately 10% when scattered radiation was present. The HR phosphor was more efficient than a Lanex Single Fine extremity screen used with an EM-1 film. These relative response data will be useful for selecting the x-ray technique factors which minimize patient dose in x-ray examinations performed with photostimulable phosphors. PMID:9350716

  14. Resonances of piezoelectric plate with embedded 2D electron system

    NASA Astrophysics Data System (ADS)

    Suslov, A. V.

    2009-02-01

    A thin GaAs/AlGaAs plate was studied by the resonant ultrasound spectroscopy (RUS) in the temperature range 0.3-10 K and in magnetic fields of up to 18 T. The resonance frequencies and linewidths were measured. Quantum oscillations of both these values were observed and were associated with the quantum Hall effect occurred in the 2D electron system. For an analysis the sample was treated as a dielectric piezoelectric plate covered on one side by a film with a field dependent conductivity. Screening of the strain-driven electric field was changed due to the variation of the electron relaxation time in the vicinity of the metal-dielectric transitions caused by the magnetic field in the 2D system. The dielectric film does not affect properties of GaAs and thus the resonance frequencies are defined only by the elastic, piezoelectric and dielectric constants of GaAs. A metallic 2D sheet effectively screens the parallel electric field, so the ultrasound wave velocities and resonance frequencies decrease when the sheet conductivity increases. Oscillations of the resonance linewidth reflect the influence of the 2D system on the ultrasound attenuation, which is proportional to the linewidth. A metallic film as well as a dielectric one does not affect this attenuation but at some finite nonzero value of the conductivity the linewidth approaches a maximum. In high magnetic field each oscillation of the conductivity produces one oscillation of a resonance frequency and two linewidth peaks. The observed phenomena can be described by the relaxation type equations and the resonant ultrasound spectroscopy opens another opportunity for contactless studies on 2D electron systems.

  15. Computational challenges and human factors influencing the design and use of clinical research participant eligibility pre-screening tools

    PubMed Central

    2012-01-01

    Background Clinical trials are the primary mechanism for advancing clinical care and evidenced-based practice, yet challenges with the recruitment of participants for such trials are widely recognized as a major barrier to these types of studies. Data warehouses (DW) store large amounts of heterogenous clinical data that can be used to enhance recruitment practices, but multiple challenges exist when using a data warehouse for such activities, due to the manner of collection, management, integration, analysis, and dissemination of the data. A critical step in leveraging the DW for recruitment purposes is being able to match trial eligibility criteria to discrete and semi-structured data types in the data warehouse, though trial eligibility criteria tend to be written without concern for their computability. We present the multi-modal evaluation of a web-based tool that can be used for pre-screening patients for clinical trial eligibility and assess the ability of this tool to be practically used for clinical research pre-screening and recruitment. Methods The study used a validation study, usability testing, and a heuristic evaluation to evaluate and characterize the operational characteristics of the software as well as human factors affecting its use. Results Clinical trials from the Division of Cardiology and the Department of Family Medicine were used for this multi-modal evaluation, which included a validation study, usability study, and a heuristic evaluation. From the results of the validation study, the software demonstrated a positive predictive value (PPV) of 54.12% and 0.7%, respectively, and a negative predictive value (NPV) of 73.3% and 87.5%, respectively, for two types of clinical trials. Heuristic principles concerning error prevention and documentation were characterized as the major usability issues during the heuristic evaluation. Conclusions This software is intended to provide an initial list of eligible patients to a clinical study

  16. Sparse radar imaging using 2D compressed sensing

    NASA Astrophysics Data System (ADS)

    Hou, Qingkai; Liu, Yang; Chen, Zengping; Su, Shaoying

    2014-10-01

    Radar imaging is an ill-posed linear inverse problem and compressed sensing (CS) has been proved to have tremendous potential in this field. This paper surveys the theory of radar imaging and a conclusion is drawn that the processing of ISAR imaging can be denoted mathematically as a problem of 2D sparse decomposition. Based on CS, we propose a novel measuring strategy for ISAR imaging radar and utilize random sub-sampling in both range and azimuth dimensions, which will reduce the amount of sampling data tremendously. In order to handle 2D reconstructing problem, the ordinary solution is converting the 2D problem into 1D by Kronecker product, which will increase the size of dictionary and computational cost sharply. In this paper, we introduce the 2D-SL0 algorithm into the reconstruction of imaging. It is proved that 2D-SL0 can achieve equivalent result as other 1D reconstructing methods, but the computational complexity and memory usage is reduced significantly. Moreover, we will state the results of simulating experiments and prove the effectiveness and feasibility of our method.

  17. Traditional and Computer-Based Screening and Diagnosis of Reading Disabilities in Greek

    ERIC Educational Resources Information Center

    Protopapas, Athanassios; Skaloumbakas, Christos

    2007-01-01

    In this study, we examined the characteristics of reading disability (RD) in the seventh grade of the Greek educational system and the corresponding diagnostic practice. We presented a clinically administered assessment battery, composed of typically employed tasks, and a fully automated, computer-based assessment battery that evaluates some of…

  18. Identification of Potent Chemotypes Targeting Leishmania major Using a High-Throughput, Low-Stringency, Computationally Enhanced, Small Molecule Screen

    PubMed Central

    Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.

    2009-01-01

    Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337

  19. Application of computer-extracted breast tissue texture features in predicting false-positive recalls from screening mammography

    NASA Astrophysics Data System (ADS)

    Ray, Shonket; Choi, Jae Y.; Keller, Brad M.; Chen, Jinbo; Conant, Emily F.; Kontos, Despina

    2014-03-01

    Mammographic texture features have been shown to have value in breast cancer risk assessment. Previous models have also been developed that use computer-extracted mammographic features of breast tissue complexity to predict the risk of false-positive (FP) recall from breast cancer screening with digital mammography. This work details a novel locallyadaptive parenchymal texture analysis algorithm that identifies and extracts mammographic features of local parenchymal tissue complexity potentially relevant for false-positive biopsy prediction. This algorithm has two important aspects: (1) the adaptive nature of automatically determining an optimal number of region-of-interests (ROIs) in the image and each ROI's corresponding size based on the parenchymal tissue distribution over the whole breast region and (2) characterizing both the local and global mammographic appearances of the parenchymal tissue that could provide more discriminative information for FP biopsy risk prediction. Preliminary results show that this locallyadaptive texture analysis algorithm, in conjunction with logistic regression, can predict the likelihood of false-positive biopsy with an ROC performance value of AUC=0.92 (p<0.001) with a 95% confidence interval [0.77, 0.94]. Significant texture feature predictors (p<0.05) included contrast, sum variance and difference average. Sensitivity for false-positives was 51% at the 100% cancer detection operating point. Although preliminary, clinical implications of using prediction models incorporating these texture features may include the future development of better tools and guidelines regarding personalized breast cancer screening recommendations. Further studies are warranted to prospectively validate our findings in larger screening populations and evaluate their clinical utility.

  20. Perspectives for spintronics in 2D materials

    NASA Astrophysics Data System (ADS)

    Han, Wei

    2016-03-01

    The past decade has been especially creative for spintronics since the (re)discovery of various two dimensional (2D) materials. Due to the unusual physical characteristics, 2D materials have provided new platforms to probe the spin interaction with other degrees of freedom for electrons, as well as to be used for novel spintronics applications. This review briefly presents the most important recent and ongoing research for spintronics in 2D materials.

  1. Radiative heat transfer in 2D Dirac materials

    DOE PAGESBeta

    Rodriguez-López, Pablo; Tse, Wang -Kong; Dalvit, Diego A. R.

    2015-05-12

    We compute the radiative heat transfer between two sheets of 2D Dirac materials, including topological Chern insulators and graphene, within the framework of the local approximation for the optical response of these materials. In this approximation, which neglects spatial dispersion, we derive both numerically and analytically the short-distance asymptotic of the near-field heat transfer in these systems, and show that it scales as the inverse of the distance between the two sheets. In conclusion, we discuss the limitations to the validity of this scaling law imposed by spatial dispersion in 2D Dirac materials.

  2. STEALTH - a Lagrange explicit finite-difference code for solid, structural, and thermohydraulic analysis. Volume 8A: STEALTH/WHAMSE - a 2-D fluid-structure interaction code. Computer code manual

    SciTech Connect

    Gross, M.B.

    1984-10-01

    STEALTH is a family of computer codes that can be used to calculate a variety of physical processes in which the dynamic behavior of a continuum is involved. The version of STEALTH described in this volume is designed for calculations of fluid-structure interaction. This version of the program consists of a hydrodynamic version of STEALTH which has been coupled to a finite-element code, WHAMSE. STEALTH computes the transient response of the fluid continuum, while WHAMSE computes the transient response of shell and beam structures under external fluid loadings. The coupling between STEALTH and WHAMSE is performed during each cycle or step of a calculation. Separate calculations of fluid response and structural response are avoided, thereby giving a more accurate model of the dynamic coupling between fluid and structure. This volume provides the theoretical background, the finite-difference equations, the finite-element equations, a discussion of several sample problems, a listing of the input decks for the sample problems, a programmer's manual and a description of the input records for the STEALTH/WHAMSE computer program.

  3. The physics of computed radiography: Measurements of pulse height spectra of photostimulable phosphor screens using prompt luminescence

    SciTech Connect

    Watt, Kristina N.; Yan, Kuo; DeCrescenzo, Giovanni; Rowlands, J. A.

    2005-12-15

    Computed radiography (CR) is a digital technology that employs reusable photostimulable phosphor (PSP) imaging plates (IP) to acquire radiographic images. In CR, the x-ray attenuation pattern of the imaged object is temporarily stored as a latent charge image within the PSP. The latent image is optically readout as photostimulated luminescence (PSL) when the phosphor is subsequently stimulated using a scanning laser. The multiple stages necessary to create a CR image make it difficult to investigate either experimentally or theoretically. In order to examine the performance of the CR system at a fundamental level separate measurements of the processes involved are desirable. Here pulse height spectroscopy is used to study the prompt violet light emission or prompt luminescence (PL) from commercial PSP screens. Since the mechanism by which light escapes from the phosphor is identical for PL and PSL, observations and conclusions based on the pulse height spectra (PHS) of PL are relevant to the understanding of the behavior of the PSL light emission that outputs the radiographic image in CR. The PL PHS of screens of different thickness and optical properties were measured and compared with the PHS of conventional phosphors. A new method for calibration of the PHS in terms of the absolute number of optical photons per x-ray is introduced and compared to previously established methods.

  4. Gold silver alloy nanoparticles (GSAN): an imaging probe for breast cancer screening with dual-energy mammography or computed tomography.

    PubMed

    Naha, Pratap C; Lau, Kristen C; Hsu, Jessica C; Hajfathalian, Maryam; Mian, Shaameen; Chhour, Peter; Uppuluri, Lahari; McDonald, Elizabeth S; Maidment, Andrew D A; Cormode, David P

    2016-07-14

    Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening. PMID:27412458

  5. Interobserver variations on interpretation of multislice CT lung cancer screening studies, and the implications for computer-aided diagnosis

    NASA Astrophysics Data System (ADS)

    Novak, Carol L.; Qian, JianZhong; Fan, Li; Ko, Jane P.; Rubinowitz, Ami N.; McGuinness, Georgeann; Naidich, David

    2002-04-01

    With low dose multi-slice CT for screening of lung cancer, physicians are now finding and examining increasingly smaller nodules. However as the size of detectable nodules becomes smaller, there may be greater differences among physicians as to what is detected and what constitutes a nodule. In this study, 10 CT screening studies of smokers were individually evaluated by three thoracic radiologists. After consensus to determine a gold standard, the number of nodules detected by individual radiologists ranged from 1.4 to 2.1 detections per patient. Each radiologist detected nodules missed by the other two. Although a total of 26 true nodules were detected by one or more radiologists, only 8 (31%) were detected by all three radiologists. The number of true nodules detected by an integrated automatic detection algorithm was 3.2 per patient after radiologist validation. Including these nodules in the gold standard set reduced the sensitivity of nodule detection by each radiologist to less than half. The sensitivity of nodule detection by the computer was better at 64%, proving especially efficacious for detecting smaller and more central nodules. Use of the automatic detection module would allow individual radiologists to increase the number of detected nodules by 114% to 207%.

  6. The effects of on-screen, point of care computer reminders on processes and outcomes of care

    PubMed Central

    Shojania, Kaveh G; Jennings, Alison; Mayhew, Alain; Ramsay, Craig R; Eccles, Martin P; Grimshaw, Jeremy

    2014-01-01

    Background The opportunity to improve care by delivering decision support to clinicians at the point of care represents one of the main incentives for implementing sophisticated clinical information systems. Previous reviews of computer reminder and decision support systems have reported mixed effects, possibly because they did not distinguish point of care computer reminders from e-mail alerts, computer-generated paper reminders, and other modes of delivering ‘computer reminders’. Objectives To evaluate the effects on processes and outcomes of care attributable to on-screen computer reminders delivered to clinicians at the point of care. Search methods We searched the Cochrane EPOC Group Trials register, MEDLINE, EMBASE and CINAHL and CENTRAL to July 2008, and scanned bibliographies from key articles. Selection criteria Studies of a reminder delivered via a computer system routinely used by clinicians, with a randomised or quasi-randomised design and reporting at least one outcome involving a clinical endpoint or adherence to a recommended process of care. Data collection and analysis Two authors independently screened studies for eligibility and abstracted data. For each study, we calculated the median improvement in adherence to target processes of care and also identified the outcome with the largest such improvement. We then calculated the median absolute improvement in process adherence across all studies using both the median outcome from each study and the best outcome. Main results Twenty-eight studies (reporting a total of thirty-two comparisons) were included. Computer reminders achieved a median improvement in process adherence of 4.2% (interquartile range (IQR): 0.8% to 18.8%) across all reported process outcomes, 3.3% (IQR: 0.5% to 10.6%) for medication ordering, 3.8% (IQR: 0.5% to 6.6%) for vaccinations, and 3.8% (IQR: 0.4% to 16.3%) for test ordering. In a sensitivity analysis using the best outcome from each study, the median improvement was 5

  7. Atypical cytostatic mechanism of N-1-sulfonylcytosine derivatives determined by in vitro screening and computational analysis.

    PubMed

    Supek, Fran; Kralj, Marijeta; Marjanović, Marko; Suman, Lidija; Smuc, Tomislav; Krizmanić, Irena; Zinić, Biserka

    2008-04-01

    We have previously shown that N-1-sulfonylpyrimidine derivatives have strong antiproliferative activity on human tumor cell lines, whereby 1-(p-toluenesulfonyl)cytosine showed good selectivity with regard to normal cells and was easily synthesized on a large scale. In the present work we have used an interdisciplinary approach to elucidate the compounds' mechanistic class. An augmented number of cell lines (11) has allowed a computational search for compounds with similar activity profiles and/or mechanistic class by integrating our data with the comprehensive DTP-NCI database. We applied supervised machine learning methodology (Random Forest classifier), which offers information complementary to unsupervised algorithms commonly used for analysis of cytostatic activity profiles, such as self-organizing maps. The computational results taken together with cell cycle perturbation and apoptosis analysis of the cell lines point to an unusual mechanism of cytostatic action, possibly a combination of nucleic acid antimetabolite activity and a novel molecular mechanism. PMID:17898928

  8. Computational redesign of bacterial biotin carboxylase inhibitors using structure-based virtual screening of combinatorial libraries.

    PubMed

    Brylinski, Michal; Waldrop, Grover L

    2014-01-01

    As the spread of antibiotic resistant bacteria steadily increases, there is an urgent need for new antibacterial agents. Because fatty acid synthesis is only used for membrane biogenesis in bacteria, the enzymes in this pathway are attractive targets for antibacterial agent development. Acetyl-CoA carboxylase catalyzes the committed and regulated step in fatty acid synthesis. In bacteria, the enzyme is composed of three distinct protein components: biotin carboxylase, biotin carboxyl carrier protein, and carboxyltransferase. Fragment-based screening revealed that amino-oxazole inhibits biotin carboxylase activity and also exhibits antibacterial activity against Gram-negative organisms. In this report, we redesigned previously identified lead inhibitors to expand the spectrum of bacteria sensitive to the amino-oxazole derivatives by including Gram-positive species. Using 9,411 small organic building blocks, we constructed a diverse combinatorial library of 1.2×10⁸ amino-oxazole derivatives. A subset of 9×10⁶ of these compounds were subjected to structure-based virtual screening against seven biotin carboxylase isoforms using similarity-based docking by eSimDock. Potentially broad-spectrum antibiotic candidates were selected based on the consensus ranking by several scoring functions including non-linear statistical models implemented in eSimDock and traditional molecular mechanics force fields. The analysis of binding poses of the top-ranked compounds docked to biotin carboxylase isoforms suggests that: (1) binding of the amino-oxazole anchor is stabilized by a network of hydrogen bonds to residues 201, 202 and 204; (2) halogenated aromatic moieties attached to the amino-oxazole scaffold enhance interactions with a hydrophobic pocket formed by residues 157, 169, 171 and 203; and (3) larger substituents reach deeper into the binding pocket to form additional hydrogen bonds with the side chains of residues 209 and 233. These structural insights into drug

  9. Computational screening of disease-associated mutations in OCA2 gene.

    PubMed

    Kamaraj, Balu; Purohit, Rituraj

    2014-01-01

    Oculocutaneous albinism type 2 (OCA2), caused by mutations of OCA2 gene, is an autosomal recessive disorder characterized by reduced biosynthesis of melanin pigment in the skin, hair, and eyes. The OCA2 gene encodes instructions for making a protein called the P protein. This protein plays a crucial role in melanosome biogenesis, and controls the eumelanin content in melanocytes in part via the processing and trafficking of tyrosinase which is the rate-limiting enzyme in melanin synthesis. In this study we analyzed the pathogenic effect of 95 non-synonymous single nucleotide polymorphisms reported in OCA2 gene using computational methods. We found R305W mutation as most deleterious and disease associated using SIFT, PolyPhen, PANTHER, PhD-SNP, Pmut, and MutPred tools. To understand the atomic arrangement in 3D space, the native and mutant (R305W) structures were modeled. Molecular dynamics simulation was conducted to observe the structural significance of computationally prioritized disease-associated mutation (R305W). Root-mean-square deviation, root-mean-square fluctuation, radius of gyration, solvent accessibility surface area, hydrogen bond (NH bond), trace of covariance matrix, eigenvector projection analysis, and density analysis results showed prominent loss of stability and rise in mutant flexibility values in 3D space. This study presents a well designed computational methodology to examine the albinism-associated SNPs. PMID:23824587

  10. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    PubMed

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. PMID:27324166

  11. [Phantom Study on Dose Reduction Using Iterative Reconstruction in Low-dose Computed Tomography for Lung Cancer Screening].

    PubMed

    Minehiro, Kaori; Takata, Tadanori; Hayashi, Hiroyuki; Sakuda, Keita; Nunome, Haruka; Kawashima, Hiroko; Sanada, Shigeru

    2015-12-01

    We investigated dose reduction ability of an iterative reconstruction technology for low-dose computed tomography (CT) for lung cancer screening. The Sinogram Affirmed Iterative Reconstruction (SAFIRE) provided in a multi slice CT system, Somatom Definition Flash (Siemens Healthcare) was used. An anthropomorphic chest phantom (N-1, Kyoto Kagaku) was scanned at volume CT dose index (CTDIvol) of 0.50-11.86 mGy with 120 kV. For noise (standard deviation) and contrast-to-noise ratio (CNR) measurements, CTP486 and CTP515 modules in the Catphan (The Phantom Laboratory) were scanned. Radiological technologists were participated in the perceptual comparison. SAFIRE reduced the SD values by approximately 50% compared with filter back projection (FBP). The estimated dose reduction rates by SAFIRE determined from the perceptual comparison was approximately 23%, while 75% dose reduction rate was expected from the SD value reduction of 50%. PMID:26685831

  12. The Emerging Roles of Coronary Computed Tomographic Angiography: Acute Chest Pain Evaluation and Screening for Asymptomatic Individuals

    PubMed Central

    Chien, Ning; Wang, Tzung-Dau; Chang, Yeun-Chung; Lin, Po-Chih; Tseng, Yao-Hui; Lee, Yee-Fan; Ko, Wei-Chun; Lee, Bai-Chin; Lee, Wen-Jeng

    2016-01-01

    Coronary computed tomographic angiography (CCTA) has been widely available since 2004. After that, the diagnostic accuracy of CCTA has been extensively validated with invasive coronary angiography for detection of coronary arterial stenosis. In this paper, we reviewed the updated evidence of the role of CCTA in both scenarios including acute chest pain and screening in asymptomatic adults. Several large-scale studies have been conducted to evaluate the diagnostic value of CCTA in the context of acute chest pain patients. CCTA could play a role in delivering more efficient care. For risk stratification of asymptomatic patients using CCTA, latest studies have revealed incremental benefits. Future studies evaluating the totality of plaque characteristics may be useful for determining the role of noncalcified plaque for risk stratification in asymptomatic individuals. PMID:27122947

  13. Discovery of small molecule inhibitors of MyD88-dependent signaling pathways using a computational screen

    PubMed Central

    Olson, Mark A.; Lee, Michael S.; Kissner, Teri L.; Alam, Shahabuddin; Waugh, David S.; Saikh, Kamal U.

    2015-01-01

    In this study, we used high-throughput computational screening to discover drug-like inhibitors of the host MyD88 protein-protein signaling interaction implicated in the potentially lethal immune response associated with Staphylococcal enterotoxins. We built a protein-protein dimeric docking model of the Toll-interleukin receptor (TIR)-domain of MyD88 and identified a binding site for docking small molecules. Computational screening of 5 million drug-like compounds led to testing of 30 small molecules; one of these molecules inhibits the TIR-TIR domain interaction and attenuates pro-inflammatory cytokine production in human primary cell cultures. Compounds chemically similar to this hit from the PubChem database were observed to be more potent with improved drug-like properties. Most of these 2nd generation compounds inhibit Staphylococcal enterotoxin B (SEB)-induced TNF-α, IFN-γ, IL-6, and IL-1β production at 2–10 μM in human primary cells. Biochemical analysis and a cell-based reporter assay revealed that the most promising compound, T6167923, disrupts MyD88 homodimeric formation, which is critical for its signaling function. Furthermore, we observed that administration of a single dose of T6167923 completely protects mice from lethal SEB-induced toxic shock. In summary, our in silico approach has identified anti-inflammatory inhibitors against in vitro and in vivo toxin exposure with promise to treat other MyD88-related pro-inflammatory diseases. PMID:26381092

  14. Deriving Heterospecific Self-Assembling Protein–Protein Interactions Using a Computational Interactome Screen

    PubMed Central

    Crooks, Richard O.; Baxter, Daniel; Panek, Anna S.; Lubben, Anneke T.; Mason, Jody M.

    2016-01-01

    Interactions between naturally occurring proteins are highly specific, with protein-network imbalances associated with numerous diseases. For designed protein–protein interactions (PPIs), required specificity can be notoriously difficult to engineer. To accelerate this process, we have derived peptides that form heterospecific PPIs when combined. This is achieved using software that generates large virtual libraries of peptide sequences and searches within the resulting interactome for preferentially interacting peptides. To demonstrate feasibility, we have (i) generated 1536 peptide sequences based on the parallel dimeric coiled-coil motif and varied residues known to be important for stability and specificity, (ii) screened the 1,180,416 member interactome for predicted Tm values and (iii) used predicted Tm cutoff points to isolate eight peptides that form four heterospecific PPIs when combined. This required that all 32 hypothetical off-target interactions within the eight-peptide interactome be disfavoured and that the four desired interactions pair correctly. Lastly, we have verified the approach by characterising all 36 pairs within the interactome. In analysing the output, we hypothesised that several sequences are capable of adopting antiparallel orientations. We subsequently improved the software by removing sequences where doing so led to fully complementary electrostatic pairings. Our approach can be used to derive increasingly large and therefore complex sets of heterospecific PPIs with a wide range of potential downstream applications from disease modulation to the design of biomaterials and peptides in synthetic biology. PMID:26655848

  15. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis

    PubMed Central

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A.; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-01-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications. PMID:27325228

  16. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis

    NASA Astrophysics Data System (ADS)

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A.; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-06-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications.

  17. Deriving Heterospecific Self-Assembling Protein-Protein Interactions Using a Computational Interactome Screen.

    PubMed

    Crooks, Richard O; Baxter, Daniel; Panek, Anna S; Lubben, Anneke T; Mason, Jody M

    2016-01-29

    Interactions between naturally occurring proteins are highly specific, with protein-network imbalances associated with numerous diseases. For designed protein-protein interactions (PPIs), required specificity can be notoriously difficult to engineer. To accelerate this process, we have derived peptides that form heterospecific PPIs when combined. This is achieved using software that generates large virtual libraries of peptide sequences and searches within the resulting interactome for preferentially interacting peptides. To demonstrate feasibility, we have (i) generated 1536 peptide sequences based on the parallel dimeric coiled-coil motif and varied residues known to be important for stability and specificity, (ii) screened the 1,180,416 member interactome for predicted Tm values and (iii) used predicted Tm cutoff points to isolate eight peptides that form four heterospecific PPIs when combined. This required that all 32 hypothetical off-target interactions within the eight-peptide interactome be disfavoured and that the four desired interactions pair correctly. Lastly, we have verified the approach by characterising all 36 pairs within the interactome. In analysing the output, we hypothesised that several sequences are capable of adopting antiparallel orientations. We subsequently improved the software by removing sequences where doing so led to fully complementary electrostatic pairings. Our approach can be used to derive increasingly large and therefore complex sets of heterospecific PPIs with a wide range of potential downstream applications from disease modulation to the design of biomaterials and peptides in synthetic biology. PMID:26655848

  18. Computational screening for active compounds targeting protein sequences: methodology and experimental validation.

    PubMed

    Wang, Fei; Liu, Dongxiang; Wang, Heyao; Luo, Cheng; Zheng, Mingyue; Liu, Hong; Zhu, Weiliang; Luo, Xiaomin; Zhang, Jian; Jiang, Hualiang

    2011-11-28

    The three-dimensional (3D) structures of most protein targets have not been determined so far, with many of them not even having a known ligand, a truly general method to predict ligand-protein interactions in the absence of three-dimensional information would be of great potential value in drug discovery. Using the support vector machine (SVM) approach, we constructed a model for predicting ligand-protein interaction based only on the primary sequence of proteins and the structural features of small molecules. The model, trained by using 15,000 ligand-protein interactions between 626 proteins and over 10,000 active compounds, was successfully used in discovering nine novel active compounds for four pharmacologically important targets (i.e., GPR40, SIRT1, p38, and GSK-3β). To our knowledge, this is the first example of a successful sequence-based virtual screening campaign, demonstrating that our approach has the potential to discover, with a single model, active ligands for any protein. PMID:21955088

  19. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis.

    PubMed

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-01-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications. PMID:27325228

  20. Simulation of Yeast Cooperation in 2D.

    PubMed

    Wang, M; Huang, Y; Wu, Z

    2016-03-01

    Evolution of cooperation has been an active research area in evolutionary biology in decades. An important type of cooperation is developed from group selection, when individuals form spatial groups to prevent them from foreign invasions. In this paper, we study the evolution of cooperation in a mixed population of cooperating and cheating yeast strains in 2D with the interactions among the yeast cells restricted to their small neighborhoods. We conduct a computer simulation based on a game theoretic model and show that cooperation is increased when the interactions are spatially restricted, whether the game is of a prisoner's dilemma, snow drifting, or mutual benefit type. We study the evolution of homogeneous groups of cooperators or cheaters and describe the conditions for them to sustain or expand in an opponent population. We show that under certain spatial restrictions, cooperator groups are able to sustain and expand as group sizes become large, while cheater groups fail to expand and keep them from collapse. PMID:26988702

  1. To Screen or not to Screen: Low Dose Computed Tomography in Comparison to Chest Radiography or Usual Care in Reducing Morbidity and Mortality from Lung Cancer

    PubMed Central

    Kamdar, Jay; Moats, Austin; Nguyen, Brenda

    2016-01-01

    Lung cancer has the highest mortality rate of all cancers. This paper seeks to address the question: Can the mortality of lung cancer be decreased by screening with low-dose computerized tomography (LDCT) in higher risk patients compared to chest X-rays (CXR) or regular patient care? Currently, CXR screening is recommended for certain high-risk patients. Several recent trials have examined the effectiveness of LDCT versus chest radiography or usual care as a control. These trials include National Lung Screening Trial (NLST), Detection And screening of early lung cancer with Novel imaging TEchnology (DANTE), Lung Screening Study (LSS), Depiscan, Italian Lung (ITALUNG), and Dutch-Belgian Randomized Lung Cancer Screening Trial (Dutch acronym: NELSON study). NLST, the largest trial (n=53, 454), demonstrated a decrease in mortality from lung cancer in the LDCT group (RRR=20%, P=0.004). LSS demonstrated a greater sensitivity in detecting both early stage and any stage of lung cancer in comparison to traditional CXR. Although the DANTE trial yielded data consistent with findings in LSS, it also showed that via LDCT screening a greater proportion of patients were placed under unnecessary surgical procedures. The Depiscan trial yielded a high nodule detection rate at the cost of a high false-positive rate compared to CXR screening. The ITALUNG and NELSON trials demonstrated the early detection capabilities of LDCT for lung cancers compared to usual care without surveillance imaging. False-positive findings with unnecessary workup, intervention, and radiation exposure remain significant concerns for routine LDCT screening. However, current data suggests LDCT may provide a highly sensitive and specific means for detecting lung cancers and reducing mortality. PMID:27375974

  2. To Screen or not to Screen: Low Dose Computed Tomography in Comparison to Chest Radiography or Usual Care in Reducing Morbidity and Mortality from Lung Cancer.

    PubMed

    Dajac, Joshua; Kamdar, Jay; Moats, Austin; Nguyen, Brenda

    2016-01-01

    Lung cancer has the highest mortality rate of all cancers. This paper seeks to address the question: Can the mortality of lung cancer be decreased by screening with low-dose computerized tomography (LDCT) in higher risk patients compared to chest X-rays (CXR) or regular patient care? Currently, CXR screening is recommended for certain high-risk patients. Several recent trials have examined the effectiveness of LDCT versus chest radiography or usual care as a control. These trials include National Lung Screening Trial (NLST), Detection And screening of early lung cancer with Novel imaging TEchnology (DANTE), Lung Screening Study (LSS), Depiscan, Italian Lung (ITALUNG), and Dutch-Belgian Randomized Lung Cancer Screening Trial (Dutch acronym: NELSON study). NLST, the largest trial (n=53, 454), demonstrated a decrease in mortality from lung cancer in the LDCT group (RRR=20%, P=0.004). LSS demonstrated a greater sensitivity in detecting both early stage and any stage of lung cancer in comparison to traditional CXR. Although the DANTE trial yielded data consistent with findings in LSS, it also showed that via LDCT screening a greater proportion of patients were placed under unnecessary surgical procedures. The Depiscan trial yielded a high nodule detection rate at the cost of a high false-positive rate compared to CXR screening. The ITALUNG and NELSON trials demonstrated the early detection capabilities of LDCT for lung cancers compared to usual care without surveillance imaging. False-positive findings with unnecessary workup, intervention, and radiation exposure remain significant concerns for routine LDCT screening. However, current data suggests LDCT may provide a highly sensitive and specific means for detecting lung cancers and reducing mortality. PMID:27375974

  3. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction.

    PubMed

    Rowley-Neale, Samuel J; Fearn, Jamie M; Brownson, Dale A C; Smith, Graham C; Ji, Xiaobo; Banks, Craig E

    2016-08-21

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm(-2) modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR. PMID:27448174

  4. Computer based screening of compound databases: 1. Preselection of benzamidine-based thrombin inhibitors.

    PubMed

    Fox, T; Haaksma, E E

    2000-07-01

    We present a computational protocol which uses the known three-dimensional structure of a target enzyme to identify possible ligands from databases of compounds with low molecular weight. This is accomplished by first mapping the essential interactions in the binding site with the program GRID. The resulting regions of favorable interaction between target and ligand are translated into a database query, and with UNITY a flexible 3D database search is performed. The feasibility of this approach is calibrated with thrombin as the target. Our results show that the resulting hit lists are enriched with thrombin inhibitors compared to the total database. PMID:10896314

  5. 3D computational and experimental radiation transport assessments of Pu-Be sources and graded moderators for parcel screening

    NASA Astrophysics Data System (ADS)

    Ghita, Gabriel; Sjoden, Glenn; Baciak, James; Huang, Nancy

    2006-05-01

    The Florida Institute for Nuclear Detection and Security (FINDS) is currently working on the design and evaluation of a prototype neutron detector array that may be used for parcel screening systems and homeland security applications. In order to maximize neutron detector response over a wide spectrum of energies, moderator materials of different compositions and amounts are required, and can be optimized through 3-D discrete ordinates and Monte Carlo model simulations verified through measurement. Pu-Be sources can be used as didactic source materials to augment the design, optimization, and construction of detector arrays with proper characterization via transport analysis. To perform the assessments of the Pu-Be Source Capsule, 3-D radiation transport computations are used, including Monte Carlo (MCNP5) and deterministic (PENTRAN) methodologies. In establishing source geometry, we based our model on available source schematic data. Because both the MCNP5 and PENTRAN codes begin with source neutrons, exothermic (α,n) reactions are modeled using the SCALE5 code from ORNL to define the energy spectrum and the decay of the source. We combined our computational results with experimental data to fully validate our computational schemes, tools and models. Results from our computational models will then be used with experiment to generate a mosaic of the radiation spectrum. Finally, we discuss follow-up studies that highlight response optimization efforts in designing, building, and testing an array of detectors with varying moderators/thicknesses tagged to specific responses predicted using 3-D radiation transport models to augment special nuclear materials detection.

  6. 2D bifurcations and Newtonian properties of memristive Chua's circuits

    NASA Astrophysics Data System (ADS)

    Marszalek, W.; Podhaisky, H.

    2016-01-01

    Two interesting properties of Chua's circuits are presented. First, two-parameter bifurcation diagrams of Chua's oscillatory circuits with memristors are presented. To obtain various 2D bifurcation images a substantial numerical effort, possibly with parallel computations, is needed. The numerical algorithm is described first and its numerical code for 2D bifurcation image creation is available for free downloading. Several color 2D images and the corresponding 1D greyscale bifurcation diagrams are included. Secondly, Chua's circuits are linked to Newton's law φ ''= F(t,φ,φ')/m with φ=\\text{flux} , constant m > 0, and the force term F(t,φ,φ') containing memory terms. Finally, the jounce scalar equations for Chua's circuits are also discussed.

  7. Formulation pre-screening of inhalation powders using computational atom-atom systematic search method.

    PubMed

    Ramachandran, Vasuki; Murnane, Darragh; Hammond, Robert B; Pickering, Jonathan; Roberts, Kevin J; Soufian, Majeed; Forbes, Ben; Jaffari, Sara; Martin, Gary P; Collins, Elizabeth; Pencheva, Klimentina

    2015-01-01

    The synthonic modeling approach provides a molecule-centered understanding of the surface properties of crystals. It has been applied extensively to understand crystallization processes. This study aimed to investigate the functional relevance of synthonic modeling to the formulation of inhalation powders by assessing cohesivity of three active pharmaceutical ingredients (APIs, fluticasone propionate (FP), budesonide (Bud), and salbutamol base (SB)) and the commonly used excipient, α-lactose monohydrate (LMH). It is found that FP (-11.5 kcal/mol) has a higher cohesive strength than Bud (-9.9 kcal/mol) or SB (-7.8 kcal/mol). The prediction correlated directly to cohesive strength measurements using laser diffraction, where the airflow pressure required for complete dispersion (CPP) was 3.5, 2.0, and 1.0 bar for FP, Bud, and SB, respectively. The highest cohesive strength was predicted for LMH (-15.9 kcal/mol), which did not correlate with the CPP value of 2.0 bar (i.e., ranking lower than FP). High FP-LMH adhesive forces (-11.7 kcal/mol) were predicted. However, aerosolization studies revealed that the FP-LMH blends consisted of agglomerated FP particles with a large median diameter (∼4-5 μm) that were not disrupted by LMH. Modeling of the crystal and surface chemistry of LMH identified high electrostatic and H-bond components of its cohesive energy due to the presence of water and hydroxyl groups in lactose, unlike the APIs. A direct comparison of the predicted and measured cohesive balance of LMH with APIs will require a more in-depth understanding of highly hydrogen-bonded systems with respect to the synthonic engineering modeling tool, as well as the influence of agglomerate structure on surface-surface contact geometry. Overall, this research has demonstrated the possible application and relevance of synthonic engineering tools for rapid pre-screening in drug formulation and design. PMID:25380027

  8. Organ Dose and Attributable Cancer Risk in Lung Cancer Screening with Low-Dose Computed Tomography

    PubMed Central

    Saltybaeva, Natalia; Martini, Katharina; Frauenfelder, Thomas; Alkadhi, Hatem

    2016-01-01

    Purpose Lung cancer screening with CT has been recently recommended for decreasing lung cancer mortality. The radiation dose of CT, however, must be kept as low as reasonably achievable for reducing potential stochastic risks from ionizing radiation. The purpose of this study was to calculate individual patients’ lung doses and to estimate cancer risks in low-dose CT (LDCT) in comparison with a standard dose CT (SDCT) protocol. Materials and Methods This study included 47 adult patients (mean age 63.0 ± 5.7 years) undergoing chest CT on a third-generation dual-source scanner. 23/47 patients (49%) had a non-enhanced chest SDCT, 24 patients (51%) underwent LDCT at 100 kVp with spectral shaping at a dose equivalent to a chest x-ray. 3D-dose distributions were obtained from Monte Carlo simulations for each patient, taking into account their body size and individual CT protocol. Based on the dose distributions, patient-specific lung doses were calculated and relative cancer risk was estimated according to BEIR VII recommendations. Results As compared to SDCT, the LDCT protocol allowed for significant organ dose and cancer risk reductions (p<0.001). On average, lung dose was reduced from 7.7 mGy to 0.3 mGy when using LDCT, which was associated with lowering of the cancer risk from 8.6 to 0.35 per 100’000 cases. A strong linear correlation between lung dose and patient effective diameter was found for both protocols (R2 = 0.72 and R2 = 0.75 for SDCT and LDCT, respectively). Conclusion Use of a LDCT protocol for chest CT with a dose equivalent to a chest x-ray allows for significant lung dose and cancer risk reduction from ionizing radiation. PMID:27203720

  9. Annotated Bibliography of EDGE2D Use

    SciTech Connect

    J.D. Strachan and G. Corrigan

    2005-06-24

    This annotated bibliography is intended to help EDGE2D users, and particularly new users, find existing published literature that has used EDGE2D. Our idea is that a person can find existing studies which may relate to his intended use, as well as gain ideas about other possible applications by scanning the attached tables.

  10. Mean flow and anisotropic cascades in decaying 2D turbulence

    NASA Astrophysics Data System (ADS)

    Liu, Chien-Chia; Cerbus, Rory; Gioia, Gustavo; Chakraborty, Pinaki

    2015-11-01

    Many large-scale atmospheric and oceanic flows are decaying 2D turbulent flows embedded in a non-uniform mean flow. Despite its importance for large-scale weather systems, the affect of non-uniform mean flows on decaying 2D turbulence remains unknown. In the absence of mean flow it is well known that decaying 2D turbulent flows exhibit the enstrophy cascade. More generally, for any 2D turbulent flow, all computational, experimental and field data amassed to date indicate that the spectrum of longitudinal and transverse velocity fluctuations correspond to the same cascade, signifying isotropy of cascades. Here we report experiments on decaying 2D turbulence in soap films with a non-uniform mean flow. We find that the flow transitions from the usual isotropic enstrophy cascade to a series of unusual and, to our knowledge, never before observed or predicted, anisotropic cascades where the longitudinal and transverse spectra are mutually independent. We discuss implications of our results for decaying geophysical turbulence.

  11. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine

    PubMed Central

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-01

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1. PMID:26754609

  12. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine.

    PubMed

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-01

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1. PMID:26754609

  13. AnisWave2D: User's Guide to the 2d Anisotropic Finite-DifferenceCode

    SciTech Connect

    Toomey, Aoife

    2005-01-06

    This document describes a parallel finite-difference code for modeling wave propagation in 2D, fully anisotropic materials. The code utilizes a mesh refinement scheme to improve computational efficiency. Mesh refinement allows the grid spacing to be tailored to the velocity model, so that fine grid spacing can be used in low velocity zones where the seismic wavelength is short, and coarse grid spacing can be used in zones with higher material velocities. Over-sampling of the seismic wavefield in high velocity zones is therefore avoided. The code has been implemented to run in parallel over multiple processors and allows large-scale models and models with large velocity contrasts to be simulated with ease.

  14. Theory for spiralling ions for 2D FT-ICR and comparison with precessing magnetization vectors in 2D NMR.

    PubMed

    Sehgal, Akansha Ashvani; Pelupessy, Philippe; Rolando, Christian; Bodenhausen, Geoffrey

    2016-04-01

    Two-dimensional (2D) Fourier transform ion cyclotron resonance (FT-ICR) offers an approach to mass spectrometry (MS) that pursuits similar objectives as MS/MS experiments. While the latter must focus on one ion species at a time, 2D FT ICR can examine all possible correlations due to ion fragmentation in a single experiment: correlations between precursors, charged and neutral fragments. We revisited the original 2D FT-ICR experiment that has hitherto fallen short of stimulating significant analytical applications, probably because it is technically demanding. These shortcomings can now be overcome by improved FT-ICR instrumentation and computer hard- and software. We seek to achieve a better understanding of the intricacies of the behavior of ions during a basic two-dimensional ICR sequence comprising three simple monochromatic pulses. Through simulations based on Lorentzian equations, we have mapped the ion trajectories for different pulse durations and phases. PMID:26974979

  15. A novel sliding window algorithm for 2D discrete Fourier transform

    NASA Astrophysics Data System (ADS)

    Dong, Zhifang; Wu, Jiasong; Gui, Jiyong

    2015-12-01

    Discrete Fourier transform (DFT) is one of the most wildly used tools for signal processing. In this paper, a novel sliding window algorithm is presented for fast computing 2D DFT when sliding window shifts more than one-point. The propose algorithm computing the DFT of the current window using that of the previous window. For fast computation, we take advantage of the recursive process of 2D SDFT and butterfly-based algorithm. So it can be directly applied to 2D signal processing. The theoretical analysis shows that the computational complexity is equal to 2D SDFT when one sample comes into current window. As well, the number of additions and multiplications of our proposed algorithm are less than those of 2D vector radix FFT when sliding window shifts mutiple-point.

  16. 2D materials for nanophotonic devices

    NASA Astrophysics Data System (ADS)

    Xu, Renjing; Yang, Jiong; Zhang, Shuang; Pei, Jiajie; Lu, Yuerui

    2015-12-01

    Two-dimensional (2D) materials have become very important building blocks for electronic, photonic, and phononic devices. The 2D material family has four key members, including the metallic graphene, transition metal dichalcogenide (TMD) layered semiconductors, semiconducting black phosphorous, and the insulating h-BN. Owing to the strong quantum confinements and defect-free surfaces, these atomically thin layers have offered us perfect platforms to investigate the interactions among photons, electrons and phonons. The unique interactions in these 2D materials are very important for both scientific research and application engineering. In this talk, I would like to briefly summarize and highlight the key findings, opportunities and challenges in this field. Next, I will introduce/highlight our recent achievements. We demonstrated atomically thin micro-lens and gratings using 2D MoS2, which is the thinnest optical component around the world. These devices are based on our discovery that the elastic light-matter interactions in highindex 2D materials is very strong. Also, I would like to introduce a new two-dimensional material phosphorene. Phosphorene has strongly anisotropic optical response, which creates 1D excitons in a 2D system. The strong confinement in phosphorene also enables the ultra-high trion (charged exciton) binding energies, which have been successfully measured in our experiments. Finally, I will briefly talk about the potential applications of 2D materials in energy harvesting.

  17. Inertial solvation in femtosecond 2D spectra

    NASA Astrophysics Data System (ADS)

    Hybl, John; Albrecht Ferro, Allison; Farrow, Darcie; Jonas, David

    2001-03-01

    We have used 2D Fourier transform spectroscopy to investigate polar solvation. 2D spectroscopy can reveal molecular lineshapes beneath ensemble averaged spectra and freeze molecular motions to give an undistorted picture of the microscopic dynamics of polar solvation. The transition from "inhomogeneous" to "homogeneous" 2D spectra is governed by both vibrational relaxation and solvent motion. Therefore, the time dependence of the 2D spectrum directly reflects the total response of the solvent-solute system. IR144, a cyanine dye with a dipole moment change upon electronic excitation, was used to probe inertial solvation in methanol and propylene carbonate. Since the static Stokes' shift of IR144 in each of these solvents is similar, differences in the 2D spectra result from solvation dynamics. Initial results indicate that the larger propylene carbonate responds more slowly than methanol, but appear to be inconsistent with rotational estimates of the inertial response. To disentangle intra-molecular vibrations from solvent motion, the 2D spectra of IR144 will be compared to the time-dependent 2D spectra of the structurally related nonpolar cyanine dye HDITCP.

  18. Internal Photoemission Spectroscopy of 2-D Materials

    NASA Astrophysics Data System (ADS)

    Nguyen, Nhan; Li, Mingda; Vishwanath, Suresh; Yan, Rusen; Xiao, Shudong; Xing, Huili; Cheng, Guangjun; Hight Walker, Angela; Zhang, Qin

    Recent research has shown the great benefits of using 2-D materials in the tunnel field-effect transistor (TFET), which is considered a promising candidate for the beyond-CMOS technology. The on-state current of TFET can be enhanced by engineering the band alignment of different 2D-2D or 2D-3D heterostructures. Here we present the internal photoemission spectroscopy (IPE) approach to determine the band alignments of various 2-D materials, in particular SnSe2 and WSe2, which have been proposed for new TFET designs. The metal-oxide-2-D semiconductor test structures are fabricated and characterized by IPE, where the band offsets from the 2-D semiconductor to the oxide conduction band minimum are determined by the threshold of the cube root of IPE yields as a function of photon energy. In particular, we find that SnSe2 has a larger electron affinity than most semiconductors and can be combined with other semiconductors to form near broken-gap heterojunctions with low barrier heights which can produce a higher on-state current. The details of data analysis of IPE and the results from Raman spectroscopy and spectroscopic ellipsometry measurements will also be presented and discussed.

  19. Computational screening of iodine uptake in zeolitic imidazolate frameworks in a water-containing system.

    PubMed

    Yuan, Yue; Dong, Xiuqin; Chen, Yifei; Zhang, Minhua

    2016-08-17

    Iodine capture is of great environmental significance due to the high toxicity and volatility of I2. Here we conduct a systematic computational investigation of iodine adsorption in zeolitic imidazolate frameworks (ZIFs) by adopting the grand canonical Monte Carlo (GCMC) simulation and the density functional theory (DFT) method. The results confirm the vital structural factors for iodine adsorption at 298 K and moderate pressures including metal sites, organic linkers, symmetry, and topology types. The uptake will be enhanced by active metal sites, the simple imidazolate linker and single asymmetric linkers with polar functional groups. The symmetry effect is stronger than the surface properties. Meanwhile low steric hindrance is more beneficial than polar functional groups to iodine adsorption. The specific topology types like mer bringing large surface areas and large diameter cages result in high iodine capacities. Iodine molecules tend to locate in cages with large diameters and aggregates along the sides of cages. In contrast, water prefers small diameter cages. In hydrophilic materials, water has a negative impact on iodine uptake due to its similar adsorption sites to iodine. The selectivity of iodine over water increases with increasing water content due to the large diameter cages of ZIFs. This work proves that ZIFs can be identified as efficient and economical adsorbents with high diversity for iodine in a water-containing system. Furthermore, it provides comprehensive insights into key structural factors for iodine uptake and separation in silver-free porous solids. PMID:27499079

  20. Gold silver alloy nanoparticles (GSAN): an imaging probe for breast cancer screening with dual-energy mammography or computed tomography

    NASA Astrophysics Data System (ADS)

    Naha, Pratap C.; Lau, Kristen C.; Hsu, Jessica C.; Hajfathalian, Maryam; Mian, Shaameen; Chhour, Peter; Uppuluri, Lahari; McDonald, Elizabeth S.; Maidment, Andrew D. A.; Cormode, David P.

    2016-07-01

    Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening.Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various

  1. The physics of 2D microfluidic droplet ensembles

    NASA Astrophysics Data System (ADS)

    Beatus, Tsevi; Bar-Ziv, Roy H.; Tlusty, Tsvi

    2012-07-01

    We review non-equilibrium many-body phenomena in ensembles of 2D microfluidic droplets. The system comprises of continuous two-phase flow with disc-shaped droplets driven in a channel, at low Reynolds number of 10-4-10-3. The basic physics is that of an effective potential flow, governed by the 2D Laplace equation, with multiple, static and dynamic, boundaries of the droplets and the walls. The motion of the droplets induces dipolar flow fields, which mediate 1/r2 hydrodynamic interaction between the droplets. Summation of these long-range 2D forces over droplet ensembles converges, in contrast to the divergence of the hydrodynamic forces in 3D. In analogy to electrostatics, the strong effect of boundaries on the equations of motion is calculated by means of image dipoles. We first consider the dynamics of droplets flowing in a 1D crystal, which exhibits unique phonon-like excitations, and a variety of nonlinear instabilities-all stemming from the hydrodynamic interactions. Narrowing the channel results in hydrodynamic screening of the dipolar interactions, which changes salient features of the phonon spectra. Shifting from a 1D ordered crystal to 2D disordered ensemble, the hydrodynamic interactions induce collective density waves and shocks, which are superposed on single-droplet randomized motion and dynamic clustering. These collective modes originate from density-velocity coupling, whose outcome is a 1D Burgers equation. The rich observational phenomenology and the tractable theory render 2D droplet ensembles a suitable table-top system for studying non-equilibrium many-body physics with long-range interactions.

  2. Features of Undiagnosed Breast Cancers at Screening Breast MR Imaging and Potential Utility of Computer-Aided Evaluation

    PubMed Central

    Seo, Mirinae; Bae, Min Sun; Koo, Hye Ryoung; Kim, Won Hwa; Lee, Su Hyun; Chu, Ajung

    2016-01-01

    Objective To retrospectively evaluate the features of undiagnosed breast cancers on prior screening breast magnetic resonance (MR) images in patients who were subsequently diagnosed with breast cancer, as well as the potential utility of MR-computer-aided evaluation (CAE). Materials and Methods Between March 2004 and May 2013, of the 72 consecutive pairs of prior negative MR images and subsequent MR images with diagnosed cancers (median interval, 32.8 months; range, 5.4-104.6 months), 36 (50%) had visible findings (mean size, 1.0 cm; range, 0.3-5.2 cm). The visible findings were divided into either actionable or underthreshold groups by the blinded review by 5 radiologists. MR imaging features, reasons for missed cancer, and MR-CAE features according to actionability were evaluated. Results Of the 36 visible findings on prior MR images, 33.3% (12 of 36) of the lesions were determined to be actionable and 66.7% (24 of 36) were underthreshold; 85.7% (6 of 7) of masses and 31.6% (6 of 19) of non-mass enhancements were classified as actionable lesions. Mimicking physiologic enhancements (27.8%, 10 of 36) and small lesion size (27.8%, 10 of 36) were the most common reasons for missed cancer. Actionable findings tended to show more washout or plateau kinetic patterns on MR-CAE than underthreshold findings, as the 100% of actionable findings and 46.7% of underthreshold findings showed washout or plateau (p = 0.008). Conclusion MR-CAE has the potential for reducing the number of undiagnosed breast cancers on screening breast MR images, the majority of which are caused by mimicking physiologic enhancements or small lesion size. PMID:26798217

  3. Relevant incidental findings at abdominal multi-detector contrast-enhanced computed tomography: A collateral screening?

    PubMed Central

    Sconfienza, Luca Maria; Mauri, Giovanni; Muzzupappa, Claudia; Poloni, Alessandro; Bandirali, Michele; Esseridou, Anastassia; Tritella, Stefania; Secchi, Francesco; Di Leo, Giovanni; Sardanelli, Francesco

    2015-01-01

    AIM: To investigate the prevalence of relevant incidental findings (RIFs) detected during routine abdominal contrast-enhanced computed tomography (CeCT). METHODS: We retrospectively evaluated the reports of a consecutive series of abdominal CeCT studies performed between January and May 2013. For each report, patients’ age and sex, admission as inpatient or outpatient, clinical suspicion as indicated by the requesting physician, availability of a previous abdominal examination, and name of the reporting radiologist were recorded. Based on the clinical suspicion, the presence and features of any RIFs (if needing additional workup) was noted. RESULTS: One thousand forty abdominal CeCT were performed in 949 patients (528 males, mean age 66 ± 14 years). No significant difference was found between inpatients and outpatients age and sex distribution (P > 0.472). RIFs were found in 195/1040 (18.8%) CeCT [inpatients = 108/470 (23.0%); outpatients = 87/570 (15.2%); P = 0.002]. RIFs were found in 30/440 (6.8%) CeCT with a previous exam and in 165/600 (27.5%) without a previous exam (P < 0.001). Radiologists’ distribution between inpatients or outpatients was significantly different (P < 0.001). RIFs prevalence increased with aging, except for a peak in 40-49 year group. Most involved organs were kidneys, gallbladder, and lungs. CONCLUSION: A RIF is detected in 1/5 patients undergoing abdominal CeCT. Risk of overdiagnosis should be taken into account. PMID:26516432

  4. Interfacing electrochromic spectacles to computer IO ports.

    PubMed

    White, D N; Bissland, L

    1995-04-01

    Many important properties of molecules depend on their precise three-dimensional (3D) structure. It is therefore useful to be able to view a molecule in 3D on a 2D computer screen when manipulating it. An inexpensive method for viewing in 3D using liquid crystal glasses and a PC is presented. The methodology used is easily extended to other computers and workstations. PMID:7619789

  5. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction

    NASA Astrophysics Data System (ADS)

    Rowley-Neale, Samuel J.; Fearn, Jamie M.; Brownson, Dale A. C.; Smith, Graham C.; Ji, Xiaobo; Banks, Craig E.

    2016-08-01

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm-2 modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR.Two-dimensional molybdenum disulphide nanosheets

  6. Effect of screen-based computer simulation on knowledge and skill in nursing students' learning of preoperative and postoperative care management: a randomized controlled study.

    PubMed

    Durmaz, Aylin; Dicle, Aklime; Cakan, Emre; Cakir, Şen

    2012-04-01

    Screen-based computer simulations are considered a method of skill teaching in health education. This study examined the effect of screen-based computer simulation on knowledge, skill, and the clinical decision-making process in teaching preoperative and postoperative care management to second-year students in an undergraduate school of nursing. It is a randomized controlled study. The study sample was composed of 82 students. They received education in screen-based computer simulation (n = 41) and skill laboratories (n = 41). Three instruments were used: a preoperative and postoperative care management cognitive level assessment test, skill control lists of preoperative and postoperative care management, and the Clinical Decision Making in Nursing Scale. There was not a significant difference between the students' posteducation knowledge levels (P = .421), practical deep breathing and coughing exercise education skills (P = .867), or clinical decision-making scale total and subscale scores (P = .065). However, a significant difference was found between the admission of the patient in the surgical clinic after surgery skill scores of the students (P = .04). Education provided in the screen-based computer simulation laboratory was equivalent to that provided in the skill laboratory. PMID:22228217

  7. Brittle damage models in DYNA2D

    SciTech Connect

    Faux, D.R.

    1997-09-01

    DYNA2D is an explicit Lagrangian finite element code used to model dynamic events where stress wave interactions influence the overall response of the system. DYNA2D is often used to model penetration problems involving ductile-to-ductile impacts; however, with the advent of the use of ceramics in the armor-anti-armor community and the need to model damage to laser optics components, good brittle damage models are now needed in DYNA2D. This report will detail the implementation of four brittle damage models in DYNA2D, three scalar damage models and one tensor damage model. These new brittle damage models are then used to predict experimental results from three distinctly different glass damage problems.

  8. Matrix models of 2d gravity

    SciTech Connect

    Ginsparg, P.

    1991-01-01

    These are introductory lectures for a general audience that give an overview of the subject of matrix models and their application to random surfaces, 2d gravity, and string theory. They are intentionally 1.5 years out of date.

  9. Matrix models of 2d gravity

    SciTech Connect

    Ginsparg, P.

    1991-12-31

    These are introductory lectures for a general audience that give an overview of the subject of matrix models and their application to random surfaces, 2d gravity, and string theory. They are intentionally 1.5 years out of date.

  10. Lorenz-Mie theory for 2D scattering and resonance calculations

    NASA Astrophysics Data System (ADS)

    Gagnon, Denis; Dubé, Louis J.

    2015-10-01

    This PhD tutorial is concerned with a description of the two-dimensional generalized Lorenz-Mie theory (2D-GLMT), a well-established numerical method used to compute the interaction of light with arrays of cylindrical scatterers. This theory is based on the method of separation of variables and the application of an addition theorem for cylindrical functions. The purpose of this tutorial is to assemble the practical tools necessary to implement the 2D-GLMT method for the computation of scattering by passive scatterers or of resonances in optically active media. The first part contains a derivation of the vector and scalar Helmholtz equations for 2D geometries, starting from Maxwell’s equations. Optically active media are included in 2D-GLMT using a recent stationary formulation of the Maxwell-Bloch equations called steady-state ab initio laser theory (SALT), which introduces new classes of solutions useful for resonance computations. Following these preliminaries, a detailed description of 2D-GLMT is presented. The emphasis is placed on the derivation of beam-shape coefficients for scattering computations, as well as the computation of resonant modes using a combination of 2D-GLMT and SALT. The final section contains several numerical examples illustrating the full potential of 2D-GLMT for scattering and resonance computations. These examples, drawn from the literature, include the design of integrated polarization filters and the computation of optical modes of photonic crystal cavities and random lasers.

  11. An inverse design method for 2D airfoil

    NASA Astrophysics Data System (ADS)

    Liang, Zhi-Yong; Cui, Peng; Zhang, Gen-Bao

    2010-03-01

    The computational method for aerodynamic design of aircraft is applied more universally than before, in which the design of an airfoil is a hot problem. The forward problem is discussed by most relative papers, but inverse method is more useful in practical designs. In this paper, the inverse design of 2D airfoil was investigated. A finite element method based on the variational principle was used for carrying out. Through the simulation, it was shown that the method was fit for the design.

  12. Real-time 2-D temperature imaging using ultrasound.

    PubMed

    Liu, Dalong; Ebbini, Emad S

    2010-01-01

    We have previously introduced methods for noninvasive estimation of temperature change using diagnostic ultrasound. The basic principle was validated both in vitro and in vivo by several groups worldwide. Some limitations remain, however, that have prevented these methods from being adopted in monitoring and guidance of minimally invasive thermal therapies, e.g., RF ablation and high-intensity-focused ultrasound (HIFU). In this letter, we present first results from a real-time system for 2-D imaging of temperature change using pulse-echo ultrasound. The front end of the system is a commercially available scanner equipped with a research interface, which allows the control of imaging sequence and access to the RF data in real time. A high-frame-rate 2-D RF acquisition mode, M2D, is used to capture the transients of tissue motion/deformations in response to pulsed HIFU. The M2D RF data is streamlined to the back end of the system, where a 2-D temperature imaging algorithm based on speckle tracking is implemented on a graphics processing unit. The real-time images of temperature change are computed on the same spatial and temporal grid of the M2D RF data, i.e., no decimation. Verification of the algorithm was performed by monitoring localized HIFU-induced heating of a tissue-mimicking elastography phantom. These results clearly demonstrate the repeatability and sensitivity of the algorithm. Furthermore, we present in vitro results demonstrating the possible use of this algorithm for imaging changes in tissue parameters due to HIFU-induced lesions. These results clearly demonstrate the value of the real-time data streaming and processing in monitoring, and guidance of minimally invasive thermotherapy. PMID:19884075

  13. Chemical Approaches to 2D Materials.

    PubMed

    Samorì, Paolo; Palermo, Vincenzo; Feng, Xinliang

    2016-08-01

    Chemistry plays an ever-increasing role in the production, functionalization, processing and applications of graphene and other 2D materials. This special issue highlights a selection of enlightening chemical approaches to 2D materials, which nicely reflect the breadth of the field and convey the excitement of the individuals involved in it, who are trying to translate graphene and related materials from the laboratory into a real, high-impact technology. PMID:27478083

  14. Extended 2D generalized dilaton gravity theories

    NASA Astrophysics Data System (ADS)

    de Mello, R. O.

    2008-09-01

    We show that an anomaly-free description of matter in (1+1) dimensions requires a deformation of the 2D relativity principle, which introduces a non-trivial centre in the 2D Poincaré algebra. Then we work out the reduced phase space of the anomaly-free 2D relativistic particle, in order to show that it lives in a noncommutative 2D Minkowski space. Moreover, we build a Gaussian wave packet to show that a Planck length is well defined in two dimensions. In order to provide a gravitational interpretation for this noncommutativity, we propose to extend the usual 2D generalized dilaton gravity models by a specific Maxwell component, which guages the extra symmetry associated with the centre of the 2D Poincaré algebra. In addition, we show that this extension is a high energy correction to the unextended dilaton theories that can affect the topology of spacetime. Further, we couple a test particle to the general extended dilaton models with the purpose of showing that they predict a noncommutativity in curved spacetime, which is locally described by a Moyal star product in the low energy limit. We also conjecture a probable generalization of this result, which provides strong evidence that the noncommutativity is described by a certain star product which is not of the Moyal type at high energies. Finally, we prove that the extended dilaton theories can be formulated as Poisson Sigma models based on a nonlinear deformation of the extended Poincaré algebra.

  15. 2D/3D Image Registration using Regression Learning

    PubMed Central

    Chou, Chen-Rui; Frederick, Brandon; Mageras, Gig; Chang, Sha; Pizer, Stephen

    2013-01-01

    In computer vision and image analysis, image registration between 2D projections and a 3D image that achieves high accuracy and near real-time computation is challenging. In this paper, we propose a novel method that can rapidly detect an object’s 3D rigid motion or deformation from a 2D projection image or a small set thereof. The method is called CLARET (Correction via Limited-Angle Residues in External Beam Therapy) and consists of two stages: registration preceded by shape space and regression learning. In the registration stage, linear operators are used to iteratively estimate the motion/deformation parameters based on the current intensity residue between the target projec-tion(s) and the digitally reconstructed radiograph(s) (DRRs) of the estimated 3D image. The method determines the linear operators via a two-step learning process. First, it builds a low-order parametric model of the image region’s motion/deformation shape space from its prior 3D images. Second, using learning-time samples produced from the 3D images, it formulates the relationships between the model parameters and the co-varying 2D projection intensity residues by multi-scale linear regressions. The calculated multi-scale regression matrices yield the coarse-to-fine linear operators used in estimating the model parameters from the 2D projection intensity residues in the registration. The method’s application to Image-guided Radiation Therapy (IGRT) requires only a few seconds and yields good results in localizing a tumor under rigid motion in the head and neck and under respiratory deformation in the lung, using one treatment-time imaging 2D projection or a small set thereof. PMID:24058278

  16. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children

    PubMed Central

    Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Background Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. Methods In a cross-sectional study, 185 parents and children aged 3–18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. Results After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23–8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07–2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99–1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. Conclusions The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by

  17. Configuration of automatic exposure control on mammography units for computed radiography to match patient dose of screen film systems

    NASA Astrophysics Data System (ADS)

    Yang, Chang-Ying Joseph; Huang, Weidong

    2009-02-01

    Computed radiography (CR) is considered a drop-in addition or replacement for traditional screen-film (SF) systems in digital mammography. Unlike other technologies, CR has the advantage of being compatible with existing mammography units. One of the challenges, however, is to properly configure the automatic exposure control (AEC) on existing mammography units for CR use. Unlike analogue systems, the capture and display of digital CR images is decoupled. The function of AEC is changed from ensuring proper and consistent optical density of the captured image on film to balancing image quality with patient dose needed for CR. One of the preferences when acquiring CR images under AEC is to use the same patient dose as SF systems. The challenge is whether the existing AEC design and calibration process-most of them proprietary from the X-ray systems manufacturers and tailored specifically for SF response properties-can be adapted for CR cassettes, in order to compensate for their response and attenuation differences. This paper describes the methods for configuring the AEC of three different mammography units models to match the patient dose used for CR with those that are used for a KODAK MIN-R 2000 SF System. Based on phantom test results, these methods provide the dose level under AEC for the CR systems to match with the dose of SF systems. These methods can be used in clinical environments that require the acquisition of CR images under AEC at the same dose levels as those used for SF systems.

  18. [Early lung cancer detection in an occupational asbestos exposed population: clinical impact of low-dose computed tomography screening].

    PubMed

    Pira, E; Coggiola, M; Bosio, D

    2010-01-01

    Lung cancer is the primary cause of cancer mortality in developed countries. Early detection and surgical resection is essential for the treatment of lung cancer. The introduction of low-dose spiral computed tomography (LDCT) is considered one of the most promising clinical research developments in early diagnosis of lung cancer. Our study is aimed at the evaluation of spiral CT in a cohort of subjects with a past occupational exposure to asbestos at high risk of developing lung cancer. 149 subjects were enrolled between 2007 and 2009 (the criteria for enrollment were date of birth between 1930-1961, no previous cancer and general good health, latency from the beginning of exposure > 10 years, exposure duration > 1 year, possibility to undergo to surgery). A helical low-dose CT (LDCT) of the chest was performed yearly and an evaluation protocol derived from IEO with a morphological analysis of nodules have been adopted. 13 nodules were diagnosed in the first CT, 7 in the second and 3 in the third but no invasive procedures have been taken and no lung cancer have been detected. Our early follow-up data aren't able yet to evaluate the effect of screening with LDCT on mortality but have do not confirm some of the literature initial results such as the Increase in cases of overdiagnosis (false positive) due to the high prevalence of benign lesions. PMID:21438306

  19. Beneficial Effects of Combining Computed Tomography Enteroclysis/Enterography with Capsule Endoscopy for Screening Tumor Lesions in the Small Intestine

    PubMed Central

    Shibata, Hiroaki; Hashimoto, Shinichi; Shimizu, Kensaku; Kawasato, Ryo; Shirasawa, Tomohiro; Yokota, Takayuki; Onoda, Hideko; Okamoto, Takeshi; Matsunaga, Naofumi; Sakaida, Isao

    2015-01-01

    Aim. To compare the efficacy of using computed tomography enteroclysis/enterography (CTE), capsule endoscopy (CE), and CTE with CE for diagnosing tumor lesions in the small intestine. Materials and Methods. We included 98 patients who underwent CE during the observation period and were subjected to CTE at our hospital from April 2008 to May 2014. Results. CTE had a significantly higher sensitivity than CE (84.6% versus 46.2%, P = 0.039), but there were no significant differences in specificity, positive or negative predictive values, or diagnostic accuracy rates. The sensitivity of CTE/CE was 100%, again significantly higher than that of CE (P = 0.002). The difference in specificity between CTE/CE and CE was not significant, but there were significant differences in positive predictive values (100% for CTE/CE versus 66.7% for CE, P = 0.012), negative predictive values (100% versus 92.1%, P = 0.008), and diagnostic accuracy rate (100% versus 89.8%, P = 0.001). The diagnostic accuracy rate was also significantly higher in CTE/CE versus CTE (100% versus 95.9%, P = 0.043). Conclusion. Our findings suggested that a combination of CTE and CE was useful for screening tumor lesions in the small intestine. This trial is registered with number UMIN000016154. PMID:25792979

  20. A comparative study of adult patient doses in film screen and computed radiography in some Sudanese hospitals.

    PubMed

    Elshiekh, E; Suliman, I I; Habbani, F

    2015-07-01

    A study was performed to compare adult patient doses in film screen (FS) and computed radiography (CR) diagnostic X-ray examinations in some hospitals in Sudan over a period of 1 y; during this period of time, the CR systems were introduced to replace FS systems. Radiation doses were estimated for 354 patients in five hospitals (two FS units and three CR units). Entrance surface air kerma (ESAK) was estimated from incident air kerma using patient exposure parameters and tube output. Dose calculations were performed using CALDOSE X 3.5 Monte Carlo-based software. In FS, third quartile of ESAK values for skull PA, skull LAT, chest PA, pelvis AP, lumbar spine AP and lumbar spine LAT were 1.5, 1.3, 0.3, 1.9, 2.8 and 5.9 mGy, respectively, while in CR, third quartile of ESAK values for the same examinations were 2.7, 1.7, 0.18, 1.7, 3.2 and 10.8 mGy, respectively. Comparable ESAK values were presented in FS and CR units. The results are important for future dose optimisation and setting national diagnostic reference levels. PMID:25889604

  1. Low-dose computed tomography screening for lung cancer in populations highly exposed to tobacco: A systematic methodological appraisal of published randomised controlled trials.

    PubMed

    Coureau, Gaëlle; Salmi, L Rachid; Etard, Cécile; Sancho-Garnier, Hélène; Sauvaget, Catherine; Mathoulin-Pélissier, Simone

    2016-07-01

    Low-dose computed tomography (LDCT) screening recommendations for lung cancer are contradictory. The French National Authority for Health commissioned experts to carry a systematic review on the effectiveness, acceptability and safety of lung cancer screening with LDCT in subjects highly exposed to tobacco. We used MEDLINE and Embase databases (2003-2014) and identified 83 publications representing ten randomised control trials. Control arms and methodology varied considerably, precluding a full comparison and questioning reproducibility of the findings. From five trials reporting mortality results, only the National Lung Screening Trial found a significant decrease of disease-specific and all-cause mortality with LDCT screening compared to chest X-ray screening. None of the studies provided all information needed to document the risk-benefit balance. The lack of statistical power and the methodological heterogeneity of European trials question on the possibility of obtaining valid results separately or by pooling. We conclude, in regard to the lack of strong scientific evidence, that LDCT screening should not be recommended in subjects highly exposed to tobacco. PMID:27211572

  2. ELLIPT2D: A Flexible Finite Element Code Written Python

    SciTech Connect

    Pletzer, A.; Mollis, J.C.

    2001-03-22

    The use of the Python scripting language for scientific applications and in particular to solve partial differential equations is explored. It is shown that Python's rich data structure and object-oriented features can be exploited to write programs that are not only significantly more concise than their counter parts written in Fortran, C or C++, but are also numerically efficient. To illustrate this, a two-dimensional finite element code (ELLIPT2D) has been written. ELLIPT2D provides a flexible and easy-to-use framework for solving a large class of second-order elliptic problems. The program allows for structured or unstructured meshes. All functions defining the elliptic operator are user supplied and so are the boundary conditions, which can be of Dirichlet, Neumann or Robbins type. ELLIPT2D makes extensive use of dictionaries (hash tables) as a way to represent sparse matrices.Other key features of the Python language that have been widely used include: operator over loading, error handling, array slicing, and the Tkinter module for building graphical use interfaces. As an example of the utility of ELLIPT2D, a nonlinear solution of the Grad-Shafranov equation is computed using a Newton iterative scheme. A second application focuses on a solution of the toroidal Laplace equation coupled to a magnetohydrodynamic stability code, a problem arising in the context of magnetic fusion research.

  3. Targeting multiple types of tumors using NKG2D-coated iron oxide nanoparticles

    PubMed Central

    Wu, Ming-Ru; Cook, W. James; Zhang, Tong; Sentman, Charles L.

    2015-01-01

    Iron oxide nanoparticles (IONPs) hold great potential for cancer therapy. Actively targeting IONPs to tumor cells can further increase therapeutic efficacy and decrease off-target side effects. To target tumor cells, a natural killer (NK) cell activating receptor, NKG2D, was utilized to develop pan-tumor targeting IONPs. NKG2D ligands are expressed on many tumor types and its ligands are not found on most normal tissues under steady state conditions. The data showed that mouse and human fragment crystallizable (Fc) -fusion NKG2D (Fc-NKG2D) coated IONPs (NKG2D/NPs) can target multiple NKG2D ligand positive tumor types in vitro in a dose dependent manner by magnetic cell sorting. Tumor targeting effect was robust even under a very low tumor cell to normal cell ratio and targeting efficiency correlated with NKG2D ligand expression level on tumor cells. Furthermore, the magnetic separation platform utilized to test NKG2D/NP specificity has the potential to be developed into high throughput screening strategies to identify ideal fusion proteins or antibodies for targeting IONPs. In conclusion, NKG2D/NPs can be used to target multiple tumor types and magnetic separation platform can facilitate the proof-of-concept phase of tumor targeting IONP development. PMID:25371538

  4. Targeting multiple types of tumors using NKG2D-coated iron oxide nanoparticles

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Ru; Cook, W. James; Zhang, Tong; Sentman, Charles L.

    2014-11-01

    Iron oxide nanoparticles (IONPs) hold great potential for cancer therapy. Actively targeting IONPs to tumor cells can further increase therapeutic efficacy and decrease off-target side effects. To target tumor cells, a natural killer (NK) cell activating receptor, NKG2D, was utilized to develop pan-tumor targeting IONPs. NKG2D ligands are expressed on many tumor types and its ligands are not found on most normal tissues under steady state conditions. The data showed that mouse and human fragment crystallizable (Fc)-fusion NKG2D (Fc-NKG2D) coated IONPs (NKG2D/NPs) can target multiple NKG2D ligand positive tumor types in vitro in a dose dependent manner by magnetic cell sorting. Tumor targeting effect was robust even under a very low tumor cell to normal cell ratio and targeting efficiency correlated with NKG2D ligand expression level on tumor cells. Furthermore, the magnetic separation platform utilized to test NKG2D/NP specificity has the potential to be developed into high throughput screening strategies to identify ideal fusion proteins or antibodies for targeting IONPs. In conclusion, NKG2D/NPs can be used to target multiple tumor types and magnetic separation platform can facilitate the proof-of-concept phase of tumor targeting IONP development.

  5. A scanning-mode 2D shear wave imaging (s2D-SWI) system for ultrasound elastography.

    PubMed

    Qiu, Weibao; Wang, Congzhi; Li, Yongchuan; Zhou, Juan; Yang, Ge; Xiao, Yang; Feng, Ge; Jin, Qiaofeng; Mu, Peitian; Qian, Ming; Zheng, Hairong

    2015-09-01

    Ultrasound elastography is widely used for the non-invasive measurement of tissue elasticity properties. Shear wave imaging (SWI) is a quantitative method for assessing tissue stiffness. SWI has been demonstrated to be less operator dependent than quasi-static elastography, and has the ability to acquire quantitative elasticity information in contrast with acoustic radiation force impulse (ARFI) imaging. However, traditional SWI implementations cannot acquire two dimensional (2D) quantitative images of the tissue elasticity distribution. This study proposes and evaluates a scanning-mode 2D SWI (s2D-SWI) system. The hardware and image processing algorithms are presented in detail. Programmable devices are used to support flexible control of the system and the image processing algorithms. An analytic signal based cross-correlation method and a Radon transformation based shear wave speed determination method are proposed, which can be implemented using parallel computation. Imaging of tissue mimicking phantoms, and in vitro, and in vivo imaging test are conducted to demonstrate the performance of the proposed system. The s2D-SWI system represents a new choice for the quantitative mapping of tissue elasticity, and has great potential for implementation in commercial ultrasound scanners. PMID:26025508

  6. Optical modulators with 2D layered materials

    NASA Astrophysics Data System (ADS)

    Sun, Zhipei; Martinez, Amos; Wang, Feng

    2016-04-01

    Light modulation is an essential operation in photonics and optoelectronics. With existing and emerging technologies increasingly demanding compact, efficient, fast and broadband optical modulators, high-performance light modulation solutions are becoming indispensable. The recent realization that 2D layered materials could modulate light with superior performance has prompted intense research and significant advances, paving the way for realistic applications. In this Review, we cover the state of the art of optical modulators based on 2D materials, including graphene, transition metal dichalcogenides and black phosphorus. We discuss recent advances employing hybrid structures, such as 2D heterostructures, plasmonic structures, and silicon and fibre integrated structures. We also take a look at the future perspectives and discuss the potential of yet relatively unexplored mechanisms, such as magneto-optic and acousto-optic modulation.

  7. Large Area Synthesis of 2D Materials

    NASA Astrophysics Data System (ADS)

    Vogel, Eric

    Transition metal dichalcogenides (TMDs) have generated significant interest for numerous applications including sensors, flexible electronics, heterostructures and optoelectronics due to their interesting, thickness-dependent properties. Despite recent progress, the synthesis of high-quality and highly uniform TMDs on a large scale is still a challenge. In this talk, synthesis routes for WSe2 and MoS2 that achieve monolayer thickness uniformity across large area substrates with electrical properties equivalent to geological crystals will be described. Controlled doping of 2D semiconductors is also critically required. However, methods established for conventional semiconductors, such as ion implantation, are not easily applicable to 2D materials because of their atomically thin structure. Redox-active molecular dopants will be demonstrated which provide large changes in carrier density and workfunction through the choice of dopant, treatment time, and the solution concentration. Finally, several applications of these large-area, uniform 2D materials will be described including heterostructures, biosensors and strain sensors.

  8. 2D-Crystal-Based Functional Inks.

    PubMed

    Bonaccorso, Francesco; Bartolotta, Antonino; Coleman, Jonathan N; Backes, Claudia

    2016-08-01

    The possibility to produce and process graphene, related 2D crystals, and heterostructures in the liquid phase makes them promising materials for an ever-growing class of applications as composite materials, sensors, in flexible optoelectronics, and energy storage and conversion. In particular, the ability to formulate functional inks with on-demand rheological and morphological properties, i.e., lateral size and thickness of the dispersed 2D crystals, is a step forward toward the development of industrial-scale, reliable, inexpensive printing/coating processes, a boost for the full exploitation of such nanomaterials. Here, the exfoliation strategies of graphite and other layered crystals are reviewed, along with the advances in the sorting of lateral size and thickness of the exfoliated sheets together with the formulation of functional inks and the current development of printing/coating processes of interest for the realization of 2D-crystal-based devices. PMID:27273554

  9. The 2D lingual appliance system.

    PubMed

    Cacciafesta, Vittorio

    2013-09-01

    The two-dimensional (2D) lingual bracket system represents a valuable treatment option for adult patients seeking a completely invisible orthodontic appliance. The ease of direct or simplified indirect bonding of 2D lingual brackets in combination with low friction mechanics makes it possible to achieve a good functional and aesthetic occlusion, even in the presence of a severe malocclusion. The use of a self-ligating bracket significantly reduces chair-side time for the orthodontist, and the low-profile bracket design greatly improves patient comfort. PMID:24005953

  10. Inkjet printing of 2D layered materials.

    PubMed

    Li, Jiantong; Lemme, Max C; Östling, Mikael

    2014-11-10

    Inkjet printing of 2D layered materials, such as graphene and MoS2, has attracted great interests for emerging electronics. However, incompatible rheology, low concentration, severe aggregation and toxicity of solvents constitute critical challenges which hamper the manufacturing efficiency and product quality. Here, we introduce a simple and general technology concept (distillation-assisted solvent exchange) to efficiently overcome these challenges. By implementing the concept, we have demonstrated excellent jetting performance, ideal printing patterns and a variety of promising applications for inkjet printing of 2D layered materials. PMID:25169938

  11. Measurement of 2D birefringence distribution

    NASA Astrophysics Data System (ADS)

    Noguchi, Masato; Ishikawa, Tsuyoshi; Ohno, Masahiro; Tachihara, Satoru

    1992-10-01

    A new measuring method of 2-D birefringence distribution has been developed. It has not been an easy job to get a birefringence distribution in an optical element with conventional ellipsometry because of its lack of scanning means. Finding an analogy between the rotating analyzer method in ellipsometry and the phase-shifting method in recently developed digital interferometry, we have applied the phase-shifting algorithm to ellipsometry, and have developed a new method that makes the measurement of 2-D birefringence distribution easy and possible. The system contains few moving parts, assuring reliability, and measures a large area of a sample at one time, making the measuring time very short.

  12. The Privileged Chemical Space Predictor (PCSP): A computer program that identifies privileged chemical space from screens of modularly assembled chemical libraries

    PubMed Central

    Seedhouse, Steven J.; Labuda, Lucas P.; Disney, Matthew D.

    2010-01-01

    Modularly assembled combinatorial libraries are often used to identify ligands that bind to and modulate the function of a protein or a nucleic acid. Much of the data from screening these compounds, however, is not efficiently utilized to define structure-activity relationships (SAR). If SAR data are accurately constructed, it can enable the design of more potent binders. Herein, we describe a computer program called Privileged Chemical Space Predictor (PCSP) that statistically determines SAR from high-throughput screening (HTS) data and then identifies features in small molecules that predispose them for binding a target. Features are scored for statistical significance and can be utilized to design improved second generation compounds or more target-focused libraries. The program’s utility is demonstrated through analysis of a modularly assembled peptoid library that was screened for binding to and inhibiting a group I intron RNA from the fungal pathogen Candida albicans. PMID:20097562

  13. COYOTE: A computer program for 2-D reactive flow simulations

    SciTech Connect

    Cloutman, L.D.

    1990-04-01

    We describe the numerical algorithm used in the COYOTE two- dimensional, transient, Eulerian hydrodynamics program for reactive flows. The program has a variety of options that provide capabilities for a wide range of applications, and it is designed to be robust and relatively easy to use while maintaining adequate accuracy and efficiency to solve realistic problems. It is based on the ICE method, and it includes a general species and chemical reaction network for simulating reactive flows. It also includes swirl, turbulence transport models, and a nonuniform mesh capability. We describe several applications of the program. 33 refs., 4 figs.

  14. TOPAZ2D heat transfer code users manual and thermal property data base

    NASA Astrophysics Data System (ADS)

    Shapiro, A. B.; Edwards, A. L.

    1990-05-01

    TOPAZ2D is a two dimensional implicit finite element computer code for heat transfer analysis. This user's manual provides information on the structure of a TOPAZ2D input file. Also included is a material thermal property data base. This manual is supplemented with The TOPAZ2D Theoretical Manual and the TOPAZ2D Verification Manual. TOPAZ2D has been implemented on the CRAY, SUN, and VAX computers. TOPAZ2D can be used to solve for the steady state or transient temperature field on two dimensional planar or axisymmetric geometries. Material properties may be temperature dependent and either isotropic or orthotropic. A variety of time and temperature dependent boundary conditions can be specified including temperature, flux, convection, and radiation. Time or temperature dependent internal heat generation can be defined locally be element or globally by material. TOPAZ2D can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in material surrounding the enclosure. Additional features include thermally controlled reactive chemical mixtures, thermal contact resistance across an interface, bulk fluid flow, phase change, and energy balances. Thermal stresses can be calculated using the solid mechanics code NIKE2D which reads the temperature state data calculated by TOPAZ2D. A three dimensional version of the code, TOPAZ3D is available.

  15. Search for β2 Adrenergic Receptor Ligands by Virtual Screening via Grid Computing and Investigation of Binding Modes by Docking and Molecular Dynamics Simulations

    PubMed Central

    Bai, Qifeng; Shao, Yonghua; Pan, Dabo; Zhang, Yang; Liu, Huanxiang; Yao, Xiaojun

    2014-01-01

    We designed a program called MolGridCal that can be used to screen small molecule database in grid computing on basis of JPPF grid environment. Based on MolGridCal program, we proposed an integrated strategy for virtual screening and binding mode investigation by combining molecular docking, molecular dynamics (MD) simulations and free energy calculations. To test the effectiveness of MolGridCal, we screened potential ligands for β2 adrenergic receptor (β2AR) from a database containing 50,000 small molecules. MolGridCal can not only send tasks to the grid server automatically, but also can distribute tasks using the screensaver function. As for the results of virtual screening, the known agonist BI-167107 of β2AR is ranked among the top 2% of the screened candidates, indicating MolGridCal program can give reasonable results. To further study the binding mode and refine the results of MolGridCal, more accurate docking and scoring methods are used to estimate the binding affinity for the top three molecules (agonist BI-167107, neutral antagonist alprenolol and inverse agonist ICI 118,551). The results indicate agonist BI-167107 has the best binding affinity. MD simulation and free energy calculation are employed to investigate the dynamic interaction mechanism between the ligands and β2AR. The results show that the agonist BI-167107 also has the lowest binding free energy. This study can provide a new way to perform virtual screening effectively through integrating molecular docking based on grid computing, MD simulations and free energy calculations. The source codes of MolGridCal are freely available at http://molgridcal.codeplex.com. PMID:25229694

  16. Concurrent Validity of a Computer-Based Cognitive Screening Tool for Use in Adults with HIV Disease

    PubMed Central

    Dew, Mary Amanda; Aizenstein, Howard J.; Lopez, Oscar L.; Morrow, Lisa; Saxton, Judith

    2011-01-01

    Abstract As the incidence of HIV-associated dementia has decreased, the survival of HIV-infected individuals with milder forms of cognitive impairment has increased. Detecting this milder impairment in its earliest stages has great clinical and research importance. We report here the results of an initial evaluation of the Computer Assessment of Mild Cognitive Impairment (CAMCI®), a computerized screening tool designed to assess abnormal cognitive decline with reduced respondent and test administrator burden. Fifty-nine volunteers (29 HIV infected; age=50.9 years; education=14.9 years; 36/59 males) completed the CAMCI® and a battery of neuropsychological tests. The CAMCI was repeated 12 and 24 weeks later. The results from the CAMCI were compared to Global and Domain Impairment scores derived from the full neuropsychological test battery. The CAMCI detected mild impairment (compared with normal and borderline test performance) with a sensitivity of 0.72, specificity of 0.97, positive predictive rate of 0.93, and a negative predictive rate of 0.89. Median stability over 12 and 24 weeks of follow-up was 0.32 and 0.46, respectively. These rates did not differ as a function of serostatus. A discriminant function analysis correctly classified 90% of the subjects with respect to their overall Global Impairment Rating from six of the CAMCI scores. This preliminary study demonstrates that the CAMCI is sensitive to mild forms of cognitive impairment, and is stable over 24 weeks of follow-up. A larger trial to obtain risk-group appropriate normative data will be necessary to make the instrument useful in both clinical practice and research (e.g., clinical trials). PMID:21545295

  17. MPEG-4-based 2D facial animation for mobile devices

    NASA Astrophysics Data System (ADS)

    Riegel, Thomas B.

    2005-03-01

    The enormous spread of mobile computing devices (e.g. PDA, cellular phone, palmtop, etc.) emphasizes scalable applications, since users like to run their favorite programs on the terminal they operate at that moment. Therefore appliances are of interest, which can be adapted to the hardware realities without loosing a lot of their functionalities. A good example for this is "Facial Animation," which offers an interesting way to achieve such "scalability." By employing MPEG-4, which provides an own profile for facial animation, a solution for low power terminals including mobile phones is demonstrated. From the generic 3D MPEG-4 face a specific 2D head model is derived, which consists primarily of a portrait image superposed by a suited warping mesh and adapted 2D animation rules. Thus the animation process of MPEG-4 need not be changed and standard compliant facial animation parameters can be used to displace the vertices of the mesh and warp the underlying image accordingly.

  18. FPCAS2D user's guide, version 1.0

    NASA Astrophysics Data System (ADS)

    Bakhle, Milind A.

    1994-12-01

    The FPCAS2D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady two-dimensional full potential equation which is solved for a cascade of blades. The structural analysis is based on a two degree-of-freedom rigid typical section model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS2D code. A complete description of the input data is provided in this report. In addition, four test cases, including inputs and outputs, are provided.

  19. 2-D Magnetohydrodynamic Modeling of A Pulsed Plasma Thruster

    NASA Technical Reports Server (NTRS)

    Thio, Y. C. Francis; Cassibry, J. T.; Wu, S. T.; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Experiments are being performed on the NASA Marshall Space Flight Center (MSFC) MK-1 pulsed plasma thruster. Data produced from the experiments provide an opportunity to further understand the plasma dynamics in these thrusters via detailed computational modeling. The detailed and accurate understanding of the plasma dynamics in these devices holds the key towards extending their capabilities in a number of applications, including their applications as high power (greater than 1 MW) thrusters, and their use for producing high-velocity, uniform plasma jets for experimental purposes. For this study, the 2-D MHD modeling code, MACH2, is used to provide detailed interpretation of the experimental data. At the same time, a 0-D physics model of the plasma initial phase is developed to guide our 2-D modeling studies.

  20. 2D FEM Heat Transfer & E&M Field Code

    1992-04-02

    TOPAZ and TOPAZ2D are two-dimensional implicit finite element computer codes for heat transfer analysis. TOPAZ2D can also be used to solve electrostatic and magnetostatic problems. The programs solve for the steady-state or transient temperature or electrostatic and magnetostatic potential field on two-dimensional planar or axisymmetric geometries. Material properties may be temperature or potential-dependent and either isotropic or orthotropic. A variety of time and temperature-dependent boundary conditions can be specified including temperature, flux, convection, and radiation.more » By implementing the user subroutine feature, users can model chemical reaction kinetics and allow for any type of functional representation of boundary conditions and internal heat generation. The programs can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in the material surrounding the enclosure. Additional features include thermal contact resistance across an interface, bulk fluids, phase change, and energy balances.« less

  1. 2D FEM Heat Transfer & E&M Field Code

    SciTech Connect

    1992-04-02

    TOPAZ and TOPAZ2D are two-dimensional implicit finite element computer codes for heat transfer analysis. TOPAZ2D can also be used to solve electrostatic and magnetostatic problems. The programs solve for the steady-state or transient temperature or electrostatic and magnetostatic potential field on two-dimensional planar or axisymmetric geometries. Material properties may be temperature or potential-dependent and either isotropic or orthotropic. A variety of time and temperature-dependent boundary conditions can be specified including temperature, flux, convection, and radiation. By implementing the user subroutine feature, users can model chemical reaction kinetics and allow for any type of functional representation of boundary conditions and internal heat generation. The programs can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in the material surrounding the enclosure. Additional features include thermal contact resistance across an interface, bulk fluids, phase change, and energy balances.

  2. Grid Cell Responses in 1D Environments Assessed as Slices through a 2D Lattice.

    PubMed

    Yoon, KiJung; Lewallen, Sam; Kinkhabwala, Amina A; Tank, David W; Fiete, Ila R

    2016-03-01

    Grid cells, defined by their striking periodic spatial responses in open 2D arenas, appear to respond differently on 1D tracks: the multiple response fields are not periodically arranged, peak amplitudes vary across fields, and the mean spacing between fields is larger than in 2D environments. We ask whether such 1D responses are consistent with the system's 2D dynamics. Combining analytical and numerical methods, we show that the 1D responses of grid cells with stable 1D fields are consistent with a linear slice through a 2D triangular lattice. Further, the 1D responses of comodular cells are well described by parallel slices, and the offsets in the starting points of the 1D slices can predict the measured 2D relative spatial phase between the cells. From these results, we conclude that the 2D dynamics of these cells is preserved in 1D, suggesting a common computation during both types of navigation behavior. PMID:26898777

  3. Parallel stitching of 2D materials

    DOE PAGESBeta

    Ling, Xi; Wu, Lijun; Lin, Yuxuan; Ma, Qiong; Wang, Ziqiang; Song, Yi; Yu, Lili; Huang, Shengxi; Fang, Wenjing; Zhang, Xu; et al

    2016-01-27

    Diverse parallel stitched 2D heterostructures, including metal–semiconductor, semiconductor–semiconductor, and insulator–semiconductor, are synthesized directly through selective “sowing” of aromatic molecules as the seeds in the chemical vapor deposition (CVD) method. Lastly, the methodology enables the large-scale fabrication of lateral heterostructures, which offers tremendous potential for its application in integrated circuits.

  4. Parallel Stitching of 2D Materials.

    PubMed

    Ling, Xi; Lin, Yuxuan; Ma, Qiong; Wang, Ziqiang; Song, Yi; Yu, Lili; Huang, Shengxi; Fang, Wenjing; Zhang, Xu; Hsu, Allen L; Bie, Yaqing; Lee, Yi-Hsien; Zhu, Yimei; Wu, Lijun; Li, Ju; Jarillo-Herrero, Pablo; Dresselhaus, Mildred; Palacios, Tomás; Kong, Jing

    2016-03-01

    Diverse parallel stitched 2D heterostructures, including metal-semiconductor, semiconductor-semiconductor, and insulator-semiconductor, are synthesized directly through selective "sowing" of aromatic molecules as the seeds in the chemical vapor deposition (CVD) method. The methodology enables the large-scale fabrication of lateral heterostructures, which offers tremendous potential for its application in integrated circuits. PMID:26813882

  5. Baby universes in 2d quantum gravity

    NASA Astrophysics Data System (ADS)

    Ambjørn, Jan; Jain, Sanjay; Thorleifsson, Gudmar

    1993-06-01

    We investigate the fractal structure of 2d quantum gravity, both for pure gravity and for gravity coupled to multiple gaussian fields and for gravity coupled to Ising spins. The roughness of the surfaces is described in terms of baby universes and using numerical simulations we measure their distribution which is related to the string susceptibility exponent γstring.

  6. LOCA hydroloads calculations with multidimensional nonlinear fluid/structure interaction. Volume 2: STEALTH 2D/WHAMSE 2D single-phse fluid and elastic structure studies. Final report. [PWR

    SciTech Connect

    Chang, F.H.; Santee, G.E. Jr.; Mortensen, G.A.; Brockett, G.F.; Gross, M.B.; Silling, S.A.; Belytschko, T.

    1981-03-01

    This report, the second in a series of reports for RP-1065, describes the second step in the stepwise approach for developing the three-dimensional, nonlinear, fluid/structure interaction methodology to assess the hydroloads on a large PWR during the subcooled portions of a hypothetical LOCA. The second step in the methodology considers enhancements and special modifications to the 2D STEALTH-HYDRO computer program and the 2D WHAMSE computer program. The 2D STEALTH-HYDRO enhancements consist of a fluid-fluid coupling control-volume model and an orifice control-volume model. The enhancements to 2D WHAMSE include elimination of the implicit integration routines, material models, and structural elements not required for the hydroloads application. In addition the logic for coupling the 2D STEALTH-HYDRO computer program to the 2D WHAMSE computer program is discussed.

  7. CYP2D6 phenotype-genotype relationships in African-Americans and Caucasians in Los Angeles.

    PubMed

    Leathart, J B; London, S J; Steward, A; Adams, J D; Idle, J R; Daly, A K

    1998-12-01

    CYP2D6 genotyping (CYP2D6*3, CYP2D6*4, CYP2D6*5, CYP2D6*13, CYP2D6*16 alleles and gene duplications) was previously performed on 1053 Caucasian and African-American lung cancer cases and control individuals and no significant difference in allele frequencies between cases and control individuals detected. We have carried out additional genotyping (CYP2D6*6, CYP2D6*7, CYP2D6*8, CYP2D6*9, CYP2D6*10, CYP2D6*17 alleles) and debrisoquine phenotyping on subgroups from this study to assess phenotype-genotype relationships. African-Americans showed significant differences from Caucasians with respect to frequency of defective CYP2D6 alleles, particularly CYP2D6*4 and CYP2D6*5. The CYP2D6*17 allele occurred at a frequency of 0.26 among 87 African-Americans and appeared to explain higher average metabolic ratios among African-Americans compared with Caucasians. CYP2D6*6, CYP2D6*8, CYP2D6*9 and CYP2D6*10 were rare in both ethnic groups but explained approximately 40% of higher than expected metabolic ratios among extensive metabolizers. Among individuals phenotyped with debrisoquine, 32 out of 359 were in the poor metabolizer range with 24 of these (75%) also showing two defective CYP2D6 alleles. Additional single strand conformational polymorphism analysis screening of samples showing large phenotype-genotype discrepancies resulted in the detection of three novel polymorphisms. If subjects taking potentially interfering drugs were excluded, this additional screening enabled the positive identification of 88% of phenotypic poor metabolizers by genotyping. This sensitivity was comparable with that of phenotyping, which identified 90% of those with two defective alleles as poor metabolizers. PMID:9918137

  8. Using computational modeling to assess the impact of clinical decision support on cancer screening improvement strategies within the community health centers.

    PubMed

    Carney, Timothy Jay; Morgan, Geoffrey P; Jones, Josette; McDaniel, Anna M; Weaver, Michael; Weiner, Bryan; Haggstrom, David A

    2014-10-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman's Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241

  9. A comparative analysis of 2D and 3D CAD for calcifications in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Acciavatti, Raymond J.; Ray, Shonket; Keller, Brad M.; Maidment, Andrew D. A.; Conant, Emily F.

    2015-03-01

    Many medical centers offer digital breast tomosynthesis (DBT) and 2D digital mammography acquired under the same compression (i.e., "Combo" examination) for screening. This paper compares a conventional 2D CAD algorithm (Hologic® ImageChecker® CAD v9.4) for calcification detection against a prototype 3D algorithm (Hologic® ImageChecker® 3D Calc CAD v1.0). Due to the newness of DBT, the development of this 3D CAD algorithm is ongoing, and it is currently not FDA-approved in the United States. For this study, DBT screening cases with suspicious calcifications were identified retrospectively at the University of Pennsylvania. An expert radiologist (E.F.C.) reviewed images with both 2D and DBT CAD marks, and compared the marks to biopsy results. Control cases with one-year negative follow-up were also studied; these cases either possess clearly benign calcifications or lacked calcifications. To allow the user to alter the sensitivity for cancer detection, an operating point is assigned to each CAD mark. As expected from conventional 2D CAD, increasing the operating point in 3D CAD increases sensitivity and reduces specificity. Additionally, we showed that some cancers are occult to 2D CAD at all operating points. By contrast, 3D CAD allows for detection of some cancers that are missed on 2D CAD. We also demonstrated that some non-cancerous CAD marks in 3D are not present at analogous locations in the 2D image. Hence, there are additional marks when using both 2D and 3D CAD in combination, leading to lower specificity than with conventional 2D CAD alone.

  10. Application of 2D Non-Graphene Materials and 2D Oxide Nanostructures for Biosensing Technology

    PubMed Central

    Shavanova, Kateryna; Bakakina, Yulia; Burkova, Inna; Shtepliuk, Ivan; Viter, Roman; Ubelis, Arnolds; Beni, Valerio; Starodub, Nickolaj; Yakimova, Rositsa; Khranovskyy, Volodymyr

    2016-01-01

    The discovery of graphene and its unique properties has inspired researchers to try to invent other two-dimensional (2D) materials. After considerable research effort, a distinct “beyond graphene” domain has been established, comprising the library of non-graphene 2D materials. It is significant that some 2D non-graphene materials possess solid advantages over their predecessor, such as having a direct band gap, and therefore are highly promising for a number of applications. These applications are not limited to nano- and opto-electronics, but have a strong potential in biosensing technologies, as one example. However, since most of the 2D non-graphene materials have been newly discovered, most of the research efforts are concentrated on material synthesis and the investigation of the properties of the material. Applications of 2D non-graphene materials are still at the embryonic stage, and the integration of 2D non-graphene materials into devices is scarcely reported. However, in recent years, numerous reports have blossomed about 2D material-based biosensors, evidencing the growing potential of 2D non-graphene materials for biosensing applications. This review highlights the recent progress in research on the potential of using 2D non-graphene materials and similar oxide nanostructures for different types of biosensors (optical and electrochemical). A wide range of biological targets, such as glucose, dopamine, cortisol, DNA, IgG, bisphenol, ascorbic acid, cytochrome and estradiol, has been reported to be successfully detected by biosensors with transducers made of 2D non-graphene materials. PMID:26861346

  11. Application of 2D Non-Graphene Materials and 2D Oxide Nanostructures for Biosensing Technology.

    PubMed

    Shavanova, Kateryna; Bakakina, Yulia; Burkova, Inna; Shtepliuk, Ivan; Viter, Roman; Ubelis, Arnolds; Beni, Valerio; Starodub, Nickolaj; Yakimova, Rositsa; Khranovskyy, Volodymyr

    2016-01-01

    The discovery of graphene and its unique properties has inspired researchers to try to invent other two-dimensional (2D) materials. After considerable research effort, a distinct "beyond graphene" domain has been established, comprising the library of non-graphene 2D materials. It is significant that some 2D non-graphene materials possess solid advantages over their predecessor, such as having a direct band gap, and therefore are highly promising for a number of applications. These applications are not limited to nano- and opto-electronics, but have a strong potential in biosensing technologies, as one example. However, since most of the 2D non-graphene materials have been newly discovered, most of the research efforts are concentrated on material synthesis and the investigation of the properties of the material. Applications of 2D non-graphene materials are still at the embryonic stage, and the integration of 2D non-graphene materials into devices is scarcely reported. However, in recent years, numerous reports have blossomed about 2D material-based biosensors, evidencing the growing potential of 2D non-graphene materials for biosensing applications. This review highlights the recent progress in research on the potential of using 2D non-graphene materials and similar oxide nanostructures for different types of biosensors (optical and electrochemical). A wide range of biological targets, such as glucose, dopamine, cortisol, DNA, IgG, bisphenol, ascorbic acid, cytochrome and estradiol, has been reported to be successfully detected by biosensors with transducers made of 2D non-graphene materials. PMID:26861346

  12. Contributions of the Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) for the diagnosis of MCI in Brazil.

    PubMed

    Memória, Cláudia M; Yassuda, Mônica S; Nakano, Eduardo Y; Forlenza, Orestes V

    2014-05-01

    ABSTRACT Background: The Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) is a computer-based cognitive screening instrument that involves automated administration and scoring and immediate analyses of test sessions. The objective of this study was to translate and culturally adapt the Brazilian Portuguese version of the CANS-MCI (CANS-MCI-BR) and to evaluate its reliability and validity for the diagnostic screening of MCI and dementia due to Alzheimer's disease. Methods: The test was administered to 97 older adults (mean age 73.41 ± 5.27 years) with at least four years of formal education (mean education 12.23 ± 4.48 years). Participants were classified into three diagnostic groups according to global cognitive status (normal controls, n = 41; MCI, n = 35; AD, n = 21) based on clinical data and formal neuropsychological assessments. Results: The results indicated high internal consistency (Cronbach's α = 0.77) in the total sample. Three-month test-retest reliability correlations were significant and robust (0.875; p < 0.001). A moderate level of concurrent validity was attained relative to the screening test for MCI (MoCA test, r = 0.76, p < 0.001). Confirmatory factor analysis supported the three-factor model of the original test, i.e., memory, language/spatial fluency, and executive function/mental control. Goodness of fit indicators were strong (Bentler Comparative Fit Index = 0.96, Root Mean Square Error of Approximation = 0.09). Receiver operating characteristic curve analyses suggested high sensitivity and specificity (81% and 73% respectively) to screen for possible MCI cases. Conclusions: The CANS-MCI-BR maintains adequate psychometric characteristics that render it suitable to identify elderly adults with probable cognitive impairment to whom a more extensive evaluation by formal neuropsychological tests may be required. PMID:24806666

  13. Computational Analysis and In silico Predictive Modeling for Inhibitors of PhoP Regulon in S. typhi on High-Throughput Screening Bioassay Dataset.

    PubMed

    Kaur, Harleen; Ahmad, Mohd; Scaria, Vinod

    2016-03-01

    There is emergence of multidrug-resistant Salmonella enterica serotype typhi in pandemic proportions throughout the world, and therefore, there is a necessity to speed up the discovery of novel molecules having different modes of action and also less influenced by the resistance formation that would be used as drug for the treatment of salmonellosis particularly typhoid fever. The PhoP regulon is well studied and has now been shown to be a critical regulator of number of gene expressions which are required for intracellular survival of S. enterica and pathophysiology of disease like typhoid. The evident roles of two-component PhoP-/PhoQ-regulated products in salmonella virulence have motivated attempts to target them therapeutically. Although the discovery process of biologically active compounds for the treatment of typhoid relies on hit-finding procedure, using high-throughput screening technology alone is very expensive, as well as time consuming when performed on large scales. With the recent advancement in combinatorial chemistry and contemporary technique for compounds synthesis, there are more and more compounds available which give ample growth of diverse compound library, but the time and endeavor required to screen these unfocused massive and diverse library have been slightly reduced in the past years. Hence, there is demand to improve the high-quality hits and success rate for high-throughput screening that required focused and biased compound library toward the particular target. Therefore, we still need an advantageous and expedient method to prioritize the molecules that will be utilized for biological screens, which saves time and is also inexpensive. In this concept, in silico methods like machine learning are widely applicable technique used to build computational model for high-throughput virtual screens to prioritize molecules for advance study. Furthermore, in computational analysis, we extended our study to identify the common enriched

  14. Design Application Translates 2-D Graphics to 3-D Surfaces

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Fabric Images Inc., specializing in the printing and manufacturing of fabric tension architecture for the retail, museum, and exhibit/tradeshow communities, designed software to translate 2-D graphics for 3-D surfaces prior to print production. Fabric Images' fabric-flattening design process models a 3-D surface based on computer-aided design (CAD) specifications. The surface geometry of the model is used to form a 2-D template, similar to a flattening process developed by NASA's Glenn Research Center. This template or pattern is then applied in the development of a 2-D graphic layout. Benefits of this process include 11.5 percent time savings per project, less material wasted, and the ability to improve upon graphic techniques and offer new design services. Partners include Exhibitgroup/Giltspur (end-user client: TAC Air, a division of Truman Arnold Companies Inc.), Jack Morton Worldwide (end-user client: Nickelodeon), as well as 3D Exhibits Inc., and MG Design Associates Corp.

  15. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    SciTech Connect

    Clark, Aurora Sue; Wall, Nathalie; Benny, Paul

    2015-11-16

    through the design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  16. Stochastic Inversion of 2D Magnetotelluric Data

    2010-07-01

    The algorithm is developed to invert 2D magnetotelluric (MT) data based on sharp boundary parametrization using a Bayesian framework. Within the algorithm, we consider the locations and the resistivity of regions formed by the interfaces are as unknowns. We use a parallel, adaptive finite-element algorithm to forward simulate frequency-domain MT responses of 2D conductivity structure. Those unknown parameters are spatially correlated and are described by a geostatistical model. The joint posterior probability distribution function ismore » explored by Markov Chain Monte Carlo (MCMC) sampling methods. The developed stochastic model is effective for estimating the interface locations and resistivity. Most importantly, it provides details uncertainty information on each unknown parameter. Hardware requirements: PC, Supercomputer, Multi-platform, Workstation; Software requirements C and Fortan; Operation Systems/version is Linux/Unix or Windows« less

  17. Stochastic Inversion of 2D Magnetotelluric Data

    SciTech Connect

    Chen, Jinsong

    2010-07-01

    The algorithm is developed to invert 2D magnetotelluric (MT) data based on sharp boundary parametrization using a Bayesian framework. Within the algorithm, we consider the locations and the resistivity of regions formed by the interfaces are as unknowns. We use a parallel, adaptive finite-element algorithm to forward simulate frequency-domain MT responses of 2D conductivity structure. Those unknown parameters are spatially correlated and are described by a geostatistical model. The joint posterior probability distribution function is explored by Markov Chain Monte Carlo (MCMC) sampling methods. The developed stochastic model is effective for estimating the interface locations and resistivity. Most importantly, it provides details uncertainty information on each unknown parameter. Hardware requirements: PC, Supercomputer, Multi-platform, Workstation; Software requirements C and Fortan; Operation Systems/version is Linux/Unix or Windows

  18. Explicit 2-D Hydrodynamic FEM Program

    1996-08-07

    DYNA2D* is a vectorized, explicit, two-dimensional, axisymmetric and plane strain finite element program for analyzing the large deformation dynamic and hydrodynamic response of inelastic solids. DYNA2D* contains 13 material models and 9 equations of state (EOS) to cover a wide range of material behavior. The material models implemented in all machine versions are: elastic, orthotropic elastic, kinematic/isotropic elastic plasticity, thermoelastoplastic, soil and crushable foam, linear viscoelastic, rubber, high explosive burn, isotropic elastic-plastic, temperature-dependent elastic-plastic. Themore » isotropic and temperature-dependent elastic-plastic models determine only the deviatoric stresses. Pressure is determined by one of 9 equations of state including linear polynomial, JWL high explosive, Sack Tuesday high explosive, Gruneisen, ratio of polynomials, linear polynomial with energy deposition, ignition and growth of reaction in HE, tabulated compaction, and tabulated.« less

  19. Schottky diodes from 2D germanane

    NASA Astrophysics Data System (ADS)

    Sahoo, Nanda Gopal; Esteves, Richard J.; Punetha, Vinay Deep; Pestov, Dmitry; Arachchige, Indika U.; McLeskey, James T.

    2016-07-01

    We report on the fabrication and characterization of a Schottky diode made using 2D germanane (hydrogenated germanene). When compared to germanium, the 2D structure has higher electron mobility, an optimal band-gap, and exceptional stability making germanane an outstanding candidate for a variety of opto-electronic devices. One-atom-thick sheets of hydrogenated puckered germanium atoms have been synthesized from a CaGe2 framework via intercalation and characterized by XRD, Raman, and FTIR techniques. The material was then used to fabricate Schottky diodes by suspending the germanane in benzonitrile and drop-casting it onto interdigitated metal electrodes. The devices demonstrate significant rectifying behavior and the outstanding potential of this material.

  20. Numerical Simulation of Supersonic Compression Corners and Hypersonic Inlet Flows Using the RPLUS2D Code

    NASA Technical Reports Server (NTRS)

    Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.

    1994-01-01

    A two-dimensional computational code, PRLUS2D, which was developed for the reactive propulsive flows of ramjets and scramjets, was validated for two-dimensional shock-wave/turbulent-boundary-layer interactions. The problem of compression corners at supersonic speeds was solved using the RPLUS2D code. To validate the RPLUS2D code for hypersonic speeds, it was applied to a realistic hypersonic inlet geometry. Both the Baldwin-Lomax and the Chien two-equation turbulence models were used. Computational results showed that the RPLUS2D code compared very well with experimentally obtained data for supersonic compression corner flows, except in the case of large separated flows resulting from the interactions between the shock wave and turbulent boundary layer. The computational results compared well with the experiment results in a hypersonic NASA P8 inlet case, with the Chien two-equation turbulence model performing better than the Baldwin-Lomax model.

  1. VECTUM. Irregular 2D Velocity Vector Field Plotting Package

    SciTech Connect

    McClurg, F.R.; Mousseau, V.A.

    1992-05-04

    VECTUM is a NCAR Graphics based package, for generating a plot of an irregular 2D velocity vector field. The program reads an ASCII database of x, y, u, v, data pairs and produces a plot in Computer Graphics Metafile (CGM) format. The program also uses an ASCII parameter file for controlling annotation details such as the plot title, arrowhead style, scale of vectors, windowing, etc. Simple geometry (i.e. lines, arcs, splines) can be defined to be included with the velocity vectors. NCAR Graphics drivers can be used to display the CGM file into PostScript, HPGL, HDF, etc, output.

  2. A parallel splitting wavelet method for 2D conservation laws

    NASA Astrophysics Data System (ADS)

    Schmidt, Alex A.; Kozakevicius, Alice J.; Jakobsson, Stefan

    2016-06-01

    The current work presents a parallel formulation using the MPI protocol for an adaptive high order finite difference scheme to solve 2D conservation laws. Adaptivity is achieved at each time iteration by the application of an interpolating wavelet transform in each space dimension. High order approximations for the numerical fluxes are computed by ENO and WENO schemes. Since time evolution is made by a TVD Runge-Kutta space splitting scheme, the problem is naturally suitable for parallelization. Numerical simulations and speedup results are presented for Euler equations in gas dynamics problems.

  3. Real-time SPECT and 2D ultrasound image registration.

    PubMed

    Bucki, Marek; Chassat, Fabrice; Galdames, Francisco; Asahi, Takeshi; Pizarro, Daniel; Lobo, Gabriel

    2007-01-01

    In this paper we present a technique for fully automatic, real-time 3D SPECT (Single Photon Emitting Computed Tomography) and 2D ultrasound image registration. We use this technique in the context of kidney lesion diagnosis. Our registration algorithm allows a physician to perform an ultrasound exam after a SPECT image has been acquired and see in real time the registration of both modalities. An automatic segmentation algorithm has been implemented in order to display in 3D the positions of the acquired US images with respect to the organs. PMID:18044572

  4. Layer Engineering of 2D Semiconductor Junctions.

    PubMed

    He, Yongmin; Sobhani, Ali; Lei, Sidong; Zhang, Zhuhua; Gong, Yongji; Jin, Zehua; Zhou, Wu; Yang, Yingchao; Zhang, Yuan; Wang, Xifan; Yakobson, Boris; Vajtai, Robert; Halas, Naomi J; Li, Bo; Xie, Erqing; Ajayan, Pulickel

    2016-07-01

    A new concept for junction fabrication by connecting multiple regions with varying layer thicknesses, based on the thickness dependence, is demonstrated. This type of junction is only possible in super-thin-layered 2D materials, and exhibits similar characteristics as p-n junctions. Rectification and photovoltaic effects are observed in chemically homogeneous MoSe2 junctions between domains of different thicknesses. PMID:27136275

  5. 2dF mechanical engineering

    NASA Astrophysics Data System (ADS)

    Smith, Greg; Lankshear, Allan

    1998-07-01

    2dF is a multi-object instrument mounted at prime focus at the AAT capable of spectroscopic analysis of 400 objects in a single 2 degree field. It also prepares a second 2 degree 400 object field while the first field is being observed. At its heart is a high precision robotic positioner that places individual fiber end magnetic buttons on one of two field plates. The button gripper is carried on orthogonal gantries powered by linear synchronous motors and contains a TV camera which precisely locates backlit buttons to allow placement in user defined locations to 10 (mu) accuracy. Fiducial points on both plates can also be observed by the camera to allow repeated checks on positioning accuracy. Field plates rotate to follow apparent sky rotation. The spectrographs both analyze light from the 200 observing fibers each and back- illuminate the 400 fibers being re-positioned during the observing run. The 2dF fiber position and spectrograph system is a large and complex instrument located at the prime focus of the Anglo Australian Telescope. The mechanical design has departed somewhat from the earlier concepts of Gray et al, but still reflects the audacity of those first ideas. The positioner is capable of positioning 400 fibers on a field plate while another 400 fibers on another plate are observing at the focus of the telescope and feeding the twin spectrographs. When first proposed it must have seemed like ingenuity unfettered by caution. Yet now it works, and works wonderfully well. 2dF is a system which functions as the result of the combined and coordinated efforts of the astronomers, the mechanical designers and tradespeople, the electronic designers, the programmers, the support staff at the telescope, and the manufacturing subcontractors. The mechanical design of the 2dF positioner and spectrographs was carried out by the mechanical engineering staff of the AAO and the majority of the manufacture was carried out in the AAO workshops.

  6. Compact 2-D graphical representation of DNA

    NASA Astrophysics Data System (ADS)

    Randić, Milan; Vračko, Marjan; Zupan, Jure; Novič, Marjana

    2003-05-01

    We present a novel 2-D graphical representation for DNA sequences which has an important advantage over the existing graphical representations of DNA in being very compact. It is based on: (1) use of binary labels for the four nucleic acid bases, and (2) use of the 'worm' curve as template on which binary codes are placed. The approach is illustrated on DNA sequences of the first exon of human β-globin and gorilla β-globin.

  7. 2D materials: Graphene and others

    NASA Astrophysics Data System (ADS)

    Bansal, Suneev Anil; Singh, Amrinder Pal; Kumar, Suresh

    2016-05-01

    Present report reviews the recent advancements in new atomically thick 2D materials. Materials covered in this review are Graphene, Silicene, Germanene, Boron Nitride (BN) and Transition metal chalcogenides (TMC). These materials show extraordinary mechanical, electronic and optical properties which make them suitable candidates for future applications. Apart from unique properties, tune-ability of highly desirable properties of these materials is also an important area to be emphasized on.

  8. TACO (2D AND 3D). Taco

    SciTech Connect

    Mason, W.E.

    1983-03-01

    A set of finite element codes for the solution of nonlinear, two-dimensional (TACO2D) and three-dimensional (TACO3D) heat transfer problems. Performs linear and nonlinear analyses of both transient and steady state heat transfer problems. Has the capability to handle time or temperature dependent material properties. Materials may be either isotropic or orthotropic. A variety of time and temperature dependent boundary conditions and loadings are available including temperature, flux, convection, radiation, and internal heat generation.

  9. Tomosynthesis imaging with 2D scanning trajectories

    NASA Astrophysics Data System (ADS)

    Khare, Kedar; Claus, Bernhard E. H.; Eberhard, Jeffrey W.

    2011-03-01

    Tomosynthesis imaging in chest radiography provides volumetric information with the potential for improved diagnostic value when compared to the standard AP or LAT projections. In this paper we explore the image quality benefits of 2D scanning trajectories when coupled with advanced image reconstruction approaches. It is intuitively clear that 2D trajectories provide projection data that is more complete in terms of Radon space filling, when compared with conventional tomosynthesis using a linearly scanned source. Incorporating this additional information for obtaining improved image quality is, however, not a straightforward problem. The typical tomosynthesis reconstruction algorithms are based on direct inversion methods e.g. Filtered Backprojection (FBP) or iterative algorithms that are variants of the Algebraic Reconstruction Technique (ART). The FBP approach is fast and provides high frequency details in the image but at the same time introduces streaking artifacts degrading the image quality. The iterative methods can reduce the image artifacts by using image priors but suffer from a slow convergence rate, thereby producing images lacking high frequency details. In this paper we propose using a fast converging optimal gradient iterative scheme that has advantages of both the FBP and iterative methods in that it produces images with high frequency details while reducing the image artifacts. We show that using favorable 2D scanning trajectories along with the proposed reconstruction method has the advantage of providing improved depth information for structures such as the spine and potentially producing images with more isotropic resolution.

  10. Engineering light outcoupling in 2D materials.

    PubMed

    Lien, Der-Hsien; Kang, Jeong Seuk; Amani, Matin; Chen, Kevin; Tosun, Mahmut; Wang, Hsin-Ping; Roy, Tania; Eggleston, Michael S; Wu, Ming C; Dubey, Madan; Lee, Si-Chen; He, Jr-Hau; Javey, Ali

    2015-02-11

    When light is incident on 2D transition metal dichalcogenides (TMDCs), it engages in multiple reflections within underlying substrates, producing interferences that lead to enhancement or attenuation of the incoming and outgoing strength of light. Here, we report a simple method to engineer the light outcoupling in semiconducting TMDCs by modulating their dielectric surroundings. We show that by modulating the thicknesses of underlying substrates and capping layers, the interference caused by substrate can significantly enhance the light absorption and emission of WSe2, resulting in a ∼11 times increase in Raman signal and a ∼30 times increase in the photoluminescence (PL) intensity of WSe2. On the basis of the interference model, we also propose a strategy to control the photonic and optoelectronic properties of thin-layer WSe2. This work demonstrates the utilization of outcoupling engineering in 2D materials and offers a new route toward the realization of novel optoelectronic devices, such as 2D LEDs and solar cells. PMID:25602462

  11. Fully automated 2D-3D registration and verification.

    PubMed

    Varnavas, Andreas; Carrell, Tom; Penney, Graeme

    2015-12-01

    Clinical application of 2D-3D registration technology often requires a significant amount of human interaction during initialisation and result verification. This is one of the main barriers to more widespread clinical use of this technology. We propose novel techniques for automated initial pose estimation of the 3D data and verification of the registration result, and show how these techniques can be combined to enable fully automated 2D-3D registration, particularly in the case of a vertebra based system. The initialisation method is based on preoperative computation of 2D templates over a wide range of 3D poses. These templates are used to apply the Generalised Hough Transform to the intraoperative 2D image and the sought 3D pose is selected with the combined use of the generated accumulator arrays and a Gradient Difference Similarity Measure. On the verification side, two algorithms are proposed: one using normalised features based on the similarity value and the other based on the pose agreement between multiple vertebra based registrations. The proposed methods are employed here for CT to fluoroscopy registration and are trained and tested with data from 31 clinical procedures with 417 low dose, i.e. low quality, high noise interventional fluoroscopy images. When similarity value based verification is used, the fully automated system achieves a 95.73% correct registration rate, whereas a no registration result is produced for the remaining 4.27% of cases (i.e. incorrect registration rate is 0%). The system also automatically detects input images outside its operating range. PMID:26387052

  12. Computational study of the transition state for H[sub 2] addition to Vaska-type complexes (trans-Ir(L)[sub 2](CO)X). Substituent effects on the energy barrier and the origin of the small H[sub 2]/D[sub 2] kinetic isotope effect

    SciTech Connect

    Abu-Hasanayn, F.; Goldman, A.S.; Krogh-Jespersen, K. )

    1993-06-03

    Ab initio molecular orbital methods have been used to study transition state properties for the concerted addition reaction of H[sub 2] to Vaska-type complexes, trans-Ir(L)[sub 2](CO)X, 1 (L = PH[sub 3] and X = F, Cl, Br, I, CN, or H; L = NH[sub 3] and X = Cl). Stationary points on the reaction path retaining the trans-L[sub 2] arrangement were located at the Hartree-Fock level using relativistic effective core potentials and valence basis sets of double-[zeta] quality. The identities of the stationary points were confirmed by normal mode analysis. Activation energy barriers were calculated with electron correlation effects included via Moller-Plesset perturbation theory carried fully through fourth order, MP4(SDTQ). The more reactive complexes feature structurally earlier transition states and larger reaction exothermicities, in accord with the Hammond postulate. The experimentally observed increase in reactivity of Ir(PPh[sub 3])[sub 2](CO)X complexes toward H[sub 2] addition upon going from X = F to X = I is reproduced well by the calculations and is interpreted to be a consequence of diminished halide-to-Ir [pi]-donation by the heavier halogens. Computed activation barriers (L = PH[sub 3]) range from 6.1 kcal/mol (X = H) to 21.4 kcal/mol (X = F). Replacing PH[sub 3] by NH[sub 3] when X = Cl increases the barrier from 14.1 to 19.9 kcal/mol. Using conventional transition state theory, the kinetic isotope effects for H[sub 2]/D[sub 2] addition are computed to lie between 1.1 and 1.7 with larger values corresponding to earlier transition states. Judging from the computational data presented here, tunneling appears to be unimportant for H[sub 2] addition to these iridium complexes. 51 refs., 4 tabs.

  13. 2D superconductivity by ionic gating

    NASA Astrophysics Data System (ADS)

    Iwasa, Yoshi

    2D superconductivity is attracting a renewed interest due to the discoveries of new highly crystalline 2D superconductors in the past decade. Superconductivity at the oxide interfaces triggered by LaAlO3/SrTiO3 has become one of the promising routes for creation of new 2D superconductors. Also, the MBE grown metallic monolayers including FeSe are also offering a new platform of 2D superconductors. In the last two years, there appear a variety of monolayer/bilayer superconductors fabricated by CVD or mechanical exfoliation. Among these, electric field induced superconductivity by electric double layer transistor (EDLT) is a unique platform of 2D superconductivity, because of its ability of high density charge accumulation, and also because of the versatility in terms of materials, stemming from oxides to organics and layered chalcogenides. In this presentation, the following issues of electric filed induced superconductivity will be addressed; (1) Tunable carrier density, (2) Weak pinning, (3) Absence of inversion symmetry. (1) Since the sheet carrier density is quasi-continuously tunable from 0 to the order of 1014 cm-2, one is able to establish an electronic phase diagram of superconductivity, which will be compared with that of bulk superconductors. (2) The thickness of superconductivity can be estimated as 2 - 10 nm, dependent on materials, and is much smaller than the in-plane coherence length. Such a thin but low resistance at normal state results in extremely weak pinning beyond the dirty Boson model in the amorphous metallic films. (3) Due to the electric filed, the inversion symmetry is inherently broken in EDLT. This feature appears in the enhancement of Pauli limit of the upper critical field for the in-plane magnetic fields. In transition metal dichalcogenide with a substantial spin-orbit interactions, we were able to confirm the stabilization of Cooper pair due to its spin-valley locking. This work has been supported by Grant-in-Aid for Specially

  14. A survey and task-based quality assessment of static 2D colormaps

    NASA Astrophysics Data System (ADS)

    Bernard, Jürgen; Steiger, Martin; Mittelstädt, Sebastian; Thum, Simon; Keim, Daniel; Kohlhammer, Jörn

    2015-01-01

    Color is one of the most important visual variables since it can be combined with any other visual mapping to encode information without using additional space on the display. Encoding one or two dimensions with color is widely explored and discussed in the field. Also mapping multi-dimensional data to color is applied in a vast number of applications, either to indicate similar, or to discriminate between different elements or (multi-dimensional) structures on the screen. A variety of 2D colormaps exists in literature, covering a large variance with respect to different perceptual aspects. Many of the colormaps have a different perspective on the underlying data structure as a consequence of the various analysis tasks that exist for multivariate data. Thus, a large design space for 2D colormaps exists which makes the development and use of 2D colormaps cumbersome. According to our literature research, 2D colormaps have not been subject of in-depth quality assessment. Therefore, we present a survey of static 2D colormaps as applied for information visualization and related fields. In addition, we map seven devised quality assessment measures for 2D colormaps to seven relevant tasks for multivariate data analysis. Finally, we present the quality assessment results of the 2D colormaps with respect to the seven analysis tasks, and contribute guidelines about which colormaps to select or create for each analysis task.

  15. GBL-2D Version 1.0: a 2D geometry boolean library.

    SciTech Connect

    McBride, Cory L. (Elemental Technologies, American Fort, UT); Schmidt, Rodney Cannon; Yarberry, Victor R.; Meyers, Ray J.

    2006-11-01

    This report describes version 1.0 of GBL-2D, a geometric Boolean library for 2D objects. The library is written in C++ and consists of a set of classes and routines. The classes primarily represent geometric data and relationships. Classes are provided for 2D points, lines, arcs, edge uses, loops, surfaces and mask sets. The routines contain algorithms for geometric Boolean operations and utility functions. Routines are provided that incorporate the Boolean operations: Union(OR), XOR, Intersection and Difference. A variety of additional analytical geometry routines and routines for importing and exporting the data in various file formats are also provided. The GBL-2D library was originally developed as a geometric modeling engine for use with a separate software tool, called SummitView [1], that manipulates the 2D mask sets created by designers of Micro-Electro-Mechanical Systems (MEMS). However, many other practical applications for this type of software can be envisioned because the need to perform 2D Boolean operations can arise in many contexts.

  16. Predicting non-square 2D dice probabilities

    NASA Astrophysics Data System (ADS)

    Pender, G. A. T.; Uhrin, M.

    2014-07-01

    The prediction of the final state probabilities of a general cuboid randomly thrown onto a surface is a problem that naturally arises in the minds of men and women familiar with regular cubic dice and the basic concepts of probability. Indeed, it was considered by Newton in 1664 (Newton 1967 The Mathematical Papers of Issac Newton vol I (Cambridge: Cambridge University Press) pp 60-1). In this paper we make progress on the 2D problem (which can be realized in 3D by considering a long cuboid, or alternatively a rectangular cross-sectioned dreidel). For the two-dimensional case we suggest that the ratio of the probabilities of landing on each of the two sides is given by \\frac{\\sqrt{{{k}^{2}}+{{l}^{2}}}-k}{\\sqrt{{{k}^{2}}+{{l}^{2}}}-l}\\frac{arctan \\frac{l}{k}}{arctan \\frac{k}{l}} where k and l are the lengths of the two sides. We test this theory both experimentally and computationally, and find good agreement between our theory, experimental and computational results. Our theory is known, from its derivation, to be an approximation for particularly bouncy or ‘grippy’ surfaces where the die rolls through many revolutions before settling. On real surfaces we would expect (and we observe) that the true probability ratio for a 2D die is a somewhat closer to unity than predicted by our theory. This problem may also have wider relevance in the testing of physics engines.

  17. Multiscale simulation of 2D elastic wave propagation

    NASA Astrophysics Data System (ADS)

    Zhang, Wensheng; Zheng, Hui

    2016-06-01

    In this paper, we develop the multiscale method for simulation of elastic wave propagation. Based on the first-order velocity-stress hyperbolic form of 2D elastic wave equation, the particle velocities are solved first ona coarse grid by the finite volume method. Then the stress tensor is solved by using the multiscale basis functions which can represent the fine-scale variation of the wavefield on the coarse grid. The basis functions are computed by solving a local problem with the finite element method. The theoretical formulae and description of the multiscale method for elastic wave equation are given in more detail. The numerical computations for an inhomogeneous model with random scatter are completed. The results show the effectiveness of the multiscale method.

  18. 2D quantum gravity at three loops: A counterterm investigation

    NASA Astrophysics Data System (ADS)

    Leduc, Lætitia; Bilal, Adel

    2016-02-01

    We analyze the divergences of the three-loop partition function at fixed area in 2D quantum gravity. Considering the Liouville action in the Kähler formalism, we extract the coefficient of the leading divergence ∼ AΛ2(ln ⁡ AΛ2) 2. This coefficient is non-vanishing. We discuss the counterterms one can and must add and compute their precise contribution to the partition function. This allows us to conclude that every local and non-local divergence in the partition function can be balanced by local counterterms, with the only exception of the maximally non-local divergence (ln ⁡ AΛ2) 3. Yet, this latter is computed and does cancel between the different three-loop diagrams. Thus, requiring locality of the counterterms is enough to renormalize the partition function. Finally, the structure of the new counterterms strongly suggests that they can be understood as a renormalization of the measure action.

  19. Interparticle Attraction in 2D Complex Plasmas

    NASA Astrophysics Data System (ADS)

    Kompaneets, Roman; Morfill, Gregor E.; Ivlev, Alexei V.

    2016-03-01

    Complex (dusty) plasmas allow experimental studies of various physical processes occurring in classical liquids and solids by directly observing individual microparticles. A major problem is that the interaction between microparticles is generally not molecularlike. In this Letter, we propose how to achieve a molecularlike interaction potential in laboratory 2D complex plasmas. We argue that this principal aim can be achieved by using relatively small microparticles and properly adjusting discharge parameters. If experimentally confirmed, this will make it possible to employ complex plasmas as a model system with an interaction potential resembling that of conventional liquids.

  20. Periodically sheared 2D Yukawa systems

    SciTech Connect

    Kovács, Anikó Zsuzsa; Hartmann, Peter; Donkó, Zoltán

    2015-10-15

    We present non-equilibrium molecular dynamics simulation studies on the dynamic (complex) shear viscosity of a 2D Yukawa system. We have identified a non-monotonic frequency dependence of the viscosity at high frequencies and shear rates, an energy absorption maximum (local resonance) at the Einstein frequency of the system at medium shear rates, an enhanced collective wave activity, when the excitation is near the plateau frequency of the longitudinal wave dispersion, and the emergence of significant configurational anisotropy at small frequencies and high shear rates.

  1. ENERGY LANDSCAPE OF 2D FLUID FORMS

    SciTech Connect

    Y. JIANG; ET AL

    2000-04-01

    The equilibrium states of 2D non-coarsening fluid foams, which consist of bubbles with fixed areas, correspond to local minima of the total perimeter. (1) The authors find an approximate value of the global minimum, and determine directly from an image how far a foam is from its ground state. (2) For (small) area disorder, small bubbles tend to sort inwards and large bubbles outwards. (3) Topological charges of the same sign repel while charges of opposite sign attract. (4) They discuss boundary conditions and the uniqueness of the pattern for fixed topology.

  2. A scalable 2-D parallel sparse solver

    SciTech Connect

    Kothari, S.C.; Mitra, S.

    1995-12-01

    Scalability beyond a small number of processors, typically 32 or less, is known to be a problem for existing parallel general sparse (PGS) direct solvers. This paper presents a parallel general sparse PGS direct solver for general sparse linear systems on distributed memory machines. The algorithm is based on the well-known sequential sparse algorithm Y12M. To achieve efficient parallelization, a 2-D scattered decomposition of the sparse matrix is used. The proposed algorithm is more scalable than existing parallel sparse direct solvers. Its scalability is evaluated on a 256 processor nCUBE2s machine using Boeing/Harwell benchmark matrices.

  3. 2D stepping drive for hyperspectral systems

    NASA Astrophysics Data System (ADS)

    Endrödy, Csaba; Mehner, Hannes; Grewe, Adrian; Sinzinger, Stefan; Hoffmann, Martin

    2015-07-01

    We present the design, fabrication and characterization of a compact 2D stepping microdrive for pinhole array positioning. The miniaturized solution enables a highly integrated compact hyperspectral imaging system. Based on the geometry of the pinhole array, an inch-worm drive with electrostatic actuators was designed resulting in a compact (1 cm2) positioning system featuring a step size of about 15 µm in a 170 µm displacement range. The high payload (20 mg) as required for the pinhole array and the compact system design exceed the known electrostatic inch-worm-based microdrives.

  4. Screen time and children

    MedlinePlus

    ... screen, such as watching TV, working on a computer, or playing video games. Screen time is sedentary activity, meaning you are ... child eat while watching TV or using the computer. DO NOT leave the ... as family board games, puzzles, or going for a walk. Keep a ...

  5. TOPAZ2D heat transfer code users manual and thermal property data base

    SciTech Connect

    Shapiro, A.B.; Edwards, A.L.

    1990-05-01

    TOPAZ2D is a two dimensional implicit finite element computer code for heat transfer analysis. This user's manual provides information on the structure of a TOPAZ2D input file. Also included is a material thermal property data base. This manual is supplemented with The TOPAZ2D Theoretical Manual and the TOPAZ2D Verification Manual. TOPAZ2D has been implemented on the CRAY, SUN, and VAX computers. TOPAZ2D can be used to solve for the steady state or transient temperature field on two dimensional planar or axisymmetric geometries. Material properties may be temperature dependent and either isotropic or orthotropic. A variety of time and temperature dependent boundary conditions can be specified including temperature, flux, convection, and radiation. Time or temperature dependent internal heat generation can be defined locally be element or globally by material. TOPAZ2D can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in material surrounding the enclosure. Additional features include thermally controlled reactive chemical mixtures, thermal contact resistance across an interface, bulk fluid flow, phase change, and energy balances. Thermal stresses can be calculated using the solid mechanics code NIKE2D which reads the temperature state data calculated by TOPAZ2D. A three dimensional version of the code, TOPAZ3D is available. The material thermal property data base, Chapter 4, included in this manual was originally published in 1969 by Art Edwards for use with his TRUMP finite difference heat transfer code. The format of the data has been altered to be compatible with TOPAZ2D. Bob Bailey is responsible for adding the high explosive thermal property data.

  6. WFR-2D: an analytical model for PWAS-generated 2D ultrasonic guided wave propagation

    NASA Astrophysics Data System (ADS)

    Shen, Yanfeng; Giurgiutiu, Victor

    2014-03-01

    This paper presents WaveFormRevealer 2-D (WFR-2D), an analytical predictive tool for the simulation of 2-D ultrasonic guided wave propagation and interaction with damage. The design of structural health monitoring (SHM) systems and self-aware smart structures requires the exploration of a wide range of parameters to achieve best detection and quantification of certain types of damage. Such need for parameter exploration on sensor dimension, location, guided wave characteristics (mode type, frequency, wavelength, etc.) can be best satisfied with analytical models which are fast and efficient. The analytical model was constructed based on the exact 2-D Lamb wave solution using Bessel and Hankel functions. Damage effects were inserted in the model by considering the damage as a secondary wave source with complex-valued directivity scattering coefficients containing both amplitude and phase information from wave-damage interaction. The analytical procedure was coded with MATLAB, and a predictive simulation tool called WaveFormRevealer 2-D was developed. The wave-damage interaction coefficients (WDICs) were extracted from harmonic analysis of local finite element model (FEM) with artificial non-reflective boundaries (NRB). The WFR-2D analytical simulation results were compared and verified with full scale multiphysics finite element models and experiments with scanning laser vibrometer. First, Lamb wave propagation in a pristine aluminum plate was simulated with WFR-2D, compared with finite element results, and verified by experiments. Then, an inhomogeneity was machined into the plate to represent damage. Analytical modeling was carried out, and verified by finite element simulation and experiments. This paper finishes with conclusions and suggestions for future work.

  7. Microwave Assisted 2D Materials Exfoliation

    NASA Astrophysics Data System (ADS)

    Wang, Yanbin

    Two-dimensional materials have emerged as extremely important materials with applications ranging from energy and environmental science to electronics and biology. Here we report our discovery of a universal, ultrafast, green, solvo-thermal technology for producing excellent-quality, few-layered nanosheets in liquid phase from well-known 2D materials such as such hexagonal boron nitride (h-BN), graphite, and MoS2. We start by mixing the uniform bulk-layered material with a common organic solvent that matches its surface energy to reduce the van der Waals attractive interactions between the layers; next, the solutions are heated in a commercial microwave oven to overcome the energy barrier between bulk and few-layers states. We discovered the minutes-long rapid exfoliation process is highly temperature dependent, which requires precise thermal management to obtain high-quality inks. We hypothesize a possible mechanism of this proposed solvo-thermal process; our theory confirms the basis of this novel technique for exfoliation of high-quality, layered 2D materials by using an as yet unknown role of the solvent.

  8. Photocurrent spectroscopy of 2D materials

    NASA Astrophysics Data System (ADS)

    Cobden, David

    Confocal photocurrent measurements provide a powerful means of studying many aspects of the optoelectronic and electrical properties of a 2D device or material. At a diffraction-limited point they can provide a detailed absorption spectrum, and they can probe local symmetry, ultrafast relaxation rates and processes, electron-electron interaction strengths, and transport coefficients. We illustrate this with several examples, once being the photo-Nernst effect. In gapless 2D materials, such as graphene, in a perpendicular magnetic field a photocurrent antisymmetric in the field is generated near to the free edges, with opposite sign at opposite edges. Its origin is the transverse thermoelectric current associated with the laser-induced electron temperature gradient. This effect provides an unambiguous demonstration of the Shockley-Ramo nature of long-range photocurrent generation in gapless materials. It also provides a means of investigating quasiparticle properties. For example, in the case of graphene on hBN, it can be used to probe the Lifshitz transition that occurs due to the minibands formed by the Moire superlattice. We also observe and discuss photocurrent generated in other semimetallic (WTe2) and semiconducting (WSe2) monolayers. Work supported by DoE BES and NSF EFRI grants.

  9. Multienzyme Inkjet Printed 2D Arrays.

    PubMed

    Gdor, Efrat; Shemesh, Shay; Magdassi, Shlomo; Mandler, Daniel

    2015-08-19

    The use of printing to produce 2D arrays is well established, and should be relatively facile to adapt for the purpose of printing biomaterials; however, very few studies have been published using enzyme solutions as inks. Among the printing technologies, inkjet printing is highly suitable for printing biomaterials and specifically enzymes, as it offers many advantages. Formulation of the inkjet inks is relatively simple and can be adjusted to a variety of biomaterials, while providing nonharmful environment to the enzymes. Here we demonstrate the applicability of inkjet printing for patterning multiple enzymes in a predefined array in a very straightforward, noncontact method. Specifically, various arrays of the enzymes glucose oxidase (GOx), invertase (INV) and horseradish peroxidase (HP) were printed on aminated glass surfaces, followed by immobilization using glutardialdehyde after printing. Scanning electrochemical microscopy (SECM) was used for imaging the printed patterns and to ascertain the enzyme activity. The successful formation of 2D arrays consisting of enzymes was explored as a means of developing the first surface confined enzyme based logic gates. Principally, XOR and AND gates, each consisting of two enzymes as the Boolean operators, were assembled, and their operation was studied by SECM. PMID:26214072

  10. High-Throughput Computational Screening of Electrical and Phonon Properties of Two-Dimensional Transition Metal Dichalcogenides

    NASA Astrophysics Data System (ADS)

    Williamson, Izaak; Hernandez, Andres Correa; Wong-Ng, Winnie; Li, Lan

    2016-08-01

    Two-dimensional transition metal dichalcogenides (2D-TMDs) are of broadening research interest due to their novel physical, electrical, and thermoelectric properties. Having the chemical formula MX 2, where M is a transition metal and X is a chalcogen, there are many possible combinations to consider for materials-by-design exploration. By identifying novel compositions and utilizing the lower dimensionality, which allows for improved thermoelectric performance (e.g., increased Seebeck coefficients without sacrificing electron concentration), MX 2 materials are promising candidates for thermoelectric applications. However, to develop these materials into wide-scale use, it is crucial to comprehensively understand the compositional affects. This work investigates the structure, electronic, and phonon properties of 18 different MX 2 materials compositions as a benchmark to explore the impact of various elements. There is significant correlation between properties of constituent transition metals (atomic mass and radius) and the structure/properties of the corresponding 2D-TMDs. As the mass of M increases, the n-type power factor and phonon frequency gap increases. Similarly, increases in the radius of M lead to increased layer thickness and Seebeck coefficient S. Our results identify key factors to optimize MX 2 compositions for desired performance.

  11. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly

  12. Stereoscopic Vascular Models of the Head and Neck: A Computed Tomography Angiography Visualization

    ERIC Educational Resources Information Center

    Cui, Dongmei; Lynch, James C.; Smith, Andrew D.; Wilson, Timothy D.; Lehman, Michael N.

    2016-01-01

    Computer-assisted 3D models are used in some medical and allied health science schools; however, they are often limited to online use and 2D flat screen-based imaging. Few schools take advantage of 3D stereoscopic learning tools in anatomy education and clinically relevant anatomical variations when teaching anatomy. A new approach to teaching…

  13. Haloperidol plasma concentration in Japanese psychiatric subjects with gene duplication of CYP2D6

    PubMed Central

    Ohnuma, Tohru; Shibata, Nobuto; Matsubara, Yoichiro; Arai, Heii

    2003-01-01

    Aims The cytochrome P-450 2D6 (CYP2D6) gene duplication/multiduplication producing an increase in enzyme activity, and the common Japanese mutation, CYP2D6*10A producing a decrease of enzyme activity were screened in a large number of Japanese psychiatric subjects (n = 111) in order to investigate whether these mutated alleles affected the plasma concentration of haloperidol. Methods Polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) method was performed to identify the CYP2D6*10A and CYP2D6*2 genotypes in subjects who had been taking haloperidol. For the screening of duplicated active CYP2D6 gene, allele-specific long PCR was performed. Plasma concentration of haloperidol was measured by the enzyme immunoassay, and expressed as ‘plasma concentration dose ratio’ to normalize individual differences. Results The plasma concentration–dose ratio showed large interindividual differences of approximately 18-fold. PCR-RFLP methods revealed that 29 (26.1%), 10 (9.0%), 39 (35.1%), 0 (0%), seven (6.3%) and 26 (23.4%) cases possessed the CYP2D6 genotypes *1/*1, *1/*2, *1/*10A, *2/*2, *2/*10A and *10 A/*10A, respectively. Six cases (5.4%) had duplicated CYP2D6 genes. There were no significant differences of plasma concentration–dose ratio between the groups classified by CYP2D6*10A and *2 genotypes (Kruskal–Wallis test; P = 0.37), even in those cases whose daily doses were lower than 20 mg (n = 90, P = 0.91). Subjects having duplicated genes (n = 6) did not show significant differences of plasma concentration–dose ratio by comparison with subjects who had no duplicated genes (Mann–Whitney U-test; P = 0.80). Conclusions Gene duplication, and the common Japanese mutation CYP2D6*10A on CYP2D6 gene are not likely to be the main modulatory factors of plasma concentration of haloperidol in Japanese psychiatric subjects. PMID:12919180

  14. Applications of 2D to 3D conversion for educational purposes

    NASA Astrophysics Data System (ADS)

    Koido, Yoshihisa; Morikawa, Hiroyuki; Shiraishi, Saki; Takeuchi, Soya; Maruyama, Wataru; Nakagori, Toshio; Hirakata, Masataka; Shinkai, Hirohisa; Kawai, Takashi

    2013-03-01

    There are three main approaches creating stereoscopic S3D content: stereo filming using two cameras, stereo rendering of 3D computer graphics, and 2D to S3D conversion by adding binocular information to 2D material images. Although manual "off-line" conversion can control the amount of parallax flexibly, 2D material images are converted according to monocular information in most cases, and the flexibility of 2D to S3D conversion has not been exploited. If the depth is expressed flexibly, comprehensions and interests from converted S3D contents are anticipated to be differed from those from 2D. Therefore, in this study we created new S3D content for education by applying 2D to S3D conversion. For surgical education, we created S3D surgical operation content under a surgeon using a partial 2D to S3D conversion technique which was expected to concentrate viewers' attention on significant areas. And for art education, we converted Ukiyoe prints; traditional Japanese artworks made from a woodcut. The conversion of this content, which has little depth information, into S3D, is expected to produce different cognitive processes from those evoked by 2D content, e.g., the excitation of interest, and the understanding of spatial information. In addition, the effects of the representation of these contents were investigated.

  15. 2-D or not 2-D, that is the question: A Northern California test

    SciTech Connect

    Mayeda, K; Malagnini, L; Phillips, W S; Walter, W R; Dreger, D

    2005-06-06

    Reliable estimates of the seismic source spectrum are necessary for accurate magnitude, yield, and energy estimation. In particular, how seismic radiated energy scales with increasing earthquake size has been the focus of recent debate within the community and has direct implications on earthquake source physics studies as well as hazard mitigation. The 1-D coda methodology of Mayeda et al. has provided the lowest variance estimate of the source spectrum when compared against traditional approaches that use direct S-waves, thus making it ideal for networks that have sparse station distribution. The 1-D coda methodology has been mostly confined to regions of approximately uniform complexity. For larger, more geophysically complicated regions, 2-D path corrections may be required. The complicated tectonics of the northern California region coupled with high quality broadband seismic data provides for an ideal ''apples-to-apples'' test of 1-D and 2-D path assumptions on direct waves and their coda. Using the same station and event distribution, we compared 1-D and 2-D path corrections and observed the following results: (1) 1-D coda results reduced the amplitude variance relative to direct S-waves by roughly a factor of 8 (800%); (2) Applying a 2-D correction to the coda resulted in up to 40% variance reduction from the 1-D coda results; (3) 2-D direct S-wave results, though better than 1-D direct waves, were significantly worse than the 1-D coda. We found that coda-based moment-rate source spectra derived from the 2-D approach were essentially identical to those from the 1-D approach for frequencies less than {approx}0.7-Hz, however for the high frequencies (0.7{le} f {le} 8.0-Hz), the 2-D approach resulted in inter-station scatter that was generally 10-30% smaller. For complex regions where data are plentiful, a 2-D approach can significantly improve upon the simple 1-D assumption. In regions where only 1-D coda correction is available it is still preferable over 2

  16. Numerical Evaluation of 2D Ground States

    NASA Astrophysics Data System (ADS)

    Kolkovska, Natalia

    2016-02-01

    A ground state is defined as the positive radial solution of the multidimensional nonlinear problem \\varepsilon propto k_ bot 1 - ξ with the function f being either f(u) =a|u|p-1u or f(u) =a|u|pu+b|u|2pu. The numerical evaluation of ground states is based on the shooting method applied to an equivalent dynamical system. A combination of fourth order Runge-Kutta method and Hermite extrapolation formula is applied to solving the resulting initial value problem. The efficiency of this procedure is demonstrated in the 1D case, where the maximal difference between the exact and numerical solution is ≈ 10-11 for a discretization step 0:00025. As a major application, we evaluate numerically the critical energy constant. This constant is defined as a functional of the ground state and is used in the study of the 2D Boussinesq equations.

  17. Canard configured aircraft with 2-D nozzle

    NASA Technical Reports Server (NTRS)

    Child, R. D.; Henderson, W. P.

    1978-01-01

    A closely-coupled canard fighter with vectorable two-dimensional nozzle was designed for enhanced transonic maneuvering. The HiMAT maneuver goal of a sustained 8g turn at a free-stream Mach number of 0.9 and 30,000 feet was the primary design consideration. The aerodynamic design process was initiated with a linear theory optimization minimizing the zero percent suction drag including jet effects and refined with three-dimensional nonlinear potential flow techniques. Allowances were made for mutual interference and viscous effects. The design process to arrive at the resultant configuration is described, and the design of a powered 2-D nozzle model to be tested in the LRC 16-foot Propulsion Wind Tunnel is shown.

  18. 2D Electrostatic Actuation of Microshutter Arrays

    NASA Technical Reports Server (NTRS)

    Burns, Devin E.; Oh, Lance H.; Li, Mary J.; Jones, Justin S.; Kelly, Daniel P.; Zheng, Yun; Kutyrev, Alexander S.; Moseley, Samuel H.

    2015-01-01

    An electrostatically actuated microshutter array consisting of rotational microshutters (shutters that rotate about a torsion bar) were designed and fabricated through the use of models and experiments. Design iterations focused on minimizing the torsional stiffness of the microshutters, while maintaining their structural integrity. Mechanical and electromechanical test systems were constructed to measure the static and dynamic behavior of the microshutters. The torsional stiffness was reduced by a factor of four over initial designs without sacrificing durability. Analysis of the resonant behavior of the microshutter arrays demonstrates that the first resonant mode is a torsional mode occurring around 3000 Hz. At low vacuum pressures, this resonant mode can be used to significantly reduce the drive voltage necessary for actuation requiring as little as 25V. 2D electrostatic latching and addressing was demonstrated using both a resonant and pulsed addressing scheme.

  19. 2D Electrostatic Actuation of Microshutter Arrays

    NASA Technical Reports Server (NTRS)

    Burns, Devin E.; Oh, Lance H.; Li, Mary J.; Kelly, Daniel P.; Kutyrev, Alexander S.; Moseley, Samuel H.

    2015-01-01

    Electrostatically actuated microshutter arrays consisting of rotational microshutters (shutters that rotate about a torsion bar) were designed and fabricated through the use of models and experiments. Design iterations focused on minimizing the torsional stiffness of the microshutters, while maintaining their structural integrity. Mechanical and electromechanical test systems were constructed to measure the static and dynamic behavior of the microshutters. The torsional stiffness was reduced by a factor of four over initial designs without sacrificing durability. Analysis of the resonant behavior of the microshutters demonstrates that the first resonant mode is a torsional mode occurring around 3000 Hz. At low vacuum pressures, this resonant mode can be used to significantly reduce the drive voltage necessary for actuation requiring as little as 25V. 2D electrostatic latching and addressing was demonstrated using both a resonant and pulsed addressing scheme.

  20. Graphene suspensions for 2D printing

    NASA Astrophysics Data System (ADS)

    Soots, R. A.; Yakimchuk, E. A.; Nebogatikova, N. A.; Kotin, I. A.; Antonova, I. V.

    2016-04-01

    It is shown that, by processing a graphite suspension in ethanol or water by ultrasound and centrifuging, it is possible to obtain particles with thicknesses within 1-6 nm and, in the most interesting cases, 1-1.5 nm. Analogous treatment of a graphite suspension in organic solvent yields eventually thicker particles (up to 6-10 nm thick) even upon long-term treatment. Using the proposed ink based on graphene and aqueous ethanol with ethylcellulose and terpineol additives for 2D printing, thin (~5 nm thick) films with sheet resistance upon annealing ~30 MΩ/□ were obtained. With the ink based on aqueous graphene suspension, the sheet resistance was ~5-12 kΩ/□ for 6- to 15-nm-thick layers with a carrier mobility of ~30-50 cm2/(V s).

  1. Metrology for graphene and 2D materials

    NASA Astrophysics Data System (ADS)

    Pollard, Andrew J.

    2016-09-01

    The application of graphene, a one atom-thick honeycomb lattice of carbon atoms with superlative properties, such as electrical conductivity, thermal conductivity and strength, has already shown that it can be used to benefit metrology itself as a new quantum standard for resistance. However, there are many application areas where graphene and other 2D materials, such as molybdenum disulphide (MoS2) and hexagonal boron nitride (h-BN), may be disruptive, areas such as flexible electronics, nanocomposites, sensing and energy storage. Applying metrology to the area of graphene is now critical to enable the new, emerging global graphene commercial world and bridge the gap between academia and industry. Measurement capabilities and expertise in a wide range of scientific areas are required to address this challenge. The combined and complementary approach of varied characterisation methods for structural, chemical, electrical and other properties, will allow the real-world issues of commercialising graphene and other 2D materials to be addressed. Here, examples of metrology challenges that have been overcome through a multi-technique or new approach are discussed. Firstly, the structural characterisation of defects in both graphene and MoS2 via Raman spectroscopy is described, and how nanoscale mapping of vacancy defects in graphene is also possible using tip-enhanced Raman spectroscopy (TERS). Furthermore, the chemical characterisation and removal of polymer residue on chemical vapour deposition (CVD) grown graphene via secondary ion mass spectrometry (SIMS) is detailed, as well as the chemical characterisation of iron films used to grow large domain single-layer h-BN through CVD growth, revealing how contamination of the substrate itself plays a role in the resulting h-BN layer. In addition, the role of international standardisation in this area is described, outlining the current work ongoing in both the International Organization of Standardization (ISO) and the

  2. The mouse ruby-eye 2(d) (ru2(d) /Hps5(ru2-d) ) allele inhibits eumelanin but not pheomelanin synthesis.

    PubMed

    Hirobe, Tomohisa; Ito, Shosuke; Wakamatsu, Kazumasa

    2013-09-01

    The novel mutation named ru2(d) /Hps5(ru2-d) , characterized by light-colored coats and ruby-eyes, prohibits differentiation of melanocytes by inhibiting tyrosinase (Tyr) activity, expression of Tyr, Tyr-related protein 1 (Tyrp1), Tyrp2, and Kit. However, it is not known whether the ru2(d) allele affects pheomelanin synthesis in recessive yellow (e/Mc1r(e) ) or in pheomelanic stage in agouti (A) mice. In this study, effects of the ru2(d) allele on pheomelanin synthesis were investigated by chemical analysis of melanin present in dorsal hairs of 5-week-old mice from F2 generation between C57BL/10JHir (B10)-co-isogenic ruby-eye 2(d) and B10-congenic recessive yellow or agouti. Eumelanin content was decreased in ruby-eye 2(d) and ruby-eye 2(d) agouti mice, whereas pheomelanin content in ruby-eye 2(d) recessive yellow and ruby-eye 2(d) agouti mice did not differ from the corresponding Ru2(d) /- mice, suggesting that the ru2(d) allele inhibits eumelanin but not pheomelanin synthesis. PMID:23672590

  3. 2D to 3D conversion implemented in different hardware

    NASA Astrophysics Data System (ADS)

    Ramos-Diaz, Eduardo; Gonzalez-Huitron, Victor; Ponomaryov, Volodymyr I.; Hernandez-Fragoso, Araceli

    2015-02-01

    Conversion of available 2D data for release in 3D content is a hot topic for providers and for success of the 3D applications, in general. It naturally completely relies on virtual view synthesis of a second view given by original 2D video. Disparity map (DM) estimation is a central task in 3D generation but still follows a very difficult problem for rendering novel images precisely. There exist different approaches in DM reconstruction, among them manually and semiautomatic methods that can produce high quality DMs but they demonstrate hard time consuming and are computationally expensive. In this paper, several hardware implementations of designed frameworks for an automatic 3D color video generation based on 2D real video sequence are proposed. The novel framework includes simultaneous processing of stereo pairs using the following blocks: CIE L*a*b* color space conversions, stereo matching via pyramidal scheme, color segmentation by k-means on an a*b* color plane, and adaptive post-filtering, DM estimation using stereo matching between left and right images (or neighboring frames in a video), adaptive post-filtering, and finally, the anaglyph 3D scene generation. Novel technique has been implemented on DSP TMS320DM648, Matlab's Simulink module over a PC with Windows 7, and using graphic card (NVIDIA Quadro K2000) demonstrating that the proposed approach can be applied in real-time processing mode. The time values needed, mean Similarity Structural Index Measure (SSIM) and Bad Matching Pixels (B) values for different hardware implementations (GPU, Single CPU, and DSP) are exposed in this paper.

  4. Progress in 2D photonic crystal Fano resonance photonics

    NASA Astrophysics Data System (ADS)

    Zhou, Weidong; Zhao, Deyin; Shuai, Yi-Chen; Yang, Hongjun; Chuwongin, Santhad; Chadha, Arvinder; Seo, Jung-Hun; Wang, Ken X.; Liu, Victor; Ma, Zhenqiang; Fan, Shanhui

    2014-01-01

    In contrast to a conventional symmetric Lorentzian resonance, Fano resonance is predominantly used to describe asymmetric-shaped resonances, which arise from the constructive and destructive interference of discrete resonance states with broadband continuum states. This phenomenon and the underlying mechanisms, being common and ubiquitous in many realms of physical sciences, can be found in a wide variety of nanophotonic structures and quantum systems, such as quantum dots, photonic crystals, plasmonics, and metamaterials. The asymmetric and steep dispersion of the Fano resonance profile promises applications for a wide range of photonic devices, such as optical filters, switches, sensors, broadband reflectors, lasers, detectors, slow-light and non-linear devices, etc. With advances in nanotechnology, impressive progress has been made in the emerging field of nanophotonic structures. One of the most attractive nanophotonic structures for integrated photonics is the two-dimensional photonic crystal slab (2D PCS), which can be integrated into a wide range of photonic devices. The objective of this manuscript is to provide an in depth review of the progress made in the general area of Fano resonance photonics, focusing on the photonic devices based on 2D PCS structures. General discussions are provided on the origins and characteristics of Fano resonances in 2D PCSs. A nanomembrane transfer printing fabrication technique is also reviewed, which is critical for the heterogeneous integrated Fano resonance photonics. The majority of the remaining sections review progress made on various photonic devices and structures, such as high quality factor filters, membrane reflectors, membrane lasers, detectors and sensors, as well as structures and phenomena related to Fano resonance slow light effect, nonlinearity, and optical forces in coupled PCSs. It is expected that further advances in the field will lead to more significant advances towards 3D integrated photonics, flat

  5. 2-D Model for Normal and Sickle Cell Blood Microcirculation

    NASA Astrophysics Data System (ADS)

    Tekleab, Yonatan; Harris, Wesley

    2011-11-01

    Sickle cell disease (SCD) is a genetic disorder that alters the red blood cell (RBC) structure and function such that hemoglobin (Hb) cannot effectively bind and release oxygen. Previous computational models have been designed to study the microcirculation for insight into blood disorders such as SCD. Our novel 2-D computational model represents a fast, time efficient method developed to analyze flow dynamics, O2 diffusion, and cell deformation in the microcirculation. The model uses a finite difference, Crank-Nicholson scheme to compute the flow and O2 concentration, and the level set computational method to advect the RBC membrane on a staggered grid. Several sets of initial and boundary conditions were tested. Simulation data indicate a few parameters to be significant in the perturbation of the blood flow and O2 concentration profiles. Specifically, the Hill coefficient, arterial O2 partial pressure, O2 partial pressure at 50% Hb saturation, and cell membrane stiffness are significant factors. Results were found to be consistent with those of Le Floch [2010] and Secomb [2006].

  6. Random forest learning of ultrasonic statistical physics and object spaces for lesion detection in 2D sonomammography

    NASA Astrophysics Data System (ADS)

    Sheet, Debdoot; Karamalis, Athanasios; Kraft, Silvan; Noël, Peter B.; Vag, Tibor; Sadhu, Anup; Katouzian, Amin; Navab, Nassir; Chatterjee, Jyotirmoy; Ray, Ajoy K.

    2013-03-01

    Breast cancer is the most common form of cancer in women. Early diagnosis can significantly improve lifeexpectancy and allow different treatment options. Clinicians favor 2D ultrasonography for breast tissue abnormality screening due to high sensitivity and specificity compared to competing technologies. However, inter- and intra-observer variability in visual assessment and reporting of lesions often handicaps its performance. Existing Computer Assisted Diagnosis (CAD) systems though being able to detect solid lesions are often restricted in performance. These restrictions are inability to (1) detect lesion of multiple sizes and shapes, and (2) differentiate between hypo-echoic lesions from their posterior acoustic shadowing. In this work we present a completely automatic system for detection and segmentation of breast lesions in 2D ultrasound images. We employ random forests for learning of tissue specific primal to discriminate breast lesions from surrounding normal tissues. This enables it to detect lesions of multiple shapes and sizes, as well as discriminate between hypo-echoic lesion from associated posterior acoustic shadowing. The primal comprises of (i) multiscale estimated ultrasonic statistical physics and (ii) scale-space characteristics. The random forest learns lesion vs. background primal from a database of 2D ultrasound images with labeled lesions. For segmentation, the posterior probabilities of lesion pixels estimated by the learnt random forest are hard thresholded to provide a random walks segmentation stage with starting seeds. Our method achieves detection with 99.19% accuracy and segmentation with mean contour-to-contour error < 3 pixels on a set of 40 images with 49 lesions.

  7. E-2D Advanced Hawkeye: primary flight display

    NASA Astrophysics Data System (ADS)

    Paolillo, Paul W.; Saxena, Ragini; Garruba, Jonathan; Tripathi, Sanjay; Blanchard, Randy

    2006-05-01

    This paper is a response to the challenge of providing a large area avionics display for the E-2D AHE aircraft. The resulting display design provides a pilot with high-resolution visual information content covering an image area of almost three square feet (Active Area of Samsung display = 33.792cm x 27.0336 cm = 13.304" x 10.643" = 141.596 square inches = 0.983 sq. ft x 3 = 2.95 sq. ft). The avionics display application, design and performance being described is the Primary Flight Display for the E-2D Advanced Hawkeye aircraft. This cockpit display has a screen diagonal size of 17 inches. Three displays, with minimum bezel width, just fit within the available instrument panel area. The significant design constraints of supporting an upgrade installation have been addressed. These constraints include a display image size that is larger than the mounting opening in the instrument panel. This, therefore, requires that the Electromagnetic Interference (EMI) window, LCD panel and backlight all fit within the limited available bezel depth. High brightness and a wide dimming range are supported with a dual mode Cold Cathode Fluorescent Tube (CCFT) and LED backlight. Packaging constraints dictated the use of multiple U shaped fluorescent lamps in a direct view backlight design for a maximum display brightness of 300 foot-Lamberts. The low intensity backlight levels are provided by remote LEDs coupled through a fiber optic mesh. This architecture generates luminous uniformity within a minimum backlight depth. Cross-cockpit viewing is supported with ultra-wide field-of-view performance including contrast and the color stability of an advanced LCD cell design supports. Display system design tradeoffs directed a priority to high optical efficiency for minimum power and weight.

  8. Boosting classification performance in computer aided diagnosis of breast masses in raw full-field digital mammography using processed and screen film images

    NASA Astrophysics Data System (ADS)

    Kooi, Thijs; Karssemeijer, Nico

    2014-03-01

    The introduction of Full-Field Digital Mammography (FFDM) in breast screening has brought with it several advantages in terms and processing facilities and image quality and Computer Aided Detection (CAD) systems are now sprouting that make use of this modality. A major drawback however, is that FFDM data is still relatively scarce and therefore, CAD system's performance are inhibited by a lack of training examples. In this paper, we explore the incorporation of more ubiquitous Screen Film Mammograms (SFM) and FFDM processed by the manufacturer, in training a system for the detection of tumour masses. We compute a small set of additional quantitative features in the raw data, that make explicit use of the log-linearity of the energy imparted on the detector in raw FFDM. We explore four di erent fusion methods: a weighted average, a majority vote, a convex combination of classi er outputs, based on the training error and an additional classi er, that combines the output of the three individual label estimates. Results are evaluated based on the Partial Area Under the Curve (PAUC) around a clinically relevant operating point. All fusion methods perform signi cantly better than any of the individual classi ers but we nd no signi cant di erence between the fusion techniques.

  9. A Hierarchical Control Strategy For 2-D Object Recognition

    NASA Astrophysics Data System (ADS)

    Cullen, Mark F.; Kuszmaul, Christopher L.; Ramsey, Timothy S.

    1988-02-01

    A control strategy for 2-D object recognition has been implemented on a hardware configuration which includes a Symbolics Lisp Machine (TM) as a front-end processor to a 16,384 processor Connection Machine (TM). The goal of this ongoing research program is to develop an image analysis system as an aid to human image interpretation experts. Our efforts have concentrated on 2-D object recognition in aerial imagery specifically, the detection and identification of aircraft near the Danbury, CT airport. Image processing functions to label and extract image features are implemented on the Connection Machine for robust computation. A model matching function was also designed and implemented on the CM for object recognition. In this paper we report on the integration of these algorithms on the CM, with a hierarchical control strategy to focus and guide the object recognition task to particular objects and regions of interest in imagery. It will be shown that these tech-nigues may be used to manipulate imagery on the order of 2k x 2k pixels in near-real-time.

  10. Facial biometrics based on 2D vector geometry

    NASA Astrophysics Data System (ADS)

    Malek, Obaidul; Venetsanopoulos, Anastasios; Androutsos, Dimitrios

    2014-05-01

    The main challenge of facial biometrics is its robustness and ability to adapt to changes in position orientation, facial expression, and illumination effects. This research addresses the predominant deficiencies in this regard and systematically investigates a facial authentication system in the Euclidean domain. In the proposed method, Euclidean geometry in 2D vector space is being constructed for features extraction and the authentication method. In particular, each assigned point of the candidates' biometric features is considered to be a 2D geometrical coordinate in the Euclidean vector space. Algebraic shapes of the extracted candidate features are also computed and compared. The proposed authentication method is being tested on images from the public "Put Face Database". The performance of the proposed method is evaluated based on Correct Recognition (CRR), False Acceptance (FAR), and False Rejection (FRR) rates. The theoretical foundation of the proposed method along with the experimental results are also presented in this paper. The experimental results demonstrate the effectiveness of the proposed method.

  11. A new inversion method for (T2, D) 2D NMR logging and fluid typing

    NASA Astrophysics Data System (ADS)

    Tan, Maojin; Zou, Youlong; Zhou, Cancan

    2013-02-01

    One-dimensional nuclear magnetic resonance (1D NMR) logging technology has some significant limitations in fluid typing. However, not only can two-dimensional nuclear magnetic resonance (2D NMR) provide some accurate porosity parameters, but it can also identify fluids more accurately than 1D NMR. In this paper, based on the relaxation mechanism of (T2, D) 2D NMR in a gradient magnetic field, a hybrid inversion method that combines least-squares-based QR decomposition (LSQR) and truncated singular value decomposition (TSVD) is examined in the 2D NMR inversion of various fluid models. The forward modeling and inversion tests are performed in detail with different acquisition parameters, such as magnetic field gradients (G) and echo spacing (TE) groups. The simulated results are discussed and described in detail, the influence of the above-mentioned observation parameters on the inversion accuracy is investigated and analyzed, and the observation parameters in multi-TE activation are optimized. Furthermore, the hybrid inversion can be applied to quantitatively determine the fluid saturation. To study the effects of noise level on the hybrid method and inversion results, the numerical simulation experiments are performed using different signal-to-noise-ratios (SNRs), and the effect of different SNRs on fluid typing using three fluid models are discussed and analyzed in detail.

  12. Computer aided screening of potent inhibitor compounds against inhibitor resistant TEM β-lactamase mutants from traditional Chinese medicine

    PubMed Central

    Zhu, Qifeng; Yin, Yanxia; Liu, Hanjie; Tian, Jinhong

    2014-01-01

    Inhibitor-resistant TEM (IRT) type β-lactamase mutation is largely known. Therefore, it is of interest to identify new yet improved leads against IRT from traditional Chinese medicine. Hence, we screened more than 10,000 compounds from Chinese medicine (tcm@taiwan database) with mutant molecular IRT models through docking techniques. This exercise identified compounds affeic acid, curcumin, salvianolic acid E, ferulic acid and p-coumaric acid with high binding score with the mutants. This was further validated in vitro where salvianolic acid E combined with cefoperazone and sulbactam effectively inhibit the R244S mutant. PMID:25670878

  13. Learning from graphically integrated 2D and 3D representations improves retention of neuroanatomy

    NASA Astrophysics Data System (ADS)

    Naaz, Farah

    Visualizations in the form of computer-based learning environments are highly encouraged in science education, especially for teaching spatial material. Some spatial material, such as sectional neuroanatomy, is very challenging to learn. It involves learning the two dimensional (2D) representations that are sampled from the three dimensional (3D) object. In this study, a computer-based learning environment was used to explore the hypothesis that learning sectional neuroanatomy from a graphically integrated 2D and 3D representation will lead to better learning outcomes than learning from a sequential presentation. The integrated representation explicitly demonstrates the 2D-3D transformation and should lead to effective learning. This study was conducted using a computer graphical model of the human brain. There were two learning groups: Whole then Sections, and Integrated 2D3D. Both groups learned whole anatomy (3D neuroanatomy) before learning sectional anatomy (2D neuroanatomy). The Whole then Sections group then learned sectional anatomy using 2D representations only. The Integrated 2D3D group learned sectional anatomy from a graphically integrated 3D and 2D model. A set of tests for generalization of knowledge to interpreting biomedical images was conducted immediately after learning was completed. The order of presentation of the tests of generalization of knowledge was counterbalanced across participants to explore a secondary hypothesis of the study: preparation for future learning. If the computer-based instruction programs used in this study are effective tools for teaching anatomy, the participants should continue learning neuroanatomy with exposure to new representations. A test of long-term retention of sectional anatomy was conducted 4-8 weeks after learning was completed. The Integrated 2D3D group was better than the Whole then Sections

  14. Comparison of 3-D finite element model of ashlar masonry with 2-D numerical models of ashlar masonry

    NASA Astrophysics Data System (ADS)

    Beran, Pavel

    2016-06-01

    3-D state of stress in heterogeneous ashlar masonry can be also computed by several suitable chosen 2-D numerical models of ashlar masonry. The results obtained from 2-D numerical models well correspond to the results obtained from 3-D numerical model. The character of thermal stress is the same. While using 2-D models the computational time is reduced more than hundredfold and therefore this method could be used for computation of thermal stresses during long time periods with 10 000 of steps.

  15. 2D/3D Visual Tracker for Rover Mast

    NASA Technical Reports Server (NTRS)

    Bajracharya, Max; Madison, Richard W.; Nesnas, Issa A.; Bandari, Esfandiar; Kunz, Clayton; Deans, Matt; Bualat, Maria

    2006-01-01

    A visual-tracker computer program controls an articulated mast on a Mars rover to keep a designated feature (a target) in view while the rover drives toward the target, avoiding obstacles. Several prior visual-tracker programs have been tested on rover platforms; most require very small and well-estimated motion between consecutive image frames a requirement that is not realistic for a rover on rough terrain. The present visual-tracker program is designed to handle large image motions that lead to significant changes in feature geometry and photometry between frames. When a point is selected in one of the images acquired from stereoscopic cameras on the mast, a stereo triangulation algorithm computes a three-dimensional (3D) location for the target. As the rover moves, its body-mounted cameras feed images to a visual-odometry algorithm, which tracks two-dimensional (2D) corner features and computes their old and new 3D locations. The algorithm rejects points, the 3D motions of which are inconsistent with a rigid-world constraint, and then computes the apparent change in the rover pose (i.e., translation and rotation). The mast pan and tilt angles needed to keep the target centered in the field-of-view of the cameras (thereby minimizing the area over which the 2D-tracking algorithm must operate) are computed from the estimated change in the rover pose, the 3D position of the target feature, and a model of kinematics of the mast. If the motion between the consecutive frames is still large (i.e., 3D tracking was unsuccessful), an adaptive view-based matching technique is applied to the new image. This technique uses correlation-based template matching, in which a feature template is scaled by the ratio between the depth in the original template and the depth of pixels in the new image. This is repeated over the entire search window and the best correlation results indicate the appropriate match. The program could be a core for building application programs for systems

  16. 2D and 3D heterogeneous photonic integrated circuits

    NASA Astrophysics Data System (ADS)

    Yoo, S. J. Ben

    2014-03-01

    Exponential increases in the amount of data that need to be sensed, communicated, and processed are continuing to drive the complexity of our computing, networking, and sensing systems. High degrees of integration is essential in scalable, practical, and cost-effective microsystems. In electronics, high-density 2D integration has naturally evolved towards 3D integration by stacking of memory and processor chips with through-silicon-vias. In photonics, too, we anticipate highdegrees of 3D integration of photonic components to become a prevailing method in realizing future microsystems for information and communication technologies. However, compared to electronics, photonic 3D integration face a number of challenges. This paper will review two methods of 3D photonic integration --- fs laser inscription and layer stacking, and discuss applications and future prospects.

  17. Advecting Procedural Textures for 2D Flow Animation

    NASA Technical Reports Server (NTRS)

    Kao, David; Pang, Alex; Moran, Pat (Technical Monitor)

    2001-01-01

    This paper proposes the use of specially generated 3D procedural textures for visualizing steady state 2D flow fields. We use the flow field to advect and animate the texture over time. However, using standard texture advection techniques and arbitrary textures will introduce some undesirable effects such as: (a) expanding texture from a critical source point, (b) streaking pattern from the boundary of the flowfield, (c) crowding of advected textures near an attracting spiral or sink, and (d) absent or lack of textures in some regions of the flow. This paper proposes a number of strategies to solve these problems. We demonstrate how the technique works using both synthetic data and computational fluid dynamics data.

  18. 2D Regimes of Non-Fourier Convection

    NASA Astrophysics Data System (ADS)

    Papanicolaou, N. C.

    2010-11-01

    In this work, we investigate the 2D flow in a rectangular cavity subject to both vertical and horizontal temperature gradients. The linearized model is studied and the effect of thermal relaxation, as described by the Maxwell-Cattaneo law of heat conduction is examined. To this end, a spectral numerical model is created based on a Galerkin expansion. The basis is the Cartesian product of systems of beam functions and trigonometric functions. The natural modes of the system are derived for both the Fourier and non-Fourier models. The results are compared to earlier works for the plain Fourier law. Our computations show that for the same set of parameters, the Maxwell-Cattaneo law yields modes which are quantitatively different from the Fourier. It is found that the real parts of the eigenvalues increase with the Straughan number Sg, which quantifies the non-Fourier effects. This confirms the destabilizing effect of the MC-law on the convective flow.

  19. The Anatomy of High-Performance 2D Similarity Calculations

    PubMed Central

    Haque, Imran S.; Pande, Vijay S.

    2011-01-01

    Similarity measures based on the comparison of dense bit-vectors of two-dimensional chemical features are a dominant method in chemical informatics. For large-scale problems, including compound selection and machine learning, computing the intersection between two dense bit-vectors is the overwhelming bottleneck. We describe efficient implementations of this primitive, as well as example applications, using features of modern CPUs that allow 20-40x performance increases relative to typical code. Specifically, we describe fast methods for population count on modern x86 processors and cache-efficient matrix traversal and leader clustering algorithms that alleviate memory bandwidth bottlenecks in similarity matrix construction and clustering. The speed of our 2D comparison primitives is within a small factor of that obtained on GPUs, and does not require specialized hardware. PMID:21854053

  20. Radiofrequency Spectroscopy and Thermodynamics of Fermi Gases in the 2D to Quasi-2D Dimensional Crossover

    NASA Astrophysics Data System (ADS)

    Cheng, Chingyun; Kangara, Jayampathi; Arakelyan, Ilya; Thomas, John

    2016-05-01

    We tune the dimensionality of a strongly interacting degenerate 6 Li Fermi gas from 2D to quasi-2D, by adjusting the radial confinement of pancake-shaped clouds to control the radial chemical potential. In the 2D regime with weak radial confinement, the measured pair binding energies are in agreement with 2D-BCS mean field theory, which predicts dimer pairing energies in the many-body regime. In the qausi-2D regime obtained with increased radial confinement, the measured pairing energy deviates significantly from 2D-BCS theory. In contrast to the pairing energy, the measured radii of the cloud profiles are not fit by 2D-BCS theory in either the 2D or quasi-2D regimes, but are fit in both regimes by a beyond mean field polaron-model of the free energy. Supported by DOE, ARO, NSF, and AFOSR.

  1. Competing coexisting phases in 2D water

    PubMed Central

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-01-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules. PMID:27185018

  2. 2D Radiative Processes Near Cloud Edges

    NASA Technical Reports Server (NTRS)

    Varnai, T.

    2012-01-01

    Because of the importance and complexity of dynamical, microphysical, and radiative processes taking place near cloud edges, the transition zone between clouds and cloud free air has been the subject of intense research both in the ASR program and in the wider community. One challenge in this research is that the one-dimensional (1D) radiative models widely used in both remote sensing and dynamical simulations become less accurate near cloud edges: The large horizontal gradients in particle concentrations imply that accurate radiative calculations need to consider multi-dimensional radiative interactions among areas that have widely different optical properties. This study examines the way the importance of multidimensional shortwave radiative interactions changes as we approach cloud edges. For this, the study relies on radiative simulations performed for a multiyear dataset of clouds observed over the NSA, SGP, and TWP sites. This dataset is based on Microbase cloud profiles as well as wind measurements and ARM cloud classification products. The study analyzes the way the difference between 1D and 2D simulation results increases near cloud edges. It considers both monochromatic radiances and broadband radiative heating, and it also examines the influence of factors such as cloud type and height, and solar elevation. The results provide insights into the workings of radiative processes and may help better interpret radiance measurements and better estimate the radiative impacts of this critical region.

  3. Phase Engineering of 2D Tin Sulfides.

    PubMed

    Mutlu, Zafer; Wu, Ryan J; Wickramaratne, Darshana; Shahrezaei, Sina; Liu, Chueh; Temiz, Selcuk; Patalano, Andrew; Ozkan, Mihrimah; Lake, Roger K; Mkhoyan, K A; Ozkan, Cengiz S

    2016-06-01

    Tin sulfides can exist in a variety of phases and polytypes due to the different oxidation states of Sn. A subset of these phases and polytypes take the form of layered 2D structures that give rise to a wide host of electronic and optical properties. Hence, achieving control over the phase, polytype, and thickness of tin sulfides is necessary to utilize this wide range of properties exhibited by the compound. This study reports on phase-selective growth of both hexagonal tin (IV) sulfide SnS2 and orthorhombic tin (II) sulfide SnS crystals with diameters of over tens of microns on SiO2 substrates through atmospheric pressure vapor-phase method in a conventional horizontal quartz tube furnace with SnO2 and S powders as the source materials. Detailed characterization of each phase of tin sulfide crystals is performed using various microscopy and spectroscopy methods, and the results are corroborated by ab initio density functional theory calculations. PMID:27099950

  4. Ion Transport in 2-D Graphene Nanochannels

    NASA Astrophysics Data System (ADS)

    Xie, Quan; Foo, Elbert; Duan, Chuanhua

    2015-11-01

    Graphene membranes have recently attracted wide attention due to its great potential in water desalination and selective molecular sieving. Further developments of these membranes, including enhancing their mass transport rate and/or molecular selectivity, rely on the understanding of fundamental transport mechanisms through graphene membranes, which has not been studied experimentally before due to fabrication and measurement difficulties. Herein we report the fabrication of the basic constituent of graphene membranes, i.e. 2-D single graphene nanochannels (GNCs) and the study of ion transport in these channels. A modified bonding technique was developed to form GNCs with well-defined geometry and uniform channel height. Ion transport in such GNCs was studied using DC conductance measurement. Our preliminary results showed that the ion transport in GNCs is still governed by surface charge at low concentrations (10-6M to 10-4M). However, GNCs exhibits much higher ionic conductances than silica nanochannels with the same geometries in the surface-charge-governed regime. This conductance enhancement can be attributed to the pre-accumulation of charges on graphene surfaces. The work is supported by the Faculty Startup Fund (Boston University, USA).

  5. 2D Turbulence with Complicated Boundaries

    NASA Astrophysics Data System (ADS)

    Roullet, G.; McWilliams, J. C.

    2014-12-01

    We examine the consequences of lateral viscous boundary layers on the 2D turbulence that arises in domains with complicated boundaries (headlands, bays etc). The study is carried out numerically with LES. The numerics are carefully designed to ensure all global conservation laws, proper boundary conditions and a minimal range of dissipation scales. The turbulence dramatically differs from the classical bi-periodic case. Boundary layer separations lead to creation of many small vortices and act as a continuing energy source exciting the inverse cascade of energy throughout the domain. The detachments are very intermittent in time. In free decay, the final state depends on the effective numerical resolution: laminar with a single dominant vortex for low Re and turbulent with many vortices for large enough Re. After very long time, the turbulent end-state exhibits a striking tendency for the emergence of shielded vortices which then interact almost elastically. In the forced case, the boundary layers allow the turbulence to reach a statistical steady state without any artificial hypo-viscosity or other large-scale dissipation. Implications are discussed for the oceanic mesoscale and submesoscale turbulence.

  6. Competing coexisting phases in 2D water

    NASA Astrophysics Data System (ADS)

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-05-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules.

  7. Competing coexisting phases in 2D water.

    PubMed

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-01-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules. PMID:27185018

  8. 2-D wavelet with position controlled resolution

    NASA Astrophysics Data System (ADS)

    Walczak, Andrzej; Puzio, Leszek

    2005-09-01

    Wavelet transformation localizes all irregularities in the scene. It is most effective in the case when intensities in the scene have no sharp details. It is the case often present in a medical imaging. To identify the shape one has to extract it from the scene as typical irregularity. When the scene does not contain sharp changes then common differential filters are not efficient tool for a shape extraction. The new 2-D wavelet for such task has been proposed. Described wavelet transform is axially symmetric and has varied scale in dependence on the distance from the centre of the wavelet symmetry. The analytical form of the wavelet has been presented as well as its application for details extraction in the scene. Most important feature of the wavelet transform is that it gives a multi-scale transformation, and if zoom is on the wavelet selectivity varies proportionally to the zoom step. As a result, the extracted shape does not change during zoom operation. What is more the wavelet selectivity can be fit to the local intensity gradient properly to obtain best extraction of the irregularities.

  9. Constriant inversion of 2D magnetotelluric data with anisotropic conductivities

    NASA Astrophysics Data System (ADS)

    Chen, X.; Weckmann, U.

    2011-12-01

    Within the framework of the German - South African geo-scientific research initiative Inkaba yeAfrica a series of magnetotelluric (MT) field experiments were conducted along the Agulhas-Karoo Transect in South Africa. This transect crosses several continental collision zones between the Cape Fold Belt, the Namaqua Natal Mobile Belt and the Kaapvaal Craton. Along the Cape Fold Belt (CFB) profile we can identify areas (>10 km) where MT sites exhibit phases over 90°. This phenomenon usually occurs in presence of electrical anisotropy. Due to the dense site spacing we are able to observe this behaviour consistently at several sites. The anisotropy of electrical conductivity is essentially a scale effect: Even if the conductivity is isotropic on the micro scale, it will become anisotropic on a larger scale if, in the averaging volume, preferred orientation (e.g., layering or lamination) exist. Therefore, it is necessary to understand the electrical anisotropy in more details and furthermore electrical anisotropy offers new degrees of freedom, which should allow a better interpretation of data. In 2D MT case with considering of electrical anisotropy, computing of impedance tensor requires two independent electric field solutions computed for two different source polarisations. Based on the forward problem formulation and its numerical approximation we derive partial differential equations for the sensitivities of the magnetotelluric fields with respect to the elements of the conductivity tensor within the medium. For illustration a sensitivity study for a simple synthetic model is shown. We present an algorithm for the inversion of 2D magnetotelluric data with anisotropic conductivities which is a extension of the well-known NLCG minimization algorithm to anisotropic model. To constrain the structure complexity, a penalty function consists of datamisfit, standard model roughness and quadratic variation of the conductivity tensor elements is minimized. To demonstrate the

  10. Ab initio modeling of 2D layered organohalide lead perovskites.

    PubMed

    Fraccarollo, Alberto; Cantatore, Valentina; Boschetto, Gabriele; Marchese, Leonardo; Cossi, Maurizio

    2016-04-28

    A number of 2D layered perovskites A2PbI4 and BPbI4, with A and B mono- and divalent ammonium and imidazolium cations, have been modeled with different theoretical methods. The periodic structures have been optimized (both in monoclinic and in triclinic systems, corresponding to eclipsed and staggered arrangements of the inorganic layers) at the DFT level, with hybrid functionals, Gaussian-type orbitals and dispersion energy corrections. With the same methods, the various contributions to the solid stabilization energy have been discussed, separating electrostatic and dispersion energies, organic-organic intralayer interactions and H-bonding effects, when applicable. Then the electronic band gaps have been computed with plane waves, at the DFT level with scalar and full relativistic potentials, and including the correlation energy through the GW approximation. Spin orbit coupling and GW effects have been combined in an additive scheme, validated by comparing the computed gap with well known experimental and theoretical results for a model system. Finally, various contributions to the computed band gaps have been discussed on some of the studied systems, by varying some geometrical parameters and by substituting one cation in another's place. PMID:27131557

  11. Ab initio modeling of 2D layered organohalide lead perovskites

    NASA Astrophysics Data System (ADS)

    Fraccarollo, Alberto; Cantatore, Valentina; Boschetto, Gabriele; Marchese, Leonardo; Cossi, Maurizio

    2016-04-01

    A number of 2D layered perovskites A2PbI4 and BPbI4, with A and B mono- and divalent ammonium and imidazolium cations, have been modeled with different theoretical methods. The periodic structures have been optimized (both in monoclinic and in triclinic systems, corresponding to eclipsed and staggered arrangements of the inorganic layers) at the DFT level, with hybrid functionals, Gaussian-type orbitals and dispersion energy corrections. With the same methods, the various contributions to the solid stabilization energy have been discussed, separating electrostatic and dispersion energies, organic-organic intralayer interactions and H-bonding effects, when applicable. Then the electronic band gaps have been computed with plane waves, at the DFT level with scalar and full relativistic potentials, and including the correlation energy through the GW approximation. Spin orbit coupling and GW effects have been combined in an additive scheme, validated by comparing the computed gap with well known experimental and theoretical results for a model system. Finally, various contributions to the computed band gaps have been discussed on some of the studied systems, by varying some geometrical parameters and by substituting one cation in another's place.

  12. 2D Wavefront Sensor Analysis and Control

    1996-02-19

    This software is designed for data acquisition and analysis of two dimensional wavefront sensors. The software includes data acquisition and control functions for an EPIX frame grabber to acquire data from a computer and all the appropriate analysis functions necessary to produce and display intensity and phase information. This software is written in Visual Basic for windows.

  13. ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data

    NASA Astrophysics Data System (ADS)

    Akca, Irfan

    2016-04-01

    ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discretized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.

  14. Electron dynamics and valley relaxation in 2D semiconductors

    NASA Astrophysics Data System (ADS)

    Gundogdu, Kenan

    2015-03-01

    Single layer transition metal dichalcogenides are 2D semiconducting systems with unique electronic band structure. Two-valley energy bands along with strong spin-orbital coupling lead to valley dependent career spin polarization, which is the basis for recently proposed valleytronic applications. Since the durations of valley population provide the time window in which valley specific processes take place, it is an essential parameter for developing valleytronic devices. These systems also exhibit unusually strong many body affects, such as strong exciton and trion binding, due to reduced dielectric screening of Coulomb interactions. But there is not much known about the impact of strong many particle correlations on spin and valley polarization dynamics. Here we report direct measurements of ultrafast valley specific relaxation dynamics in single layer MoS2 and WS2. We found that excitonic many body interactions significantly contribute to the relaxation process. Biexciton formation reveals hole valley spin relaxation time. Our results also suggest initial fast intervalley electron scattering and electron spin relaxation leads to loss of electron valley polarization, which then facilitates hole valley relaxation via excitonic spin exchange interaction.

  15. MAZE96. Generates 2D Input for DYNA NIKE & TOPAZ

    SciTech Connect

    Sanford, L.; Hallquist, J.O.

    1992-02-24

    MAZE is an interactive program that serves as an input and two-dimensional mesh generator for DYNA2D, NIKE2D, TOPAZ2D, and CHEMICAL TOPAZ2D. MAZE also generates a basic template for ISLAND input. MAZE has been applied to the generation of input data to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  16. On 2D graphical representation of DNA sequence of nondegeneracy

    NASA Astrophysics Data System (ADS)

    Zhang, Yusen; Liao, Bo; Ding, Kequan

    2005-08-01

    Some two-dimensional (2D) graphical representations of DNA sequences have been given by Gates, Nandy, Leong and Mogenthaler, Randić, and Liao et al., which give visual characterizations of DNA sequences. In this Letter, we introduce a nondegeneracy 2D graphical representation of DNA sequence, which is different from Randić's novel 2D representation and Liao's 2D representation. We also present the nondegeneracy forms corresponding to the representations of Gates, Nandy, Leong and Mogenthaler.

  17. Generates 2D Input for DYNA NIKE & TOPAZ

    1996-07-15

    MAZE is an interactive program that serves as an input and two-dimensional mesh generator for DYNA2D, NIKE2D, TOPAZ2D, and CHEMICAL TOPAZ2D. MAZE also generates a basic template for ISLAND input. MAZE has been applied to the generation of input data to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  18. 2d PDE Linear Symmetric Matrix Solver

    1983-10-01

    ICCG2 (Incomplete Cholesky factorized Conjugate Gradient algorithm for 2d symmetric problems) was developed to solve a linear symmetric matrix system arising from a 9-point discretization of two-dimensional elliptic and parabolic partial differential equations found in plasma physics applications, such as resistive MHD, spatial diffusive transport, and phase space transport (Fokker-Planck equation) problems. These problems share the common feature of being stiff and requiring implicit solution techniques. When these parabolic or elliptic PDE''s are discretized withmore » finite-difference or finite-element methods,the resulting matrix system is frequently of block-tridiagonal form. To use ICCG2, the discretization of the two-dimensional partial differential equation and its boundary conditions must result in a block-tridiagonal supermatrix composed of elementary tridiagonal matrices. The incomplete Cholesky conjugate gradient algorithm is used to solve the linear symmetric matrix equation. Loops are arranged to vectorize on the Cray1 with the CFT compiler, wherever possible. Recursive loops, which cannot be vectorized, are written for optimum scalar speed. For matrices lacking symmetry, ILUCG2 should be used. Similar methods in three dimensions are available in ICCG3 and ILUCG3. A general source containing extensions and macros, which must be processed by a pre-compiler to obtain the standard FORTRAN source, is provided along with the standard FORTRAN source because it is believed to be more readable. The pre-compiler is not included, but pre-compilation may be performed by a text editor as described in the UCRL-88746 Preprint.« less

  19. 2d PDE Linear Asymmetric Matrix Solver

    1983-10-01

    ILUCG2 (Incomplete LU factorized Conjugate Gradient algorithm for 2d problems) was developed to solve a linear asymmetric matrix system arising from a 9-point discretization of two-dimensional elliptic and parabolic partial differential equations found in plasma physics applications, such as plasma diffusion, equilibria, and phase space transport (Fokker-Planck equation) problems. These equations share the common feature of being stiff and requiring implicit solution techniques. When these parabolic or elliptic PDE''s are discretized with finite-difference or finite-elementmore » methods, the resulting matrix system is frequently of block-tridiagonal form. To use ILUCG2, the discretization of the two-dimensional partial differential equation and its boundary conditions must result in a block-tridiagonal supermatrix composed of elementary tridiagonal matrices. A generalization of the incomplete Cholesky conjugate gradient algorithm is used to solve the matrix equation. Loops are arranged to vectorize on the Cray1 with the CFT compiler, wherever possible. Recursive loops, which cannot be vectorized, are written for optimum scalar speed. For problems having a symmetric matrix ICCG2 should be used since it runs up to four times faster and uses approximately 30% less storage. Similar methods in three dimensions are available in ICCG3 and ILUCG3. A general source, containing extensions and macros, which must be processed by a pre-compiler to obtain the standard FORTRAN source, is provided along with the standard FORTRAN source because it is believed to be more readable. The pre-compiler is not included, but pre-compilation may be performed by a text editor as described in the UCRL-88746 Preprint.« less

  20. Ultrasonic 2D matrix PVDF transducer

    NASA Astrophysics Data System (ADS)

    Ptchelintsev, A.; Maev, R. Gr.

    2000-05-01

    During the past decade a substantial amount of work has been done in the area of ultrasonic imaging technology using 2D arrays. The main problems arising for the two-dimensional matrix transducers at megahertz frequencies are small size and huge count of the elements, high electrical impedance, low sensitivity, bad SNR and slower data acquisition rate. The major technological difficulty remains the high density of the interconnect. To solve these problems numerous approaches have been suggested. In the present work, a 24×24 elements (24 transmit+24 receive) matrix and a switching board were developed. The transducer consists of two 52 μm PVDF layers each representing a linear array of 24 elements placed one on the top of the other. Electrodes in these two layers are perpendicular and form the grid of 0.5×0.5 mm pitch. The layers are bonded together with the ground electrode being monolithic and located between the layers. The matrix is backed from the rear surface with an epoxy composition. During the emission, a linear element from the emitting layer generates a longitudinal wave pulse propagating inside the test object. Reflected pulses are picked-up by the receiving layer. During one transmit-receive cycle one transmit element and one receive element are selected by corresponding multiplexers. These crossed elements emulate a small element formed by their intersection. The present design presents the following advantages: minimizes number of active channels and density of the interconnect; reduces the electrical impedance of the element improving electrical matching; enables the transmit-receive mode; due to the efficient backing provides bandwidth and good time resolution; and, significantly reduces the electronics complexity. The matrix can not be used for the beam steering and focusing. Owing to this impossibility of focusing, the penetration depth is limited as well by the diffraction phenomena.

  1. Measurement of astrophysical S factors and electron screening potentials for d(d, n){sup 3}He reaction In ZrD{sub 2}, TiD{sub 2}, D{sub 2}O, and CD{sub 2} targets in the ultralow energy region using plasma accelerators

    SciTech Connect

    Bystritsky, V. M.; Bystritskii, Vit. M.; Dudkin, G. N.; Filipowicz, M.; Gazi, S.; Huran, J.; Kobzev, A. P.; Mesyats, G. A.; Nechaev, B. A.; Padalko, V. N.; Parzhitskii, S. S.; Pen'kov, F. M.; Philippov, A. V.; Kaminskii, V. L.; Tuleushev, Yu. Zh.; Wozniak, J.

    2012-01-15

    The paper is devoted to study electron screening effect influence on the rate of d(d, n){sup 3}He reaction in the ultralow deuteron collision energy range in the deuterated polyethylene (CD{sub 2}), frozen heavy water (D{sub 2}O) and deuterated metals (ZrD{sub 2} and TiD{sub 2}). The ZrD{sub 2} and TiD{sub 2} targets were fabricated via magnetron sputtering of titanium and zirconium in gas (deuterium) environment. The experiments have been carried out using high-current plasma pulsed accelerator with forming of inverse Z pinch (HCEIRAS, Russia) and pulsed Hall plasma accelerator (NPI at TPU, Russia). The detection of neutrons with energy of 2.5MeV from dd reaction was done with plastic scintillation spectrometers. As a result of the experiments the energy dependences of astrophysical S factor for the dd reaction in the deuteron collision energy range of 2-7 keV and the values of the electron screening potential U{sub e} of interacting deuterons have been measured for the indicated above target: U{sub e}(CD{sub 2}) Less-Than-Or-Slanted-Equal-To 40 eV; U{sub e}(D{sub 2}O) Less-Than-Or-Slanted-Equal-To 26 eV; U{sub e}(ZrD{sub 2}) = 157 {+-} 43 eV; U{sub e}(TiD{sub 2}) = 125{+-}34 eV. The value of astrophysical S factor, corresponding to the deuteron collision energy equal to zero, in the experiments with D{sub 2}O target is found: S{sub b}(0) = 58.6 {+-} 3.6 keV b. The paper compares our results with other available published experimental and calculated data.

  2. A Planar Quantum Transistor Based on 2D-2D Tunneling in Double Quantum Well Heterostructures

    SciTech Connect

    Baca, W.E.; Blount, M.A.; Hafich, M.J.; Lyo, S.K.; Moon, J.S.; Reno, J.L.; Simmons, J.A.; Wendt, J.R.

    1998-12-14

    We report on our work on the double electron layer tunneling transistor (DELTT), based on the gate-control of two-dimensional -- two-dimensional (2D-2D) tunneling in a double quantum well heterostructure. While previous quantum transistors have typically required tiny laterally-defined features, by contrast the DELTT is entirely planar and can be reliably fabricated in large numbers. We use a novel epoxy-bond-and-stop-etch (EBASE) flip-chip process, whereby submicron gating on opposite sides of semiconductor epitaxial layers as thin as 0.24 microns can be achieved. Because both electron layers in the DELTT are 2D, the resonant tunneling features are unusually sharp, and can be easily modulated with one or more surface gates. We demonstrate DELTTs with peak-to-valley ratios in the source-drain I-V curve of order 20:1 below 1 K. Both the height and position of the resonant current peak can be controlled by gate voltage over a wide range. DELTTs with larger subband energy offsets ({approximately} 21 meV) exhibit characteristics that are nearly as good at 77 K, in good agreement with our theoretical calculations. Using these devices, we also demonstrate bistable memories operating at 77 K. Finally, we briefly discuss the prospects for room temperature operation, increases in gain, and high-speed.

  3. Computational screening for new inhibitors of M. tuberculosis mycolyltransferases antigen 85 group of proteins as potential drug targets.

    PubMed

    Gahoi, Shachi; Mandal, Rahul Shubhra; Ivanisenko, Nikita; Shrivastava, Priyanka; Jain, Sriyans; Singh, Ashish Kumar; Raghunandanan, Muthukurrusi Varieth; Kanchan, Swarna; Taneja, Bhupesh; Mandal, Chhabinath; Ivanisenko, Vladimir A; Kumar, Anil; Kumar, Rita; Open Source Drug Discovery Consortium; Ramachandran, Srinivasan

    2013-01-01

    The group of antigen 85 proteins of Mycobacterium tuberculosis is responsible for converting trehalose monomycolate to trehalose dimycolate, which contributes to cell wall stability. Here, we have used a serial enrichment approach to identify new potential inhibitors by searching the libraries of compounds using both 2D atom pair descriptors and binary fingerprints followed by molecular docking. Three different docking softwares AutoDock, GOLD, and LigandFit were used for docking calculations. In addition, we applied the criteria of selecting compounds with binding efficiency close to the starting known inhibitor and showing potential to form hydrogen bonds with the active site amino acid residues. The starting inhibitor was ethyl-3-phenoxybenzyl-butylphosphonate, which had IC(50) value of 2.0 μM in mycolyltransferase inhibition assay. Our search from more than 34 million compounds from public libraries yielded 49 compounds. Subsequently, selection was restricted to compounds conforming to the Lipinski rule of five and exhibiting hydrogen bonding to any of the amino acid residues in the active site pocket of all three proteins of antigen 85A, 85B, and 85C. Finally, we selected those ligands which were ranked top in the table with other known decoys in all the docking results. The compound NIH415032 from tuberculosis antimicrobial acquisition and coordinating facility was further examined using molecular dynamics simulations for 10 ns. These results showed that the binding is stable, although some of the hydrogen bond atom pairs varied through the course of simulation. The NIH415032 has antitubercular properties with IC(90) at 20 μg/ml (53.023 μM). These results will be helpful to the medicinal chemists for developing new antitubercular molecules for testing. PMID:22804492

  4. A computational perspective of molecular interactions through virtual screening, pharmacokinetic and dynamic prediction on ribosome toxin A chain and inhibitors of Ricinus communis

    PubMed Central

    Kumar, R. Barani; Suresh, M. Xavier

    2012-01-01

    Background: Ricin is considered to be one of the most deadly toxins and gained its favor as a bioweapon that has a serious social and biological impact, due to its widespread nature and abundant availability. The hazardous effects of this toxin in human being are seen in almost all parts of the organ system. The severe consequences of the toxin necessitate the need for developing potential inhibitors that can effectively block its interaction with the host system. Materials and Methods: In order to identify potential inhibitors that can effectively block ricin, we employed various computational approaches. In this work, we computationally screened and analyzed 66 analogs and further tested their ADME/T profiles. From the kinetic and toxicity studies we selected six analogs that possessed appropriate pharmacokinetic and dynamic property. We have also performed a computational docking of these analogs with the target. Results: On the basis of the dock scores and hydrogen bond interactions we have identified analog 64 to be the best interacting molecule. Molecule 64 seems to have stable interaction with the residues Tyr80, Arg180, and Val81. The pharmacophore feature that describes the key functional features of a molecule was also studied and presented. Conclusion: The pharmacophore features of the drugs provided suggests the key functional groups that can aid in the design and synthesis of more potential inhibitors. PMID:22224054

  5. Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods

    PubMed Central

    McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando

    2010-01-01

    Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349

  6. A search for mosquito larvicidal compounds by blocking the sterol carrying protein, AeSCP-2, through computational screening and docking strategies

    PubMed Central

    Kumar, R. Barani; Shanmugapriya, B.; Thiyagesan, K.; Kumar, S. Raj; Xavier, Suresh M.

    2010-01-01

    Background: Sterol is a very vital compound for most of the insects and mosquitoes to complete their life cycle. Unfortunately mosquitoes cannot synthesize the sterol, it depends on mammals for the same. Mosquitoes take the sterol from the plant decays during their larval stage in the form of phytosterol, which is then converted to cholesterol for further growth and reproduction. This conversion occurs with the help of the sterol carrier protein 2(SCP2). Methods: Mosquito populations are controlled by plant-based inhibitors, which inhibit sterol carrier protein (SCPI-Sterol carrier protein inhibitor) activity. In this article, we explain the methods of inhibiting Aedes aegypti SCP2 by insilico methods including natural inhibitor selection and filtrations by virtual screening and interaction studies. Results: In this study protein-ligand interactions were carried out with various phytochemicals, as a result of virtual screening Alpha-mangostin and Panthenol were found to be good analogs, and were allowed to dock with the mosquito cholesterol carrier protein AeSCP-2. Conclusion: Computational selections of SCPIs are highly reliable and novel methods for discovering new and more effective compounds to control mosquitoes. PMID:21808576

  7. Correlated Electron Phenomena in 2D Materials

    NASA Astrophysics Data System (ADS)

    Lambert, Joseph G.

    In this thesis, I present experimental results on coherent electron phenomena in layered two-dimensional materials: single layer graphene and van der Waals coupled 2D TiSe2. Graphene is a two-dimensional single-atom thick sheet of carbon atoms first derived from bulk graphite by the mechanical exfoliation technique in 2004. Low-energy charge carriers in graphene behave like massless Dirac fermions, and their density can be easily tuned between electron-rich and hole-rich quasiparticles with electrostatic gating techniques. The sharp interfaces between regions of different carrier densities form barriers with selective transmission, making them behave as partially reflecting mirrors. When two of these interfaces are set at a separation distance within the phase coherence length of the carriers, they form an electronic version of a Fabry-Perot cavity. I present measurements and analysis of multiple Fabry-Perot modes in graphene with parallel electrodes spaced a few hundred nanometers apart. Transition metal dichalcogenide (TMD) TiSe2 is part of the family of materials that coined the term "materials beyond graphene". It contains van der Waals coupled trilayer stacks of Se-Ti-Se. Many TMD materials exhibit a host of interesting correlated electronic phases. In particular, TiSe2 exhibits chiral charge density waves (CDW) below TCDW ˜ 200 K. Upon doping with copper, the CDW state gets suppressed with Cu concentration, and CuxTiSe2 becomes superconducting with critical temperature of T c = 4.15 K. There is still much debate over the mechanisms governing the coexistence of the two correlated electronic phases---CDW and superconductivity. I will present some of the first conductance spectroscopy measurements of proximity coupled superconductor-CDW systems. Measurements reveal a proximity-induced critical current at the Nb-TiSe2 interfaces, suggesting pair correlations in the pure TiSe2. The results indicate that superconducting order is present concurrently with CDW in

  8. Development and use of touch-screen audio computer-assisted self-interviewing in a study of American Indians.

    PubMed

    Edwards, Sandra L; Slattery, Martha L; Murtaugh, Maureen A; Edwards, Roger L; Bryner, James; Pearson, Mindy; Rogers, Amy; Edwards, Alison M; Tom-Orme, Lillian

    2007-06-01

    This article describes the development and usability of an audio computer-assisted self-interviewing (ACASI) questionnaire created to collect dietary, physical activity, medical history, and other lifestyle data in a population of American Indians. Study participants were part of a cohort of American Indians living in the southwestern United States. Data were collected between March 2004 and July 2005. Information for evaluating questionnaire usability and acceptability was collected from three different sources: baseline study data, auxiliary background data, and a short questionnaire administered to a subset of study participants. For the subset of participants, 39.6% reported not having used a computer in the past year. The ACASI questionnaires were well accepted: 96.0% of the subset of participants reported finding them enjoyable to use, 97.2% reported that they were easy to use, and 82.6% preferred them for future questionnaires. A lower educational level and infrequent computer use in the past year were predictors of having usability trouble. These results indicate that the ACASI questionnaire is both an acceptable and a preferable mode of data collection in this population. PMID:17379618

  9. Cellophane as a half-wave plate and its use for converting a laptop computer screen into a three-dimensional display

    NASA Astrophysics Data System (ADS)

    Iizuka, Keigo

    2003-08-01

    It was experimentally verified that an ordinary 25-μm-thick cellophane sheet possesses properties of a wide wavelength spectrum half-wave plate. Moreover, cellophane displayed superior performance when used for rotating the direction of polarization of white light than a commercially available half-wave plate with a specified wavelength. The retardance of the cellophane was measured to be 170°. The availability of a half-wave plate of an extra large size with low cost opens up usages for large size displays. As an example of its application, an ordinary screen of a laptop personal computer was converted into that of a three-dimensional display by the cellophane half-wave plate. It may be added that the price per square cm2 of the cellophane half-wave plate is about 1/3500 of that of a commercially available half-wave plate.

  10. Experimental study of heavy-ion computed tomography using a scintillation screen and an electron-multiplying charged coupled device camera for human head imaging

    NASA Astrophysics Data System (ADS)

    Muraishi, Hiroshi; Hara, Hidetake; Abe, Shinji; Yokose, Mamoru; Watanabe, Takara; Takeda, Tohoru; Koba, Yusuke; Fukuda, Shigekazu

    2016-03-01

    We have developed a heavy-ion computed tomography (IonCT) system using a scintillation screen and an electron-multiplying charged coupled device (EMCCD) camera that can measure a large object such as a human head. In this study, objective with the development of the system was to investigate the possibility of applying this system to heavy-ion treatment planning from the point of view of spatial resolution in a reconstructed image. Experiments were carried out on a rotation phantom using 12C accelerated up to 430 MeV/u by the Heavy-Ion Medical Accelerator in Chiba (HIMAC) at the National Institute of Radiological Sciences (NIRS). We demonstrated that the reconstructed image of an object with a water equivalent thickness (WET) of approximately 18 cm was successfully achieved with the spatial resolution of 1 mm, which would make this IonCT system worth applying to the heavy-ion treatment planning for head and neck cancers.

  11. FRANC2D: A two-dimensional crack propagation simulator. Version 2.7: User's guide

    NASA Technical Reports Server (NTRS)

    Wawrzynek, Paul; Ingraffea, Anthony

    1994-01-01

    FRANC 2D (FRacture ANalysis Code, 2 Dimensions) is a menu driven, interactive finite element computer code that performs fracture mechanics analyses of 2-D structures. The code has an automatic mesh generator for triangular and quadrilateral elements. FRANC2D calculates the stress intensity factor using linear elastic fracture mechanics and evaluates crack extension using several methods that may be selected by the user. The code features a mesh refinement and adaptive mesh generation capability that is automatically developed according to the predicted crack extension direction and length. The code also has unique features that permit the analysis of layered structure with load transfer through simulated mechanical fasteners or bonded joints. The code was written for UNIX workstations with X-windows graphics and may be executed on the following computers: DEC DecStation 3000 and 5000 series, IBM RS/6000 series, Hewlitt-Packard 9000/700 series, SUN Sparc stations, and most Silicon Graphics models.

  12. Computer-delivered screening and brief intervention (e-SBI) for postpartum drug use: A randomized trial

    PubMed Central

    Ondersma, Steven J.; Svikis, Dace; Thacker, Leroy; Beatty, Jessica R.; Lockhart, Nancy

    2013-01-01

    Electronic screening and brief intervention (e-SBI) approaches for substance use have shown early promise. This trial was designed to replicate previous findings from a single 20-minute e-SBI for drug use among postpartum women. A total of 143 postpartum, primarily low-income African-American women meeting criteria for drug use, were randomly assigned to either a tailored e-SBI or a time-matched control condition. Blinded follow-up evaluation 3- and 6-months following childbirth revealed strong effects for confirmed illicit drug use abstinence at the 3-month observation (OR = 3.3, p = .01), as did hair analysis at 6 months (OR = 4.8, p = .018). Additional primary outcomes suggested small to moderate effect sizes in favor of the e-SBI, but did not reach significance. This result replicates previous findings but fails to show durable effects. Assessment reactivity, e-SBI design, and possible extension of e-SBI via tailored messaging all merit careful consideration. PMID:24051077

  13. CYP2D7 Sequence Variation Interferes with TaqMan CYP2D6*15 and *35 Genotyping

    PubMed Central

    Riffel, Amanda K.; Dehghani, Mehdi; Hartshorne, Toinette; Floyd, Kristen C.; Leeder, J. Steven; Rosenblatt, Kevin P.; Gaedigk, Andrea

    2016-01-01

    TaqMan™ genotyping assays are widely used to genotype CYP2D6, which encodes a major drug metabolizing enzyme. Assay design for CYP2D6 can be challenging owing to the presence of two pseudogenes, CYP2D7 and CYP2D8, structural and copy number variation and numerous single nucleotide polymorphisms (SNPs) some of which reflect the wild-type sequence of the CYP2D7 pseudogene. The aim of this study was to identify the mechanism causing false-positive CYP2D6*15 calls and remediate those by redesigning and validating alternative TaqMan genotype assays. Among 13,866 DNA samples genotyped by the CompanionDx® lab on the OpenArray platform, 70 samples were identified as heterozygotes for 137Tins, the key SNP of CYP2D6*15. However, only 15 samples were confirmed when tested with the Luminex xTAG CYP2D6 Kit and sequencing of CYP2D6-specific long range (XL)-PCR products. Genotype and gene resequencing of CYP2D6 and CYP2D7-specific XL-PCR products revealed a CC>GT dinucleotide SNP in exon 1 of CYP2D7 that reverts the sequence to CYP2D6 and allows a TaqMan assay PCR primer to bind. Because CYP2D7 also carries a Tins, a false-positive mutation signal is generated. This CYP2D7 SNP was also responsible for generating false-positive signals for rs769258 (CYP2D6*35) which is also located in exon 1. Although alternative CYP2D6*15 and *35 assays resolved the issue, we discovered a novel CYP2D6*15 subvariant in one sample that carries additional SNPs preventing detection with the alternate assay. The frequency of CYP2D6*15 was 0.1% in this ethnically diverse U.S. population sample. In addition, we also discovered linkage between the CYP2D7 CC>GT dinucleotide SNP and the 77G>A (rs28371696) SNP of CYP2D6*43. The frequency of this tentatively functional allele was 0.2%. Taken together, these findings emphasize that regardless of how careful genotyping assays are designed and evaluated before being commercially marketed, rare or unknown SNPs underneath primer and/or probe regions can impact

  14. CYP2D7 Sequence Variation Interferes with TaqMan CYP2D6 (*) 15 and (*) 35 Genotyping.

    PubMed

    Riffel, Amanda K; Dehghani, Mehdi; Hartshorne, Toinette; Floyd, Kristen C; Leeder, J Steven; Rosenblatt, Kevin P; Gaedigk, Andrea

    2015-01-01

    TaqMan™ genotyping assays are widely used to genotype CYP2D6, which encodes a major drug metabolizing enzyme. Assay design for CYP2D6 can be challenging owing to the presence of two pseudogenes, CYP2D7 and CYP2D8, structural and copy number variation and numerous single nucleotide polymorphisms (SNPs) some of which reflect the wild-type sequence of the CYP2D7 pseudogene. The aim of this study was to identify the mechanism causing false-positive CYP2D6 (*) 15 calls and remediate those by redesigning and validating alternative TaqMan genotype assays. Among 13,866 DNA samples genotyped by the CompanionDx® lab on the OpenArray platform, 70 samples were identified as heterozygotes for 137Tins, the key SNP of CYP2D6 (*) 15. However, only 15 samples were confirmed when tested with the Luminex xTAG CYP2D6 Kit and sequencing of CYP2D6-specific long range (XL)-PCR products. Genotype and gene resequencing of CYP2D6 and CYP2D7-specific XL-PCR products revealed a CC>GT dinucleotide SNP in exon 1 of CYP2D7 that reverts the sequence to CYP2D6 and allows a TaqMan assay PCR primer to bind. Because CYP2D7 also carries a Tins, a false-positive mutation signal is generated. This CYP2D7 SNP was also responsible for generating false-positive signals for rs769258 (CYP2D6 (*) 35) which is also located in exon 1. Although alternative CYP2D6 (*) 15 and (*) 35 assays resolved the issue, we discovered a novel CYP2D6 (*) 15 subvariant in one sample that carries additional SNPs preventing detection with the alternate assay. The frequency of CYP2D6 (*) 15 was 0.1% in this ethnically diverse U.S. population sample. In addition, we also discovered linkage between the CYP2D7 CC>GT dinucleotide SNP and the 77G>A (rs28371696) SNP of CYP2D6 (*) 43. The frequency of this tentatively functional allele was 0.2%. Taken together, these findings emphasize that regardless of how careful genotyping assays are designed and evaluated before being commercially marketed, rare or unknown SNPs underneath primer

  15. Lung Cancer Screening Update.

    PubMed

    Ruchalski, Kathleen L; Brown, Kathleen

    2016-07-01

    Since the release of the US Preventive Services Task Force and Centers for Medicare and Medicaid Services recommendations for lung cancer screening, low-dose chest computed tomography screening has moved from the research arena to clinical practice. Lung cancer screening programs must reach beyond image acquisition and interpretation and engage in a multidisciplinary effort of clinical shared decision-making, standardization of imaging and nodule management, smoking cessation, and patient follow-up. Standardization of radiologic reports and nodule management will systematize patient care, provide quality assurance, further reduce harm, and contain health care costs. Although the National Lung Screening Trial results and eligibility criteria of a heavy smoking history are the foundation for the standard guidelines for low-dose chest computed tomography screening in the United States, currently only 27% of patients diagnosed with lung cancer would meet US lung cancer screening recommendations. Current and future efforts must be directed to better delineate those patients who would most benefit from screening and to ensure that the benefits of screening reach all socioeconomic strata and racial and ethnic minorities. Further optimization of lung cancer screening program design and patient eligibility will assure that lung cancer screening benefits will outweigh the potential risks to our patients. PMID:27306387

  16. Retrospective analysis of 2D patient-specific IMRT verifications

    SciTech Connect

    Childress, Nathan L.; White, R. Allen; Bloch, Charles; Salehpour, Mohammad; Dong, Lei; Rosen, Isaac I.

    2005-04-01

    We performed 858 two-dimensional (2D) patient-specific intensity modulated radiotherapy verifications over a period of 18 months. Multifield, composite treatment plans were measured in phantom using calibrated Kodak EDR2 film and compared with the calculated dose extracted from two treatment planning systems. This research summarizes our findings using the normalized agreement test (NAT) index and the percent of pixels failing the gamma index as metrics to represent the agreement between measured and computed dose distributions. An in-house dose comparison software package was used to register and compare all verifications. We found it was important to use an automatic positioning algorithm to achieve maximum registration accuracy, and that our automatic algorithm agreed well with anticipated results from known phantom geometries. We also measured absolute dose for each case using an ion chamber. Because the computed distributions agreed with ion chamber measurements better than the EDR2 film doses, we normalized EDR2 data to the computed distributions. The distributions of both the NAT indices and the percentage of pixels failing the gamma index were found to be exponential distributions. We continue to use both the NAT index and percent of pixels failing gamma with 5%/3 mm criteria to evaluate future verifications, as these two metrics were found to be complementary. Our data showed that using 2%/2 mm or 3%/3 mm criteria produces results similar to those using 5%/3 mm criteria. Normalized comparisons that have a NAT index greater than 45 and/or more than 20% of the pixels failing gamma for 5%/3 mm criteria represent outliers from our clinical data set and require further analysis. Because our QA verification results were exponentially distributed, rather than a tight grouping of similar results, we continue to perform patient-specific QA in order to identify and correct outliers in our verifications. The data from this work could be useful as a reference for

  17. Mesophases in nearly 2D room-temperature ionic liquids.

    PubMed

    Manini, N; Cesaratto, M; Del Pópolo, M G; Ballone, P

    2009-11-26

    Computer simulations of (i) a [C(12)mim][Tf(2)N] film of nanometric thickness squeezed at kbar pressure by a piecewise parabolic confining potential reveal a mesoscopic in-plane density and composition modulation reminiscent of mesophases seen in 3D samples of the same room-temperature ionic liquid (RTIL). Near 2D confinement, enforced by a high normal load, as well as relatively long aliphatic chains are strictly required for the mesophase formation, as confirmed by computations for two related systems made of (ii) the same [C(12)mim][Tf(2)N] adsorbed at a neutral solid surface and (iii) a shorter-chain RTIL ([C(4)mim][Tf(2)N]) trapped in the potential well of part i. No in-plane modulation is seen for ii and iii. In case ii, the optimal arrangement of charge and neutral tails is achieved by layering parallel to the surface, while, in case iii, weaker dispersion and packing interactions are unable to bring aliphatic tails together into mesoscopic islands, against overwhelming entropy and Coulomb forces. The onset of in-plane mesophases could greatly affect the properties of long-chain RTILs used as lubricants. PMID:19886615

  18. 2D Quantum Mechanical Study of Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, B.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    With the onset of quantum confinement in the inversion layer in nanoscale MOSFETs, behavior of the resonant level inevitably determines all device characteristics. While most classical device simulators take quantization into account in some simplified manner, the important details of electrostatics are missing. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL, and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density-gradient and quantum-corrected MEDICI). We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions and oxide tunneling are treated on an equal footing. Electrons in the ellipsoids of the conduction band are treated within the anisotropic effective mass approximation. We present the results of our simulations of MIT 25, 50 and 90 nm "well-tempered" MOSFETs and compare them to those of classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. Surprisingly, the self-consistent potential profile shows lower injection barrier in the channel in quantum case. These results are qualitatively consistent with ID Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and subthreshold current has been studied. The shorter gate length device has an order of magnitude smaller current at zero gate bias than the longer gate length device without a significant trade-off in on-current. This should be a device design consideration.

  19. 2D Quantum Transport Modeling in Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, Bryan

    2001-01-01

    With the onset of quantum confinement in the inversion layer in nanoscale MOSFETs, behavior of the resonant level inevitably determines all device characteristics. While most classical device simulators take quantization into account in some simplified manner, the important details of electrostatics are missing. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL, and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density- gradient and quantum-corrected MEDICI). We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions, oxide tunneling and phase-breaking scattering are treated on equal footing. Electrons in the ellipsoids of the conduction band are treated within the anisotropic effective mass approximation. Quantum simulations are focused on MIT 25, 50 and 90 nm "well- tempered" MOSFETs and compared to classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. These results are quantitatively consistent with I D Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and sub-threshold current has been studied. The shorter gate length device has an order of magnitude smaller current at zero gate bias than the longer gate length device without a significant trade-off in on-current. This should be a device design consideration.

  20. Computer simulated screening of dentin bonding primer monomers through analysis of their chemical functions and their spatial 3D alignment.

    PubMed

    Vaidyanathan, J; Vaidyanathan, T K; Ravichandran, S

    2009-02-01

    Binding interactions between dentin bonding primer monomers and dentinal collagen were studied by an analysis of their chemical functions and their spatial 3D alignment. A trial set of 12 monomers used as primers in dentin adhesives was characterized to assess them for binding to a complementary target. HipHop utility in the Catalyst software from Accelrys was used for the study. Ten hypotheses were generated by HipHop procedures involving (a) conformational generation using a poling technique to promote conformational variation, (b) extraction of functions to remodel ligands as function-based structures, and (c) identification of common patterns of functional alignment displayed by low energy conformations. The hypotheses, designated as pharmacaphores, were also scored and ranked. Analysis of pharmacaphore models through mapping of ligands revealed important differences between ligands. Top-ranked poses from direct docking simulations using type 1 collagen target were mapped in a rigid manner to the highest ranked pharmacophore model. The visual match observed in mapping and associated fit values suggest a strong correspondence between direct and indirect docking simulations. The results elegantly demonstrate that an indirect approach used to identify pharmacaphore models from adhesive ligands without a target may be a simple and viable approach to assess their intermolecular interactions with an intended target. Inexpensive indirect/direct virtual screening of hydrophilic monomer candidates may be a practical way to assess their initial promise for dentin primer use well before additional experimental evaluation of their priming/bonding efficacy. This is also of value in the search/design of new compounds for priming dentin. PMID:18546179

  1. Parallel algorithms for 2-D cylindrical transport equations of Eigenvalue problem

    SciTech Connect

    Wei, J.; Yang, S.

    2013-07-01

    In this paper, aimed at the neutron transport equations of eigenvalue problem under 2-D cylindrical geometry on unstructured grid, the discrete scheme of Sn discrete ordinate and discontinuous finite is built, and the parallel computation for the scheme is realized on MPI systems. Numerical experiments indicate that the designed parallel algorithm can reach perfect speedup, it has good practicality and scalability. (authors)

  2. Evaluation of 2D shallow-water model for spillway flow with a complex geometry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although the two-dimensional (2D) shallow water model is formulated based on several assumptions such as hydrostatic pressure distribution and vertical velocity is negligible, as a simple alternative to the complex 3D model, it has been used to compute water flows in which these assumptions may be ...

  3. Automatic intensity-based 3D-to-2D registration of CT volume and dual-energy digital radiography for the detection of cardiac calcification

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Gilkeson, Robert; Fei, Baowei

    2007-03-01

    We are investigating three-dimensional (3D) to two-dimensional (2D) registration methods for computed tomography (CT) and dual-energy digital radiography (DR) for the detection of coronary artery calcification. CT is an established tool for the diagnosis of coronary artery diseases (CADs). Dual-energy digital radiography could be a cost-effective alternative for screening coronary artery calcification. In order to utilize CT as the "gold standard" to evaluate the ability of DR images for the detection and localization of calcium, we developed an automatic intensity-based 3D-to-2D registration method for 3D CT volumes and 2D DR images. To generate digital rendering radiographs (DRR) from the CT volumes, we developed three projection methods, i.e. Gaussian-weighted projection, threshold-based projection, and average-based projection. We tested normalized cross correlation (NCC) and normalized mutual information (NMI) as similarity measurement. We used the Downhill Simplex method as the search strategy. Simulated projection images from CT were fused with the corresponding DR images to evaluate the localization of cardiac calcification. The registration method was evaluated by digital phantoms, physical phantoms, and clinical data sets. The results from the digital phantoms show that the success rate is 100% with mean errors of less 0.8 mm and 0.2 degree for both NCC and NMI. The registration accuracy of the physical phantoms is 0.34 +/- 0.27 mm. Color overlay and 3D visualization of the clinical data show that the two images are registered well. This is consistent with the improvement of the NMI values from 0.20 +/- 0.03 to 0.25 +/- 0.03 after registration. The automatic 3D-to-2D registration method is accurate and robust and may provide a useful tool to evaluate the dual-energy DR images for the detection of coronary artery calcification.

  4. Automatic Intensity-based 3D-to-2D Registration of CT Volume and Dual-energy Digital Radiography for the Detection of Cardiac Calcification

    PubMed Central

    Chen, Xiang; Gilkeson, Robert; Fei, Baowei

    2013-01-01

    We are investigating three-dimensional (3D) to two-dimensional (2D) registration methods for computed tomography (CT) and dual-energy digital radiography (DR) for the detection of coronary artery calcification. CT is an established tool for the diagnosis of coronary artery diseases (CADs). Dual-energy digital radiography could be a cost-effective alternative for screening coronary artery calcification. In order to utilize CT as the “gold standard” to evaluate the ability of DR images for the detection and localization of calcium, we developed an automatic intensity-based 3D-to-2D registration method for 3D CT volumes and 2D DR images. To generate digital rendering radiographs (DRR) from the CT volumes, we developed three projection methods, i.e. Gaussian-weighted projection, threshold-based projection, and average-based projection. We tested normalized cross correlation (NCC) and normalized mutual information (NMI) as similarity measurement. We used the Downhill Simplex method as the search strategy. Simulated projection images from CT were fused with the corresponding DR images to evaluate the localization of cardiac calcification. The registration method was evaluated by digital phantoms, physical phantoms, and clinical data sets. The results from the digital phantoms show that the success rate is 100% with mean errors of less 0.8 mm and 0.2 degree for both NCC and NMI. The registration accuracy of the physical phantoms is 0.34 ± 0.27 mm. Color overlay and 3D visualization of the clinical data show that the two images are registered well. This is consistent with the improvement of the NMI values from 0.20 ± 0.03 to 0.25 ± 0.03 after registration. The automatic 3D-to-2D registration method is accurate and robust and may provide a useful tool to evaluate the dual-energy DR images for the detection of coronary artery calcification. PMID:24386527

  5. Nano-scale electronic and optoelectronic devices based on 2D crystals

    NASA Astrophysics Data System (ADS)

    Zhu, Wenjuan

    In the last few years, the research community has been rapidly growing interests in two-dimensional (2D) crystals and their applications. The properties of these 2D crystals are diverse -- ranging from semi-metal such as graphene, semiconductors such as MoS2, to insulator such as boron nitride. These 2D crystals have many unique properties as compared to their bulk counterparts due to their reduced dimensionality and symmetry. A key difference is the band structures, which lead to distinct electronic and photonic properties. The 2D nature of the material also plays an important role in defining their exceptional properties of mechanical strength, surface sensitivity, thermal conductivity, tunable band-gap and their interaction with light. These unique properties of 2D crystals open up a broad territory of applications in computing, communication, energy, and medicine. In this talk, I will present our work on understanding the electrical properties of graphene and MoS2, in particular current transport and band-gap engineering in graphene, interface between gate dielectrics and graphene, and gap states in MoS2. I will also present our work on the nano-scale electronic devices (RF and logic devices) and photonic devices (plasmonic devices and photo-detectors) based on these 2D crystals.

  6. Programmable variable stiffness 2D surface design

    NASA Astrophysics Data System (ADS)

    Trabia, Sarah; Hwang, Taeseon; Yim, Woosoon

    2014-03-01

    Variable stiffness features can contribute to many engineering applications ranging from robotic joints to shock and vibration mitigation. In addition, variable stiffness can be used in the tactile feedback to provide the sense of touch to the user. A key component in the proposed device is the Biased Magnetorheological Elastomer (B-MRE) where iron particles within the elastomer compound develop a dipole interaction energy. A novel feature of this device is to introduce a field induced shear modulus bias via a permanent magnet which provides an offset with a current input to the electromagnetic control coil to change the compliance or modulus of a base elastomer in both directions (softer or harder). The B-MRE units can lead to the design of a variable stiffness surface. In this preliminary work, both computational and experimental results of the B-MRE are presented along with a preliminary design of the programmable variable stiffness surface design.

  7. A real-time multi-scale 2D Gaussian filter based on FPGA

    NASA Astrophysics Data System (ADS)

    Luo, Haibo; Gai, Xingqin; Chang, Zheng; Hui, Bin

    2014-11-01

    Multi-scale 2-D Gaussian filter has been widely used in feature extraction (e.g. SIFT, edge etc.), image segmentation, image enhancement, image noise removing, multi-scale shape description etc. However, their computational complexity remains an issue for real-time image processing systems. Aimed at this problem, we propose a framework of multi-scale 2-D Gaussian filter based on FPGA in this paper. Firstly, a full-hardware architecture based on parallel pipeline was designed to achieve high throughput rate. Secondly, in order to save some multiplier, the 2-D convolution is separated into two 1-D convolutions. Thirdly, a dedicate first in first out memory named as CAFIFO (Column Addressing FIFO) was designed to avoid the error propagating induced by spark on clock. Finally, a shared memory framework was designed to reduce memory costs. As a demonstration, we realized a 3 scales 2-D Gaussian filter on a single ALTERA Cyclone III FPGA chip. Experimental results show that, the proposed framework can computing a Multi-scales 2-D Gaussian filtering within one pixel clock period, is further suitable for real-time image processing. Moreover, the main principle can be popularized to the other operators based on convolution, such as Gabor filter, Sobel operator and so on.

  8. Differential CYP 2D6 Metabolism Alters Primaquine Pharmacokinetics

    PubMed Central

    Potter, Brittney M. J.; Xie, Lisa H.; Vuong, Chau; Zhang, Jing; Zhang, Ping; Duan, Dehui; Luong, Thu-Lan T.; Bandara Herath, H. M. T.; Dhammika Nanayakkara, N. P.; Tekwani, Babu L.; Walker, Larry A.; Nolan, Christina K.; Sciotti, Richard J.; Zottig, Victor E.; Smith, Philip L.; Paris, Robert M.; Read, Lisa T.; Li, Qigui; Pybus, Brandon S.; Sousa, Jason C.; Reichard, Gregory A.

    2015-01-01

    Primaquine (PQ) metabolism by the cytochrome P450 (CYP) 2D family of enzymes is required for antimalarial activity in both humans (2D6) and mice (2D). Human CYP 2D6 is highly polymorphic, and decreased CYP 2D6 enzyme activity has been linked to decreased PQ antimalarial activity. Despite the importance of CYP 2D metabolism in PQ efficacy, the exact role that these enzymes play in PQ metabolism and pharmacokinetics has not been extensively studied in vivo. In this study, a series of PQ pharmacokinetic experiments were conducted in mice with differential CYP 2D metabolism characteristics, including wild-type (WT), CYP 2D knockout (KO), and humanized CYP 2D6 (KO/knock-in [KO/KI]) mice. Plasma and liver pharmacokinetic profiles from a single PQ dose (20 mg/kg of body weight) differed significantly among the strains for PQ and carboxy-PQ. Additionally, due to the suspected role of phenolic metabolites in PQ efficacy, these were probed using reference standards. Levels of phenolic metabolites were highest in mice capable of metabolizing CYP 2D6 substrates (WT and KO/KI 2D6 mice). PQ phenolic metabolites were present in different quantities in the two strains, illustrating species-specific differences in PQ metabolism between the human and mouse enzymes. Taking the data together, this report furthers understanding of PQ pharmacokinetics in the context of differential CYP 2D metabolism and has important implications for PQ administration in humans with different levels of CYP 2D6 enzyme activity. PMID:25645856

  9. PLAN2D - A PROGRAM FOR ELASTO-PLASTIC ANALYSIS OF PLANAR FRAMES

    NASA Technical Reports Server (NTRS)

    Lawrence, C.

    1994-01-01

    PLAN2D is a FORTRAN computer program for the plastic analysis of planar rigid frame structures. Given a structure and loading pattern as input, PLAN2D calculates the ultimate load that the structure can sustain before collapse. Element moments and plastic hinge rotations are calculated for the ultimate load. The location of hinges required for a collapse mechanism to form are also determined. The program proceeds in an iterative series of linear elastic analyses. After each iteration the resulting elastic moments in each member are compared to the reserve plastic moment capacity of that member. The member or members that have moments closest to their reserve capacity will determine the minimum load factor and the site where the next hinge is to be inserted. Next, hinges are inserted and the structural stiffness matrix is reformulated. This cycle is repeated until the structure becomes unstable. At this point the ultimate collapse load is calculated by accumulating the minimum load factor from each previous iteration and multiplying them by the original input loads. PLAN2D is based on the program STAN, originally written by Dr. E.L. Wilson at U.C. Berkeley. PLAN2D has several limitations: 1) Although PLAN2D will detect unloading of hinges it does not contain the capability to remove hinges; 2) PLAN2D does not allow the user to input different positive and negative moment capacities and 3) PLAN2D does not consider the interaction between axial and plastic moment capacity. Axial yielding and buckling is ignored as is the reduction in moment capacity due to axial load. PLAN2D is written in FORTRAN and is machine independent. It has been tested on an IBM PC and a DEC MicroVAX. The program was developed in 1988.

  10. Computer-aided detection of colonic polyps with level set-based adaptive convolution in volumetric mucosa to advance CT colonography toward a screening modality

    PubMed Central

    Zhu, Hongbin; Duan, Chaijie; Pickhardt, Perry; Wang, Su; Liang, Zhengrong

    2009-01-01

    As a promising second reader of computed tomographic colonography (CTC) screening, the computer-aided detection (CAD) of colonic polyps has earned fast growing research interest. In this paper, we present a CAD scheme to automatically detect colonic polyps in CTC images. First, a thick colon wall representation, ie, a volumetric mucosa (VM) with several voxels wide in general, was segmented from CTC images by a partial-volume image segmentation algorithm. Based on the VM, we employed a level set-based adaptive convolution method for calculating the first- and second-order spatial derivatives more accurately to start the geometric analysis. Furthermore, to emphasize the correspondence among different layers in the VM, we introduced a middle-layer enhanced integration along the image gradient direction inside the VM to improve the operation of extracting the geometric information, like the principal curvatures. Initial polyp candidates (IPCs) were then determined by thresholding the geometric measurements. Based on IPCs, several features were extracted for each IPC, and fed into a support vector machine to reduce false positives (FPs). The final detections were displayed in a commercial system to provide second opinions for radiologists. The CAD scheme was applied to 26 patient CTC studies with 32 confirmed polyps by both optical and virtual colonoscopies. Compared to our previous work, all the polyps can be detected successfully with less FPs. At the 100% by polyp sensitivity, the new method yielded 3.5 FPs/dataset. PMID:20428331

  11. 2D to 3D to 2D Dimensionality Crossovers in Thin BSCCO Films

    NASA Astrophysics Data System (ADS)

    Williams, Gary A.

    2003-03-01

    With increasing temperature the superfluid fraction in very thin BSCCO films undergoes a series of dimensionality crossovers. At low temperatures the strong anisotropy causes the thermal excitations to be 2D pancake-antipancake pairs in uncoupled layers. At higher temperatures where the c-axis correlation length becomes larger than a layer there is a crossover to 3D vortex loops. These are initially elliptical, but as the 3D Tc is approached they become more circular as the anisotropy scales away, as modeled by Shenoy and Chattopadhyay [1]. Close to Tc when the correlation length becomes comparable to the film thickness there is a further crossover to a 2D Kosterlitz-Thouless transition, with a drop of the superfluid fraction to zero at T_KT which can be of the order of 1 K below T_c. Good agreement with this model is found for experiments on thin BSCCO 2212 films [2]. 1. S. R. Shenoy and B. Chattopadhyay, Phys. Rev. B 51, 9129 (1995). 2. K. Osborn et al., cond-mat/0204417.

  12. Mechanical characterization of 2D, 2D stitched, and 3D braided/RTM materials

    NASA Technical Reports Server (NTRS)

    Deaton, Jerry W.; Kullerd, Susan M.; Portanova, Marc A.

    1993-01-01

    Braided composite materials have potential for application in aircraft structures. Fuselage frames, floor beams, wing spars, and stiffeners are examples where braided composites could find application if cost effective processing and damage tolerance requirements are met. Another important consideration for braided composites relates to their mechanical properties and how they compare to the properties of composites produced by other textile composite processes being proposed for these applications. Unfortunately, mechanical property data for braided composites do not appear extensively in the literature. Data are presented in this paper on the mechanical characterization of 2D triaxial braid, 2D triaxial braid plus stitching, and 3D (through-the-thickness) braid composite materials. The braided preforms all had the same graphite tow size and the same nominal braid architectures, (+/- 30 deg/0 deg), and were resin transfer molded (RTM) using the same mold for each of two different resin systems. Static data are presented for notched and unnotched tension, notched and unnotched compression, and compression after impact strengths at room temperature. In addition, some static results, after environmental conditioning, are included. Baseline tension and compression fatigue results are also presented, but only for the 3D braided composite material with one of the resin systems.

  13. Towards 2D Bayesian Tomography of Receiver Functions

    NASA Astrophysics Data System (ADS)

    Ray, A.; Bodin, T.; Key, K.

    2014-12-01

    Receiver function analysis is a powerful tool widely used to isolate and interpret receiver-side structure effects in teleseismic records. The idea is to deconvolve the vertical component from the horizontal components to produce a time series, thus eliminating the influence of the source and distant path effects. The receiver function is usually migrated and directly interpreted by visual inspection. However, deconvolution is a numerically unstable procedure that needs to be stabilized, and the solution depends on the choice of regularization parameters (e.g. water level and the width of a low pass filter). Since the solution is blurred with multiple reflections from the subsurface that produce apparent discontinuities, qualitative interpretation of receiver functions is subjective. Alternatively, waveforms can be directly inverted for a 1D S-wave velocity model beneath the receiver. An inversion procedure is more quantitative, as a forward model will take into account all possible reflections and conversions. If cast in a Bayesian framework, an inversion also enables one to assess model uncertainties and quantify parameter trade-offs. However, seismologists have preferred migration techniques as they are easier to implement, computationally cheaper, and allow construction of 2D or 3D sections. Inversions have been limited thus far to the 1D case. In this work we present a method for inversion of converted waveforms measured at a number of aligned stations. The unknown model is a 2D vertical cross section parameterized with a variable number of discontinuities, although the forward model used to compute synthetics under individual stations is 1D. Body waves are inverted jointly with surface wave dispersion measurements to reduce the range of possible solutions. The problem is solved with a fully non linear Bayesian inversion scheme where the posterior velocity distribution is sampled with a Markov Chain Monte Carlo Algorithm. Our approach uses the 'trans

  14. Strongly Metallic Electron and Hole 2D Transport in an Ambipolar Si-Vacuum Field Effect Transistor

    NASA Astrophysics Data System (ADS)

    Hu, Binhui; Yazdanpanah, M. M.; Kane, B. E.; Hwang, E. H.; Das Sarma, S.

    2015-07-01

    We report experiment and theory on an ambipolar gate-controlled Si(111)-vacuum field effect transistor where we study electron and hole (low-temperature 2D) transport in the same device simply by changing the external gate voltage to tune the system from being a 2D electron system at positive gate voltage to a 2D hole system at negative gate voltage. The electron (hole) conductivity manifests strong (moderate) metallic temperature dependence with the conductivity decreasing by a factor of 8 (2) between 0.3 K and 4.2 K with the peak electron mobility (˜18 m2/V s ) being roughly 20 times larger than the peak hole mobility (in the same sample). Our theory explains the data well using random phase approximation