Science.gov

Sample records for 2d computer screen

  1. Computational Screening of 2D Materials for Photocatalysis.

    PubMed

    Singh, Arunima K; Mathew, Kiran; Zhuang, Houlong L; Hennig, Richard G

    2015-03-19

    Two-dimensional (2D) materials exhibit a range of extraordinary electronic, optical, and mechanical properties different from their bulk counterparts with potential applications for 2D materials emerging in energy storage and conversion technologies. In this Perspective, we summarize the recent developments in the field of solar water splitting using 2D materials and review a computational screening approach to rapidly and efficiently discover more 2D materials that possess properties suitable for solar water splitting. Computational tools based on density-functional theory can predict the intrinsic properties of potential photocatalyst such as their electronic properties, optical absorbance, and solubility in aqueous solutions. Computational tools enable the exploration of possible routes to enhance the photocatalytic activity of 2D materials by use of mechanical strain, bias potential, doping, and pH. We discuss future research directions and needed method developments for the computational design and optimization of 2D materials for photocatalysis.

  2. MAGNUM-2D computer code: user's guide

    SciTech Connect

    England, R.L.; Kline, N.W.; Ekblad, K.J.; Baca, R.G.

    1985-01-01

    Information relevant to the general use of the MAGNUM-2D computer code is presented. This computer code was developed for the purpose of modeling (i.e., simulating) the thermal and hydraulic conditions in the vicinity of a waste package emplaced in a deep geologic repository. The MAGNUM-2D computer computes (1) the temperature field surrounding the waste package as a function of the heat generation rate of the nuclear waste and thermal properties of the basalt and (2) the hydraulic head distribution and associated groundwater flow fields as a function of the temperature gradients and hydraulic properties of the basalt. MAGNUM-2D is a two-dimensional numerical model for transient or steady-state analysis of coupled heat transfer and groundwater flow in a fractured porous medium. The governing equations consist of a set of coupled, quasi-linear partial differential equations that are solved using a Galerkin finite-element technique. A Newton-Raphson algorithm is embedded in the Galerkin functional to formulate the problem in terms of the incremental changes in the dependent variables. Both triangular and quadrilateral finite elements are used to represent the continuum portions of the spatial domain. Line elements may be used to represent discrete conduits. 18 refs., 4 figs., 1 tab.

  3. 2D NMR-spectroscopic screening reveals polyketides in ladybugs

    PubMed Central

    Deyrup, Stephen T.; Eckman, Laura E.; McCarthy, Patrick H.; Smedley, Scott R.; Meinwald, Jerrold; Schroeder, Frank C.

    2011-01-01

    Small molecules of biological origin continue to yield the most promising leads for drug design, but systematic approaches for exploring nature’s cache of structural diversity are lacking. Here, we demonstrate the use of 2D NMR spectroscopy to screen a library of biorationally selected insect metabolite samples for partial structures indicating the presence of new chemical entities. This NMR-spectroscopic survey enabled detection of novel compounds in complex metabolite mixtures without prior fractionation or isolation. Our screen led to discovery and subsequent isolation of two families of tricyclic pyrones in Delphastus catalinae, a tiny ladybird beetle that is employed commercially as a biological pest control agent. The D. catalinae pyrones are based on 23-carbon polyketide chains forming 1,11-dioxo-2,6,10-trioxaanthracene and 4,8-dioxo-1,9,13-trioxaanthracene derivatives, representing ring systems not previously found in nature. This study highlights the utility of 2D NMR-spectroscopic screening for exploring nature’s structure space and suggests that insect metabolomes remain vastly underexplored. PMID:21646540

  4. 2D NMR-spectroscopic screening reveals polyketides in ladybugs.

    PubMed

    Deyrup, Stephen T; Eckman, Laura E; McCarthy, Patrick H; Smedley, Scott R; Meinwald, Jerrold; Schroeder, Frank C

    2011-06-14

    Small molecules of biological origin continue to yield the most promising leads for drug design, but systematic approaches for exploring nature's cache of structural diversity are lacking. Here, we demonstrate the use of 2D NMR spectroscopy to screen a library of biorationally selected insect metabolite samples for partial structures indicating the presence of new chemical entities. This NMR-spectroscopic survey enabled detection of novel compounds in complex metabolite mixtures without prior fractionation or isolation. Our screen led to discovery and subsequent isolation of two families of tricyclic pyrones in Delphastus catalinae, a tiny ladybird beetle that is employed commercially as a biological pest control agent. The D. catalinae pyrones are based on 23-carbon polyketide chains forming 1,11-dioxo-2,6,10-trioxaanthracene and 4,8-dioxo-1,9,13-trioxaanthracene derivatives, representing ring systems not previously found in nature. This study highlights the utility of 2D NMR-spectroscopic screening for exploring nature's structure space and suggests that insect metabolomes remain vastly underexplored. PMID:21646540

  5. Practical Algorithm For Computing The 2-D Arithmetic Fourier Transform

    NASA Astrophysics Data System (ADS)

    Reed, Irving S.; Choi, Y. Y.; Yu, Xiaoli

    1989-05-01

    Recently, Tufts and Sadasiv [10] exposed a method for computing the coefficients of a Fourier series of a periodic function using the Mobius inversion of series. They called this method of analysis the Arithmetic Fourier Transform(AFT). The advantage of the AFT over the FN 1' is that this method of Fourier analysis needs only addition operations except for multiplications by scale factors at one stage of the computation. The disadvantage of the AFT as they expressed it originally is that it could be used effectively only to compute finite Fourier coefficients of a real even function. To remedy this the AFT developed in [10] is extended in [11] to compute the Fourier coefficients of both the even and odd components of a periodic function. In this paper, the improved AFT [11] is extended to a two-dimensional(2-D) Arithmetic Fourier Transform for calculating the Fourier Transform of two-dimensional discrete signals. This new algorithm is based on both the number-theoretic method of Mobius inversion of double series and the complex conjugate property of Fourier coefficients. The advantage of this algorithm over the conventional 2-D FFT is that the corner-turning problem needed in a conventional 2-D Discrete Fourier Transform(DFT) can be avoided. Therefore, this new 2-D algorithm is readily suitable for VLSI implementation as a parallel architecture. Comparing the operations of 2-D AFT of a MxM 2-D data array with the conventional 2-D FFT, the number of multiplications is significantly reduced from (2log2M)M2 to (9/4)M2. Hence, this new algorithm is faster than the FFT algorithm. Finally, two simulation results of this new 2-D AFT algorithm for 2-D artificial and real images are given in this paper.

  6. Screening and transport in 2D semiconductor systems at low temperatures

    PubMed Central

    Das Sarma, S.; Hwang, E. H.

    2015-01-01

    Low temperature carrier transport properties in 2D semiconductor systems can be theoretically well-understood within RPA-Boltzmann theory as being limited by scattering from screened Coulomb disorder arising from random quenched charged impurities in the environment. In this work, we derive a number of analytical formula, supported by realistic numerical calculations, for the relevant density, mobility, and temperature range where 2D transport should manifest strong intrinsic (i.e., arising purely from electronic effects) metallic temperature dependence in different semiconductor materials arising entirely from the 2D screening properties, thus providing an explanation for why the strong temperature dependence of the 2D resistivity can only be observed in high-quality and low-disorder 2D samples and also why some high-quality 2D materials manifest much weaker metallicity than other materials. We also discuss effects of interaction and disorder on the 2D screening properties in this context as well as compare 2D and 3D screening functions to comment why such a strong intrinsic temperature dependence arising from screening cannot occur in 3D metallic carrier transport. Experimentally verifiable predictions are made about the quantitative magnitude of the maximum possible low-temperature metallicity in 2D systems and the scaling behavior of the temperature scale controlling the quantum to classical crossover. PMID:26572738

  7. Computing 2D constrained delaunay triangulation using the GPU.

    PubMed

    Qi, Meng; Cao, Thanh-Tung; Tan, Tiow-Seng

    2013-05-01

    We propose the first graphics processing unit (GPU) solution to compute the 2D constrained Delaunay triangulation (CDT) of a planar straight line graph (PSLG) consisting of points and edges. There are many existing CPU algorithms to solve the CDT problem in computational geometry, yet there has been no prior approach to solve this problem efficiently using the parallel computing power of the GPU. For the special case of the CDT problem where the PSLG consists of just points, which is simply the normal Delaunay triangulation (DT) problem, a hybrid approach using the GPU together with the CPU to partially speed up the computation has already been presented in the literature. Our work, on the other hand, accelerates the entire computation on the GPU. Our implementation using the CUDA programming model on NVIDIA GPUs is numerically robust, and runs up to an order of magnitude faster than the best sequential implementations on the CPU. This result is reflected in our experiment with both randomly generated PSLGs and real-world GIS data having millions of points and edges.

  8. Preconditioning 2D Integer Data for Fast Convex Hull Computations.

    PubMed

    Cadenas, José Oswaldo; Megson, Graham M; Luengo Hendriks, Cris L

    2016-01-01

    In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved. PMID:26938221

  9. Preconditioning 2D Integer Data for Fast Convex Hull Computations

    PubMed Central

    2016-01-01

    In order to accelerate computing the convex hull on a set of n points, a heuristic procedure is often applied to reduce the number of points to a set of s points, s ≤ n, which also contains the same hull. We present an algorithm to precondition 2D data with integer coordinates bounded by a box of size p × q before building a 2D convex hull, with three distinct advantages. First, we prove that under the condition min(p, q) ≤ n the algorithm executes in time within O(n); second, no explicit sorting of data is required; and third, the reduced set of s points forms a simple polygonal chain and thus can be directly pipelined into an O(n) time convex hull algorithm. This paper empirically evaluates and quantifies the speed up gained by preconditioning a set of points by a method based on the proposed algorithm before using common convex hull algorithms to build the final hull. A speedup factor of at least four is consistently found from experiments on various datasets when the condition min(p, q) ≤ n holds; the smaller the ratio min(p, q)/n is in the dataset, the greater the speedup factor achieved. PMID:26938221

  10. Sparse and incomplete factorial matrices to screen membrane protein 2D crystallization.

    PubMed

    Lasala, R; Coudray, N; Abdine, A; Zhang, Z; Lopez-Redondo, M; Kirshenbaum, R; Alexopoulos, J; Zolnai, Z; Stokes, D L; Ubarretxena-Belandia, I

    2015-02-01

    Electron crystallography is well suited for studying the structure of membrane proteins in their native lipid bilayer environment. This technique relies on electron cryomicroscopy of two-dimensional (2D) crystals, grown generally by reconstitution of purified membrane proteins into proteoliposomes under conditions favoring the formation of well-ordered lattices. Growing these crystals presents one of the major hurdles in the application of this technique. To identify conditions favoring crystallization a wide range of factors that can lead to a vast matrix of possible reagent combinations must be screened. However, in 2D crystallization these factors have traditionally been surveyed in a relatively limited fashion. To address this problem we carried out a detailed analysis of published 2D crystallization conditions for 12 β-barrel and 138 α-helical membrane proteins. From this analysis we identified the most successful conditions and applied them in the design of new sparse and incomplete factorial matrices to screen membrane protein 2D crystallization. Using these matrices we have run 19 crystallization screens for 16 different membrane proteins totaling over 1300 individual crystallization conditions. Six membrane proteins have yielded diffracting 2D crystals suitable for structure determination, indicating that these new matrices show promise to accelerate the success rate of membrane protein 2D crystallization.

  11. Sparse and incomplete factorial matrices to screen membrane protein 2D crystallization

    PubMed Central

    Lasala, R.; Coudray, N.; Abdine, A.; Zhang, Z.; Lopez-Redondo, M.; Kirshenbaum, R.; Alexopoulos, J.; Zolnai, Z.; Stokes, D.L.; Ubarretxena-Belandia, I.

    2014-01-01

    Electron crystallography is well suited for studying the structure of membrane proteins in their native lipid bilayer environment. This technique relies on electron cryomicroscopy of two-dimensional (2D) crystals, grown generally by reconstitution of purified membrane proteins into proteoliposomes under conditions favoring the formation of well-ordered lattices. Growing these crystals presents one of the major hurdles in the application of this technique. To identify conditions favoring crystallization a wide range of factors that can lead to a vast matrix of possible reagent combinations must be screened. However, in 2D crystallization these factors have traditionally been surveyed in a relatively limited fashion. To address this problem we carried out a detailed analysis of published 2D crystallization conditions for 12 β-barrel and 138 α-helical membrane proteins. From this analysis we identified the most successful conditions and applied them in the design of new sparse and incomplete factorial matrices to screen membrane protein 2D crystallization. Using these matrices we have run 19 crystallization screens for 16 different membrane proteins totaling over 1,300 individual crystallization conditions. Six membrane proteins have yielded diffracting 2D crystals suitable for structure determination, indicating that these new matrices show promise to accelerate the success rate of membrane protein 2D crystallization. PMID:25478971

  12. Sparse and incomplete factorial matrices to screen membrane protein 2D crystallization.

    PubMed

    Lasala, R; Coudray, N; Abdine, A; Zhang, Z; Lopez-Redondo, M; Kirshenbaum, R; Alexopoulos, J; Zolnai, Z; Stokes, D L; Ubarretxena-Belandia, I

    2015-02-01

    Electron crystallography is well suited for studying the structure of membrane proteins in their native lipid bilayer environment. This technique relies on electron cryomicroscopy of two-dimensional (2D) crystals, grown generally by reconstitution of purified membrane proteins into proteoliposomes under conditions favoring the formation of well-ordered lattices. Growing these crystals presents one of the major hurdles in the application of this technique. To identify conditions favoring crystallization a wide range of factors that can lead to a vast matrix of possible reagent combinations must be screened. However, in 2D crystallization these factors have traditionally been surveyed in a relatively limited fashion. To address this problem we carried out a detailed analysis of published 2D crystallization conditions for 12 β-barrel and 138 α-helical membrane proteins. From this analysis we identified the most successful conditions and applied them in the design of new sparse and incomplete factorial matrices to screen membrane protein 2D crystallization. Using these matrices we have run 19 crystallization screens for 16 different membrane proteins totaling over 1300 individual crystallization conditions. Six membrane proteins have yielded diffracting 2D crystals suitable for structure determination, indicating that these new matrices show promise to accelerate the success rate of membrane protein 2D crystallization. PMID:25478971

  13. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates

    PubMed Central

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: • The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms. • The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform. • The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time. PMID:26150988

  14. Building a symbolic computer algebra toolbox to compute 2D Fourier transforms in polar coordinates.

    PubMed

    Dovlo, Edem; Baddour, Natalie

    2015-01-01

    The development of a symbolic computer algebra toolbox for the computation of two dimensional (2D) Fourier transforms in polar coordinates is presented. Multidimensional Fourier transforms are widely used in image processing, tomographic reconstructions and in fact any application that requires a multidimensional convolution. By examining a function in the frequency domain, additional information and insights may be obtained. The advantages of our method include: •The implementation of the 2D Fourier transform in polar coordinates within the toolbox via the combination of two significantly simpler transforms.•The modular approach along with the idea of lookup tables implemented help avoid the issue of indeterminate results which may occur when attempting to directly evaluate the transform.•The concept also helps prevent unnecessary computation of already known transforms thereby saving memory and processing time.

  15. Implementation of 2D computational models for NDE on GPU

    NASA Astrophysics Data System (ADS)

    Bardel, Charles; Lei, Naiguang; Udpa, Lalita

    2012-05-01

    This paper presents an attempt to implement a simulation model for electromagnetic NDE on a GPU. A sample electromagnetic NDE problem is examined and the solution is computed on both CPU and GPU. Diffierent matrix storage formats and matrix-vector computational strategies will be investigated. Analysis of the storage requirements for the matrix on the GPU is tabulated and a full-timing breakdown of the process is presented and discussed.

  16. Analysis and comparison of 2D fingerprints: insights into database screening performance using eight fingerprint methods.

    PubMed

    Duan, Jianxin; Dixon, Steven L; Lowrie, Jeffrey F; Sherman, Woody

    2010-09-01

    Virtual screening is a widely used strategy in modern drug discovery and 2D fingerprint similarity is an important tool that has been successfully applied to retrieve active compounds from large datasets. However, it is not always straightforward to select an appropriate fingerprint method and associated settings for a given problem. Here, we applied eight different fingerprint methods, as implemented in the new cheminformatics package Canvas, on a well-validated dataset covering five targets. The fingerprint methods include Linear, Dendritic, Radial, MACCS, MOLPRINT2D, Pairwise, Triplet, and Torsion. We find that most fingerprints have similar retrieval rates on average; however, each has special characteristics that distinguish its performance on different query molecules and ligand sets. For example, some fingerprints exhibit a significant ligand size dependency whereas others are more robust with respect to variations in the query or active compounds. In cases where little information is known about the active ligands, MOLPRINT2D fingerprints produce the highest average retrieval actives. When multiple queries are available, we find that a fingerprint averaged over all query molecules is generally superior to fingerprints derived from single queries. Finally, a complementarity metric is proposed to determine which fingerprint methods can be combined to improve screening results.

  17. Cytochrome P450-2D6 Screening Among Elderly Using Antidepressants (CYSCE)

    ClinicalTrials.gov

    2016-10-24

    Depression; Depressive Disorder; Poor Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Intermediate Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Ultrarapid Metabolizer Due to Cytochrome P450 CYP2D6 Variant

  18. GEO2D - Two-Dimensional Computer Model of a Ground Source Heat Pump System

    DOE Data Explorer

    James Menart

    2013-06-07

    This file contains a zipped file that contains many files required to run GEO2D. GEO2D is a computer code for simulating ground source heat pump (GSHP) systems in two-dimensions. GEO2D performs a detailed finite difference simulation of the heat transfer occurring within the working fluid, the tube wall, the grout, and the ground. Both horizontal and vertical wells can be simulated with this program, but it should be noted that the vertical wall is modeled as a single tube. This program also models the heat pump in conjunction with the heat transfer occurring. GEO2D simulates the heat pump and ground loop as a system. Many results are produced by GEO2D as a function of time and position, such as heat transfer rates, temperatures and heat pump performance. On top of this information from an economic comparison between the geothermal system simulated and a comparable air heat pump systems or a comparable gas, oil or propane heating systems with a vapor compression air conditioner. The version of GEO2D in the attached file has been coupled to the DOE heating and cooling load software called ENERGYPLUS. This is a great convenience for the user because heating and cooling loads are an input to GEO2D. GEO2D is a user friendly program that uses a graphical user interface for inputs and outputs. These make entering data simple and they produce many plotted results that are easy to understand. In order to run GEO2D access to MATLAB is required. If this program is not available on your computer you can download the program MCRInstaller.exe, the 64 bit version, from the MATLAB website or from this geothermal depository. This is a free download which will enable you to run GEO2D..

  19. 8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. DETAIL OF COMPUTER SCREEN AND CONTROL BOARDS: LEFT SCREEN TRACKS RESIDUAL CHLORINE; INDICATES AMOUNT OF SUNLIGHT WHICH ENABLES OPERATOR TO ESTIMATE NEEDED CHLORINE; CENTER SCREEN SHOWS TURNOUT STRUCTURES; RIGHT SCREEN SHOWS INDICATORS OF ALUMINUM SULFATE TANK FARM. - F. E. Weymouth Filtration Plant, 700 North Moreno Avenue, La Verne, Los Angeles County, CA

  20. CAST2D: A finite element computer code for casting process modeling

    SciTech Connect

    Shapiro, A.B.; Hallquist, J.O.

    1991-10-01

    CAST2D is a coupled thermal-stress finite element computer code for casting process modeling. This code can be used to predict the final shape and stress state of cast parts. CAST2D couples the heat transfer code TOPAZ2D and solid mechanics code NIKE2D. CAST2D has the following features in addition to all the features contained in the TOPAZ2D and NIKE2D codes: (1) a general purpose thermal-mechanical interface algorithm (i.e., slide line) that calculates the thermal contact resistance across the part-mold interface as a function of interface pressure and gap opening; (2) a new phase change algorithm, the delta function method, that is a robust method for materials undergoing isothermal phase change; (3) a constitutive model that transitions between fluid behavior and solid behavior, and accounts for material volume change on phase change; and (4) a modified plot file data base that allows plotting of thermal variables (e.g., temperature, heat flux) on the deformed geometry. Although the code is specialized for casting modeling, it can be used for other thermal stress problems (e.g., metal forming).

  1. Topological evolutionary computing in the optimal design of 2D and 3D structures

    NASA Astrophysics Data System (ADS)

    Burczynski, T.; Poteralski, A.; Szczepanik, M.

    2007-10-01

    An application of evolutionary algorithms and the finite-element method to the topology optimization of 2D structures (plane stress, bending plates, and shells) and 3D structures is described. The basis of the topological evolutionary optimization is the direct control of the density material distribution (or thickness for 2D structures) by the evolutionary algorithm. The structures are optimized for stress, mass, and compliance criteria. The numerical examples demonstrate that this method is an effective technique for solving problems in computer-aided optimal design.

  2. Investigations on the sensitivity of the computer code TURBO-2D

    NASA Astrophysics Data System (ADS)

    Amon, B.

    1994-12-01

    The two-dimensional computer model TURBO-2D for the calculation of two-phase flow was used to calculate the cold injection of fuel into a model chamber. Investigations of the influence of the input parameter on its sensitivity relative to the obtained results were made. In addition to that calculations were performed and compared using experimental injection pressure data and corresponding averaged injection parameter.

  3. The potential for CYP2D6 inhibition screening using a novel scintillation proximity assay-based approach.

    PubMed

    Delaporte, E; Slaughter, D E; Egan, M A; Gatto, G J; Santos, A; Shelley, J; Price, E; Howells, L; Dean, D C; Rodrigues, A D

    2001-08-01

    High throughput inhibition screens for human cytochrome P450s (CYPs) are being used in preclinical drug metabolism to support drug discovery programs. The versatility of scintillation proximity assay (SPA) technology has enabled the development of a homogeneous high throughput assay for cytochrome P450 2D6 (CYP2D6) inhibition screen using [O-methyl-(14)C]dextromethorphan as substrate. The basis of the assay was the trapping of the O-demethylation product, [(14)C]HCHO, on SPA beads. Enzyme kinetics parameters V(max) and apparent K(m), determined using pooled human liver microsomes and microsomes from baculovirus cells coexpressing human CYP2D6 and NADPH-cytochrome P450 reductase, were 245 pmol [(14)C]HCHO/min/mg protein and 11 microM, and 27 pmol [(14)C]HCHO/min/pmol and 1.6 microM, respectively. In incubations containing either pooled microsomes or recombinant CYP2D6, [(14)C]dextromethorphan O-demethylase activity was inhibited in the presence of quinidine (IC(50) = 1.0 microM and 20 nM, respectively). By comparison, inhibitors selective for other CYP isoforms were relatively weak (IC(50) > 25 microM). In agreement, a selective CYP2D6 inhibitory monoclonal antibody caused greater than 90% inhibition of [(14)C]dextromethorphan O-demethylase activity in human liver microsomes, whereas CYP2C9/19- and CYP3A4/5-selective antibodies elicited a minimal inhibitory effect. SPA-based [(14)C]dextromethorphan O-demethylase activity was also shown to correlate (r(2) = 0.6) with dextromethorphan O-demethylase measured by high-performance liquid chromatography in a bank of human liver microsomes (N = 15 different organ donors). In a series of known CYP2D6 inhibitors/substrates, the SPA-based assay resolved potent inhibitors (IC(50) < 2 microM) from weak inhibitors (IC(50) >or= 20 microM). It is concluded that the SPA-based assay described herein is suitable for CYP2D6 inhibition screening using either native human liver microsomes or cDNA-expressed CYP2D6. PMID:11689122

  4. Fast acceleration of 2D wave propagation simulations using modern computational accelerators.

    PubMed

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than 150x speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least 200x faster than the sequential implementation and 30x faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of 120x with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of

  5. GPU computing with OpenCL to model 2D elastic wave propagation: exploring memory usage

    NASA Astrophysics Data System (ADS)

    Iturrarán-Viveros, Ursula; Molero-Armenta, Miguel

    2015-01-01

    Graphics processing units (GPUs) have become increasingly powerful in recent years. Programs exploring the advantages of this architecture could achieve large performance gains and this is the aim of new initiatives in high performance computing. The objective of this work is to develop an efficient tool to model 2D elastic wave propagation on parallel computing devices. To this end, we implement the elastodynamic finite integration technique, using the industry open standard open computing language (OpenCL) for cross-platform, parallel programming of modern processors, and an open-source toolkit called [Py]OpenCL. The code written with [Py]OpenCL can run on a wide variety of platforms; it can be used on AMD or NVIDIA GPUs as well as classical multicore CPUs, adapting to the underlying architecture. Our main contribution is its implementation with local and global memory and the performance analysis using five different computing devices (including Kepler, one of the fastest and most efficient high performance computing technologies) with various operating systems.

  6. Breast density measurement: 3D cone beam computed tomography (CBCT) images versus 2D digital mammograms

    NASA Astrophysics Data System (ADS)

    Han, Tao; Lai, Chao-Jen; Chen, Lingyun; Liu, Xinming; Shen, Youtao; Zhong, Yuncheng; Ge, Shuaiping; Yi, Ying; Wang, Tianpeng; Yang, Wei T.; Shaw, Chris C.

    2009-02-01

    Breast density has been recognized as one of the major risk factors for breast cancer. However, breast density is currently estimated using mammograms which are intrinsically 2D in nature and cannot accurately represent the real breast anatomy. In this study, a novel technique for measuring breast density based on the segmentation of 3D cone beam CT (CBCT) images was developed and the results were compared to those obtained from 2D digital mammograms. 16 mastectomy breast specimens were imaged with a bench top flat-panel based CBCT system. The reconstructed 3D CT images were corrected for the cupping artifacts and then filtered to reduce the noise level, followed by using threshold-based segmentation to separate the dense tissue from the adipose tissue. For each breast specimen, volumes of the dense tissue structures and the entire breast were computed and used to calculate the volumetric breast density. BI-RADS categories were derived from the measured breast densities and compared with those estimated from conventional digital mammograms. The results show that in 10 of 16 cases the BI-RADS categories derived from the CBCT images were lower than those derived from the mammograms by one category. Thus, breasts considered as dense in mammographic examinations may not be considered as dense with the CBCT images. This result indicates that the relation between breast cancer risk and true (volumetric) breast density needs to be further investigated.

  7. Computer program BL2D for solving two-dimensional and axisymmetric boundary layers

    NASA Technical Reports Server (NTRS)

    Iyer, Venkit

    1995-01-01

    This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.

  8. Computer program BL2D for solving two-dimensional and axisymmetric boundary layers

    NASA Astrophysics Data System (ADS)

    Iyer, Venkit

    1995-05-01

    This report presents the formulation, validation, and user's manual for the computer program BL2D. The program is a fourth-order-accurate solution scheme for solving two-dimensional or axisymmetric boundary layers in speed regimes that range from low subsonic to hypersonic Mach numbers. A basic implementation of the transition zone and turbulence modeling is also included. The code is a result of many improvements made to the program VGBLP, which is described in NASA TM-83207 (February 1982), and can effectively supersede it. The code BL2D is designed to be modular, user-friendly, and portable to any machine with a standard fortran77 compiler. The report contains the new formulation adopted and the details of its implementation. Five validation cases are presented. A detailed user's manual with the input format description and instructions for running the code is included. Adequate information is presented in the report to enable the user to modify or customize the code for specific applications.

  9. Computational study of acoustic solitary waves in 2D complex plasma

    NASA Astrophysics Data System (ADS)

    Garee, M. J.; Sheridan, T. E.

    2008-03-01

    A one-dimensional, nonlinear model has been developed for dust-acoustic (DA) waves in a two-dimensional complex plasma. In our model, identical charged dust particles reside on a periodic triangular lattice with lattice constant a. These particles are constrained to move in one dimension, and interact with each other via a screened Coulomb force with Debye length λD. The model is used to compute the dependence of the DA wave speed on the screening parameter κ=a/λD. Computed wave speeds show excellent agreement with theoretical predictions, thereby verifying the model. Total energy is also conserved, as it should be. Localized velocity perturbations are found to evolve into compressive solitary waves and to propagate through the lattice with speeds greater than the DA wave speed. Rarefactive solitary waves are not observed. We intend to characterize overtaking collisions of solitary waves in this system to determine if the phase shift predicted by Korteweg--deVries (KdV) theory occurs, and to compare computed solitary wave widths, amplitudes and speeds to the scalings predicted for KdV solitons.

  10. Adiabatic and Hamiltonian computing on a 2D lattice with simple two-qubit interactions

    NASA Astrophysics Data System (ADS)

    Lloyd, Seth; Terhal, Barbara M.

    2016-02-01

    We show how to perform universal Hamiltonian and adiabatic computing using a time-independent Hamiltonian on a 2D grid describing a system of hopping particles which string together and interact to perform the computation. In this construction, the movement of one particle is controlled by the presence or absence of other particles, an effective quantum field effect transistor that allows the construction of controlled-NOT and controlled-rotation gates. The construction translates into a model for universal quantum computation with time-independent two-qubit ZZ and XX+YY interactions on an (almost) planar grid. The effective Hamiltonian is arrived at by a single use of first-order perturbation theory avoiding the use of perturbation gadgets. The dynamics and spectral properties of the effective Hamiltonian can be fully determined as it corresponds to a particular realization of a mapping between a quantum circuit and a Hamiltonian called the space-time circuit-to-Hamiltonian construction. Because of the simple interactions required, and because no higher-order perturbation gadgets are employed, our construction is potentially realizable using superconducting or other solid-state qubits.

  11. Solution-Adaptive Program for Computing 2D/Axi Viscous Flow

    NASA Technical Reports Server (NTRS)

    Wood, William A.

    2003-01-01

    A computer program solves the Navier- Stokes equations governing the flow of a viscous, compressible fluid in an axisymmetric or two-dimensional (2D) setting. To obtain solutions more accurate than those generated by prior such programs that utilize regular and/or fixed computational meshes, this program utilizes unstructured (that is, irregular triangular) computational meshes that are automatically adapted to solutions. The adaptation can refine to regions of high change in gradient or can be driven by a novel residual minimization technique. Starting from an initial mesh and a corresponding data structure, the adaptation of the mesh is controlled by use of minimization functional. Other improvements over prior such programs include the following: (1) Boundary conditions are imposed weakly; that is, following initial specification of solution values at boundary nodes, these values are relaxed in time by means of the same formulations as those used for interior nodes. (2) Eigenvalues are limited in order to suppress expansion shocks. (3) An upwind fluctuation-splitting distribution scheme applied to inviscid flux requires fewer operations and produces less artificial dissipation than does a finite-volume scheme, leading to greater accuracy of solutions.

  12. Diverse Geological Applications For Basil: A 2d Finite-deformation Computational Algorithm

    NASA Astrophysics Data System (ADS)

    Houseman, Gregory A.; Barr, Terence D.; Evans, Lynn

    Geological processes are often characterised by large finite-deformation continuum strains, on the order of 100% or greater. Microstructural processes cause deformation that may be represented by a viscous constitutive mechanism, with viscosity that may depend on temperature, pressure, or strain-rate. We have developed an effective com- putational algorithm for the evaluation of 2D deformation fields produced by Newto- nian or non-Newtonian viscous flow. With the implementation of this algorithm as a computer program, Basil, we have applied it to a range of diverse applications in Earth Sciences. Viscous flow fields in 2D may be defined for the thin-sheet case or, using a velocity-pressure formulation, for the plane-strain case. Flow fields are represented using 2D triangular elements with quadratic interpolation for velocity components and linear for pressure. The main matrix equation is solved by an efficient and compact conjugate gradient algorithm with iteration for non-Newtonian viscosity. Regular grids may be used, or grids based on a random distribution of points. Definition of the prob- lem requires that velocities, tractions, or some combination of the two, are specified on all external boundary nodes. Compliant boundaries may also be defined, based on the idea that traction is opposed to and proportional to boundary displacement rate. In- ternal boundary segments, allowing fault-like displacements within a viscous medium have also been developed, and we find that the computed displacement field around the fault tip is accurately represented for Newtonian and non-Newtonian viscosities, in spite of the stress singularity at the fault tip. Basil has been applied by us and colleagues to problems that include: thin sheet calculations of continental collision, Rayleigh-Taylor instability of the continental mantle lithosphere, deformation fields around fault terminations at the outcrop scale, stress and deformation fields in and around porphyroblasts, and

  13. Computational Study and Analysis of Structural Imperfections in 1D and 2D Photonic Crystals

    SciTech Connect

    Maskaly, Karlene Rosera

    2005-06-01

    increasing RMS roughness. Again, the homogenization approximation is able to predict these results. The problem of surface scratches on 1D photonic crystals is also addressed. Although the reflectivity decreases are lower in this study, up to a 15% change in reflectivity is observed in certain scratched photonic crystal structures. However, this reflectivity change can be significantly decreased by adding a low index protective coating to the surface of the photonic crystal. Again, application of homogenization theory to these structures confirms its predictive power for this type of imperfection as well. Additionally, the problem of a circular pores in 2D photonic crystals is investigated, showing that almost a 50% change in reflectivity can occur for some structures. Furthermore, this study reveals trends that are consistent with the 1D simulations: parameter changes that increase the absolute reflectivity of the photonic crystal will also increase its tolerance to structural imperfections. Finally, experimental reflectance spectra from roughened 1D photonic crystals are compared to the results predicted computationally in this thesis. Both the computed and experimental spectra correlate favorably, validating the findings presented herein.

  14. Craniosynostosis: prenatal diagnosis by 2D/3D ultrasound, magnetic resonance imaging and computed tomography.

    PubMed

    Helfer, Talita Micheletti; Peixoto, Alberto Borges; Tonni, Gabriele; Araujo Júnior, Edward

    2016-09-01

    Craniosynostosis is defined as the process of premature fusion of one or more of the cranial sutures. It is a common condition that occurs in about 1 to 2,000 live births. Craniosynostosis may be classified in primary or secondary. It is also classified as nonsyndromic or syndromic. According to suture commitment, craniosynostosis may affect a single suture or multiple sutures. There is a wide range of syndromes involving craniosynostosis and the most common are Apert, Pffeifer, Crouzon, Shaethre-Chotzen and Muenke syndromes. The underlying etiology of nonsyndromic craniosynostosis is unknown. Mutations in the fibroblast growth factor (FGF) signalling pathway play a crucial role in the etiology of craniosynostosis syndromes. Prenatal ultrasound`s detection rate of craniosynostosis is low. Nowadays, different methods can be applied for prenatal diagnosis of craniosynostosis, such as two-dimensional (2D) and three-dimensional (3D) ultrasound, magnetic resonance imaging (MRI), computed tomography (CT) scan and, finally, molecular diagnosis. The presence of craniosynostosis may affect the birthing process. Fetuses with craniosynostosis also have higher rates of perinatal complications. In order to avoid the risks of untreated craniosynostosis, children are usually treated surgically soon after postnatal diagnosis. PMID:27622416

  15. An algorithm for computing the 2D structure of fast rotating stars

    NASA Astrophysics Data System (ADS)

    Rieutord, Michel; Espinosa Lara, Francisco; Putigny, Bertrand

    2016-08-01

    Stars may be understood as self-gravitating masses of a compressible fluid whose radiative cooling is compensated by nuclear reactions or gravitational contraction. The understanding of their time evolution requires the use of detailed models that account for a complex microphysics including that of opacities, equation of state and nuclear reactions. The present stellar models are essentially one-dimensional, namely spherically symmetric. However, the interpretation of recent data like the surface abundances of elements or the distribution of internal rotation have reached the limits of validity of one-dimensional models because of their very simplified representation of large-scale fluid flows. In this article, we describe the ESTER code, which is the first code able to compute in a consistent way a two-dimensional model of a fast rotating star including its large-scale flows. Compared to classical 1D stellar evolution codes, many numerical innovations have been introduced to deal with this complex problem. First, the spectral discretization based on spherical harmonics and Chebyshev polynomials is used to represent the 2D axisymmetric fields. A nonlinear mapping maps the spheroidal star and allows a smooth spectral representation of the fields. The properties of Picard and Newton iterations for solving the nonlinear partial differential equations of the problem are discussed. It turns out that the Picard scheme is efficient on the computation of the simple polytropic stars, but Newton algorithm is unsurpassed when stellar models include complex microphysics. Finally, we discuss the numerical efficiency of our solver of Newton iterations. This linear solver combines the iterative Conjugate Gradient Squared algorithm together with an LU-factorization serving as a preconditioner of the Jacobian matrix.

  16. Icarus: A 2-D Direct Simulation Monte Carlo (DSMC) Code for Multi-Processor Computers

    SciTech Connect

    BARTEL, TIMOTHY J.; PLIMPTON, STEVEN J.; GALLIS, MICHAIL A.

    2001-10-01

    Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird[11.1] and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modeled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modeled using steric factors derived from Arrhenius reaction rates or in a manner similar to continuum modeling. Surface chemistry is modeled with surface reaction probabilities; an optional site density, energy dependent, coverage model is included. Electrons are modeled by either a local charge neutrality assumption or as discrete simulational particles. Ion chemistry is modeled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electro-static fields can either be: externally input, a Langmuir-Tonks model or from a Green's Function (Boundary Element) based Poison Solver. Icarus has been used for subsonic to hypersonic, chemically reacting, and plasma flows. The Icarus software package includes the grid generation, parallel processor decomposition, post-processing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. All of the software packages are written in standard Fortran.

  17. Computation of neutron fluxes in clusters of fuel pins arranged in hexagonal assemblies (2D and 3D)

    SciTech Connect

    Prabha, H.; Marleau, G.

    2012-07-01

    For computations of fluxes, we have used Carvik's method of collision probabilities. This method requires tracking algorithms. An algorithm to compute tracks (in 2D and 3D) has been developed for seven hexagonal geometries with cluster of fuel pins. This has been implemented in the NXT module of the code DRAGON. The flux distribution in cluster of pins has been computed by using this code. For testing the results, they are compared when possible with the EXCELT module of the code DRAGON. Tracks are plotted in the NXT module by using MATLAB, these plots are also presented here. Results are presented with increasing number of lines to show the convergence of these results. We have numerically computed volumes, surface areas and the percentage errors in these computations. These results show that 2D results converge faster than 3D results. The accuracy on the computation of fluxes up to second decimal is achieved with fewer lines. (authors)

  18. Atom pair 2D-fingerprints perceive 3D-molecular shape and pharmacophores for very fast virtual screening of ZINC and GDB-17.

    PubMed

    Awale, Mahendra; Reymond, Jean-Louis

    2014-07-28

    Three-dimensional (3D) molecular shape and pharmacophores are important determinants of the biological activity of organic molecules; however, a precise computation of 3D-shape is generally too slow for virtual screening of very large databases. A reinvestigation of the concept of atom pairs initially reported by Carhart et al. and extended by Schneider et al. showed that a simple atom pair fingerprint (APfp) counting atom pairs at increasing topological distances in 2D-structures without atom property assignment correlates with various representations of molecular shape extracted from the 3D-structures. A related 55-dimensional atom pair fingerprint extended with atom properties (Xfp) provided an efficient pharmacophore fingerprint with good performance for ligand-based virtual screening such as the recovery of active compounds from decoys in DUD, and overlap with the ROCS 3D-pharmacophore scoring function. The APfp and Xfp data were organized for web-based extremely fast nearest-neighbor searching in ZINC (13.5 M compounds) and GDB-17 (50 M random subset) freely accessible at www.gdb.unibe.ch .

  19. MULTI2D - a computer code for two-dimensional radiation hydrodynamics

    NASA Astrophysics Data System (ADS)

    Ramis, R.; Meyer-ter-Vehn, J.; Ramírez, J.

    2009-06-01

    Simulation of radiation hydrodynamics in two spatial dimensions is developed, having in mind, in particular, target design for indirectly driven inertial confinement energy (IFE) and the interpretation of related experiments. Intense radiation pulses by laser or particle beams heat high-Z target configurations of different geometries and lead to a regime which is optically thick in some regions and optically thin in others. A diffusion description is inadequate in this situation. A new numerical code has been developed which describes hydrodynamics in two spatial dimensions (cylindrical R-Z geometry) and radiation transport along rays in three dimensions with the 4 π solid angle discretized in direction. Matter moves on a non-structured mesh composed of trilateral and quadrilateral elements. Radiation flux of a given direction enters on two (one) sides of a triangle and leaves on the opposite side(s) in proportion to the viewing angles depending on the geometry. This scheme allows to propagate sharply edged beams without ray tracing, though at the price of some lateral diffusion. The algorithm treats correctly both the optically thin and optically thick regimes. A symmetric semi-implicit (SSI) method is used to guarantee numerical stability. Program summaryProgram title: MULTI2D Catalogue identifier: AECV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 151 098 No. of bytes in distributed program, including test data, etc.: 889 622 Distribution format: tar.gz Programming language: C Computer: PC (32 bits architecture) Operating system: Linux/Unix RAM: 2 Mbytes Word size: 32 bits Classification: 19.7 External routines: X-window standard library (libX11.so) and corresponding heading files (X11/*.h) are

  20. PC2D simulation and optimization of the selective emitter solar cells fabricated by screen printing phosphoric paste method

    NASA Astrophysics Data System (ADS)

    Jia, Xiaojie; Ai, Bin; Deng, Youjun; Xu, Xinxiang; Peng, Hua; Shen, Hui

    2015-08-01

    On the basis of perfect PC2D simulation to the measured current density vs voltage (J-V) curve of the best selective emitter (SE) solar cell fabricated by the CSG Company using the screen printing phosphoric paste method, we systematically investigated the effect of the parameters of gridline, base, selective emitter, back surface field (BSF) layer and surface recombination rate on performance of the SE solar cell. Among these parameters, we identified that the base minority carrier lifetime, the front and back surface recombination rate and the ratio of the sheet-resistance of heavily and lightly doped region are the four largest efficiency-affecting factors. If all the parameters have ideal values, the SE solar cell fabricated on a p-type monocrystalline silicon wafer can even obtain the efficiency of 20.45%. In addition, the simulation also shows that fine gridline combining dense gridline and increasing bus bar number while keeping the lower area ratio can offer the other ways to improve the efficiency.

  1. Technical Note: Guidelines for the digital computation of 2D and 3D enamel thickness in hominoid teeth.

    PubMed

    Benazzi, Stefano; Panetta, Daniele; Fornai, Cinzia; Toussaint, Michel; Gruppioni, Giorgio; Hublin, Jean-Jacques

    2014-02-01

    The study of enamel thickness has received considerable attention in regard to the taxonomic, phylogenetic and dietary assessment of human and non-human primates. Recent developments based on two-dimensional (2D) and three-dimensional (3D) digital techniques have facilitated accurate analyses, preserving the original object from invasive procedures. Various digital protocols have been proposed. These include several procedures based on manual handling of the virtual models and technical shortcomings, which prevent other scholars from confidently reproducing the entire digital protocol. There is a compelling need for standard, reproducible, and well-tailored protocols for the digital analysis of 2D and 3D dental enamel thickness. In this contribution we provide essential guidelines for the digital computation of 2D and 3D enamel thickness in hominoid molars, premolars, canines and incisors. We modify previous techniques suggested for 2D analysis and we develop a new approach for 3D analysis that can also be applied to premolars and anterior teeth. For each tooth class, the cervical line should be considered as the fundamental morphological feature both to isolate the crown from the root (for 3D analysis) and to define the direction of the cross-sections (for 2D analysis).

  2. Efficient Computational Screening of Organic Polymer Photovoltaics.

    PubMed

    Kanal, Ilana Y; Owens, Steven G; Bechtel, Jonathon S; Hutchison, Geoffrey R

    2013-05-16

    There has been increasing interest in rational, computationally driven design methods for materials, including organic photovoltaics (OPVs). Our approach focuses on a screening "pipeline", using a genetic algorithm for first stage screening and multiple filtering stages for further refinement. An important step forward is to expand our diversity of candidate compounds, including both synthetic and property-based measures of diversity. For example, top monomer pairs from our screening are all donor-donor (D-D) combinations, in contrast with the typical donor-acceptor (D-A) motif used in organic photovoltaics. We also find a strong "sequence effect", in which the average HOMO-LUMO gap of tetramers changes by ∼0.2 eV as a function of monomer sequence (e.g., ABBA versus BAAB); this has rarely been explored in conjugated polymers. Beyond such optoelectronic optimization, we discuss other properties needed for high-efficiency organic solar cells, and applications of screening methods to other areas, including non-fullerene n-type materials, tandem cells, and improving charge and exciton transport. PMID:26282968

  3. A BENCHMARKING ANALYSIS FOR FIVE RADIONUCLIDE VADOSE ZONE MODELS (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, AND CHAIN 2D) IN SOIL SCREENING LEVEL CALCULATIONS

    EPA Science Inventory

    Five radionuclide vadose zone models with different degrees of complexity (CHAIN, MULTIMED_DP, FECTUZ, HYDRUS, and CHAIN 2D) were selected for use in soil screening level (SSL) calculations. A benchmarking analysis between the models was conducted for a radionuclide (99Tc) rele...

  4. Results of two-dimensional time-evolved phase screen computer simulations

    NASA Astrophysics Data System (ADS)

    Gamble, Kevin J.; Weeks, Arthur R.; Myler, Harley R.; Rabadi, Wissam A.

    1995-06-01

    This paper presents a 2D computer simulation of observed intensity and phase behind a time evolved phase screen. Both spatial and temporal statistics of the observed intensity is compared to theoretical predictions. In particular, the intensity statistics as a function of detector position within the propagated laser beam are investigated. The computer simulation program was written using the C-programming language running on a SUN SPARC-5 workstation.

  5. Orbit computation of the TELECOM-2D satellite with a Genetic Algorithm

    NASA Astrophysics Data System (ADS)

    Deleflie, Florent; Coulot, David; Vienne, Alain; Decosta, Romain; Richard, Pascal; Lasri, Mohammed Amjad

    2014-07-01

    In order to test a preliminary orbit determination method, we fit an orbit of the geostationary satellite TELECOM-2D, as if we did not know any a priori information on its trajectory. The method is based on a genetic algorithm coupled to an analytical propagator of the trajectory, that is used over a couple of days, and that uses a whole set of altazimutal data that are acquired by the tracking network made up of the two TAROT telescopes. The adjusted orbit is then compared to a numerical reference. The method is described, and the results are analyzed, as a step towards an operational method of preliminary orbit determination for uncatalogued objects.

  6. Auto-masked 2D/3D image registration and its validation with clinical cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Steininger, P.; Neuner, M.; Weichenberger, H.; Sharp, G. C.; Winey, B.; Kametriser, G.; Sedlmayer, F.; Deutschmann, H.

    2012-07-01

    Image-guided alignment procedures in radiotherapy aim at minimizing discrepancies between the planned and the real patient setup. For that purpose, we developed a 2D/3D approach which rigidly registers a computed tomography (CT) with two x-rays by maximizing the agreement in pixel intensity between the x-rays and the corresponding reconstructed radiographs from the CT. Moreover, the algorithm selects regions of interest (masks) in the x-rays based on 3D segmentations from the pre-planning stage. For validation, orthogonal x-ray pairs from different viewing directions of 80 pelvic cone-beam CT (CBCT) raw data sets were used. The 2D/3D results were compared to corresponding standard 3D/3D CBCT-to-CT alignments. Outcome over 8400 2D/3D experiments showed that parametric errors in root mean square were <0.18° (rotations) and <0.73 mm (translations), respectively, using rank correlation as intensity metric. This corresponds to a mean target registration error, related to the voxels of the lesser pelvis, of <2 mm in 94.1% of the cases. From the results we conclude that 2D/3D registration based on sequentially acquired orthogonal x-rays of the pelvis is a viable alternative to CBCT-based approaches if rigid alignment on bony anatomy is sufficient, no volumetric intra-interventional data set is required and the expected error range fits the individual treatment prescription.

  7. Computers and 2D Geometric Learning of Turkish Fourth and Fifth Graders

    ERIC Educational Resources Information Center

    Olkun, Sinan; Altun, Arif; Smith, Glenn

    2005-01-01

    This research investigated the possible impacts of computers on Turkish fourth-grade students geometry scores and further geometric learning. The study used a pretestinterventionposttest experimental design. Results showed that students who did not have computers at home initially had lower geometry scores. However, these differences were…

  8. Coupling 2-D cylindrical and 3-D x-y-z transport computations

    SciTech Connect

    Abu-Shumays, I.K.; Yehnert, C.E.; Pitcairn, T.N.

    1998-06-30

    This paper describes a new two-dimensional (2-D) cylindrical geometry to three-dimensional (3-D) rectangular x-y-z splice option for multi-dimensional discrete ordinates solutions to the neutron (photon) transport equation. Of particular interest are the simple transformations developed and applied in order to carry out the required spatial and angular interpolations. The spatial interpolations are linear and equivalent to those applied elsewhere. The angular interpolations are based on a high order spherical harmonics representation of the angular flux. Advantages of the current angular interpolations over previous work are discussed. An application to an intricate streaming problem is provided to demonstrate the advantages of the new method for efficient and accurate prediction of particle behavior in complex geometries.

  9. Distributed Computing Architecture for Image-Based Wavefront Sensing and 2 D FFTs

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey S.; Dean, Bruce H.; Haghani, Shadan

    2006-01-01

    Image-based wavefront sensing (WFS) provides significant advantages over interferometric-based wavefi-ont sensors such as optical design simplicity and stability. However, the image-based approach is computational intensive, and therefore, specialized high-performance computing architectures are required in applications utilizing the image-based approach. The development and testing of these high-performance computing architectures are essential to such missions as James Webb Space Telescope (JWST), Terrestial Planet Finder-Coronagraph (TPF-C and CorSpec), and Spherical Primary Optical Telescope (SPOT). The development of these specialized computing architectures require numerous two-dimensional Fourier Transforms, which necessitate an all-to-all communication when applied on a distributed computational architecture. Several solutions for distributed computing are presented with an emphasis on a 64 Node cluster of DSPs, multiple DSP FPGAs, and an application of low-diameter graph theory. Timing results and performance analysis will be presented. The solutions offered could be applied to other all-to-all communication and scientifically computationally complex problems.

  10. Novel low-cost 2D/3D switchable autostereoscopic system for notebook computers and other portable devices

    NASA Astrophysics Data System (ADS)

    Eichenlaub, Jesse B.

    1995-03-01

    Mounting a lenticular lens in front of a flat panel display is a well known, inexpensive, and easy way to create an autostereoscopic system. Such a lens produces half resolution 3D images because half the pixels on the LCD are seen by the left eye and half by the right eye. This may be acceptable for graphics, but it makes full resolution text, as displayed by common software, nearly unreadable. Very fine alignment tolerances normally preclude the possibility of removing and replacing the lens in order to switch between 2D and 3D applications. Lenticular lens based displays are therefore limited to use as dedicated 3D devices. DTI has devised a technique which removes this limitation, allowing switching between full resolution 2D and half resolution 3D imaging modes. A second element, in the form of a concave lenticular lens array whose shape is exactly the negative of the first lens, is mounted on a hinge so that it can be swung down over the first lens array. When so positioned the two lenses cancel optically, allowing the user to see full resolution 2D for text or numerical applications. The two lenses, having complementary shapes, naturally tend to nestle together and snap into perfect alignment when pressed together--thus obviating any need for user operated alignment mechanisms. This system represents an ideal solution for laptop and notebook computer applications. It was devised to meet the stringent requirements of a laptop computer manufacturer including very compact size, very low cost, little impact on existing manufacturing or assembly procedures, and compatibility with existing full resolution 2D text- oriented software as well as 3D graphics. Similar requirements apply to high and electronic calculators, several models of which now use LCDs for the display of graphics.

  11. Geometric Neural Computing for 2D Contour and 3D Surface Reconstruction

    NASA Astrophysics Data System (ADS)

    Rivera-Rovelo, Jorge; Bayro-Corrochano, Eduardo; Dillmann, Ruediger

    In this work we present an algorithm to approximate the surface of 2D or 3D objects combining concepts from geometric algebra and artificial neural networks. Our approach is based on the self-organized neural network called Growing Neural Gas (GNG), incorporating versors of the geometric algebra in its neural units; such versors are the transformations that will be determined during the training stage and then applied to a point to approximate the surface of the object. We also incorporate the information given by the generalized gradient vector flow to select automatically the input patterns, and also in the learning stage in order to improve the performance of the net. Several examples using medical images are presented, as well as images of automatic visual inspection. We compared the results obtained using snakes against the GSOM incorporating the gradient information and using versors. Such results confirm that our approach is very promising. As a second application, a kind of morphing or registration procedure is shown; namely the algorithm can be used when transforming one model at time t 1 into another at time t 2. We include also examples applying the same procedure, now extended to models based on spheres.

  12. Lattice Boltzmann methods for some 2-D nonlinear diffusion equations:Computational results

    SciTech Connect

    Elton, B.H.; Rodrigue, G.H. . Dept. of Applied Science Lawrence Livermore National Lab., CA ); Levermore, C.D. . Dept. of Mathematics)

    1990-01-01

    In this paper we examine two lattice Boltzmann methods (that are a derivative of lattice gas methods) for computing solutions to two two-dimensional nonlinear diffusion equations of the form {partial derivative}/{partial derivative}t u = v ({partial derivative}/{partial derivative}x D(u){partial derivative}/{partial derivative}x u + {partial derivative}/{partial derivative}y D(u){partial derivative}/{partial derivative}y u), where u = u({rvec x},t), {rvec x} {element of} R{sup 2}, v is a constant, and D(u) is a nonlinear term that arises from a Chapman-Enskog asymptotic expansion. In particular, we provide computational evidence supporting recent results showing that the methods are second order convergent (in the L{sub 1}-norm), conservative, conditionally monotone finite difference methods. Solutions computed via the lattice Boltzmann methods are compared with those computed by other explicit, second order, conservative, monotone finite difference methods. Results are reported for both the L{sub 1}- and L{sub {infinity}}-norms.

  13. The PR2D (Place, Route in 2-Dimensions) automatic layout computer program handbook

    NASA Technical Reports Server (NTRS)

    Edge, T. M.

    1978-01-01

    Place, Route in 2-Dimensions is a standard cell automatic layout computer program for generating large scale integrated/metal oxide semiconductor arrays. The program was utilized successfully for a number of years in both government and private sectors but until now was undocumented. The compilation, loading, and execution of the program on a Sigma V CP-V operating system is described.

  14. Identification of the wave speed and the second viscosity of cavitation flows with 2D RANS computations - Part I

    NASA Astrophysics Data System (ADS)

    Decaix, J.; Alligné, S.; Nicolet, C.; Avellan, F.; Münch, C.

    2015-12-01

    1D hydro-electric models are useful to predict dynamic behaviour of hydro-power plants. Regarding vortex rope and cavitation surge in Francis turbines, the 1D models require some inputs that can be provided by numerical simulations. In this paper, a 2D cavitating Venturi is considered. URANS computations are performed to investigate the dynamic behaviour of the cavitation sheet depending on the frequency variation of the outlet pressure. The results are used to calibrate and to assess the reliability of the 1D models.

  15. Emergent Power-Law Phase in the 2D Heisenberg Windmill Antiferromagnet: A Computational Experiment

    NASA Astrophysics Data System (ADS)

    Jeevanesan, Bhilahari; Chandra, Premala; Coleman, Piers; Orth, Peter P.

    2015-10-01

    In an extensive computational experiment, we test Polyakov's conjecture that under certain circumstances an isotropic Heisenberg model can develop algebraic spin correlations. We demonstrate the emergence of a multispin U(1) order parameter in a Heisenberg antiferromagnet on interpenetrating honeycomb and triangular lattices. The correlations of this relative phase angle are observed to decay algebraically at intermediate temperatures in an extended critical phase. Using finite-size scaling we show that both phase transitions are of the Berezinskii-Kosterlitz-Thouless type, and at lower temperatures we find long-range Z6 order.

  16. Emergent Power-Law Phase in the 2D Heisenberg Windmill Antiferromagnet: A Computational Experiment.

    PubMed

    Jeevanesan, Bhilahari; Chandra, Premala; Coleman, Piers; Orth, Peter P

    2015-10-23

    In an extensive computational experiment, we test Polyakov's conjecture that under certain circumstances an isotropic Heisenberg model can develop algebraic spin correlations. We demonstrate the emergence of a multispin U(1) order parameter in a Heisenberg antiferromagnet on interpenetrating honeycomb and triangular lattices. The correlations of this relative phase angle are observed to decay algebraically at intermediate temperatures in an extended critical phase. Using finite-size scaling we show that both phase transitions are of the Berezinskii-Kosterlitz-Thouless type, and at lower temperatures we find long-range Z(6) order.

  17. Manifest: A computer program for 2-D flow modeling in Stirling machines

    NASA Technical Reports Server (NTRS)

    Gedeon, David

    1989-01-01

    A computer program named Manifest is discussed. Manifest is a program one might want to use to model the fluid dynamics in the manifolds commonly found between the heat exchangers and regenerators of Stirling machines; but not just in the manifolds - in the regenerators as well. And in all sorts of other places too, such as: in heaters or coolers, or perhaps even in cylinder spaces. There are probably nonStirling uses for Manifest also. In broad strokes, Manifest will: (1) model oscillating internal compressible laminar fluid flow in a wide range of two-dimensional regions, either filled with porous materials or empty; (2) present a graphics-based user-friendly interface, allowing easy selection and modification of region shape and boundary condition specification; (3) run on a personal computer, or optionally (in the case of its number-crunching module) on a supercomputer; and (4) allow interactive examination of the solution output so the user can view vector plots of flow velocity, contour plots of pressure and temperature at various locations and tabulate energy-related integrals of interest.

  18. Computing Aerodynamic Performance of a 2D Iced Airfoil: Blocking Topology and Grid Generation

    NASA Technical Reports Server (NTRS)

    Chi, X.; Zhu, B.; Shih, T. I.-P.; Slater, J. W.; Addy, H. E.; Choo, Yung K.; Lee, Chi-Ming (Technical Monitor)

    2002-01-01

    The ice accrued on airfoils can have enormously complicated shapes with multiple protruded horns and feathers. In this paper, several blocking topologies are proposed and evaluated on their ability to produce high-quality structured multi-block grid systems. A transition layer grid is introduced to ensure that jaggedness on the ice-surface geometry do not to propagate into the domain. This is important for grid-generation methods based on hyperbolic PDEs (Partial Differential Equations) and algebraic transfinite interpolation. A 'thick' wrap-around grid is introduced to ensure that grid lines clustered next to solid walls do not propagate as streaks of tightly packed grid lines into the interior of the domain along block boundaries. For ice shapes that are not too complicated, a method is presented for generating high-quality single-block grids. To demonstrate the usefulness of the methods developed, grids and CFD solutions were generated for two iced airfoils: the NLF0414 airfoil with and without the 623-ice shape and the B575/767 airfoil with and without the 145m-ice shape. To validate the computations, the computed lift coefficients as a function of angle of attack were compared with available experimental data. The ice shapes and the blocking topologies were prepared by NASA Glenn's SmaggIce software. The grid systems were generated by using a four-boundary method based on Hermite interpolation with controls on clustering, orthogonality next to walls, and C continuity across block boundaries. The flow was modeled by the ensemble-averaged compressible Navier-Stokes equations, closed by the shear-stress transport turbulence model in which the integration is to the wall. All solutions were generated by using the NPARC WIND code.

  19. Presentation of the MERC work-flow for the computation of a 2D radial reflector in a PWR

    SciTech Connect

    Clerc, T.; Hebert, A.; Leroyer, H.; Argaud, J. P.; Poncot, A.; Bouriquet, B.

    2013-07-01

    This paper presents a work-flow for computing an equivalent 2D radial reflector in a pressurized water reactor (PWR) core, in adequacy with a reference power distribution, computed with the method of characteristics (MOC) of the lattice code APOLLO2. The Multi-modelling Equivalent Reflector Computation (MERC) work-flow is a coherent association of the lattice code APOLLO2 and the core code COCAGNE, structured around the ADAO (Assimilation de Donnees et Aide a l'Optimisation) module of the SALOME platform, based on the data assimilation theory. This study leads to the computation of equivalent few-groups reflectors, that can be spatially heterogeneous, which have been compared to those obtained with the OPTEX similar methodology developed with the core code DONJON, as a first validation step. Subsequently, the MERC work-flow is used to compute the most accurate reflector in consistency with all the R and D choices made at Electricite de France (EDF) for the core modelling, in terms of number of energy groups and simplified transport solvers. We observe important reductions of the power discrepancies distribution over the core when using equivalent reflectors obtained with the MERC work-flow. (authors)

  20. A computationally efficient hybrid 2D/3D thin film dislocation model

    NASA Astrophysics Data System (ADS)

    Sarrafan, Siavash

    Substantial research has been devoted to attempting to understand how dislocation structures evolve and how they affect device properties. However, current dislocation simulation methods are only able to model highly idealized systems accurately. The three-dimensional discrete dislocation dynamics models, in particular, are too computationally intensive for modelling high dislocation densities and their resultant deformations that are observed in some real applications. In this thesis, we propose a novel method to exploit the quasi-two-dimensional nature of three-dimensional dislocation loops in a thin film to model their behaviors. For most film configurations, simulation performance can be greatly enhanced by implementing a hybrid two-dimensional/three-dimensional model without losing significant fidelity. In this technique, misfits stress fields are modeled by superposing multiple two-dimensional models. Threads are modeled with a more traditional three-dimensional implementation as they move through the misfit stress field. Using this innovative technique, much higher strains and/or dislocation densities could be studied.

  1. Computational Amide I 2D IR Spectroscopy as a Probe of Protein Structure and Dynamics

    NASA Astrophysics Data System (ADS)

    Reppert, Mike; Tokmakoff, Andrei

    2016-05-01

    Two-dimensional infrared spectroscopy of amide I vibrations is increasingly being used to study the structure and dynamics of proteins and peptides. Amide I, a primarily carbonyl stretching vibration of the protein backbone, provides information on secondary structures as a result of vibrational couplings and on hydrogen-bonding contacts when isotope labeling is used to isolate specific sites. In parallel with experiments, computational models of amide I spectra that use atomistic structures from molecular dynamics simulations have evolved to calculate experimental spectra. Mixed quantum-classical models use spectroscopic maps to translate the structural information into a quantum-mechanical Hamiltonian for the spectroscopically observed vibrations. This allows one to model the spectroscopy of large proteins, disordered states, and protein conformational dynamics. With improvements in amide I models, quantitative modeling of time-dependent structural ensembles and of direct feedback between experiments and simulations is possible. We review the advances in developing these models, their theoretical basis, and current and future applications.

  2. Computation of self-field hysteresis losses in conductors with helicoidal structure using a 2D finite element method

    NASA Astrophysics Data System (ADS)

    Stenvall, A.; Siahrang, M.; Grilli, F.; Sirois, F.

    2013-04-01

    It is well known that twisting current-carrying conductors helps to reduce their coupling losses. However, the impact of twisting on self-field hysteresis losses has not been as extensively investigated as that on the reduction of coupling losses. This is mostly because the reduction of coupling losses has been an important issue to tackle in the past, and it is not possible to consider twisting within the classical two-dimensional (2D) approaches for the computation of self-field hysteresis losses. Recently, numerical codes considering the effect of twisting in continuous symmetries have appeared. For general three-dimensional (3D) simulations, one issue is that no robust, widely accepted and easy to obtain model for expressing the relationship between the current density and the electric field is available. On the other hand, we can consider that in these helicoidal structures currents flow only along the helicoidal trajectories. This approach allows one to use the scalar power-law for superconductor resistivity and makes the eddy current approach to a solution of a hysteresis loss problem feasible. In this paper we use the finite element method to solve the eddy current model in helicoidal structures in 2D domains utilizing the helicoidal symmetry. The developed tool uses the full 3D geometry but allows discretization which takes advantage of the helicoidal symmetry to reduce the computational domain to a 2D one. We utilize in this tool the non-linear power law for modelling the resistivity in the superconducting regions and study how the self-field losses are influenced by the twisting of a 10-filament wire. Additionally, in the case of high aspect ratio tapes, we compare the results computed with the new tool and a one-dimensional program based on the integral equation method and developed for simulating single layer power cables made of ReBCO coated conductors. Finally, we discuss modelling issues and present open questions related to helicoidal structures

  3. Eye-screen distance monitoring for computer use.

    PubMed

    Eastwood-Sutherland, Caillin; Gale, Timothy J

    2011-01-01

    The extended period many people now spend looking at computer screens is thought to affect eyesight over the long term. In this paper we are concerned with developing and initial evaluation of a wireless camera-based tracking system providing quantitative assessment of computer screen interaction. The system utilizes a stereo camera system and wireless XBee based infrared markers and enables unobtrusive monitoring. Preliminary results indicate that the system is an excellent method of monitoring eye-screen distance. This type of system will enable future studies of eye-screen distance for computer users. PMID:22254767

  4. A computational model that recovers the 3D shape of an object from a single 2D retinal representation.

    PubMed

    Li, Yunfeng; Pizlo, Zygmunt; Steinman, Robert M

    2009-05-01

    Human beings perceive 3D shapes veridically, but the underlying mechanisms remain unknown. The problem of producing veridical shape percepts is computationally difficult because the 3D shapes have to be recovered from 2D retinal images. This paper describes a new model, based on a regularization approach, that does this very well. It uses a new simplicity principle composed of four shape constraints: viz., symmetry, planarity, maximum compactness and minimum surface. Maximum compactness and minimum surface have never been used before. The model was tested with random symmetrical polyhedra. It recovered their 3D shapes from a single randomly-chosen 2D image. Neither learning, nor depth perception, was required. The effectiveness of the maximum compactness and the minimum surface constraints were measured by how well the aspect ratio of the 3D shapes was recovered. These constraints were effective; they recovered the aspect ratio of the 3D shapes very well. Aspect ratios recovered by the model were compared to aspect ratios adjusted by four human observers. They also adjusted aspect ratios very well. In those rare cases, in which the human observers showed large errors in adjusted aspect ratios, their errors were very similar to the errors made by the model. PMID:18621410

  5. Adaptive vessel tracking: automated computation of vessel trajectories for improved efficiency in 2D coronary MR angiography.

    PubMed

    Saranathan, M; Ho, V B; Hood, M N; Foo, T K; Hardy, C J

    2001-10-01

    A new method was investigated for improving the efficiency of ECG-gated coronary magnetic resonance angiography (CMRA) by accurate, automated tracking of the vessel motion over the cardiac cycle. Vessel tracking was implemented on a spiral gradient-echo pulse sequence with sub-millimeter in-plane spatial resolution as well as high image signal to noise ratio. Breath hold 2D CMRA was performed in 18 healthy adult subjects (mean age 46 +/- 14 years). Imaging efficiency, defined as the percentage of the slices where more than 30 mm of the vessel is visualized, was computed in multi-slice spiral scans with and without vessel tracking. There was a significant improvement in the efficiency of the vessel tracking sequence compared to the multi-slice sequence (56% vs. 32%, P < 0.001). The imaging efficiency increased further when the true motion of the coronary arteries (determined using a cross correlation algorithm) was used for vessel tracking as opposed to a linear model for motion (71% vs. 57%, P < 0.05). The motion of the coronary arteries was generally found to be linear during the systolic phase and nonlinear during the diastolic phase. The use of subject-tailored, automated tracking of vessel positions resulted in improved efficiency of coronary artery illustration on breath held 2D CMRA.

  6. Touch-screen technology for the dynamic display of -2D spatial information without vision: promise and progress.

    PubMed

    Klatzky, Roberta L; Giudice, Nicholas A; Bennett, Christopher R; Loomis, Jack M

    2014-01-01

    Many developers wish to capitalize on touch-screen technology for developing aids for the blind, particularly by incorporating vibrotactile stimulation to convey patterns on their surfaces, which otherwise are featureless. Our belief is that they will need to take into account basic research on haptic perception in designing these graphics interfaces. We point out constraints and limitations in haptic processing that affect the use of these devices. We also suggest ways to use sound to augment basic information from touch, and we include evaluation data from users of a touch-screen device with vibrotactile and auditory feedback that we have been developing, called a vibro-audio interface.

  7. Systematic E2 screening reveals a UBE2D-RNF138-CtIP axis promoting DNA repair

    PubMed Central

    Sczaniecka-Clift, Matylda; Coates, Julia; Jhujh, Satpal; Demir, Mukerrem; Cornwell, Matthew; Beli, Petra; Jackson, Stephen P

    2016-01-01

    Ubiquitylation is crucial for proper cellular responses to DNA double-strand breaks (DSBs). If unrepaired, these highly cytotoxic lesions cause genome instability, tumourigenesis, neurodegeneration or premature ageing. Here, we conduct a comprehensive, multilayered screen to systematically profile all human ubiquitin E2-enzymes for impacts on cellular DSB responses. Applying a widely applicable approach, we use an exemplary E2 family, UBE2Ds, to identify ubiquitylation-cascade components downstream of E2s. Thus, we uncover the nuclear E3-ligase RNF138 as a key homologous recombination (HR)-promoting factor that functions with UBE2Ds in cells. Mechanistically, UBE2Ds and RNF138 accumulate at DNA-damage sites and act at early resection stages by promoting CtIP ubiquitylation and accrual. This work supplies insights into regulation of DSB repair by HR. Moreover, it provides a rich information resource on E2s that can be exploited by follow-on studies. PMID:26502057

  8. Screening for lung cancer using low dose computed tomography.

    PubMed

    Tammemagi, Martin C; Lam, Stephen

    2014-01-01

    Screening for lung cancer with low dose computed tomography can reduce mortality from the disease by 20% in high risk smokers. This review covers the state of the art knowledge on several aspects of implementing a screening program. The most important are to identify people who are at high enough risk to warrant screening and the appropriate management of lung nodules found at screening. An accurate risk prediction model is more efficient than age and pack years of smoking alone at identifying those who will develop lung cancer and die from the disease. Algorithms are available for assessing people who screen positive to determine who needs additional imaging or invasive investigations. Concerns about low dose computed tomography screening include false positive results, overdiagnosis, radiation exposure, and costs. Further work is needed to define the frequency and duration of screening and to refine risk prediction models so that they can be used to assess the risk of lung cancer in special populations. Another important area is the use of computer vision software tools to facilitate high throughput interpretation of low dose computed tomography images so that costs can be reduced and the consistency of scan interpretation can be improved. Sufficient data are available to support the implementation of screening programs at the population level in stages that can be expanded when found to perform well to improve the outcome of patients with lung cancer. PMID:24865600

  9. ASIC-based architecture for the real-time computation of 2D convolution with large kernel size

    NASA Astrophysics Data System (ADS)

    Shao, Rui; Zhong, Sheng; Yan, Luxin

    2015-12-01

    Bidimensional convolution is a low-level processing algorithm of interest in many areas, but its high computational cost constrains the size of the kernels, especially in real-time embedded systems. This paper presents a hardware architecture for the ASIC-based implementation of 2-D convolution with medium-large kernels. Aiming to improve the efficiency of storage resources on-chip, reducing off-chip bandwidth of these two issues, proposed construction of a data cache reuse. Multi-block SPRAM to cross cached images and the on-chip ping-pong operation takes full advantage of the data convolution calculation reuse, design a new ASIC data scheduling scheme and overall architecture. Experimental results show that the structure can achieve 40× 32 size of template real-time convolution operations, and improve the utilization of on-chip memory bandwidth and on-chip memory resources, the experimental results show that the structure satisfies the conditions to maximize data throughput output , reducing the need for off-chip memory bandwidth.

  10. Digit Ratios (2D:4D) Determined by Computer-Assisted Analysis are More Reliable than Those Using Physical Measurements, Photocopies, and Printed Scans

    PubMed Central

    ALLAWAY, HEATHER C.; BLOSKI, TERRI G.; PIERSON, ROGER A.; LUJAN, MARLA E.

    2010-01-01

    Prenatal androgens influence the second to fourth digit ratio (2D:4D) of hands with men having lower ratios than women. Numerous methods are used to assess 2D:4D including, physical measurements with calipers, and measurements made from photocopies, scanned images, digital photographs, radiographs, and scaled tubes. Although each method appears relatively reliable, agreement upon a gold standard is necessary to better explore the putative effects of prenatal androgens. Our objective was to assess the level of intra and interobserver reliability when evaluating 2D:4D using four techniques: (1) physical measurements, (2) photocopies, (3) printed scanned images, and (4) computer-assisted image analysis. Physical measurements, photocopies, and printed scanned images were measured with Vernier calipers. Scanned images were also measured with computer-based calipers. Measurements were made in 30 men and 30 women at two different time points, by three experienced observers. Intraclass correlation coefficients were used to assess the level of reliability. Intraobserver reliability was best for computer-assisted (0.957), followed by photocopies (0.939), physical measurements (0.925), and printed scans (0.842; P = 0.015). Interobserver reliability was also greatest for computer-assisted (0.892), followed by photocopies (0.858), physical measurements (0.795), and printed scans (0.761; P = 0.001). Mean 2D:4D from physical measurements were higher than all other techniques (P < 0.0001). Digit ratios determined from computer-assisted, physical measurements, and printed scans were more reliable in men than women (P = 0.009, P = 0.017, and P = 0.012, respectively). In summary, 2D:4D determined from computer-assisted analysis yielded the most accurate and consistent measurements among observers. Investigations of 2D:4D should use computer-assisted measurements over alternate methods whenever possible. PMID:19263413

  11. Contributions of the European trials (European randomized screening group) in computed tomography lung cancer screening.

    PubMed

    Heuvelmans, Marjolein A; Vliegenthart, Rozemarijn; Oudkerk, Matthijs

    2015-03-01

    Lung cancer is the leading cause of cancer-related death worldwide. In 2011, the largest lung cancer screening trial worldwide, the US National Lung Screening Trial, published a 20% decrease in lung cancer-specific mortality in the computed tomography (CT)-screened group, compared with the group screened by chest x-ray. On the basis of this trial, different US guidelines recently have recommended CT lung cancer screening. However, several questions regarding the implementation of lung cancer screening need to be answered. In Europe, several lung cancer screening trials are ongoing. It is planned to pool the results of the lung cancer screening trials in European randomized lung cancer CT screening (EUCT). By pooling of the data, EUCT hopes to be able to provide additional information for the discussion of some important issues regarding the implementation of lung cancer screening by low-dose CT, including: the determination of the optimal screen population, the comparison between a volume-based and diameter-based nodule management protocol, and the determination of optimal screen intervals.

  12. SCREENING CHEMICALS FOR ESTROGEN RECEPTOR BIOACTIVITY USING A COMPUTATIONAL MODEL

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) is considering the use high-throughput and computational methods for regulatory applications in the Endocrine Disruptor Screening Program (EDSP). To use these new tools for regulatory decision making, computational methods must be a...

  13. Variables of Computer Screen Display and How They Affect Learning.

    ERIC Educational Resources Information Center

    Hathaway, Michael D.

    1984-01-01

    Reviews research findings on variables in computer screen displays and their effects on learning for use by purchasers or designers of computer systems for instructional purposes. Variables discussed include fatigue, density of displayed text, scrolling, upper-case vesus upper- and lower-case lettering, letter size, and graphics. Ten references…

  14. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 1: Theory and method

    NASA Technical Reports Server (NTRS)

    Shih, T. I.-P.; Bailey, R. T.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  15. Computational design of soft materials for the capture of Cs-137 in contaminated environments: From 2D covalent cucurbituril networks to 3D supramolecular materials

    NASA Astrophysics Data System (ADS)

    Pichierri, Fabio

    2016-08-01

    Using computational quantum chemistry methods we design novel 2D and 3D soft materials made of cucurbituril macrocycles covalently connected with each other via rigid linkers. Such covalent cucurbituril networks might be useful for the capture of radioactive Cs-137 (present as Cs+) in the contaminated environment.

  16. Verification and benchmarking of MAGNUM-2D: a finite element computer code for flow and heat transfer in fractured porous media

    SciTech Connect

    Eyler, L.L.; Budden, M.J.

    1985-03-01

    The objective of this work is to assess prediction capabilities and features of the MAGNUM-2D computer code in relation to its intended use in the Basalt Waste Isolation Project (BWIP). This objective is accomplished through a code verification and benchmarking task. Results are documented which support correctness of prediction capabilities in areas of intended model application. 10 references, 43 figures, 11 tables.

  17. The Effects of Computer Usage on Computer Screen Reading Rate.

    ERIC Educational Resources Information Center

    Clausing, Carolyn S.; Schmitt, Dorren Rafael

    This study investigated the differences in the reading rate of eighth grade students on a cloze reading exercise involving the reading of text from a computer monitor. Several different modes of presentation were used in order to determine the effect of prior experience with computers on the students' reading rate. Subjects were 240 eighth grade…

  18. Upgrade of PARC2D to include real gas effects. [computer program for flowfield surrounding aeroassist flight experiment

    NASA Technical Reports Server (NTRS)

    Saladino, Anthony; Praharaj, Sarat C.; Collins, Frank G.; Seaford, C. Mark

    1990-01-01

    This paper presents a description of the changes and additions to the perfect gas PARC2D code to include chemical equilibrium effects, resulting in a code called PARCEQ2D. The work developed out of a need to have the capability of more accurately representing the flowfield surrounding the aeroassist flight experiment (AFE) vehicle. Use is made of the partition function of statistical mechanics in the evaluation of the thermochemical properties. This approach will allow the PARC code to be extended to thermal nonequilibrium when this task is undertaken in the future. The transport properties follow from formulae from the kinetic theory of gases. Results are presented for a two-dimensional AFE that compare perfect gas and real gas solutions at flight conditions, showing vast differences between the two cases.

  19. Reduced-dimensional quantum computations for the rotational-vibrational dynamics of F(-)-CH4 and F(-)-CH2D2.

    PubMed

    Fábri, Csaba; Császár, Attila G; Czakó, Gábor

    2013-08-15

    Variational rotational-vibrational quantum chemical computations are performed for the F(-)-CH4 and F(-)-CH2D2 anion complexes using several reduced-dimensional models in a curvilinear polyspherical coordinate system and utilizing an accurate ab initio potential energy surface (PES). The implementation of the models is made practical by using the general rovibrational code GENIUSH, which constructs the complicated form of the exact rovibrational kinetic energy operator in reduced and full dimensions in any user-specified coordinates and body-fixed frames. A one-dimensional CF stretch, 1D(RCF), a two-dimensional intermolecular bend, 2D(θ,φ), and a three-dimensional intermolecular, 3D(RCF,θ,φ), rigid methane model provide vibrational energies for the low-frequency, large-amplitude modes in good agreement with full-dimensional MCTDH results for F(-)-CH4. The 2D(θ,φ) and 3D(RCF,θ,φ) four-well computations, describing equally the four possible CH-F(-) bonds, show that the ground-state tunneling splitting is less than 0.01 cm(-1). For the hydrogen-bonded CH stretching fundamental a local-mode model is found to have almost spectroscopic accuracy, whereas a harmonic frequency analysis performs poorly. The 2D(θ,φ) and 3D(RCF,θ,φ) rotational-vibrational computations on the Td-symmetric four-well PES reveal that in most cases F(-)-CH4 behaves as a semirigid C3v symmetric top. For the degenerate intermolecular bending vibrational states substantial splittings of the rigid rotor levels are observed. For F(-)-CH2D2 the rotational levels guide the assignment of the vibrational states to either F(-)-H or F(-)-D connectivity. PMID:23402210

  20. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and

  1. Validity of computational hemodynamics in human arteries based on 3D time-of-flight MR angiography and 2D electrocardiogram gated phase contrast images

    NASA Astrophysics Data System (ADS)

    Yu, Huidan (Whitney); Chen, Xi; Chen, Rou; Wang, Zhiqiang; Lin, Chen; Kralik, Stephen; Zhao, Ye

    2015-11-01

    In this work, we demonstrate the validity of 4-D patient-specific computational hemodynamics (PSCH) based on 3-D time-of-flight (TOF) MR angiography (MRA) and 2-D electrocardiogram (ECG) gated phase contrast (PC) images. The mesoscale lattice Boltzmann method (LBM) is employed to segment morphological arterial geometry from TOF MRA, to extract velocity profiles from ECG PC images, and to simulate fluid dynamics on a unified GPU accelerated computational platform. Two healthy volunteers are recruited to participate in the study. For each volunteer, a 3-D high resolution TOF MRA image and 10 2-D ECG gated PC images are acquired to provide the morphological geometry and the time-varying flow velocity profiles for necessary inputs of the PSCH. Validation results will be presented through comparisons of LBM vs. 4D Flow Software for flow rates and LBM simulation vs. MRA measurement for blood flow velocity maps. Indiana University Health (IUH) Values Fund.

  2. A review of automated image understanding within 3D baggage computed tomography security screening.

    PubMed

    Mouton, Andre; Breckon, Toby P

    2015-01-01

    Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT. PMID:26409422

  3. A review of automated image understanding within 3D baggage computed tomography security screening.

    PubMed

    Mouton, Andre; Breckon, Toby P

    2015-01-01

    Baggage inspection is the principal safeguard against the transportation of prohibited and potentially dangerous materials at airport security checkpoints. Although traditionally performed by 2D X-ray based scanning, increasingly stringent security regulations have led to a growing demand for more advanced imaging technologies. The role of X-ray Computed Tomography is thus rapidly expanding beyond the traditional materials-based detection of explosives. The development of computer vision and image processing techniques for the automated understanding of 3D baggage-CT imagery is however, complicated by poor image resolutions, image clutter and high levels of noise and artefacts. We discuss the recent and most pertinent advancements and identify topics for future research within the challenging domain of automated image understanding for baggage security screening CT.

  4. Screening for lung cancer: time for large-scale screening by chest computed tomography.

    PubMed

    Shlomi, Dekel; Ben-Avi, Ronny; Balmor, Gingy Ronen; Onn, Amir; Peled, Nir

    2014-07-01

    Lung cancer is the leading cause of cancer death worldwide. Age and smoking are the primary risk factors for lung cancer. Treatment based on surgical removal in the early stages of the disease results in better survival. Screening programmes for early detection that used chest radiography and sputum cytology failed to attain reduction of lung cancer mortality. Screening by low-dose computed tomography (CT) demonstrated high rates of early-stage lung cancer detection in a high-risk population. Nevertheless, no mortality advantage was manifested in small randomised control trials. A large randomised control trial in the U.S.A., the National Lung Screening Trial (NLST), showed a significant relative reduction of 20% in lung cancer mortality and 6.7% reduction in total mortality, yet no reduction was evidenced in the late-stage prevalence. Screening for lung cancer by low-dose CT reveals a high level of false-positive lesions, which necessitates further noninvasive and invasive evaluations. Based primarily on the NLST eligible criteria, new guidelines have recently been developed by major relevant organisations. The overall recommendation coming out of this collective work calls for lung cancer screening by low-dose CT to be performed in medical centres manned by specialised multidisciplinary teams, as well as for a mandatory, pre-screening, comprehensive discussion with the patient about the risks and advantages involved in the process. Lung cancer screening is on the threshold of a new era, with ever more questions still left open to challenge future studies.

  5. Decision trees and integrated features for computer aided mammographic screening

    SciTech Connect

    Kegelmeyer, W.P. Jr.; Groshong, B.; Allmen, M.; Woods, K.

    1997-02-01

    Breast cancer is a serious problem, which in the United States causes 43,000 deaths a year, eventually striking 1 in 9 women. Early detection is the only effective countermeasure, and mass mammography screening is the only reliable means for early detection. Mass screening has many shortcomings which could be addressed by a computer-aided mammographic screening system. Accordingly, we have applied the pattern recognition methods developed in earlier investigations of speculated lesions in mammograms to the detection of microcalcifications and circumscribed masses, generating new, more rigorous and uniform methods for the detection of both those signs. We have also improved the pattern recognition methods themselves, through the development of a new approach to combinations of multiple classifiers.

  6. Affinity-Based Screening of Tetravalent Peptides Identifies Subtype-Selective Neutralizers of Shiga Toxin 2d, a Highly Virulent Subtype, by Targeting a Unique Amino Acid Involved in Its Receptor Recognition.

    PubMed

    Mitsui, Takaaki; Watanabe-Takahashi, Miho; Shimizu, Eiko; Zhang, Baihao; Funamoto, Satoru; Yamasaki, Shinji; Nishikawa, Kiyotaka

    2016-09-01

    Shiga toxin (Stx), a major virulence factor of enterohemorrhagic Escherichia coli (EHEC), can be classified into two subgroups, Stx1 and Stx2, each consisting of various closely related subtypes. Stx2 subtypes Stx2a and Stx2d are highly virulent and linked with serious human disorders, such as acute encephalopathy and hemolytic-uremic syndrome. Through affinity-based screening of a tetravalent peptide library, we previously developed peptide neutralizers of Stx2a in which the structure was optimized to bind to the B-subunit pentamer. In this study, we identified Stx2d-selective neutralizers by targeting Asn16 of the B subunit, an amino acid unique to Stx2d that plays an essential role in receptor binding. We synthesized a series of tetravalent peptides on a cellulose membrane in which the core structure was exactly the same as that of peptides in the tetravalent library. A total of nine candidate motifs were selected to synthesize tetravalent forms of the peptides by screening two series of the tetravalent peptides. Five of the tetravalent peptides effectively inhibited the cytotoxicity of Stx2a and Stx2d, and notably, two of the peptides selectively inhibited Stx2d. These two tetravalent peptides bound to the Stx2d B subunit with high affinity dependent on Asn16. The mechanism of binding to the Stx2d B subunit differed from that of binding to Stx2a in that the peptides covered a relatively wide region of the receptor-binding surface. Thus, this highly optimized screening technique enables the development of subtype-selective neutralizers, which may lead to more sophisticated treatments of infections by Stx-producing EHEC. PMID:27382021

  7. Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model.

    PubMed

    Browne, Patience; Judson, Richard S; Casey, Warren M; Kleinstreuer, Nicole C; Thomas, Russell S

    2015-07-21

    The U.S. Environmental Protection Agency (EPA) is considering high-throughput and computational methods to evaluate the endocrine bioactivity of environmental chemicals. Here we describe a multistep, performance-based validation of new methods and demonstrate that these new tools are sufficiently robust to be used in the Endocrine Disruptor Screening Program (EDSP). Results from 18 estrogen receptor (ER) ToxCast high-throughput screening assays were integrated into a computational model that can discriminate bioactivity from assay-specific interference and cytotoxicity. Model scores range from 0 (no activity) to 1 (bioactivity of 17β-estradiol). ToxCast ER model performance was evaluated for reference chemicals, as well as results of EDSP Tier 1 screening assays in current practice. The ToxCast ER model accuracy was 86% to 93% when compared to reference chemicals and predicted results of EDSP Tier 1 guideline and other uterotrophic studies with 84% to 100% accuracy. The performance of high-throughput assays and ToxCast ER model predictions demonstrates that these methods correctly identify active and inactive reference chemicals, provide a measure of relative ER bioactivity, and rapidly identify chemicals with potential endocrine bioactivities for additional screening and testing. EPA is accepting ToxCast ER model data for 1812 chemicals as alternatives for EDSP Tier 1 ER binding, ER transactivation, and uterotrophic assays. PMID:26066997

  8. Computer Simulation Of Radiographic Screen-Film Images

    NASA Astrophysics Data System (ADS)

    Metter, Richard V.; Dillon, Peter L.; Huff, Kenneth E.; Rabbani, Majid

    1986-06-01

    A method is described for computer simulation of radiographic screen-film images. This method is based on a previously published model of the screen-film imaging process.l The x-ray transmittance of a test object is sampled at a pitch of 50 μm by scanning a high-resolution, low-noise direct-exposure radiograph. This transmittance is then used, along with the x-ray exposure incident upon the object, to determine the expected number of quanta per pixel incident upon the screen. The random nature of x-ray arrival and absorption, x-ray quantum to light photon conversion, and photon absorption by the film is simulated by appropriate random number generation. Standard FFT techniques are used for computing the effects of scattering. Finally, the computed film density for each pixel is produced on a high-resolution, low-noise output film by a scanning printer. The simulation allows independent specification of x-ray exposure, x-ray quantum absorption, light conversion statistics, light scattering, and film characteristics (sensitometry and gran-ularity). Each of these parameters is independently measured for radiographic systems of interest. The simulator is tested by comparing actual radiographic images with simulated images resulting from the independently measured parameters. Images are also shown illustrating the effects of changes in these parameters on image quality. Finally, comparison is made with a "perfect" imaging system where information content is only limited by the finite number of x-rays.

  9. Magnification error of digital x rays on the computer screen.

    PubMed

    Ranjitkar, S; Prakash, D; Prakash, R

    2014-12-01

    Templating x-rays of total hip and knee replacements pre-operatively are important to plan surgery. This is usually done using acetate templates of the prosthesis on hard copies of the x-ray. With the change in practice, to use digital x-rays on computer screens instead of hard copies, it is important to assess if acetate templates can be used for digital x-rays on the computer screen. This is a retrospective x-ray study of 19 hip replacements and 30 knee replacements to assess their magnification using the Patient Archiving Computerised System (PACS) software. This study was done to assess the accuracy of magnification, using acetate templates over a computer screen. In total hip replacement, the outer cup diameter was also measured using the digital measurement scale. The mean magnification was 0.59 for the acetabular cup and the femoral stem in total hip replacement and 0.48 for the femoral and tibial implant in total knee replacement. The mean difference in cup diameter comparing to the real size was an excess of 10.21 mm. The study showed over-magnified hip and knee x-rays thus suggesting that acetate templates and measurement scales on PACS was not reliable.

  10. Analysis of 2D Torus and Hub Topologies of 100Mb/s Ethernet for the Whitney Commodity Computing Testbed

    NASA Technical Reports Server (NTRS)

    Pedretti, Kevin T.; Fineberg, Samuel A.; Kutler, Paul (Technical Monitor)

    1997-01-01

    A variety of different network technologies and topologies are currently being evaluated as part of the Whitney Project. This paper reports on the implementation and performance of a Fast Ethernet network configured in a 4x4 2D torus topology in a testbed cluster of 'commodity' Pentium Pro PCs. Several benchmarks were used for performance evaluation: an MPI point to point message passing benchmark, an MPI collective communication benchmark, and the NAS Parallel Benchmarks version 2.2 (NPB2). Our results show that for point to point communication on an unloaded network, the hub and 1 hop routes on the torus have about the same bandwidth and latency. However, the bandwidth decreases and the latency increases on the torus for each additional route hop. Collective communication benchmarks show that the torus provides roughly four times more aggregate bandwidth and eight times faster MPI barrier synchronizations than a hub based network for 16 processor systems. Finally, the SOAPBOX benchmarks, which simulate real-world CFD applications, generally demonstrated substantially better performance on the torus than on the hub. In the few cases the hub was faster, the difference was negligible. In total, our experimental results lead to the conclusion that for Fast Ethernet networks, the torus topology has better performance and scales better than a hub based network.

  11. GeoBuilder: a geometric algorithm visualization and debugging system for 2D and 3D geometric computing.

    PubMed

    Wei, Jyh-Da; Tsai, Ming-Hung; Lee, Gen-Cher; Huang, Jeng-Hung; Lee, Der-Tsai

    2009-01-01

    Algorithm visualization is a unique research topic that integrates engineering skills such as computer graphics, system programming, database management, computer networks, etc., to facilitate algorithmic researchers in testing their ideas, demonstrating new findings, and teaching algorithm design in the classroom. Within the broad applications of algorithm visualization, there still remain performance issues that deserve further research, e.g., system portability, collaboration capability, and animation effect in 3D environments. Using modern technologies of Java programming, we develop an algorithm visualization and debugging system, dubbed GeoBuilder, for geometric computing. The GeoBuilder system features Java's promising portability, engagement of collaboration in algorithm development, and automatic camera positioning for tracking 3D geometric objects. In this paper, we describe the design of the GeoBuilder system and demonstrate its applications. PMID:19147888

  12. Fault-tolerant quantum computation and communication on a distributed 2D array of small local systems

    SciTech Connect

    Fujii, K.; Yamamoto, T.; Imoto, N.; Koashi, M.

    2014-12-04

    We propose a scheme for distributed quantum computation with small local systems connected via noisy quantum channels. We show that the proposed scheme tolerates errors with probabilities ∼30% and ∼ 0.1% in quantum channels and local operations, respectively, both of which are improved substantially compared to the previous works.

  13. Coupled 2-dimensional cascade theory for noise an d unsteady aerodynamics of blade row interaction in turbofans. Volume 2: Documentation for computer code CUP2D

    NASA Technical Reports Server (NTRS)

    Hanson, Donald B.

    1994-01-01

    A two dimensional linear aeroacoustic theory for rotor/stator interaction with unsteady coupling was derived and explored in Volume 1 of this report. Computer program CUP2D has been written in FORTRAN embodying the theoretical equations. This volume (Volume 2) describes the structure of the code, installation and running, preparation of the input file, and interpretation of the output. A sample case is provided with printouts of the input and output. The source code is included with comments linking it closely to the theoretical equations in Volume 1.

  14. FACET: a radiation view factor computer code for axisymmetric, 2D planar, and 3D geometries with shadowing

    SciTech Connect

    Shapiro, A.B.

    1983-08-01

    The computer code FACET calculates the radiation geometric view factor (alternatively called shape factor, angle factor, or configuration factor) between surfaces for axisymmetric, two-dimensional planar and three-dimensional geometries with interposed third surface obstructions. FACET was developed to calculate view factors for input to finite-element heat-transfer analysis codes. The first section of this report is a brief review of previous radiation-view-factor computer codes. The second section presents the defining integral equation for the geometric view factor between two surfaces and the assumptions made in its derivation. Also in this section are the numerical algorithms used to integrate this equation for the various geometries. The third section presents the algorithms used to detect self-shadowing and third-surface shadowing between the two surfaces for which a view factor is being calculated. The fourth section provides a user's input guide followed by several example problems.

  15. Computer-assisted lesion detection system for stomach screening using stomach shape and appearance models

    NASA Astrophysics Data System (ADS)

    Midoh, Y.; Nakamura, M.; Takashima, M.; Nakamae, K.; Fujioka, H.

    2007-03-01

    In Japan, stomach cancer is one of the three most common causes of death from cancer. Since periodic health checks of stomach X-rays have become more widely carried out, the physicians' burdens have been increasing in the mass screening to detect initial symptoms of a disease. For the purpose of automatic diagnosis, we try to develop a computer-assisted lesion detection system for stomach screening. The proposed system has two databases. One is the stomach shape database that consists of the computer graphics stomach 3D models based on biomechanics simulation and their projected 2D images. The other is the normal appearance database that is constructed by learning patterns in a normal patient training set. The stomach contour is extracted from an X-ray image including a barium filled region by the following steps. Firstly, the approximated stomach region is obtained by nonrigid registration based on mutual information. We define nonrigid transformation as one that includes translations, rotations, scaling, air-barium interface and weights of eigenvectors determined by principal components analysis in the stomach shape database. Secondly, the accurate stomach contour is extracted from the gradient of an image by using the Dynamic Programming. After then, stomach lesions are detected by inspecting whether the Mahalanobis distance from the mean in the normal appearance database is longer than a suitable value on the extracted stomach contour. We applied our system to 75 X-ray images of barium-filled stomach to show its validity.

  16. 2D Computational Fluid Dynamic Modeling of Human Ventricle System Based on Fluid-Solid Interaction and Pulsatile Flow.

    PubMed

    Masoumi, Nafiseh; Framanzad, F; Zamanian, Behnam; Seddighi, A S; Moosavi, M H; Najarian, S; Bastani, Dariush

    2013-01-01

    Many diseases are related to cerebrospinal fluid (CSF) hydrodynamics. Therefore, understanding the hydrodynamics of CSF flow and intracranial pressure is helpful for obtaining deeper knowledge of pathological processes and providing better treatments. Furthermore, engineering a reliable computational method is promising approach for fabricating in vitro models which is essential for inventing generic medicines. A Fluid-Solid Interaction (FSI)model was constructed to simulate CSF flow. An important problem in modeling the CSF flow is the diastolic back flow. In this article, using both rigid and flexible conditions for ventricular system allowed us to evaluate the effect of surrounding brain tissue. Our model assumed an elastic wall for the ventricles and a pulsatile CSF input as its boundary conditions. A comparison of the results and the experimental data was done. The flexible model gave better results because it could reproduce the diastolic back flow mentioned in clinical research studies. The previous rigid models have ignored the brain parenchyma interaction with CSF and so had not reported the back flow during the diastolic time. In this computational fluid dynamic (CFD) analysis, the CSF pressure and flow velocity in different areas were concordant with the experimental data.

  17. VFLOW2D - A Vorte-Based Code for Computing Flow Over Elastically Supported Tubes and Tube Arrays

    SciTech Connect

    WOLFE,WALTER P.; STRICKLAND,JAMES H.; HOMICZ,GREGORY F.; GOSSLER,ALBERT A.

    2000-10-11

    A numerical flow model is developed to simulate two-dimensional fluid flow past immersed, elastically supported tube arrays. This work is motivated by the objective of predicting forces and motion associated with both deep-water drilling and production risers in the oil industry. This work has other engineering applications including simulation of flow past tubular heat exchangers or submarine-towed sensor arrays and the flow about parachute ribbons. In the present work, a vortex method is used for solving the unsteady flow field. This method demonstrates inherent advantages over more conventional grid-based computational fluid dynamics. The vortex method is non-iterative, does not require artificial viscosity for stability, displays minimal numerical diffusion, can easily treat moving boundaries, and allows a greatly reduced computational domain since vorticity occupies only a small fraction of the fluid volume. A gridless approach is used in the flow sufficiently distant from surfaces. A Lagrangian remap scheme is used near surfaces to calculate diffusion and convection of vorticity. A fast multipole technique is utilized for efficient calculation of velocity from the vorticity field. The ability of the method to correctly predict lift and drag forces on simple stationary geometries over a broad range of Reynolds numbers is presented.

  18. Metric-Resolution 2D River Modeling at the Macroscale: Computational Methods and Applications in a Braided River

    NASA Astrophysics Data System (ADS)

    Schubert, Jochen; Monsen, Wade; Sanders, Brett

    2015-11-01

    Metric resolution digital terrain models (DTMs) of rivers now make it possible for multi-dimensional fluid mechanics models to be applied to characterize flow at fine scales that are relevant to studies of river morphology and ecological habitat, or microscales. These developments are important for managing rivers because of the potential to better understand system dynamics, anthropogenic impacts, and the consequences of proposed interventions. However, the data volumes and computational demands of microscale river modeling have largely constrained applications to small multiples of the channel width, or the mesoscale. This report presents computational methods to extend a microscale river model beyond the mesoscale to the macroscale, defined as large multiples of the channel width. A method of automated unstructured grid generation is presented that automatically clusters fine resolution cells in areas of curvature (e.g., channel banks), and places relatively coarse cells in areas lacking topographic variability. This overcomes the need to manually generate breaklines to constrain the grid, which is painstaking at the mesoscale and virtually impossible at the macroscale. The method is applied to a braided river with an extremely complex channel network configuration and shown to yield an efficient fine resolution model. The sensitivity of model output to grid design and resistance parameters is also examined as it relates to analysis of hydrology, hydraulic geometry and river habitats and the findings reiterate the importance of model calibration and validation.

  19. 2D Computational Fluid Dynamic Modeling of Human Ventricle System Based on Fluid-Solid Interaction and Pulsatile Flow.

    PubMed

    Masoumi, Nafiseh; Framanzad, F; Zamanian, Behnam; Seddighi, A S; Moosavi, M H; Najarian, S; Bastani, Dariush

    2013-01-01

    Many diseases are related to cerebrospinal fluid (CSF) hydrodynamics. Therefore, understanding the hydrodynamics of CSF flow and intracranial pressure is helpful for obtaining deeper knowledge of pathological processes and providing better treatments. Furthermore, engineering a reliable computational method is promising approach for fabricating in vitro models which is essential for inventing generic medicines. A Fluid-Solid Interaction (FSI)model was constructed to simulate CSF flow. An important problem in modeling the CSF flow is the diastolic back flow. In this article, using both rigid and flexible conditions for ventricular system allowed us to evaluate the effect of surrounding brain tissue. Our model assumed an elastic wall for the ventricles and a pulsatile CSF input as its boundary conditions. A comparison of the results and the experimental data was done. The flexible model gave better results because it could reproduce the diastolic back flow mentioned in clinical research studies. The previous rigid models have ignored the brain parenchyma interaction with CSF and so had not reported the back flow during the diastolic time. In this computational fluid dynamic (CFD) analysis, the CSF pressure and flow velocity in different areas were concordant with the experimental data. PMID:25337330

  20. Computational Studies of Condensed Matter Systems: Manganese Vanadium Oxide and 2D attractive Hubbard model with spin-dependent disorder

    NASA Astrophysics Data System (ADS)

    Nanguneri, Ravindra

    -dependent disorder. Further, the finite temperature phase diagram for the 2D attractive fermion Hubbard model with spin-dependent disorder is also considered within BdG mean field theory. Three types of disorder are studied. In the first, only one species is coupled to a random site energy; in the second, the two species both move in random site energy landscapes which are of the same amplitude, but different realizations; and finally, in the third, the disorder is in the hopping rather than the site energy. For all three cases we find that, unlike the case of spin-symmetric randomness, where the energy gap and average order parameter do not vanish as the disorder strength increases, a critical disorder strength exists separating distinct phases. In fact, the energy gap and the average order parameter vanish at distinct transitions, Vcgap and Vc op, allowing for a gapless superconducting (gSC) phase. The gSC phase becomes smaller with increasing temperature, until it vanishes at a temperature T*.

  1. Local finite element enrichment strategies for 2D contact computations and a corresponding post-processing scheme

    NASA Astrophysics Data System (ADS)

    Sauer, Roger A.

    2013-08-01

    Recently an enriched contact finite element formulation has been developed that substantially increases the accuracy of contact computations while keeping the additional numerical effort at a minimum reported by Sauer (Int J Numer Meth Eng, 87: 593-616, 2011). Two enrich-ment strategies were proposed, one based on local p-refinement using Lagrange interpolation and one based on Hermite interpolation that produces C 1-smoothness on the contact surface. Both classes, which were initially considered for the frictionless Signorini problem, are extended here to friction and contact between deformable bodies. For this, a symmetric contact formulation is used that allows the unbiased treatment of both contact partners. This paper also proposes a post-processing scheme for contact quantities like the contact pressure. The scheme, which provides a more accurate representation than the raw data, is based on an averaging procedure that is inspired by mortar formulations. The properties of the enrichment strategies and the corresponding post-processing scheme are illustrated by several numerical examples considering sliding and peeling contact in the presence of large deformations.

  2. Documentation of computer program VS2D to solve the equations of fluid flow in variably saturated porous media

    USGS Publications Warehouse

    Lappala, E.G.; Healy, R.W.; Weeks, E.P.

    1987-01-01

    This report documents FORTRAN computer code for solving problems involving variably saturated single-phase flow in porous media. The flow equation is written with total hydraulic potential as the dependent variable, which allows straightforward treatment of both saturated and unsaturated conditions. The spatial derivatives in the flow equation are approximated by central differences, and time derivatives are approximated either by a fully implicit backward or by a centered-difference scheme. Nonlinear conductance and storage terms may be linearized using either an explicit method or an implicit Newton-Raphson method. Relative hydraulic conductivity is evaluated at cell boundaries by using either full upstream weighting, the arithmetic mean, or the geometric mean of values from adjacent cells. Nonlinear boundary conditions treated by the code include infiltration, evaporation, and seepage faces. Extraction by plant roots that is caused by atmospheric demand is included as a nonlinear sink term. These nonlinear boundary and sink terms are linearized implicitly. The code has been verified for several one-dimensional linear problems for which analytical solutions exist and against two nonlinear problems that have been simulated with other numerical models. A complete listing of data-entry requirements and data entry and results for three example problems are provided. (USGS)

  3. Computer simulation of topological evolution in 2-d grain growth using a continuum diffuse-interface field model

    SciTech Connect

    Fan, D.; Geng, C.; Chen, L.Q.

    1997-03-01

    The local kinetics and topological phenomena during normal grain growth were studied in two dimensions by computer simulations employing a continuum diffuse-interface field model. The relationships between topological class and individual grain growth kinetics were examined, and compared with results obtained previously from analytical theories, experimental results and Monte Carlo simulations. It was shown that both the grain-size and grain-shape (side) distributions are time-invariant and the linear relationship between the mean radii of individual grains and topological class n was reproduced. The moments of the shape distribution were determined, and the differences among the data from soap froth. Potts model and the present simulation were discussed. In the limit when the grain size goes to zero, the average number of grain edges per grain is shown to be between 4 and 5, implying the direct vanishing of 4- and 5-sided grains, which seems to be consistent with recent experimental observations on thin films. Based on the simulation results, the conditions for the applicability of the familiar Mullins-Von Neumann law and the Hillert`s equation were discussed.

  4. 2-D computer modeling of oil generation and migration in a Transect of the Eastern Venezuela Basin

    SciTech Connect

    Gallango, O. ); Parnaud, F. )

    1993-02-01

    The aim of the study was a two-dimensional computer simulation of the basin evolution based on available geological, geophysical, geochemical, geothermal, and hydrodynamic data with the main purpose of determining the hydrocarbon generation and migration history. The modeling was done in two geological sections (platform and pre-thrusting) located along the Chacopata-Uverito Transect in the Eastern Venezuelan Basin. In the platform section an hypothetic source rock equivalent to the Gyayuta Group was considered in order to simulate the migration of hydrocarbons. The thermal history reconstruction of hypothetic source rock confirms that this source rock does not reach the oil window before the middle Miocene and that the maturity in this sector is due to the sedimentation of the Freites, La Pica, and Mesa-Las Piedras formations. The oil expulsion and migration from this hypothetic source rock began after middle Miocene time. The expulsion of the hydrocarbons took place mainly along the Oligocene-Miocene reservoir and do not reach at the present time zones located beyond of the Oritupano field, which imply that the oil accumulated in south part of the basin was generated by a source rock located to the north, in the actual deformation zone. Since 17 m.y. ago water migration pattern from north to south was observed in this section. In the pre-thrusting section the hydrocarbon expulsion started during the early Tertiary and took place mainly toward the lower Cretaceous (El Cantil and Barranquim formations). At the end of the passive margin the main migration occur across the Merecure reservoir, through which the hydrocarbon migrated forward to the Onado sector before the thrusting.

  5. A new 2D segmentation method based on dynamic programming applied to computer aided detection in mammography.

    PubMed

    Timp, Sheila; Karssemeijer, Nico

    2004-05-01

    Mass segmentation plays a crucial role in computer-aided diagnosis (CAD) systems for classification of suspicious regions as normal, benign, or malignant. In this article we present a robust and automated segmentation technique--based on dynamic programming--to segment mass lesions from surrounding tissue. In addition, we propose an efficient algorithm to guarantee resulting contours to be closed. The segmentation method based on dynamic programming was quantitatively compared with two other automated segmentation methods (region growing and the discrete contour model) on a dataset of 1210 masses. For each mass an overlap criterion was calculated to determine the similarity with manual segmentation. The mean overlap percentage for dynamic programming was 0.69, for the other two methods 0.60 and 0.59, respectively. The difference in overlap percentage was statistically significant. To study the influence of the segmentation method on the performance of a CAD system two additional experiments were carried out. The first experiment studied the detection performance of the CAD system for the different segmentation methods. Free-response receiver operating characteristics analysis showed that the detection performance was nearly identical for the three segmentation methods. In the second experiment the ability of the classifier to discriminate between malignant and benign lesions was studied. For region based evaluation the area Az under the receiver operating characteristics curve was 0.74 for dynamic programming, 0.72 for the discrete contour model, and 0.67 for region growing. The difference in Az values obtained by the dynamic programming method and region growing was statistically significant. The differences between other methods were not significant.

  6. Electroencephalography (EEG)-based brain-computer interface (BCI): a 2-D virtual wheelchair control based on event-related desynchronization/synchronization and state control.

    PubMed

    Huang, Dandan; Qian, Kai; Fei, Ding-Yu; Jia, Wenchuan; Chen, Xuedong; Bai, Ou

    2012-05-01

    This study aims to propose an effective and practical paradigm for a brain-computer interface (BCI)-based 2-D virtual wheelchair control. The paradigm was based on the multi-class discrimination of spatiotemporally distinguishable phenomenon of event-related desynchronization/synchronization (ERD/ERS) in electroencephalogram signals associated with motor execution/imagery of right/left hand movement. Comparing with traditional method using ERD only, where bilateral ERDs appear during left/right hand mental tasks, the 2-D control exhibited high accuracy within a short time, as incorporating ERS into the paradigm hypothetically enhanced the spatiotemoral feature contrast of ERS versus ERD. We also expected users to experience ease of control by including a noncontrol state. In this study, the control command was sent discretely whereas the virtual wheelchair was moving continuously. We tested five healthy subjects in a single visit with two sessions, i.e., motor execution and motor imagery. Each session included a 20 min calibration and two sets of games that were less than 30 min. Average target hit rate was as high as 98.4% with motor imagery. Every subject achieved 100% hit rate in the second set of wheelchair control games. The average time to hit a target 10 m away was about 59 s, with 39 s for the best set. The superior control performance in subjects without intensive BCI training suggested a practical wheelchair control paradigm for BCI users. PMID:22498703

  7. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems.

  8. Microplate based biosensing with a computer screen aided technique.

    PubMed

    Filippini, Daniel; Andersson, Tony P M; Svensson, Samuel P S; Lundström, Ingemar

    2003-10-30

    Melanophores, dark pigment cells from the frog Xenopus laevis, have the ability to change light absorbance upon stimulation by different biological agents. Hormone exposure (e.g. melatonin or alpha-melanocyte stimulating hormone) has been used here as a reversible stimulus to test a new compact microplate reading platform. As an application, the detection of the asthma drug formoterol in blood plasma samples is demonstrated. The present system utilizes a computer screen as a (programmable) large area light source, and a standard web camera as recording media enabling even kinetic microplate reading with a versatile and broadly available platform, which suffices to evaluate numerous bioassays. Especially in the context of point of care testing or self testing applications these possibilities become advantageous compared with highly dedicated comparatively expensive commercial systems. PMID:14558996

  9. CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm: 2D and 3D Ising, Potts, and XY models

    NASA Astrophysics Data System (ADS)

    Komura, Yukihiro; Okabe, Yutaka

    2014-03-01

    We present sample CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm. We deal with the classical spin models; the Ising model, the q-state Potts model, and the classical XY model. As for the lattice, both the 2D (square) lattice and the 3D (simple cubic) lattice are treated. We already reported the idea of the GPU implementation for 2D models (Komura and Okabe, 2012). We here explain the details of sample programs, and discuss the performance of the present GPU implementation for the 3D Ising and XY models. We also show the calculated results of the moment ratio for these models, and discuss phase transitions. Catalogue identifier: AERM_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERM_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5632 No. of bytes in distributed program, including test data, etc.: 14688 Distribution format: tar.gz Programming language: C, CUDA. Computer: System with an NVIDIA CUDA enabled GPU. Operating system: System with an NVIDIA CUDA enabled GPU. Classification: 23. External routines: NVIDIA CUDA Toolkit 3.0 or newer Nature of problem: Monte Carlo simulation of classical spin systems. Ising, q-state Potts model, and the classical XY model are treated for both two-dimensional and three-dimensional lattices. Solution method: GPU-based Swendsen-Wang multi-cluster spin flip Monte Carlo method. The CUDA implementation for the cluster-labeling is based on the work by Hawick et al. [1] and that by Kalentev et al. [2]. Restrictions: The system size is limited depending on the memory of a GPU. Running time: For the parameters used in the sample programs, it takes about a minute for each program. Of course, it depends on the system size, the number of Monte Carlo steps, etc. References: [1] K

  10. Comparative inhibitory potential of selected dietary bioactive polyphenols, phytosterols on CYP3A4 and CYP2D6 with fluorometric high-throughput screening.

    PubMed

    Vijayakumar, Thangavel Mahalingam; Kumar, Ramasamy Mohan; Agrawal, Aruna; Dubey, Govind Prasad; Ilango, Kaliappan

    2015-07-01

    Cytochrome P450 (CYP450) inhibition by the bioactive molecules of dietary supplements or herbal products leading to greater potential for toxicity of co-administered drugs. The present study was aimed to compare the inhibitory potential of selected common dietary bioactive molecules (Gallic acid, Ellagic acid, β-Sitosterol, Stigmasterol, Quercetin and Rutin) on CYP3A4 and CYP2D6 to assess safety through its inhibitory potency and to predict interaction potential with co-administered drugs. CYP450-CO complex assay was carried out for all the selected dietary bioactive molecules in isolated rat microsomes. CYP450 concentration of the rat liver microsome was found to be 0.474 nmol/mg protein, quercetin in DMSO has shown maximum inhibition on CYP450 (51.02 ± 1.24 %) but less when compared with positive control (79.02 ± 1.61 %). In high throughput fluorometric assay, IC50 value of quercetin (49.08 ± 1.02-54.36 ± 0.85 μg/ml) and gallic acid (78.46 ± 1.32-83.84 ± 1.06 μg/ml) was lower than other bioactive compounds on CYP3A4 and CYP2D6 respectively but it was higher than positive controls (06.28 ± 1.76-07.74 ± 1.32 μg/ml). In comparison of in vitro inhibitory potential on CYP3A4 and CYP2D6, consumption of food or herbal or dietary supplements containing quercetin and gallic acid without any limitation should be carefully considered when narrow therapeutic drugs are administered together. PMID:26139922

  11. A 2D-Computer Model of Atrial Tissue Based on Histographs Describes the Electro-Anatomical Impact of Microstructure on Endocardiac Potentials and Electric Near-Fields

    PubMed Central

    Campos, Fernando O.; Wiener, Thomas; Prassl, Anton J.; Ahammer, Helmut; Plank, Gernot; dos Santos, Rodrigo Weber; Sánchez-Quintana, Damián; Hofer, Ernst

    2014-01-01

    In experiments with cardiac tissue, local conduction is described by waveform analysis of the derivative of the extracellular potential Φ.e and by the loop morphology of the near-field strength E (the components of the electric field parallel and very close to the tissue surface). The question arises whether the features of these signals can be used to quantify the degree of fibrosis in the heart. A computer model allows us to study the behavior of electric signals at the endocardium with respect to known configurations of microstructure which can not be detected during the electrophysiological experiments. This work presents a 2D-computer model with sub-cellular resolution of atrial micro-conduction in the rabbit heart. It is based on the monodomain equations and digitized histographs from tissue slices obtained post-experimentum. It could be shown that excitation spread in densely coupled regions produces uniform and anisotropic conduction. In contrast, zones with parallel fibers separated by uncoupling interstitial space or connective tissue may show uniform or complex signals depending on pacing site. These results suggest that the analysis of Φ.e and E combined with multi-site pacing could be used to characterize the type and the size of fibrosis. PMID:21096441

  12. Icarus: A 2D direct simulation Monte Carlo (DSMC) code for parallel computers. User`s manual - V.3.0

    SciTech Connect

    Bartel, T.; Plimpton, S.; Johannes, J.; Payne, J.

    1996-10-01

    Icarus is a 2D Direct Simulation Monte Carlo (DSMC) code which has been optimized for the parallel computing environment. The code is based on the DSMC method of Bird and models from free-molecular to continuum flowfields in either cartesian (x, y) or axisymmetric (z, r) coordinates. Computational particles, representing a given number of molecules or atoms, are tracked as they have collisions with other particles or surfaces. Multiple species, internal energy modes (rotation and vibration), chemistry, and ion transport are modelled. A new trace species methodology for collisions and chemistry is used to obtain statistics for small species concentrations. Gas phase chemistry is modelled using steric factors derived from Arrhenius reaction rates. Surface chemistry is modelled with surface reaction probabilities. The electron number density is either a fixed external generated field or determined using a local charge neutrality assumption. Ion chemistry is modelled with electron impact chemistry rates and charge exchange reactions. Coulomb collision cross-sections are used instead of Variable Hard Sphere values for ion-ion interactions. The electrostatic fields can either be externally input or internally generated using a Langmuir-Tonks model. The Icarus software package includes the grid generation, parallel processor decomposition, postprocessing, and restart software. The commercial graphics package, Tecplot, is used for graphics display. The majority of the software packages are written in standard Fortran.

  13. Combination of transient 2D-IR experiments and ab initio computations sheds light on the formation of the charge-transfer state in photoexcited carbonyl carotenoids.

    PubMed

    Di Donato, Mariangela; Segado Centellas, Mireia; Lapini, Andrea; Lima, Manuela; Avila, Francisco; Santoro, Fabrizio; Cappelli, Chiara; Righini, Roberto

    2014-08-14

    The excited state dynamics of carbonyl carotenoids is very complex because of the coupling of single- and doubly excited states and the possible involvement of intramolecular charge-transfer (ICT) states. In this contribution we employ ultrafast infrared spectroscopy and theoretical computations to investigate the relaxation dynamics of trans-8'-apo-β-carotenal occurring on the picosecond time scale, after excitation in the S2 state. In a (slightly) polar solvent like chloroform, one-dimensional (T1D-IR) and two-dimensional (T2D-IR) transient infrared spectroscopy reveal spectral components with characteristic frequencies and lifetimes that are not observed in nonpolar solvents (cyclohexane). Combining experimental evidence with an analysis of CASPT2//CASSCF ground and excited state minima and energy profiles, complemented with TDDFT calculations in gas phase and in solvent, we propose a photochemical decay mechanism for this system where only the bright single-excited 1Bu(+) and the dark double-excited 2Ag(-) states are involved. Specifically, the initially populated 1Bu(+) relaxes toward 2Ag(-) in 200 fs. In a nonpolar solvent 2Ag(-) decays to the ground state (GS) in 25 ps. In polar solvents, distortions along twisting modes of the chain promote a repopulation of the 1Bu(+) state which then quickly relaxes to the GS (18 ps in chloroform). The 1Bu(+) state has a high electric dipole and is the main contributor to the charge-transfer state involved in the dynamics in polar solvents. The 2Ag(-) → 1Bu(+) population transfer is evidenced by a cross peak on the T2D-IR map revealing that the motions along the same stretching of the conjugated chain on the 2Ag(-) and 1Bu(+) states are coupled.

  14. 2D and 3D Traveling Salesman Problem

    ERIC Educational Resources Information Center

    Haxhimusa, Yll; Carpenter, Edward; Catrambone, Joseph; Foldes, David; Stefanov, Emil; Arns, Laura; Pizlo, Zygmunt

    2011-01-01

    When a two-dimensional (2D) traveling salesman problem (TSP) is presented on a computer screen, human subjects can produce near-optimal tours in linear time. In this study we tested human performance on a real and virtual floor, as well as in a three-dimensional (3D) virtual space. Human performance on the real floor is as good as that on a…

  15. Screening for Diabetic Retinopathy Using Computer Vision and Physiological Markers

    PubMed Central

    Hann, Christopher E.; Revie, James A.; Hewett, Darren; Chase, J. Geoffrey; Shaw, Geoffrey M.

    2009-01-01

    Background Hyperglycemia and diabetes result in vascular complications, most notably diabetic retinopathy (DR). The prevalence of DR is growing and is a leading cause of blindness and/or visual impairment in developed countries. Current methods of detecting, screening, and monitoring DR are based on subjective human evaluation, which is also slow and time-consuming. As a result, initiation and progress monitoring of DR is clinically hard. Methods Computer vision methods are developed to isolate and detect two of the most common DR dysfunctions—dot hemorrhages (DH) and exudates. The algorithms use specific color channels and segmentation methods to separate these DR manifestations from physiological features in digital fundus images. The algorithms are tested on the first 100 images from a published database. The diagnostic outcome and the resulting positive and negative prediction values (PPV and NPV) are reported. The first 50 images are marked with specialist determined ground truth for each individual exudate and/or DH, which are also compared to algorithm identification. Results Exudate identification had 96.7% sensitivity and 94.9% specificity for diagnosis (PPV = 97%, NPV = 95%). Dot hemorrhage identification had 98.7% sensitivity and 100% specificity (PPV = 100%, NPV = 96%). Greater than 95% of ground truth identified exudates, and DHs were found by the algorithm in the marked first 50 images, with less than 0.5% false positives. Conclusions A direct computer vision approach enabled high-quality identification of exudates and DHs in an independent data set of fundus images. The methods are readily generalizable to other clinical manifestations of DR. The results justify a blinded clinical trial of the system to prove its capability to detect, diagnose, and, over the long term, monitor the state of DR in individuals with diabetes. PMID:20144333

  16. Short interfering RNA guide strand modifiers from computational screening.

    PubMed

    Onizuka, Kazumitsu; Harrison, Jason G; Ball-Jones, Alexi A; Ibarra-Soza, José M; Zheng, Yuxuan; Ly, Diana; Lam, Walter; Mac, Stephanie; Tantillo, Dean J; Beal, Peter A

    2013-11-13

    Short interfering RNAs (siRNAs) are promising drug candidates for a wide range of targets including those previously considered "undruggable". However, properties associated with the native RNA structure limit drug development, and chemical modifications are necessary. Here we describe the structure-guided discovery of functional modifications for the guide strand 5'-end using computational screening with the high-resolution structure of human Ago2, the key nuclease on the RNA interference pathway. Our results indicate the guide strand 5'-end nucleotide need not engage in Watson-Crick (W/C) H-bonding but must fit the general shape of the 5'-end binding site in MID/PIWI domains of hAgo2 for efficient knockdown. 1,2,3-Triazol-4-yl bases formed from the CuAAC reaction of azides and 1-ethynylribose, which is readily incorporated into RNA via the phosphoramidite, perform well at the guide strand 5'-end. In contrast, purine derivatives with modified Hoogsteen faces or N2 substituents are poor choices for 5'-end modifications. Finally, we identified a 1,2,3-triazol-4-yl base incapable of W/C H-bonding that performs well at guide strand position 12, where base pairing to target was expected to be important. This work expands the repertoire of functional nucleotide analogues for siRNAs. PMID:24152142

  17. Development of a non-denaturing 2D gel electrophoresis protocol for screening in vivo uranium-protein targets in Procambarus clarkii with laser ablation ICP MS followed by protein identification by HPLC-Orbitrap MS.

    PubMed

    Xu, Ming; Frelon, Sandrine; Simon, Olivier; Lobinski, Ryszard; Mounicou, Sandra

    2014-10-01

    Limited knowledge about in vivo non-covalent uranium (U)-protein complexes is largely due to the lack of appropriate analytical methodology. Here, a method for screening and identifying the molecular targets of U was developed. The approach was based on non-denaturing 1D and 2D gel electrophoresis (ND-PAGE and ND-2D-PAGE (using ND-IEF as first dimension previously described)) in conjunction with laser ablation inductively coupled plasma mass spectrometry (LA-ICP MS) for the detection of U-containing proteins. The proteins were then identified by µbore HPLC-Orbitrap MS/MS. The method was applied to the analysis of cytosol of hepatopancreas (HP) of a model U-bioaccumulating organism (Procambarus clarkii). The imaging of uranium in 2D gels revealed the presence of 11 U-containing protein spots. Six protein candidates (i.e. ferritin, glyceraldehyde-3-phosphate dehydrogenase, triosephosphate isomerase, cytosolic manganese superoxide dismutase (Mn-SOD), glutathione S transferase D1 and H3 histone family protein) were then identified by matching with the data base of crustacea Decapoda species (e.g. crayfish). Among them, ferritin was the most important one. This strategy is expected to provide an insight into U toxicology and metabolism. PMID:25059147

  18. VIBA-Lab 3.0: Computer program for simulation and semi-quantitative analysis of PIXE and RBS spectra and 2D elemental maps

    NASA Astrophysics Data System (ADS)

    Orlić, Ivica; Mekterović, Darko; Mekterović, Igor; Ivošević, Tatjana

    2015-11-01

    VIBA-Lab is a computer program originally developed by the author and co-workers at the National University of Singapore (NUS) as an interactive software package for simulation of Particle Induced X-ray Emission and Rutherford Backscattering Spectra. The original program is redeveloped to a VIBA-Lab 3.0 in which the user can perform semi-quantitative analysis by comparing simulated and measured spectra as well as simulate 2D elemental maps for a given 3D sample composition. The latest version has a new and more versatile user interface. It also has the latest data set of fundamental parameters such as Coster-Kronig transition rates, fluorescence yields, mass absorption coefficients and ionization cross sections for K and L lines in a wider energy range than the original program. Our short-term plan is to introduce routine for quantitative analysis for multiple PIXE and XRF excitations. VIBA-Lab is an excellent teaching tool for students and researchers in using PIXE and RBS techniques. At the same time the program helps when planning an experiment and when optimizing experimental parameters such as incident ions, their energy, detector specifications, filters, geometry, etc. By "running" a virtual experiment the user can test various scenarios until the optimal PIXE and BS spectra are obtained and in this way save a lot of expensive machine time.

  19. Elastic Deformations in 2D van der waals Heterostructures and their Impact on Optoelectronic Properties: Predictions from a Multiscale Computational Approach

    PubMed Central

    Kumar, Hemant; Er, Dequan; Dong, Liang; Li, Junwen; Shenoy, Vivek B.

    2015-01-01

    Recent technological advances in the isolation and transfer of different 2-dimensional (2D) materials have led to renewed interest in stacked Van der Waals (vdW) heterostructures. Interlayer interactions and lattice mismatch between two different monolayers cause elastic strains, which significantly affects their electronic properties. Using a multiscale computational method, we demonstrate that significant in-plane strains and the out-of-plane displacements are introduced in three different bilayer structures, namely graphene-hBN, MoS2-WS2 and MoSe2-WSe2, due to interlayer interactions which can cause bandgap change of up to ~300 meV. Furthermore, the magnitude of the elastic deformations can be controlled by changing the relative rotation angle between two layers. Magnitude of the out-of-plane displacements in graphene agrees well with those observed in experiments and can explain the experimentally observed bandgap opening in graphene. Upon increasing the relative rotation angle between the two lattices from 0° to 10°, the magnitude of the out-of-plane displacements decrease while in-plane strains peaks when the angle is ~6°. For large misorientation angles (>10°), the out-of-plane displacements become negligible. We further predict the deformation fields for MoS2-WS2 and MoSe2-WSe2 heterostructures that have been recently synthesized experimentally and estimate the effect of these deformation fields on near-gap states. PMID:26076932

  20. Image fusion of Ultrasound Computer Tomography volumes with X-ray mammograms using a biomechanical model based 2D/3D registration.

    PubMed

    Hopp, T; Duric, N; Ruiter, N V

    2015-03-01

    Ultrasound Computer Tomography (USCT) is a promising breast imaging modality under development. Comparison to a standard method like mammography is essential for further development. Due to significant differences in image dimensionality and compression state of the breast, correlating USCT images and X-ray mammograms is challenging. In this paper we present a 2D/3D registration method to improve the spatial correspondence and allow direct comparison of the images. It is based on biomechanical modeling of the breast and simulation of the mammographic compression. We investigate the effect of including patient-specific material parameters estimated automatically from USCT images. The method was systematically evaluated using numerical phantoms and in-vivo data. The average registration accuracy using the automated registration was 11.9mm. Based on the registered images a method for analysis of the diagnostic value of the USCT images was developed and initially applied to analyze sound speed and attenuation images based on X-ray mammograms as ground truth. Combining sound speed and attenuation allows differentiating lesions from surrounding tissue. Overlaying this information on mammograms, combines quantitative and morphological information for multimodal diagnosis. PMID:25456144

  1. 2D-DIGE screening of high-productive CHO cells under glucose limitation--basic changes in the proteome equipment and hints for epigenetic effects.

    PubMed

    Wingens, Marc; Gätgens, Jochem; Schmidt, Anica; Albaum, Stefan P; Büntemeyer, Heino; Noll, Thomas; Hoffrogge, Raimund

    2015-05-10

    CHO derivates (Chinese hamster ovary) belong to the most important mammalian cells for industrial recombinant protein production. Many efforts have been made to improve productivity and stability of CHO cells in bioreactor processes. Here, we followed up one barely understood phenomenon observed with process optimizations: a significantly increased cell-specific productivity in late phases of glucose-limited perfusion cultivations, when glucose (and lactate) reserves are exhausted. Our aim was to elucidate the cellular activities connected to the metabolic shift from glucose surplus to glucose limitation phase. With 2D-DIGE, we compared three stages in a perfusion culture of CHO cells: the initial growth with high glucose concentration and low lactate production, the second phase with glucose going to limitation and high lactate level, and finally the state of glucose limitation and also low lactate concentration but increased cell-specific productivity. With our proteomic approach we were able to demonstrate consequences of glucose limitation for the protein expression machinery which also could play a role for a higher recombinant protein production. Most interestingly, we detected epigenetic effects on the level of proteins involved in histone modification (HDAC1/-2, SET, RBBP7, DDX5). Together with shifts in the protein inventory of energy metabolism, cytoskeleton and protein expression, a picture emerges of basic changes in the cellular equipment under long-term glucose limitation of CHO cells.

  2. Visibility of microcalcifications in computed and screen-film mammography

    NASA Astrophysics Data System (ADS)

    Cowen, Arnold R.; Launders, Jason H.; Jadav, Mark; Brettle, David S.

    1997-08-01

    Due to the clinically and technically demanding nature of breast x-ray imaging, mammography still remains one of the few essentially film-based radiological imaging techniques in modern medical imaging. There are a range of possible benefits available if a practical and economical direct digital imaging technique can be introduced to routine clinical practice. There has been much debate regarding the minimum specification required for direct digital acquisition. One such direct digital system available is computed radiography (CR), which has a modest specification when compared with modern screen-film mammography (SFM) systems. This paper details two psychophysical studies in which the detection of simulated microcalcifications with CR has been directly compared to that with SFM. The first study found that under scatter-free conditions the minimum detectable size of microcalcification was approximately for both SFM and CR. The second study found that SFM had a 4.6% higher probability of observers being able to correctly identify the shape of diameter test details; there was no significant difference for either larger or smaller test details. From the results of these studies it has been demonstrated that the modest specification of CR, in terms of limiting resolution, does not translate into a dramatic difference in the perception of details at the limit of detectability. When judging the imaging performance of a system it is more important to compare the signal-to-noise ratio transfer spectrum characteristics, rather than simply the modulation transfer function.

  3. Can anti-migratory drugs be screened in vitro? A review of 2D and 3D assays for the quantitative analysis of cell migration.

    PubMed

    Decaestecker, Christine; Debeir, Olivier; Van Ham, Philippe; Kiss, Robert

    2007-03-01

    The aim of the present review is to detail and analyze the pros and cons of in vitro tests available to quantify the anti-migratory effects of anti-cancer drugs for their eventual use in combating the dispersal of tumor cells, a clinical need which currently remains unsatisfied. We therefore briefly sum up why anti-migratory drugs constitute a promising approach in oncology while at the same time emphasizing that migrating cancer cells are resistant to apoptosis. To analyze the pros and cons of the various in vitro tests under review we also briefly sum up the molecular and cellular stages of cancer cell migration, an approach that enables us to argue both that no single in vitro test is sufficient to characterize the anti-migratory potential of a drug and that standardization is needed for the efficient quantitative analysis of cell locomotion in a 3D environment. Before concluding our review we devote the final two parts (i) to the description of new prototypes which, in the near future, could enter the screening process with a view to identifying novel anti-migratory compounds, and (ii) to the anti-migratory compounds currently developed against cancer, with particular emphasis on how these compounds were selected before entering the clinical trial phase.

  4. Assessing the benefits and harms of low-dose computed tomography screening for lung cancer

    PubMed Central

    Pinsky, Paul F

    2015-01-01

    Summary The concept of using low-dose computed tomography (LDCT) for lung cancer screening goes back almost 25 years. In 2011, the National Lung Screening Trial (NLST) reported that LDCT screening significantly reduced mortality from lung cancer in a high risk population. This article evaluates the benefits and harms of LDCT screening, based largely on evidence from randomized trials. Harms include false-positive screens and resultant diagnostic procedures, overdiagnosed cancers, and radiation exposure. Benefits can be expressed as the number needed to be screened to prevent one lung cancer death or as estimated overall reductions in lung cancer mortality assuming LDCT population screening as recommended by guidelines. Indirect metrics of benefit, such as lung cancer survival and stage distribution, as well as measures of harms, will be important to monitor in the future as LDCT screening disseminates in the population. PMID:26617677

  5. GRID2D/3D: A computer program for generating grid systems in complex-shaped two- and three-dimensional spatial domains. Part 2: User's manual and program listing

    NASA Technical Reports Server (NTRS)

    Bailey, R. T.; Shih, T. I.-P.; Nguyen, H. L.; Roelke, R. J.

    1990-01-01

    An efficient computer program, called GRID2D/3D, was developed to generate single and composite grid systems within geometrically complex two- and three-dimensional (2- and 3-D) spatial domains that can deform with time. GRID2D/3D generates single grid systems by using algebraic grid generation methods based on transfinite interpolation in which the distribution of grid points within the spatial domain is controlled by stretching functions. All single grid systems generated by GRID2D/3D can have grid lines that are continuous and differentiable everywhere up to the second-order. Also, grid lines can intersect boundaries of the spatial domain orthogonally. GRID2D/3D generates composite grid systems by patching together two or more single grid systems. The patching can be discontinuous or continuous. For continuous composite grid systems, the grid lines are continuous and differentiable everywhere up to the second-order except at interfaces where different single grid systems meet. At interfaces where different single grid systems meet, the grid lines are only differentiable up to the first-order. For 2-D spatial domains, the boundary curves are described by using either cubic or tension spline interpolation. For 3-D spatial domains, the boundary surfaces are described by using either linear Coon's interpolation, bi-hyperbolic spline interpolation, or a new technique referred to as 3-D bi-directional Hermite interpolation. Since grid systems generated by algebraic methods can have grid lines that overlap one another, GRID2D/3D contains a graphics package for evaluating the grid systems generated. With the graphics package, the user can generate grid systems in an interactive manner with the grid generation part of GRID2D/3D. GRID2D/3D is written in FORTRAN 77 and can be run on any IBM PC, XT, or AT compatible computer. In order to use GRID2D/3D on workstations or mainframe computers, some minor modifications must be made in the graphics part of the program; no

  6. Automatic multimodal 2D/3D image fusion of ultrasound computer tomography and x-ray mammography for breast cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Hopp, Torsten; Duric, Neb; Ruiter, Nicole V.

    2012-03-01

    Breast cancer is the most common cancer among women. The established screening method to detect breast cancer in an early state is X-ray mammography. However, X-ray frequently provides limited contrast of tumors located within glandular tissue. A new imaging approach is Ultrasound Computer Tomography generating threedimensional volumes of the breast. Three different images are available: reflectivity, attenuation and speed of sound. The correlation of USCT volumes with X-ray mammograms is of interest for evaluation of the new imaging modality as well as for a multimodal diagnosis. Yet, both modalities differ in image dimensionality, patient positioning and deformation state of the breast. In earlier work we proposed a methodology based on Finite Element Method to register speed of sound images with the according mammogram. In this work, we enhanced the methodology to register all three image types provided by USCT. Furthermore, the methodology is now completely automated using image similarity measures to estimate rotations in datasets. A fusion methodology is proposed which combines the information of the three USCT image types with the X-ray mammogram via semitransparent overlay images. The evaluation was done using 13 datasets from a clinical study. The registration accuracy was measured by the displacement of the center of a lesion marked in both modalities. Using the automated rotation estimation, a mean displacement of 10.4 mm was achieved. Due to the clinically relevant registration accuracy, the methodology provides a basis for evaluation of the new imaging device USCT as well as for multimodal diagnosis.

  7. The Use of Geometric Properties of 2D Arrays across Development

    ERIC Educational Resources Information Center

    Gibson, Brett M.; Leichtman, Michelle D.; Costa, Rachel; Bemis, Rhyannon

    2009-01-01

    Four- to 10-year-old children (n = 50) participated in a 2D search task that included geometry (with- and without lines) and feature conditions. During each of 27 trials, participants watched as a cartoon character hid behind one of three landmarks arranged in a triangle on a computer screen. During feature condition trials, participants could use…

  8. The New Screen Time: Computers, Tablets, and Smartphones Enter the Equation

    ERIC Educational Resources Information Center

    Wiles, Bradford B.; Schachtner, Laura; Pentz, Julie L.

    2016-01-01

    Emerging technologies attract children and push parents' and caregivers' abilities to attend to their families. This article presents recommendations related to the new version of screen time, which includes time with computers, tablets, and smartphones. Recommendations are provided for screen time for very young children and those in middle and…

  9. DockScreen: A database of in silico biomolecular interactions to support computational toxicology

    EPA Science Inventory

    We have developed DockScreen, a database of in silico biomolecular interactions designed to enable rational molecular toxicological insight within a computational toxicology framework. This database is composed of chemical/target (receptor and enzyme) binding scores calculated by...

  10. COMPUTATIONAL TOXICOLOGY - OBJECTIVE 2: DEVELOPING APPROACHES FOR PRIORITIZING CHEMICALS FOR SUBSEQUENT SCREENING AND TESTING

    EPA Science Inventory

    One of the strategic objectives of the Computational Toxicology Program is to develop approaches for prioritizing chemicals for subsequent screening and testing. Approaches currently available for this process require extensive resources. Therefore, less costly and time-extensi...

  11. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  12. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  13. In person versus Computer Screening for Intimate Partner Violence Among Pregnant Patients

    PubMed Central

    Dado, Diane; Schussler, Sara; Hawker, Lynn; Holland, Cynthia L.; Burke, Jessica G.; Cluss, Patricia A.

    2012-01-01

    Objective To compare in person versus computerized screening for intimate partner violence (IPV) in a hospital-based prenatal clinic and explore women’s assessment of the screening methods. Methods We compared patient IPV disclosures on a computerized questionnaire to audio-taped first obstetric visits with an obstetric care provider and performed semi-structured interviews with patient participants who reported experiencing IPV. Results Two-hundred and fifty patient participants and 52 provider participants were in the study. Ninety-one (36%) patients disclosed IPV either via computer or in person. Of those who disclosed IPV, 60 (66%) disclosed via both methods, but 31 (34%) disclosed IPV via only one of the two methods. Twenty-three women returned for interviews. They recommended using both types together. While computerized screening was felt to be non-judgmental and more anonymous, in person screening allowed for tailored questioning and more emotional connection with the provider. Conclusion Computerized screening allowed disclosure without fear of immediate judgment. In person screening allows more flexibility in wording of questions regarding IPV and opportunity for interpersonal rapport. Practice Implications Both computerized or self-completed screening and in person screening is recommended. Providers should address IPV using non-judgmental, descriptive language, include assessments for psychological IPV, and repeat screening in person, even if no patient disclosure occurs via computer. PMID:22770815

  14. Staring 2-D hadamard transform spectral imager

    DOEpatents

    Gentry, Stephen M.; Wehlburg, Christine M.; Wehlburg, Joseph C.; Smith, Mark W.; Smith, Jody L.

    2006-02-07

    A staring imaging system inputs a 2D spatial image containing multi-frequency spectral information. This image is encoded in one dimension of the image with a cyclic Hadamarid S-matrix. The resulting image is detecting with a spatial 2D detector; and a computer applies a Hadamard transform to recover the encoded image.

  15. Automatic classification of pulmonary peri-fissural nodules in computed tomography using an ensemble of 2D views and a convolutional neural network out-of-the-box.

    PubMed

    Ciompi, Francesco; de Hoop, Bartjan; van Riel, Sarah J; Chung, Kaman; Scholten, Ernst Th; Oudkerk, Matthijs; de Jong, Pim A; Prokop, Mathias; van Ginneken, Bram

    2015-12-01

    In this paper, we tackle the problem of automatic classification of pulmonary peri-fissural nodules (PFNs). The classification problem is formulated as a machine learning approach, where detected nodule candidates are classified as PFNs or non-PFNs. Supervised learning is used, where a classifier is trained to label the detected nodule. The classification of the nodule in 3D is formulated as an ensemble of classifiers trained to recognize PFNs based on 2D views of the nodule. In order to describe nodule morphology in 2D views, we use the output of a pre-trained convolutional neural network known as OverFeat. We compare our approach with a recently presented descriptor of pulmonary nodule morphology, namely Bag of Frequencies, and illustrate the advantages offered by the two strategies, achieving performance of AUC = 0.868, which is close to the one of human experts. PMID:26458112

  16. Automatic classification of pulmonary peri-fissural nodules in computed tomography using an ensemble of 2D views and a convolutional neural network out-of-the-box.

    PubMed

    Ciompi, Francesco; de Hoop, Bartjan; van Riel, Sarah J; Chung, Kaman; Scholten, Ernst Th; Oudkerk, Matthijs; de Jong, Pim A; Prokop, Mathias; van Ginneken, Bram

    2015-12-01

    In this paper, we tackle the problem of automatic classification of pulmonary peri-fissural nodules (PFNs). The classification problem is formulated as a machine learning approach, where detected nodule candidates are classified as PFNs or non-PFNs. Supervised learning is used, where a classifier is trained to label the detected nodule. The classification of the nodule in 3D is formulated as an ensemble of classifiers trained to recognize PFNs based on 2D views of the nodule. In order to describe nodule morphology in 2D views, we use the output of a pre-trained convolutional neural network known as OverFeat. We compare our approach with a recently presented descriptor of pulmonary nodule morphology, namely Bag of Frequencies, and illustrate the advantages offered by the two strategies, achieving performance of AUC = 0.868, which is close to the one of human experts.

  17. Accommodative and convergence response to computer screen and printed text

    NASA Astrophysics Data System (ADS)

    Ferreira, Andreia; Lira, Madalena; Franco, Sandra

    2011-05-01

    The aim of this work was to find out if differences exist in accommodative and convergence response for different computer monitors' and a printed text. It was also tried to relate the horizontal heterophoria value and accommodative response with the symptoms associated with computer use. Two independents experiments were carried out in this study. The first experiment was measuring the accommodative response on 89 subjects using the Grand Seiko WAM-5500 (Grand Seiko Co., Ltd., Japan). The accommodative response was measured using three computer monitors: a 17-inch cathode ray tube (CRT), two liquid crystal displays LCDs, one 17-inch (LCD17) and one 15 inches (LCD15) and a printed text. The text displayed was always the same for all the subjects and tests. A second experiment aimed to measure the value of habitual horizontal heterophoria on 80 subjects using the Von Graefe technique. The measurements were obtained using the same target presented on two different computer monitors, one 19-inch cathode ray tube (CRT) and other 19 inches liquid crystal displays (LCD) and printed on paper. A small survey about the incidence and prevalence of symptoms was performed similarly in both experiments. In the first experiment, the accommodation response was higher in the CRT and LCD's than for paper. There were not found significantly different response for both LCD monitors'. The second experiment showed that, the heterophoria values were similar for all the stimuli. On average, participants presented a small exophoria. In both experiments, asthenopia was the symptom that presented higher incidence. There are different accommodative responses when reading on paper or on computer monitors. This difference is more significant for CRT monitors. On the other hand, there was no difference in the values of convergence for the computer monitors' and paper. The symptoms associated with the use of computers are not related with the increase in accommodation and with the horizontal

  18. Aniso2D

    2005-07-01

    Aniso2d is a two-dimensional seismic forward modeling code. The earth is parameterized by an X-Z plane in which the seismic properties Can have monoclinic with x-z plane symmetry. The program uses a user define time-domain wavelet to produce synthetic seismograms anrwhere within the two-dimensional media.

  19. Towards 2D nanocomposites

    NASA Astrophysics Data System (ADS)

    Jang, Hyun-Sook; Yu, Changqian; Hayes, Robert; Granick, Steve

    2015-03-01

    Polymer vesicles (``polymersomes'') are an intriguing class of soft materials, commonly used to encapsulate small molecules or particles. Here we reveal they can also effectively incorporate nanoparticles inside their polymer membrane, leading to novel ``2D nanocomposites.'' The embedded nanoparticles alter the capacity of the polymersomes to bend and to stretch upon external stimuli.

  20. Protein engineering by highly parallel screening of computationally designed variants

    PubMed Central

    Sun, Mark G. F.; Seo, Moon-Hyeong; Nim, Satra; Corbi-Verge, Carles; Kim, Philip M.

    2016-01-01

    Current combinatorial selection strategies for protein engineering have been successful at generating binders against a range of targets; however, the combinatorial nature of the libraries and their vast undersampling of sequence space inherently limit these methods due to the difficulty in finely controlling protein properties of the engineered region. Meanwhile, great advances in computational protein design that can address these issues have largely been underutilized. We describe an integrated approach that computationally designs thousands of individual protein binders for high-throughput synthesis and selection to engineer high-affinity binders. We show that a computationally designed library enriches for tight-binding variants by many orders of magnitude as compared to conventional randomization strategies. We thus demonstrate the feasibility of our approach in a proof-of-concept study and successfully obtain low-nanomolar binders using in vitro and in vivo selection systems. PMID:27453948

  1. Protein engineering by highly parallel screening of computationally designed variants.

    PubMed

    Sun, Mark G F; Seo, Moon-Hyeong; Nim, Satra; Corbi-Verge, Carles; Kim, Philip M

    2016-07-01

    Current combinatorial selection strategies for protein engineering have been successful at generating binders against a range of targets; however, the combinatorial nature of the libraries and their vast undersampling of sequence space inherently limit these methods due to the difficulty in finely controlling protein properties of the engineered region. Meanwhile, great advances in computational protein design that can address these issues have largely been underutilized. We describe an integrated approach that computationally designs thousands of individual protein binders for high-throughput synthesis and selection to engineer high-affinity binders. We show that a computationally designed library enriches for tight-binding variants by many orders of magnitude as compared to conventional randomization strategies. We thus demonstrate the feasibility of our approach in a proof-of-concept study and successfully obtain low-nanomolar binders using in vitro and in vivo selection systems. PMID:27453948

  2. Protein engineering by highly parallel screening of computationally designed variants.

    PubMed

    Sun, Mark G F; Seo, Moon-Hyeong; Nim, Satra; Corbi-Verge, Carles; Kim, Philip M

    2016-07-01

    Current combinatorial selection strategies for protein engineering have been successful at generating binders against a range of targets; however, the combinatorial nature of the libraries and their vast undersampling of sequence space inherently limit these methods due to the difficulty in finely controlling protein properties of the engineered region. Meanwhile, great advances in computational protein design that can address these issues have largely been underutilized. We describe an integrated approach that computationally designs thousands of individual protein binders for high-throughput synthesis and selection to engineer high-affinity binders. We show that a computationally designed library enriches for tight-binding variants by many orders of magnitude as compared to conventional randomization strategies. We thus demonstrate the feasibility of our approach in a proof-of-concept study and successfully obtain low-nanomolar binders using in vitro and in vivo selection systems.

  3. School Students and Computer Games with Screen Violence

    ERIC Educational Resources Information Center

    Fedorov, A. V.

    2005-01-01

    In this article, the author states how these days, school students from low-income strata of the population in Russia spend hours sitting in computer rooms and Internet clubs, where, for a relatively small fee, they can play interactive video games. And to determine what games they prefer the author conducted a content analysis of eighty-seven…

  4. China national lung cancer screening guideline with low-dose computed tomography (2015 version)

    PubMed Central

    Zhou, Qing-hua; Fan, Ya-guang; Bu, Hong; Wang, Ying; Wu, Ning; Huang, Yun-chao; Wang, Guiqi; Wang, Xin-yun; Qiao, You-lin

    2015-01-01

    Background Lung cancer is the leading cause of cancer-related death in China. Results from a randomized controlled trial using annual low-dose computed tomography (LDCT) in specific high-risk groups demonstrated a 20% reduction in lung cancer mortality. Methods A China national lung cancer screening guideline was developed by lung cancer early detection and treatment expert group appointed by the National Health and Family Planning Commission, based on results of the National Lung Screening Trial, systematic review of evidence related to LDCT screening, and protocol of lung cancer screening program conducted in rural China. Results Annual lung cancer screening with LDCT is recommended for high risk individuals aged 50–74 years who have at least a 20 pack-year smoking history and who currently smoke or have quit within the past five years. Individualized decision making should be conducted before LDCT screening. LDCT screening also represents an opportunity to educate patients as to the health risks of smoking; thus, education should be integrated into the screening process in order to assist smoking cessation. Conclusions A lung cancer screening guideline is provided for the high-risk population in China. PMID:26557925

  5. Diagnostic Accuracy of Digital Screening Mammography with and without Computer-aided Detection

    PubMed Central

    Lehman, Constance D.; Wellman, Robert D.; Buist, Diana S.M.; Kerlikowske, Karla; Tosteson, Anna N. A.; Miglioretti, Diana L.

    2016-01-01

    Importance After the Food and Drug Administration (FDA) approved computer-aided detection (CAD) for mammography in 1998, and Centers for Medicare and Medicaid Services (CMS) provided increased payment in 2002, CAD technology disseminated rapidly. Despite sparse evidence that CAD improves accuracy of mammographic interpretations, and costs over $400 million dollars a year, CAD is currently used for the majority of screening mammograms in the U.S. Objective To measure performance of digital screening mammography with and without computer-aided detection in U.S. community practice. Design, Setting and Participants We compared the accuracy of digital screening mammography interpreted with (N=495,818) vs. without (N=129,807) computer-aided detection from 2003 through 2009 in 323,973 women. Mammograms were interpreted by 271 radiologists from 66 facilities in the Breast Cancer Surveillance Consortium. Linkage with tumor registries identified 3,159 breast cancers in 323,973 women within one year of the screening. Main Outcomes and Measures Mammography performance (sensitivity, specificity, and screen detected and interval cancers per 1,000 women) was modeled using logistic regression with radiologist-specific random effects to account for correlation among examinations interpreted by the same radiologist, adjusting for patient age, race/ethnicity, time since prior mammogram, exam year, and registry. Conditional logistic regression was used to compare performance among 107 radiologists who interpreted mammograms both with and without computer-aided detection. Results Screening performance was not improved with computer-aided detection on any metric assessed. Mammography sensitivity was 85.3% (95% confidence interval [CI]=83.6–86.9) with and 87.3% (95% CI 84.5–89.7) without computer-aided detection. Specificity was 91.6% (95% CI=91.0–92.2) with and 91.4% (95% CI=90.6–92.0) without computer-aided detection. There was no difference in cancer detection rate (4

  6. Lung Cancer Screening with Low-Dose Computed Tomography for Primary Care Providers

    PubMed Central

    Richards, Thomas B.; White, Mary C.; Caraballo, Ralph S.

    2015-01-01

    This review provides an update on lung cancer screening with low-dose computed tomography (LDCT) and its implications for primary care providers. One of the unique features of lung cancer screening is the potential complexity in patient management if an LDCT scan reveals a small pulmonary nodule. Additional tests, consultation with multiple specialists, and follow-up evaluations may be needed to evaluate whether lung cancer is present. Primary care providers should know the resources available in their communities for lung cancer screening with LDCT and smoking cessation, and the key points to be addressed in informed and shared decision-making discussions with patients. PMID:24830610

  7. A Computational model for compressed sensing RNAi cellular screening

    PubMed Central

    2012-01-01

    Background RNA interference (RNAi) becomes an increasingly important and effective genetic tool to study the function of target genes by suppressing specific genes of interest. This system approach helps identify signaling pathways and cellular phase types by tracking intensity and/or morphological changes of cells. The traditional RNAi screening scheme, in which one siRNA is designed to knockdown one specific mRNA target, needs a large library of siRNAs and turns out to be time-consuming and expensive. Results In this paper, we propose a conceptual model, called compressed sensing RNAi (csRNAi), which employs a unique combination of group of small interfering RNAs (siRNAs) to knockdown a much larger size of genes. This strategy is based on the fact that one gene can be partially bound with several small interfering RNAs (siRNAs) and conversely, one siRNA can bind to a few genes with distinct binding affinity. This model constructs a multi-to-multi correspondence between siRNAs and their targets, with siRNAs much fewer than mRNA targets, compared with the conventional scheme. Mathematically this problem involves an underdetermined system of equations (linear or nonlinear), which is ill-posed in general. However, the recently developed compressed sensing (CS) theory can solve this problem. We present a mathematical model to describe the csRNAi system based on both CS theory and biological concerns. To build this model, we first search nucleotide motifs in a target gene set. Then we propose a machine learning based method to find the effective siRNAs with novel features, such as image features and speech features to describe an siRNA sequence. Numerical simulations show that we can reduce the siRNA library to one third of that in the conventional scheme. In addition, the features to describe siRNAs outperform the existing ones substantially. Conclusions This csRNAi system is very promising in saving both time and cost for large-scale RNAi screening experiments which

  8. Mesh2d

    2011-12-31

    Mesh2d is a Fortran90 program designed to generate two-dimensional structured grids of the form [x(i),y(i,j)] where [x,y] are grid coordinates identified by indices (i,j). The x(i) coordinates alone can be used to specify a one-dimensional grid. Because the x-coordinates vary only with the i index, a two-dimensional grid is composed in part of straight vertical lines. However, the nominally horizontal y(i,j0) coordinates along index i are permitted to undulate or otherwise vary. Mesh2d also assignsmore » an integer material type to each grid cell, mtyp(i,j), in a user-specified manner. The complete grid is specified through three separate input files defining the x(i), y(i,j), and mtyp(i,j) variations.« less

  9. Molecular dynamics-based virtual screening: accelerating the drug discovery process by high-performance computing.

    PubMed

    Ge, Hu; Wang, Yu; Li, Chanjuan; Chen, Nanhao; Xie, Yufang; Xu, Mengyan; He, Yingyan; Gu, Xinchun; Wu, Ruibo; Gu, Qiong; Zeng, Liang; Xu, Jun

    2013-10-28

    High-performance computing (HPC) has become a state strategic technology in a number of countries. One hypothesis is that HPC can accelerate biopharmaceutical innovation. Our experimental data demonstrate that HPC can significantly accelerate biopharmaceutical innovation by employing molecular dynamics-based virtual screening (MDVS). Without using HPC, MDVS for a 10K compound library with tens of nanoseconds of MD simulations requires years of computer time. In contrast, a state of the art HPC can be 600 times faster than an eight-core PC server is in screening a typical drug target (which contains about 40K atoms). Also, careful design of the GPU/CPU architecture can reduce the HPC costs. However, the communication cost of parallel computing is a bottleneck that acts as the main limit of further virtual screening improvements for drug innovations.

  10. 2d index and surface operators

    NASA Astrophysics Data System (ADS)

    Gadde, Abhijit; Gukov, Sergei

    2014-03-01

    In this paper we compute the superconformal index of 2d (2, 2) supersymmetric gauge theories. The 2d superconformal index, a.k.a. flavored elliptic genus, is computed by a unitary matrix integral much like the matrix integral that computes the 4d superconformal index. We compute the 2d index explicitly for a number of examples. In the case of abelian gauge theories we see that the index is invariant under flop transition and under CY-LG correspondence. The index also provides a powerful check of the Seiberg-type duality for non-abelian gauge theories discovered by Hori and Tong. In the later half of the paper, we study half-BPS surface operators in = 2 super-conformal gauge theories. They are engineered by coupling the 2d (2, 2) supersymmetric gauge theory living on the support of the surface operator to the 4d = 2 theory, so that different realizations of the same surface operator with a given Levi type are related by a 2d analogue of the Seiberg duality. The index of this coupled system is computed by using the tools developed in the first half of the paper. The superconformal index in the presence of surface defect is expected to be invariant under generalized S-duality. We demonstrate that it is indeed the case. In doing so the Seiberg-type duality of the 2d theory plays an important role.

  11. Computational screening of oxetane monomers for novel hydroxy terminated polyethers.

    PubMed

    Sarangapani, Radhakrishnan; Ghule, Vikas D; Sikder, Arun K

    2014-06-01

    Energetic hydroxy terminated polyether prepolymers find paramount importance in search of energetic binders for propellant applications. In the present study, density functional theory (DFT) has been employed to screen the various novel energetic oxetane derivatives, which usually construct the backbone for these energetic polymers. Molecular structures were investigated at the B3LYP/6-31G* level, and isodesmic reactions were designed for calculating the gas phase heats of formation. The condensed phase heats of formation for designed compounds were calculated by the Politzer approach using heats of sublimation. Among the designed oxetane derivatives, T4 and T5 possess condensed phase heat of formation above 210 kJ mol(-1). The crystal packing density of the designed oxetane derivatives varied from 1.2 to 1.6 g/cm(3). The detonation velocities and pressures were evaluated using the Kamlet-Jacobs equations, utilizing the predicted densities and HOFCond. It was found that most of the designed oxetane derivatives have detonation performance comparable to the monomers of benchmark energetic polymers viz., NIMMO, AMMO, and BAMO. The strain energy (SE) for the oxetane derivatives were calculated using homodesmotic reactions, while intramolecular group interactions were predicted through the disproportionation energies. The concept of chemical hardness is used to analyze the susceptibility of designed compounds to reactivity and chemical transformations. The heats of formation, density, and predicted performance imply that the designed molecules are expected to be candidates for polymer synthesis and potential molecules for energetic binders.

  12. Computational screening of organic materials towards improved photovoltaic properties

    NASA Astrophysics Data System (ADS)

    Dai, Shuo; Olivares-Amaya, Roberto; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan; Borunda, Mario

    2015-03-01

    The world today faces an energy crisis that is an obstruction to the development of the human civilization. One of the most promising solutions is solar energy harvested by economical solar cells. Being the third generation of solar cell materials, organic photovoltaic (OPV) materials is now under active development from both theoretical and experimental points of view. In this study, we constructed a parameter to select the desired molecules based on their optical spectra performance. We applied it to investigate a large collection of potential OPV materials, which were from the CEPDB database set up by the Harvard Clean Energy Project. Time dependent density functional theory (TD-DFT) modeling was used to calculate the absorption spectra of the molecules. Then based on the parameter, we screened out the top performing molecules for their potential OPV usage and suggested experimental efforts toward their synthesis. In addition, from those molecules, we summarized the functional groups that provided molecules certain spectrum capability. It is hoped that useful information could be mined out to provide hints to molecular design of OPV materials.

  13. Patients and Computers as Reminders to Screen for Diabetes in Family Practice

    PubMed Central

    Kenealy, Tim; Arroll, Bruce; Petrie, Keith J

    2005-01-01

    Background In New Zealand, more than 5% of people aged 50 years and older have undiagnosed diabetes; most of them attend family practitioners (FPs) at least once a year. Objectives To test the effectiveness of patients or computers as reminders to screen for diabetes in patients attending FPs. Design A randomized-controlled trial compared screening rates in 4 intervention arms: patient reminders, computer reminders, both reminders, and usual care. The trial lasted 2 months. The patient reminder was a diabetes risk self-assessment sheet filled in by patients and given to the FP during the consultation. The computer reminder was an icon that flashed only for patients considered eligible for screening. Participants One hundred and seven FPs. Measurements The primary outcome was whether each eligible patient, who attended during the trial, was or was not tested for blood glucose. Analysis was by intention to treat and allowed for clustering by FP. Results Patient reminders (odds ratio [OR] 1.72, 95% confidence interval [CI] 1.21, 2.43), computer reminders (OR 2.55, 1.68, 3.88), and both reminders (OR 1.69, 1.11, 2.59) were all effective compared with usual care. Computer reminders were more effective than patient reminders (OR 1.49, 1.07, 2.07). Patients were more likely to be screened if they visited the FP repeatedly, if patients were non-European, if they were “regular” patients of the practice, and if their FP had a higher screening rate prior to the study. Conclusions Patient and computer reminders were effective methods to increase screening for diabetes. However, the effects were not additive. PMID:16191138

  14. Performance of linear and nonlinear texture measures in 2D and 3D for monitoring architectural changes in osteoporosis using computer-generated models of trabecular bone

    NASA Astrophysics Data System (ADS)

    Boehm, Holger F.; Link, Thomas M.; Monetti, Roberto A.; Mueller, Dirk; Rummeny, Ernst J.; Raeth, Christoph W.

    2005-04-01

    Osteoporosis is a metabolic bone disease leading to de-mineralization and increased risk of fracture. The two major factors that determine the biomechanical competence of bone are the degree of mineralization and the micro-architectural integrity. Today, modern imaging modalities (high resolution MRI, micro-CT) are capable of depicting structural details of trabecular bone tissue. From the image data, structural properties obtained by quantitative measures are analysed with respect to the presence of osteoporotic fractures of the spine (in-vivo) or correlated with biomechanical strength as derived from destructive testing (in-vitro). Fairly well established are linear structural measures in 2D that are originally adopted from standard histo-morphometry. Recently, non-linear techniques in 2D and 3D based on the scaling index method (SIM), the standard Hough transform (SHT), and the Minkowski Functionals (MF) have been introduced, which show excellent performance in predicting bone strength and fracture risk. However, little is known about the performance of the various parameters with respect to monitoring structural changes due to progression of osteoporosis or as a result of medical treatment. In this contribution, we generate models of trabecular bone with pre-defined structural properties which are exposed to simulated osteoclastic activity. We apply linear and non-linear texture measures to the models and analyse their performance with respect to detecting architectural changes. This study demonstrates, that the texture measures are capable of monitoring structural changes of complex model data. The diagnostic potential varies for the different parameters and is found to depend on the topological composition of the model and initial "bone density". In our models, non-linear texture measures tend to react more sensitively to small structural changes than linear measures. Best performance is observed for the 3rd and 4th Minkowski Functionals and for the scaling

  15. Computer assisted screening, correction, and analysis of historical weather measurements

    NASA Astrophysics Data System (ADS)

    Burnette, Dorian J.; Stahle, David W.

    2013-04-01

    A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.

  16. 2-D Animation's Not Just for Mickey Mouse.

    ERIC Educational Resources Information Center

    Weinman, Lynda

    1995-01-01

    Discusses characteristics of two-dimensional (2-D) animation; highlights include character animation, painting issues, and motion graphics. Sidebars present Silicon Graphics animations tools and 2-D animation programs for the desktop computer. (DGM)

  17. Automated computational screening of the thiol reactivity of substituted alkenes.

    PubMed

    Smith, Jennifer M; Rowley, Christopher N

    2015-08-01

    Electrophilic olefins can react with the S-H moiety of cysteine side chains. The formation of a covalent adduct through this mechanism can result in the inhibition of an enzyme. The reactivity of an olefin towards cysteine depends on its functional groups. In this study, 325 reactions of thiol-Michael-type additions to olefins were modeled using density functional theory. All combinations of ethenes with hydrogen, methyl ester, amide, and cyano substituents were included. An automated workflow was developed to perform the construction, conformation search, minimization, and calculation of molecular properties for the reactant, carbanion intermediate, and thioether products for a model reaction of the addition of methanethiol to the electrophile. Known cysteine-reactive electrophiles present in the database were predicted to react exergonically with methanethiol through a carbanion with a stability in the 30-40 kcal mol(-1) range. 13 other compounds in our database that are also present in the PubChem database have similar properties. Natural bond orbital parameters were computed and regression analysis was used to determine the relationship between properties of the olefin electronic structure and the product and intermediate stability. The stability of the intermediates is very sensitive to electronic effects on the carbon where the anionic charge is centered. The stability of the products is more sensitive to steric factors.

  18. Computer-Based Screening of Functional Conformers of Proteins

    PubMed Central

    Montiel Molina, Héctor Marlosti; Millán-Pacheco, César; Pastor, Nina; del Rio, Gabriel

    2008-01-01

    A long-standing goal in biology is to establish the link between function, structure, and dynamics of proteins. Considering that protein function at the molecular level is understood by the ability of proteins to bind to other molecules, the limited structural data of proteins in association with other bio-molecules represents a major hurdle to understanding protein function at the structural level. Recent reports show that protein function can be linked to protein structure and dynamics through network centrality analysis, suggesting that the structures of proteins bound to natural ligands may be inferred computationally. In the present work, a new method is described to discriminate protein conformations relevant to the specific recognition of a ligand. The method relies on a scoring system that matches critical residues with central residues in different structures of a given protein. Central residues are the most traversed residues with the same frequency in networks derived from protein structures. We tested our method in a set of 24 different proteins and more than 260,000 structures of these in the absence of a ligand or bound to it. To illustrate the usefulness of our method in the study of the structure/dynamics/function relationship of proteins, we analyzed mutants of the yeast TATA-binding protein with impaired DNA binding. Our results indicate that critical residues for an interaction are preferentially found as central residues of protein structures in complex with a ligand. Thus, our scoring system effectively distinguishes protein conformations relevant to the function of interest. PMID:18463705

  19. Early detection of lung cancer: Low-dose computed tomography screening in China

    PubMed Central

    Zhao, Shi-Jun; Wu, Ning

    2015-01-01

    Lung cancer is currently the leading cause of cancer-related death in China and western countries for both men and women. Overall, the five-year survival rate of lung cancer is approximately 15%, whereas the five-year survival for patients with surgically resected early-stage disease is 60–80%. Screening is conceptually a good strategy for reducing the mortality rate of lung cancer. Randomized controlled trials in the 1960s and 1970s found that chest radiographic screening did not result in a reduction in mortality for high-risk individuals. Recently published data from the National Lung Screening Trial (NLST) showed a 20% reduction in lung cancer mortality in subjects who underwent low-dose computed tomography (LDCT) screening compared to those randomized to conventional chest X-ray. The encouraging results of the NLST, however, could not be confirmed by the preliminary results of ongoing European trials. More results from European randomized controlled trials are expected in the next few years. Recently, a number of lung cancer screening studies using LDCT have been initiated in China. This article briefly summarizes the results of the current and previous lung cancer screening trials worldwide, and focuses on the current status of LDCT lung cancer screening in China. PMID:26273391

  20. Computer Decision Support to Improve Autism Screening and Care in Community Pediatric Clinics

    ERIC Educational Resources Information Center

    Bauer, Nerissa S.; Sturm, Lynne A.; Carroll, Aaron E.; Downs, Stephen M.

    2013-01-01

    An autism module was added to an existing computer decision support system (CDSS) to facilitate adherence to recommended guidelines for screening for autism spectrum disorders in primary care pediatric clinics. User satisfaction was assessed by survey and informal feedback at monthly meetings between clinical staff and the software team. To assess…

  1. Designing Multimedia Learning Application with Learning Theories: A Case Study on a Computer Science Subject with 2-D and 3-D Animated Versions

    ERIC Educational Resources Information Center

    Rias, Riaza Mohd; Zaman, Halimah Badioze

    2011-01-01

    Higher learning based instruction may be primarily concerned in most cases with the content of their academic lessons, and not very much with their instructional delivery. However, the effective application of learning theories and technology in higher education has an impact on student performance. With the rapid progress in the computer and…

  2. Patient Perspectives on Low-Dose Computed Tomography for Lung Cancer Screening, New Mexico, 2014

    PubMed Central

    Sussman, Andrew L.; Murrietta, Ambroshia M.; Getrich, Christina M.; Rhyne, Robert; Crowell, Richard E.; Taylor, Kathryn L.; Reifler, Ellen J.; Wescott, Pamela H.; Saeed, Ali I.; Hoffman, Richard M.

    2016-01-01

    Introduction National guidelines call for annual lung cancer screening for high-risk smokers using low-dose computed tomography (LDCT). The objective of our study was to characterize patient knowledge and attitudes about lung cancer screening, smoking cessation, and shared decision making by patient and health care provider. Methods We conducted semistructured qualitative interviews with patients with histories of heavy smoking who received care at a Federally Qualified Health Center (FQHC Clinic) and at a comprehensive cancer center-affiliated chest clinic (Chest Clinic) in Albuquerque, New Mexico. The interviews, conducted from February through September 2014, focused on perceptions about health screening, knowledge and attitudes about LDCT screening, and preferences regarding decision aids. We used a systematic iterative analytic process to identify preliminary and emergent themes and to create a coding structure. Results We reached thematic saturation after 22 interviews (10 at the FQHC Clinic, 12 at the Chest Clinic). Most patients were unaware of LDCT screening for lung cancer but were receptive to the test. Some smokers said they would consider quitting smoking if their screening result were positive. Concerns regarding screening were cost, radiation exposure, and transportation issues. To support decision making, most patients said they preferred one-on-one discussions with a provider. They also valued decision support tools (print materials, videos), but raised concerns about readability and Internet access. Conclusion Implementing lung cancer screening in sociodemographically diverse populations poses significant challenges. The value of tobacco cessation counseling cannot be overemphasized. Effective interventions for shared decision making to undergo lung cancer screening will need the active engagement of health care providers and will require the use of accessible decision aids designed for people with low health literacy. PMID:27536900

  3. Designing specific protein–protein interactions using computation, experimental library screening, or integrated methods

    PubMed Central

    Chen, T Scott; Keating, Amy E

    2012-01-01

    Given the importance of protein–protein interactions for nearly all biological processes, the design of protein affinity reagents for use in research, diagnosis or therapy is an important endeavor. Engineered proteins would ideally have high specificities for their intended targets, but achieving interaction specificity by design can be challenging. There are two major approaches to protein design or redesign. Most commonly, proteins and peptides are engineered using experimental library screening and/or in vitro evolution. An alternative approach involves using protein structure and computational modeling to rationally choose sequences predicted to have desirable properties. Computational design has successfully produced novel proteins with enhanced stability, desired interactions and enzymatic function. Here we review the strengths and limitations of experimental library screening and computational structure-based design, giving examples where these methods have been applied to designing protein interaction specificity. We highlight recent studies that demonstrate strategies for combining computational modeling with library screening. The computational methods provide focused libraries predicted to be enriched in sequences with the properties of interest. Such integrated approaches represent a promising way to increase the efficiency of protein design and to engineer complex functionality such as interaction specificity. PMID:22593041

  4. Cognitive context detection in UAS operators using eye-gaze patterns on computer screens

    NASA Astrophysics Data System (ADS)

    Mannaru, Pujitha; Balasingam, Balakumar; Pattipati, Krishna; Sibley, Ciara; Coyne, Joseph

    2016-05-01

    In this paper, we demonstrate the use of eye-gaze metrics of unmanned aerial systems (UAS) operators as effective indices of their cognitive workload. Our analyses are based on an experiment where twenty participants performed pre-scripted UAS missions of three different difficulty levels by interacting with two custom designed graphical user interfaces (GUIs) that are displayed side by side. First, we compute several eye-gaze metrics, traditional eye movement metrics as well as newly proposed ones, and analyze their effectiveness as cognitive classifiers. Most of the eye-gaze metrics are computed by dividing the computer screen into "cells". Then, we perform several analyses in order to select metrics for effective cognitive context classification related to our specific application; the objective of these analyses are to (i) identify appropriate ways to divide the screen into cells; (ii) select appropriate metrics for training and classification of cognitive features; and (iii) identify a suitable classification method.

  5. Reviewing risks and benefits of low-dose computed tomography screening for lung cancer.

    PubMed

    Chopra, Ishveen; Chopra, Avijeet; Bias, Thomas K

    2016-01-01

    Lung cancer is the third most common cancer among men and women and is one of the leading causes of cancer-related mortality. Diagnosis at an early stage has been suggested crucial for improving survival in individuals at high-risk of lung cancer. One potential facilitator to early diagnosis is low-dose computed tomography (LDCT). The United States Preventive Services Task Force guidelines call for annual LDCT screening for individuals at high-risk of lung cancer. This recommendation was based on the effectiveness of LDCT in early diagnosis of lung cancer, as indicated by the findings from the National Lung Screening Trial conducted in 2011. Although lung cancer accounts for more than a quarter of all cancer deaths in the United States and LDCT screening shows promising results regarding early lung cancer diagnosis, screening for lung cancer remains controversial. There is uncertainty about risks, cost-effectiveness, adequacy of evidence, and application of screening in a clinical setting. This narrative review provides an overview of risks and benefits of LDCT screening for lung cancer. Further, this review discusses the potential for implementation of LDCT in clinical setting. PMID:26680693

  6. Optimisation and assessment of three modern touch screen tablet computers for clinical vision testing.

    PubMed

    Tahir, Humza J; Murray, Ian J; Parry, Neil R A; Aslam, Tariq M

    2014-01-01

    Technological advances have led to the development of powerful yet portable tablet computers whose touch-screen resolutions now permit the presentation of targets small enough to test the limits of normal visual acuity. Such devices have become ubiquitous in daily life and are moving into the clinical space. However, in order to produce clinically valid tests, it is important to identify the limits imposed by the screen characteristics, such as resolution, brightness uniformity, contrast linearity and the effect of viewing angle. Previously we have conducted such tests on the iPad 3. Here we extend our investigations to 2 other devices and outline a protocol for calibrating such screens, using standardised methods to measure the gamma function, warm up time, screen uniformity and the effects of viewing angle and screen reflections. We demonstrate that all three devices manifest typical gamma functions for voltage and luminance with warm up times of approximately 15 minutes. However, there were differences in homogeneity and reflectance among the displays. We suggest practical means to optimise quality of display for vision testing including screen calibration.

  7. Comparing Benefits from Many Possible Computed Tomography Lung Cancer Screening Programs: Extrapolating from the National Lung Screening Trial Using Comparative Modeling

    PubMed Central

    McMahon, Pamela M.; Meza, Rafael; Plevritis, Sylvia K.; Black, William C.; Tammemagi, C. Martin; Erdogan, Ayca; ten Haaf, Kevin; Hazelton, William; Holford, Theodore R.; Jeon, Jihyoun; Clarke, Lauren; Kong, Chung Yin; Choi, Sung Eun; Munshi, Vidit N.; Han, Summer S.; van Rosmalen, Joost; Pinsky, Paul F.; Moolgavkar, Suresh

    2014-01-01

    Background The National Lung Screening Trial (NLST) demonstrated that in current and former smokers aged 55 to 74 years, with at least 30 pack-years of cigarette smoking history and who had quit smoking no more than 15 years ago, 3 annual computed tomography (CT) screens reduced lung cancer-specific mortality by 20% relative to 3 annual chest X-ray screens. We compared the benefits achievable with 576 lung cancer screening programs that varied CT screen number and frequency, ages of screening, and eligibility based on smoking. Methods and Findings We used five independent microsimulation models with lung cancer natural history parameters previously calibrated to the NLST to simulate life histories of the US cohort born in 1950 under all 576 programs. ‘Efficient’ (within model) programs prevented the greatest number of lung cancer deaths, compared to no screening, for a given number of CT screens. Among 120 ‘consensus efficient’ (identified as efficient across models) programs, the average starting age was 55 years, the stopping age was 80 or 85 years, the average minimum pack-years was 27, and the maximum years since quitting was 20. Among consensus efficient programs, 11% to 40% of the cohort was screened, and 153 to 846 lung cancer deaths were averted per 100,000 people. In all models, annual screening based on age and smoking eligibility in NLST was not efficient; continuing screening to age 80 or 85 years was more efficient. Conclusions Consensus results from five models identified a set of efficient screening programs that include annual CT lung cancer screening using criteria like NLST eligibility but extended to older ages. Guidelines for screening should also consider harms of screening and individual patient characteristics. PMID:24979231

  8. Brain-Computer Interfaces for 1-D and 2-D Cursor Control: Designs Using Volitional Control of the EEG Spectrum or Steady-State Visual Evoked Potentials

    NASA Technical Reports Server (NTRS)

    Trejo, Leonard J.; Matthews, Bryan; Rosipal, Roman

    2005-01-01

    We have developed and tested two EEG-based brain-computer interfaces (BCI) for users to control a cursor on a computer display. Our system uses an adaptive algorithm, based on kernel partial least squares classification (KPLS), to associate patterns in multichannel EEG frequency spectra with cursor controls. Our first BCI, Target Practice, is a system for one-dimensional device control, in which participants use biofeedback to learn voluntary control of their EEG spectra. Target Practice uses a KF LS classifier to map power spectra of 30-electrode EEG signals to rightward or leftward position of a moving cursor on a computer display. Three subjects learned to control motion of a cursor on a video display in multiple blocks of 60 trials over periods of up to six weeks. The best subject s average skill in correct selection of the cursor direction grew from 58% to 88% after 13 training sessions. Target Practice also implements online control of two artifact sources: a) removal of ocular artifact by linear subtraction of wavelet-smoothed vertical and horizontal EOG signals, b) control of muscle artifact by inhibition of BCI training during periods of relatively high power in the 40-64 Hz band. The second BCI, Think Pointer, is a system for two-dimensional cursor control. Steady-state visual evoked potentials (SSVEP) are triggered by four flickering checkerboard stimuli located in narrow strips at each edge of the display. The user attends to one of the four beacons to initiate motion in the desired direction. The SSVEP signals are recorded from eight electrodes located over the occipital region. A KPLS classifier is individually calibrated to map multichannel frequency bands of the SSVEP signals to right-left or up-down motion of a cursor on a computer display. The display stops moving when the user attends to a central fixation point. As for Target Practice, Think Pointer also implements wavelet-based online removal of ocular artifact; however, in Think Pointer muscle

  9. Improved CUDA programs for GPU computing of Swendsen-Wang multi-cluster spin flip algorithm: 2D and 3D Ising, Potts, and XY models

    NASA Astrophysics Data System (ADS)

    Komura, Yukihiro; Okabe, Yutaka

    2016-03-01

    We present new versions of sample CUDA programs for the GPU computing of the Swendsen-Wang multi-cluster spin flip algorithm. In this update, we add the method of GPU-based cluster-labeling algorithm without the use of conventional iteration (Komura, 2015) to those programs. For high-precision calculations, we also add a random-number generator in the cuRAND library. Moreover, we fix several bugs and remove the extra usage of shared memory in the kernel functions.

  10. The Lung Reporting and Data System (LU-RADS): a proposal for computed tomography screening.

    PubMed

    Manos, Daria; Seely, Jean M; Taylor, Jana; Borgaonkar, Joy; Roberts, Heidi C; Mayo, John R

    2014-05-01

    Despite the positive outcome of the recent randomized trial of computed tomography (CT) screening for lung cancer, substantial implementation challenges remain, including the clear reporting of relative risk and suggested workup of screen-detected nodules. Based on current literature, we propose a 6-level Lung-Reporting and Data System (LU-RADS) that classifies screening CTs by the nodule with the highest malignancy risk. As the LU-RADS level increases, the risk of malignancy increases. The LU-RADS level is linked directly to suggested follow-up pathways. Compared with current narrative reporting, this structure should improve communication with patients and clinicians, and provide a data collection framework to facilitate screening program evaluation and radiologist training. In overview, category 1 includes CTs with no nodules and returns the subject to routine screening. Category 2 scans harbor minimal risk, including <5 mm, perifissural, or long-term stable nodules that require no further workup before the next routine screening CT. Category 3 scans contain indeterminate nodules and require CT follow up with the interval dependent on nodule size (small [5-9 mm] or large [≥ 10 mm] and possibly transient). Category 4 scans are suspicious and are subdivided into 4A, low risk of malignancy; 4B, likely low-grade adenocarcinoma; and 4C, likely malignant. The 4B and 4C nodules have a high likelihood of neoplasm simply based on screening CT features, even if positron emission tomography, needle biopsy, and/or bronchoscopy are negative. Category 5 nodules demonstrate frankly malignant behavior on screening CT, and category 6 scans contain tissue-proven malignancies.

  11. Computed tomography imaging spectrometer (CTIS) with 2D reflective grating for ultraviolet to long-wave infrared detection especially useful for surveying transient events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for rapidly occurring events it is also useful for investigation of some slow moving phenomena as in the life sciences.

  12. Comparison of 2D and 3D Computational Multiphase Fluid Flow Models of Oxygen Lancing of Pyrometallurgical Furnace Tap-Holes

    NASA Astrophysics Data System (ADS)

    Erwee, M. W.; Reynolds, Q. G.; Zietsman, J. H.

    2016-06-01

    Furnace tap-holes vary in design depending on the type of furnace and process involved, but they share one common trait: The tap-hole must be opened and closed periodically. In general, tap-holes are plugged with refractory clay after tapping, thereby stopping the flow of molten material. Once a furnace is ready to be tapped, drilling and/or lancing with oxygen are typically used to remove tap-hole clay from the tap-hole. Lancing with oxygen is an energy-intensive, mostly manual process, which affects the performance and longevity of the tap-hole refractory material as well as the processes inside the furnace. Computational modeling offers an opportunity to gain insight into the possible effects of oxygen lancing on various aspects of furnace operation.

  13. Computed Tomography Imaging Spectrometer (CTIS) with 2D Reflective Grating for Ultraviolet to Long-Wave Infrared Detection Especially Useful for Surveying Transient Events

    NASA Technical Reports Server (NTRS)

    Wilson, Daniel W. (Inventor); Maker, Paul D. (Inventor); Muller, Richard E. (Inventor); Mouroulis, Pantazis Z. (Inventor)

    2003-01-01

    The optical system of this invention is an unique type of imaging spectrometer, i.e. an instrument that can determine the spectra of all points in a two-dimensional scene. The general type of imaging spectrometer under which this invention falls has been termed a computed-tomography imaging spectrometer (CTIS). CTIS's have the ability to perform spectral imaging of scenes containing rapidly moving objects or evolving features, hereafter referred to as transient scenes. This invention, a reflective CTIS with an unique two-dimensional reflective grating, can operate in any wavelength band from the ultraviolet through long-wave infrared. Although this spectrometer is especially useful for events it is also for investigation of some slow moving phenomena as in the life sciences.

  14. A computer program for the 2-D magnetostatic problem based on integral equations for the field of the conductors and boundary elements

    SciTech Connect

    Morgan, G.H. )

    1992-01-01

    This paper reports on the iterative design of the 2-dimensional cross section of a beam transport magnet having infinitely permeable iron boundaries which requires a fast means of computing the field of the conductors. Solutions in the form of series expansions are used for rectangular iron boundaries, and programs based on the method of images are used to simulate circular iron boundaries. A single procedure or program for dealing with an arbitrary iron boundary would be useful. The present program has been tested with rectangular and circular iron boundaries and provision has been made for the use of other curves. It uses complex contour integral equations for the field of the constant-current density conductors and complex line integrals for the field of the piecewise-linear boundary elements.

  15. Creative Computing with Landlab: Open-Source Python Software for Building and Exploring 2D Models of Earth-Surface Dynamics

    NASA Astrophysics Data System (ADS)

    Tucker, G. E.; Hobley, D. E.; Gasparini, N. M.; Hutton, E.; Istanbulluoglu, E.; Nudurupati, S.; Adams, J. M.

    2013-12-01

    Computer models help us explore the consequences of scientific hypotheses at a level of precision and quantification that is impossible for our unaided minds. The process of writing and debugging the necessary code is often time-consuming, however, and this cost can inhibit progress. The code-development barrier can be especially problematic when a field is rapidly unearthing new data and new ideas, as is presently the case in surface dynamics. To help meet the need for rapid, flexible model development, we have written a prototype software framework for two-dimensional numerical modeling of planetary surface processes. The Landlab software can be used to develop new models from scratch, to create models from existing components, or a combination of the two. Landlab provides a gridding module that allows you to create and configure a model grid in just a few lines of code. Grids can be regular or unstructured, and can readily be used to implement staggered-grid numerical solutions to equations for various types of geophysical flow. The gridding module provides built-in functions for common numerical operations, such as calculating gradients and integrating fluxes around the perimeter of cells. Landlab is written in Python, a high-level language that enables rapid code development and takes advantage of a wealth of libraries for scientific computing and graphical output. Landlab also provides a framework for assembling new models from combinations of pre-built components. This capability is illustrated with several examples, including flood inundation, long-term landscape evolution, impact cratering, post-wildfire erosion, and ecohydrology. Interoperability with the Community Surface Dynamics Modeling System (CSDMS) Model-Coupling Framework allows models created in Landlab to be combined with other CSDMS models, which helps to bring frontier problems in landscape and seascape dynamics within closer theoretical reach.

  16. Feasibility of Tablet Computer Screening for Opioid Abuse in the Emergency Department

    PubMed Central

    Weiner, Scott G.; Horton, Laura C.; Green, Traci C.; Butler, Stephen F.

    2015-01-01

    Introduction Tablet computer-based screening may have the potential for detecting patients at risk for opioid abuse in the emergency department (ED). Study objectives were a) to determine if the revised Screener and Opioid Assessment for Patients with Pain (SOAPP®-R), a 24-question previously paper-based screening tool for opioid abuse potential, could be administered on a tablet computer to an ED patient population; b) to demonstrate that >90% of patients can complete the electronic screener without assistance in <5 minutes and; c) to determine patient ease of use with screening on a tablet computer. Methods This was a cross-sectional convenience sample study of patients seen in an urban academic ED. SOAPP®-R was programmed on a tablet computer by study investigators. Inclusion criteria were patients ages ≥18 years who were being considered for discharge with a prescription for an opioid analgesic. Exclusion criteria included inability to understand English or physical disability preventing use of the tablet. Results 93 patients were approached for inclusion and 82 (88%) provided consent. Fifty-two percent (n=43) of subjects were male; 46% (n=38) of subjects were between 18–35 years, and 54% (n=44) were >35 years. One hundred percent of subjects completed the screener. Median time to completion was 148 (interquartile range 117.5–184.3) seconds, and 95% (n=78) completed in <5 minutes. 93% (n=76) rated ease of completion as very easy. Conclusions It is feasible to administer a screening tool to a cohort of ED patients on a tablet computer. The screener administration time is minimal and patient ease of use with this modality is high. PMID:25671003

  17. 2-d Finite Element Code Postprocessor

    1996-07-15

    ORION is an interactive program that serves as a postprocessor for the analysis programs NIKE2D, DYNA2D, TOPAZ2D, and CHEMICAL TOPAZ2D. ORION reads binary plot files generated by the two-dimensional finite element codes currently used by the Methods Development Group at LLNL. Contour and color fringe plots of a large number of quantities may be displayed on meshes consisting of triangular and quadrilateral elements. ORION can compute strain measures, interface pressures along slide lines, reaction forcesmore » along constrained boundaries, and momentum. ORION has been applied to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.« less

  18. Computation of two-electron screened Coulomb potential integrals in Hylleraas basis sets

    NASA Astrophysics Data System (ADS)

    Jiao, Li Guang; Ho, Yew Kam

    2015-03-01

    The Gegenbauer expansion and Taylor expansion methods are developed to accurately and efficiently calculate the two-electron screened Coulomb potential integrals in Hylleraas basis sets. The combination of these two methods covers the entire parameter space of the integrals, including arbitrary total angular momenta, two-electron configurations, powers of inter-electronic coordinate, and complex screening parameters. Numerical examples are given and comparisons with other computational methods in some restricted situations are made. The present methods can be easily applied to calculate the bound and resonant states of two-electron atoms or exotic three-body systems embedded in the screening environment by using the Hylleraas or Hylleraas-CI basis functions.

  19. Computational protein-ligand docking and virtual drug screening with the AutoDock suite.

    PubMed

    Forli, Stefano; Huey, Ruth; Pique, Michael E; Sanner, Michel F; Goodsell, David S; Olson, Arthur J

    2016-05-01

    Computational docking can be used to predict bound conformations and free energies of binding for small-molecule ligands to macromolecular targets. Docking is widely used for the study of biomolecular interactions and mechanisms, and it is applied to structure-based drug design. The methods are fast enough to allow virtual screening of ligand libraries containing tens of thousands of compounds. This protocol covers the docking and virtual screening methods provided by the AutoDock suite of programs, including a basic docking of a drug molecule with an anticancer target, a virtual screen of this target with a small ligand library, docking with selective receptor flexibility, active site prediction and docking with explicit hydration. The entire protocol will require ∼5 h. PMID:27077332

  20. Increasing chemical space coverage by combining empirical and computational fragment screens.

    PubMed

    Barelier, Sarah; Eidam, Oliv; Fish, Inbar; Hollander, Johan; Figaroa, Francis; Nachane, Ruta; Irwin, John J; Shoichet, Brian K; Siegal, Gregg

    2014-07-18

    Most libraries for fragment-based drug discovery are restricted to 1,000-10,000 compounds, but over 500,000 fragments are commercially available and potentially accessible by virtual screening. Whether this larger set would increase chemotype coverage, and whether a computational screen can pragmatically prioritize them, is debated. To investigate this question, a 1281-fragment library was screened by nuclear magnetic resonance (NMR) against AmpC β-lactamase, and hits were confirmed by surface plasmon resonance (SPR). Nine hits with novel chemotypes were confirmed biochemically with KI values from 0.2 to low mM. We also computationally docked 290,000 purchasable fragments with chemotypes unrepresented in the empirical library, finding 10 that had KI values from 0.03 to low mM. Though less novel than those discovered by NMR, the docking-derived fragments filled chemotype holes from the empirical library. Crystal structures of nine of the fragments in complex with AmpC β-lactamase revealed new binding sites and explained the relatively high affinity of the docking-derived fragments. The existence of chemotype holes is likely a general feature of fragment libraries, as calculation suggests that to represent the fragment substructures of even known biogenic molecules would demand a library of minimally over 32,000 fragments. Combining computational and empirical fragment screens enables the discovery of unexpected chemotypes, here by the NMR screen, while capturing chemotypes missing from the empirical library and tailored to the target, with little extra cost in resources. PMID:24807704

  1. Application of a screening method in assessing occupational safety and health of computer workstations.

    PubMed

    Niskanen, Toivo; Lehtelä, Jouni; Länsikallio, Riina

    2014-01-01

    Employers and workers need concrete guidance to plan and implement changes in the ergonomics of computer workstations. The Näppärä method is a screening tool for identifying problems requiring further assessment and corrective actions. The aim of this study was to assess the work of occupational safety and health (OSH) government inspectors who used Näppärä as part of their OSH enforcement inspections (430 assessments) related to computer work. The modifications in workstation ergonomics involved mainly adjustments to the screen, mouse, keyboard, forearm supports, and chair. One output of the assessment is an index indicating the percentage of compliance items. This method can be considered as exposure assessment and ergonomics intervention used as a benchmark for the level of ergonomics. Future research can examine whether the effectiveness of participatory ergonomics interventions should be investigated with Näppärä.

  2. Developing a computer touch-screen interactive colorectal screening decision aid for a low-literacy African American population: lessons learned.

    PubMed

    Bass, Sarah Bauerle; Gordon, Thomas F; Ruzek, Sheryl Burt; Wolak, Caitlin; Ruggieri, Dominique; Mora, Gabriella; Rovito, Michael J; Britto, Johnson; Parameswaran, Lalitha; Abedin, Zainab; Ward, Stephanie; Paranjape, Anuradha; Lin, Karen; Meyer, Brian; Pitts, Khaliah

    2013-07-01

    African Americans have higher colorectal cancer (CRC) mortality than White Americans and yet have lower rates of CRC screening. Increased screening aids in early detection and higher survival rates. Coupled with low literacy rates, the burden of CRC morbidity and mortality is exacerbated in this population, making it important to develop culturally and literacy appropriate aids to help low-literacy African Americans make informed decisions about CRC screening. This article outlines the development of a low-literacy computer touch-screen colonoscopy decision aid using an innovative marketing method called perceptual mapping and message vector modeling. This method was used to mathematically model key messages for the decision aid, which were then used to modify an existing CRC screening tutorial with different messages. The final tutorial was delivered through computer touch-screen technology to increase access and ease of use for participants. Testing showed users were not only more comfortable with the touch-screen technology but were also significantly more willing to have a colonoscopy compared with a "usual care group." Results confirm the importance of including participants in planning and that the use of these innovative mapping and message design methods can lead to significant CRC screening attitude change.

  3. iScreen: world's first cloud-computing web server for virtual screening and de novo drug design based on TCM database@Taiwan.

    PubMed

    Tsai, Tsung-Ying; Chang, Kai-Wei; Chen, Calvin Yu-Chian

    2011-06-01

    The rapidly advancing researches on traditional Chinese medicine (TCM) have greatly intrigued pharmaceutical industries worldwide. To take initiative in the next generation of drug development, we constructed a cloud-computing system for TCM intelligent screening system (iScreen) based on TCM Database@Taiwan. iScreen is compacted web server for TCM docking and followed by customized de novo drug design. We further implemented a protein preparation tool that both extract protein of interest from a raw input file and estimate the size of ligand bind site. In addition, iScreen is designed in user-friendly graphic interface for users who have less experience with the command line systems. For customized docking, multiple docking services, including standard, in-water, pH environment, and flexible docking modes are implemented. Users can download first 200 TCM compounds of best docking results. For TCM de novo drug design, iScreen provides multiple molecular descriptors for a user's interest. iScreen is the world's first web server that employs world's largest TCM database for virtual screening and de novo drug design. We believe our web server can lead TCM research to a new era of drug development. The TCM docking and screening server is available at http://iScreen.cmu.edu.tw/. PMID:21647737

  4. A brief measure of Smokers' knowledge of lung cancer screening with low-dose computed tomography.

    PubMed

    Lowenstein, Lisa M; Richards, Vincent F; Leal, Viola B; Housten, Ashley J; Bevers, Therese B; Cantor, Scott B; Cinciripini, Paul M; Cofta-Woerpel, Ludmila M; Escoto, Kamisha H; Godoy, Myrna C B; Linder, Suzanne K; Munden, Reginald F; Volk, Robert J

    2016-12-01

    We describe the development and psychometric properties of a new, brief measure of smokers' knowledge of lung cancer screening with low-dose computed tomography (LDCT). Content experts identified key facts smokers should know in making an informed decision about lung cancer screening. Sample questions were drafted and iteratively refined based on feedback from content experts and cognitive testing with ten smokers. The resulting 16-item knowledge measure was completed by 108 heavy smokers in Houston, Texas, recruited from 12/2014 to 09/2015. Item difficulty, item discrimination, internal consistency and test-retest reliability were assessed. Group differences based upon education levels and smoking history were explored. Several items were dropped due to ceiling effects or overlapping constructs, resulting in a 12-item knowledge measure. Additional items with high item uncertainty were retained because of their importance in informed decision making about lung cancer screening. Internal consistency reliability of the final scale was acceptable (KR-20 = 0.66) and test-retest reliability of the overall scale was 0.84 (intraclass correlation). Knowledge scores differed across education levels (F = 3.36, p = 0.04), while no differences were observed between current and former smokers (F = 1.43, p = 0.24) or among participants who met or did not meet the 30-pack-year screening eligibility criterion (F = 0.57, p = 0.45). The new measure provides a brief, valid and reliable indicator of smokers' knowledge of key concepts central to making an informed decision about lung cancer screening with LDCT, and can be part of a broader assessment of the quality of smokers' decision making about lung cancer screening. PMID:27512650

  5. Can computer-aided diagnosis (CAD) help radiologists find mammographically missed screening cancers?

    NASA Astrophysics Data System (ADS)

    Nishikawa, Robert M.; Giger, Maryellen L.; Schmidt, Robert A.; Papaioannou, John

    2001-06-01

    We present data from a pilot observer study whose goal is design a study to test the hypothesis that computer-aided diagnosis (CAD) can improve radiologists' performance in reading screening mammograms. In a prospective evaluation of our computer detection schemes, we have analyzed over 12,000 clinical exams. Retrospective review of the negative screening mammograms for all cancer cases found an indication of the cancer in 23 of these negative cases. The computer found 54% of these in our prospective testing. We added to these cases normal exams to create a dataset of 75 cases. Four radiologists experienced in mammography read the cases and gave their BI-RADS assessment and their confidence that the patient should be called back for diagnostic mammography. They did so once reading the films only and a second time reading with the computer aid. Three radiologists had no change in area under the ROC curve (mean Az of 0.73) and one improved from 0.73 to 0.78, but this difference failed to reach statistical significance (p equals 0.23). These data are being used to plan a larger more powerful study.

  6. JAC2D: A two-dimensional finite element computer program for the nonlinear quasi-static response of solids with the conjugate gradient method; Yucca Mountain Site Characterization Project

    SciTech Connect

    Biffle, J.H.; Blanford, M.L.

    1994-05-01

    JAC2D is a two-dimensional finite element program designed to solve quasi-static nonlinear mechanics problems. A set of continuum equations describes the nonlinear mechanics involving large rotation and strain. A nonlinear conjugate gradient method is used to solve the equations. The method is implemented in a two-dimensional setting with various methods for accelerating convergence. Sliding interface logic is also implemented. A four-node Lagrangian uniform strain element is used with hourglass stiffness to control the zero-energy modes. This report documents the elastic and isothermal elastic/plastic material model. Other material models, documented elsewhere, are also available. The program is vectorized for efficient performance on Cray computers. Sample problems described are the bending of a thin beam, the rotation of a unit cube, and the pressurization and thermal loading of a hollow sphere.

  7. Identification of Serine Conformers by Matrix-Isolation IR Spectroscopy Aided by Near-Infrared Laser-Induced Conformational Change, 2D Correlation Analysis, and Quantum Mechanical Anharmonic Computations.

    PubMed

    Najbauer, Eszter E; Bazsó, Gábor; Apóstolo, Rui; Fausto, Rui; Biczysko, Malgorzata; Barone, Vincenzo; Tarczay, György

    2015-08-20

    The conformers of α-serine were investigated by matrix-isolation IR spectroscopy combined with NIR laser irradiation. This method, aided by 2D correlation analysis, enabled unambiguously grouping the spectral lines to individual conformers. On the basis of comparison of at least nine experimentally observed vibrational transitions of each conformer with empirically scaled (SQM) and anharmonic (GVPT2) computed IR spectra, six conformers were identified. In addition, the presence of at least one more conformer in Ar matrix was proved, and a short-lived conformer with a half-life of (3.7 ± 0.5) × 10(3) s in N2 matrix was generated by NIR irradiation. The analysis of the NIR laser-induced conversions revealed that the excitation of the stretching overtone of both the side chain and the carboxylic OH groups can effectively promote conformational changes, but remarkably different paths were observed for the two kinds of excitations. PMID:26201050

  8. High divergent 2D grating

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Ma, Jianyong; Zhou, Changhe

    2014-11-01

    A 3×3 high divergent 2D-grating with period of 3.842μm at wavelength of 850nm under normal incidence is designed and fabricated in this paper. This high divergent 2D-grating is designed by the vector theory. The Rigorous Coupled Wave Analysis (RCWA) in association with the simulated annealing (SA) is adopted to calculate and optimize this 2D-grating.The properties of this grating are also investigated by the RCWA. The diffraction angles are more than 10 degrees in the whole wavelength band, which are bigger than the traditional 2D-grating. In addition, the small period of grating increases the difficulties of fabrication. So we fabricate the 2D-gratings by direct laser writing (DLW) instead of traditional manufacturing method. Then the method of ICP etching is used to obtain the high divergent 2D-grating.

  9. Computer-Based Interview for Screening Blood Donors for Risk of HIV Transmission

    PubMed Central

    Locke, S.E.; Kowaloff, H.; Safran, C.; Slack, W.V.; Cotton, D.; Hoff, R.; Popovsky, M.; McGuff, J.; Page, P.

    1990-01-01

    Concern about the safety of the nation's blood supply continues to grow because of the expanding number of HIV-infected persons in the potential donor pool. Furthermore, the proportion of HIV-infected persons who engage in high-risk activities but who test seronegative may be higher than previously recognized. Despite improvements in HIV testing, it is doubtful that such testing alone will ever be adequate to eliminate transfusion-associated AIDS. Blood donation by recently infected persons must be reduced through improved donor screening, including direct questioning of donors about high risk behaviors. We have developed a computer-based interview that queries blood donors about factors that increase the risk of HIV transmission via blood donation. The interview was administered to 64 donors during a scheduled rest period after completing their blood donation. The interview required about nine minutes to complete. Results were analyzed to determine the donor reactions to the interview. Subjects enjoyed taking the interview, thought it was a good method for screening donors, and trusted the confidentiality of the interview. Donors believed they would be more honest with the computer interview than with a human interviewer. If automated blood donor screening helps to discourage donation by high-risk persons the rate of transfusion-associated AIDS will be reduced.

  10. Does patient time spent viewing computer-tailored colorectal cancer screening materials predict patient-reported discussion of screening with providers?

    PubMed

    Sanders, Mechelle; Fiscella, Kevin; Veazie, Peter; Dolan, James G; Jerant, Anthony

    2016-08-01

    The main aim is to examine whether patients' viewing time on information about colorectal cancer (CRC) screening before a primary care physician (PCP) visit is associated with discussion of screening options during the visit. We analyzed data from a multi-center randomized controlled trial of a tailored interactive multimedia computer program (IMCP) to activate patients to undergo CRC screening, deployed in primary care offices immediately before a visit. We employed usage time information stored in the IMCP to examine the association of patient time spent using the program with patient-reported discussion of screening during the visit, adjusting for previous CRC screening recommendation and reading speed.On average, patients spent 33 minutes on the program. In adjusted analyses, 30 minutes spent using the program was associated with a 41% increase in the odds of the patient having a discussion with their PCP (1.04, 1.59, 95% CI). In a separate analysis of the tailoring modules; the modules encouraging adherence to the tailored screening recommendation and discussion with the patient's PCP yielded significant results. Other predictors of screening discussion included better self-reported physical health and increased patient activation. Time spent on the program predicted greater patient-physician discussion of screening during a linked visit.Usage time information gathered automatically by IMCPs offers promise for objectively assessing patient engagement around a topic and predicting likelihood of discussion between patients and their clinician. PMID:27343254

  11. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    PubMed

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  12. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    PubMed Central

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  13. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    PubMed

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  14. Traditional and computer-based screening and diagnosis of reading disabilities in Greek.

    PubMed

    Protopapas, Athanassios; Skaloumbakas, Christos

    2007-01-01

    In this study, we examined the characteristics of reading disability (RD) in the seventh grade of the Greek educational system and the corresponding diagnostic practice. We presented a clinically administered assessment battery, composed of typically employed tasks, and a fully automated, computer-based assessment battery that evaluates some of the same constructs. In all, 261 children ages 12 to 14 were tested. The results of the traditional assessment indicated that RD concerns primarily slow reading and secondarily poor reading and spelling accuracy. This pattern was matched in the domains most attended to in expert student evaluation. Automatic (computer-based) screening for RD in the target age range matched expert judgment in validity and reliability in the absence of a full clinical evaluation. It is proposed that the educational needs of the middle and high school population in Greece will be best served by concentrating on reading and spelling performance--particularly fluency--employing widespread computer-based screening to partially make up for expert-personnel shortage. PMID:17274545

  15. 2D microwave imaging reflectometer electronics

    SciTech Connect

    Spear, A. G.; Domier, C. W. Hu, X.; Muscatello, C. M.; Ren, X.; Luhmann, N. C.; Tobias, B. J.

    2014-11-15

    A 2D microwave imaging reflectometer system has been developed to visualize electron density fluctuations on the DIII-D tokamak. Simultaneously illuminated at four probe frequencies, large aperture optics image reflections from four density-dependent cutoff surfaces in the plasma over an extended region of the DIII-D plasma. Localized density fluctuations in the vicinity of the plasma cutoff surfaces modulate the plasma reflections, yielding a 2D image of electron density fluctuations. Details are presented of the receiver down conversion electronics that generate the in-phase (I) and quadrature (Q) reflectometer signals from which 2D density fluctuation data are obtained. Also presented are details on the control system and backplane used to manage the electronics as well as an introduction to the computer based control program.

  16. Implementing low-dose computed tomography screening for lung cancer in Canada: implications of alternative at-risk populations, screening frequency, and duration

    PubMed Central

    Evans, W.K.; Flanagan, W.M.; Miller, A.B.; Goffin, J.R.; Memon, S.; Fitzgerald, N.; Wolfson, M.C.

    2016-01-01

    Background Low-dose computed tomography (ldct) screening has been shown to reduce mortality from lung cancer; however, the optimal screening duration and “at risk” population are not known. Methods The Cancer Risk Management Model developed by Statistics Canada for the Canadian Partnership Against Cancer includes a lung screening module based on data from the U.S. National Lung Screening Trial (nlst). The base-case scenario reproduces nlst outcomes with high fidelity. The impact in Canada of annual screening on the number of incident cases and life-years gained, with a wider range of age and smoking history eligibility criteria and varied participation rates, was modelled to show the magnitude of clinical benefit nationally and by province. Life-years gained, costs (discounted and undiscounted), and resource requirements were also estimated. Results In 2014, 1.4 million Canadians were eligible for screening according to nlst criteria. Over 10 years, screening would detect 12,500 more lung cancers than the expected 268,300 and would gain 9200 life-years. The computed tomography imaging requirement of 24,000–30,000 at program initiation would rise to between 87,000 and 113,000 by the 5th year of an annual nlst-like screening program. Costs would increase from approximately $75 million to $128 million at 10 years, and the cumulative cost nationally over 10 years would approach $1 billion, partially offset by a reduction in the costs of managing advanced lung cancer. Conclusions Modelling various ways in which ldct might be implemented provides decision-makers with estimates of the effect on clinical benefit and on resource needs that clinical trial results are unable to provide. PMID:27330355

  17. Lung cancer screening beyond low-dose computed tomography: the role of novel biomarkers.

    PubMed

    Hasan, Naveed; Kumar, Rohit; Kavuru, Mani S

    2014-10-01

    Lung cancer is the most common and lethal malignancy in the world. The landmark National lung screening trial (NLST) showed a 20% relative reduction in mortality in high-risk individuals with screening low-dose computed tomography. However, the poor specificity and low prevalence of lung cancer in the NLST provide major limitations to its widespread use. Furthermore, a lung nodule on CT scan requires a nuanced and individualized approach towards management. In this regard, advances in high through-put technology (molecular diagnostics, multi-gene chips, proteomics, and bronchoscopic techniques) have led to discovery of lung cancer biomarkers that have shown potential to complement the current screening standards. Early detection of lung cancer can be achieved by analysis of biomarkers from tissue samples within the respiratory tract such as sputum, saliva, nasal/bronchial airway epithelial cells and exhaled breath condensate or through peripheral biofluids such as blood, serum and urine. Autofluorescence bronchoscopy has been employed in research setting to identify pre-invasive lesions not identified on CT scan. Although these modalities are not yet commercially available in clinic setting, they will be available in the near future and clinicians who care for patients with lung cancer should be aware. In this review, we present up-to-date state of biomarker development, discuss their clinical relevance and predict their future role in lung cancer management.

  18. ATARiS: Computational quantification of gene suppression phenotypes from multisample RNAi screens

    PubMed Central

    Shao, Diane D.; Tsherniak, Aviad; Gopal, Shuba; Weir, Barbara A.; Tamayo, Pablo; Stransky, Nicolas; Schumacher, Steven E.; Zack, Travis I.; Beroukhim, Rameen; Garraway, Levi A.; Margolin, Adam A.; Root, David E.; Hahn, William C.; Mesirov, Jill P.

    2013-01-01

    Genome-scale RNAi libraries enable the systematic interrogation of gene function. However, the interpretation of RNAi screens is complicated by the observation that RNAi reagents designed to suppress the mRNA transcripts of the same gene often produce a spectrum of phenotypic outcomes due to differential on-target gene suppression or perturbation of off-target transcripts. Here we present a computational method, Analytic Technique for Assessment of RNAi by Similarity (ATARiS), that takes advantage of patterns in RNAi data across multiple samples in order to enrich for RNAi reagents whose phenotypic effects relate to suppression of their intended targets. By summarizing only such reagent effects for each gene, ATARiS produces quantitative, gene-level phenotype values, which provide an intuitive measure of the effect of gene suppression in each sample. This method is robust for data sets that contain as few as 10 samples and can be used to analyze screens of any number of targeted genes. We used this analytic approach to interrogate RNAi data derived from screening more than 100 human cancer cell lines and identified HNF1B as a transforming oncogene required for the survival of cancer cells that harbor HNF1B amplifications. ATARiS is publicly available at http://broadinstitute.org/ataris. PMID:23269662

  19. Attitudes and Beliefs of Primary Care Providers in New Mexico About Lung Cancer Screening Using Low-Dose Computed Tomography

    PubMed Central

    Hoffman, Richard M.; Sussman, Andrew L.; Getrich, Christina M.; Rhyne, Robert L.; Crowell, Richard E.; Taylor, Kathryn L.; Reifler, Ellen J.; Wescott, Pamela H.; Murrietta, Ambroshia M.; Saeed, Ali I.

    2015-01-01

    Introduction On the basis of results from the National Lung Screening Trial (NLST), national guidelines now recommend using low-dose computed tomography (LDCT) to screen high-risk smokers for lung cancer. Our study objective was to characterize the knowledge, attitudes, and beliefs of primary care providers about implementing LDCT screening. Methods We conducted semistructured interviews with primary care providers practicing in New Mexico clinics for underserved minority populations. The interviews, conducted from February through September 2014, focused on providers’ tobacco cessation efforts, lung cancer screening practices, perceptions of NLST and screening guidelines, and attitudes about informed decision making for cancer screening. Investigators iteratively reviewed transcripts to create a coding structure. Results We reached thematic saturation after interviewing 10 providers practicing in 6 urban and 4 rural settings; 8 practiced at federally qualified health centers. All 10 providers promoted smoking cessation, some screened with chest x-rays, and none screened with LDCT. Not all were aware of NLST results or current guideline recommendations. Providers viewed study results skeptically, particularly the 95% false-positive rate, the need to screen 320 patients to prevent 1 lung cancer death, and the small proportion of minority participants. Providers were uncertain whether New Mexico had the necessary infrastructure to support high-quality screening, and worried about access barriers and financial burdens for rural, underinsured populations. Providers noted the complexity of discussing benefits and harms of screening and surveillance with their patient population. Conclusion Providers have several concerns about the feasibility and appropriateness of implementing LDCT screening. Effective lung cancer screening programs will need to educate providers and patients to support informed decision making and to ensure that high-quality screening can be

  20. Ultrafast 2D IR microscopy

    PubMed Central

    Baiz, Carlos R.; Schach, Denise; Tokmakoff, Andrei

    2014-01-01

    We describe a microscope for measuring two-dimensional infrared (2D IR) spectra of heterogeneous samples with μm-scale spatial resolution, sub-picosecond time resolution, and the molecular structure information of 2D IR, enabling the measurement of vibrational dynamics through correlations in frequency, time, and space. The setup is based on a fully collinear “one beam” geometry in which all pulses propagate along the same optics. Polarization, chopping, and phase cycling are used to isolate the 2D IR signals of interest. In addition, we demonstrate the use of vibrational lifetime as a contrast agent for imaging microscopic variations in molecular environments. PMID:25089490

  1. Discovery of new [Formula: see text] proteasome inhibitors using a knowledge-based computational screening approach.

    PubMed

    Mehra, Rukmankesh; Chib, Reena; Munagala, Gurunadham; Yempalla, Kushalava Reddy; Khan, Inshad Ali; Singh, Parvinder Pal; Khan, Farrah Gul; Nargotra, Amit

    2015-11-01

    Mycobacterium tuberculosis bacteria cause deadly infections in patients [Corrected]. The rise of multidrug resistance associated with tuberculosis further makes the situation worse in treating the disease. M. tuberculosis proteasome is necessary for the pathogenesis of the bacterium validated as an anti-tubercular target, thus making it an attractive enzyme for designing Mtb inhibitors. In this study, a computational screening approach was applied to identify new proteasome inhibitor candidates from a library of 50,000 compounds. This chemical library was procured from the ChemBridge (20,000 compounds) and the ChemDiv (30,000 compounds) databases. After a detailed analysis of the computational screening results, 50 in silico hits were retrieved and tested in vitro finding 15 compounds with [Formula: see text] values ranging from 35.32 to 64.15 [Formula: see text]M on lysate. A structural analysis of these hits revealed that 14 of these compounds probably have non-covalent mode of binding to the target and have not reported for anti-tubercular or anti-proteasome activity. The binding interactions of all the 14 protein-inhibitor complexes were analyzed using molecular docking studies. Further, molecular dynamics simulations of the protein in complex with the two most promising hits were carried out so as to identify the key interactions and validate the structural stability.

  2. Computer-Delivered Screening and Brief Intervention for Alcohol Use in Pregnancy: A Pilot Randomized Trial

    PubMed Central

    Ondersma, Steven J.; Beatty, Jessica R.; Svikis, Dace S.; Strickler, Ronald C.; Tzilos, Golfo K.; Chang, Grace; Divine, W.; Taylor, Andrew R.; Sokol, Robert J.

    2015-01-01

    Background Although screening and brief intervention (SBI) for unhealthy alcohol use has demonstrated efficacy in some trials, its implementation has been limited. Technology-delivered approaches are a promising alternative, particularly during pregnancy when the importance of alcohol use is amplified. The present trial evaluated the feasibility and acceptability of an interactive, empathic, video-enhanced, and computer-delivered SBI (e-SBI) plus three separate tailored mailings, and estimated intervention effects. Methods We recruited 48 pregnant women who screened positive for alcohol risk at an urban prenatal care clinic. Participants were randomly assigned to the e-SBI plus mailings or to a control session on infant nutrition, and were reevaluated during their postpartum hospitalization. The primary outcome was 90-day period-prevalence abstinence as measured by timeline follow-back interview. Results Participants rated the intervention as easy to use and helpful (4.7-5.0 on a 5-point scale). Blinded follow-up evaluation at childbirth revealed medium-size intervention effects on 90-day period prevalence abstinence (OR = 3.4); similarly, intervention effects on a combined healthy pregnancy outcome variable (live birth, normal birthweight, and no NICU stay) were also of moderate magnitude in favor of e-SBI participants (OR=3.3). As expected in this intentionally under-powered pilot trial, these effects were non-significant (p = .19 and .09, respectively). Conclusions This pilot trial demonstrated the acceptability and preliminary efficacy of a computer-delivered screening and brief intervention (e-SBI) plus tailored mailings for alcohol use in pregnancy. These findings mirror the promising results of other trials using a similar approach, and should be confirmed in a fully-powered trial. PMID:26010235

  3. Computational screening and design of new materials for energy storage and conversion: batteries and thermoelectrics

    NASA Astrophysics Data System (ADS)

    Kozinsky, Boris

    2015-03-01

    Understanding the atomic-level origins of thermoelectricity is necessary for the design of higher-performing materials, and we demonstrate that ab-initio computation is a valuable tool. By developing and using advanced methods to compute intrinsic contribution to electron lifetimes from electron-phonon coupling, we are able to predict temperature and doping dependence of electronic transport properties in doped semiconductors. We combine these tools to perform rapid screening of new thermoelectric compositions. In energy storage, a promising path to enabling safe high-energy-density batteries is the introduction of inorganic solid electrolytes that can protect the Li-metal anode. We have achieved a detailed understanding of a promising class of garnet compounds by developing a set of efficient atomistic computational techniques to analyze structure ordering and ionic transport mechanisms. These methods allow us to map the transport phase diagram of a broad range of compositions and to predict new phases and phase transitions. The computational techniques are coupled with a novel software platform AiiDA that combines high-throughput automation with data analysis capabilities.

  4. Estimating development cost for a tailored interactive computer program to enhance colorectal cancer screening compliance.

    PubMed

    Lairson, David R; Chang, Yu-Chia; Bettencourt, Judith L; Vernon, Sally W; Greisinger, Anthony

    2006-01-01

    The authors used an actual-work estimate method to estimate the cost of developing a tailored interactive computer education program to improve compliance with colorectal cancer screening guidelines in a large multi-specialty group medical practice. Resource use was prospectively collected from time logs, administrative records, and a design and computing subcontract. Sensitivity analysis was performed to examine the uncertainty of the overhead cost rate and other parameters. The cost of developing the system was Dollars 328,866. The development cost was Dollars 52.79 per patient when amortized over a 7-year period with a cohort of 1,000 persons. About 20% of the cost was incurred in defining the theoretic framework and supporting literature, constructing the variables and survey, and conducting focus groups. About 41% of the cost was for developing the messages, algorithms, and constructing program elements, and the remaining cost was to create and test the computer education program. About 69% of the cost was attributable to personnel expenses. Development cost is rarely estimated but is important for feasibility studies and ex-ante economic evaluations of alternative interventions. The findings from this study may aid decision makers in planning, assessing, budgeting, and pricing development of tailored interactive computer-based interventions. PMID:16799126

  5. Comparison of different strategies in prenatal screening for Down’s syndrome: cost effectiveness analysis of computer simulation

    PubMed Central

    Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-01-01

    Objectives To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down’s syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Design Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. Data sources The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Main outcome measures Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options’ outcomes. Results The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26 833 per case of Down’s syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100 000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30 963 per additional birth with Down’s syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26 833 to $C37 260 and from $C35 215 to $C45 314 per case of Down’s syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100 000

  6. AnisWave 2D

    2004-08-01

    AnisWave2D is a 2D finite-difference code for a simulating seismic wave propagation in fully anisotropic materials. The code is implemented to run in parallel over multiple processors and is fully portable. A mesh refinement algorithm has been utilized to allow the grid-spacing to be tailored to the velocity model, avoiding the over-sampling of high-velocity materials that usually occurs in fixed-grid schemes.

  7. Roton Excitations and the Fluid-Solid Phase Transition in Superfluid 2D Yukawa Bosons

    NASA Astrophysics Data System (ADS)

    Molinelli, S.; Galli, D. E.; Reatto, L.; Motta, M.

    2016-10-01

    We compute several ground-state properties and the dynamical structure factor of a zero-temperature system of Bosons interacting with the 2D screened Coulomb (2D-SC) potential. We resort to the exact shadow path integral ground state (SPIGS) quantum Monte Carlo method to compute the imaginary-time correlation function of the model, and to the genetic algorithm via falsification of theories (GIFT) to retrieve the dynamical structure factor. We provide a detailed comparison of ground-state properties and collective excitations of 2D-SC and ^4He atoms. The roton energy of the 2D-SC system is an increasing function of density, and not a decreasing one as in ^4He. This result is in contrast with the view that the roton is the soft mode of the fluid-solid transition. We uncover a remarkable quasi-universality of backflow and of other properties when expressed in terms of the amount of short-range order as quantified by the height of the first peak of the static structure factor.

  8. Roton Excitations and the Fluid-Solid Phase Transition in Superfluid 2D Yukawa Bosons

    NASA Astrophysics Data System (ADS)

    Molinelli, S.; Galli, D. E.; Reatto, L.; Motta, M.

    2016-05-01

    We compute several ground-state properties and the dynamical structure factor of a zero-temperature system of Bosons interacting with the 2D screened Coulomb (2D-SC) potential. We resort to the exact shadow path integral ground state (SPIGS) quantum Monte Carlo method to compute the imaginary-time correlation function of the model, and to the genetic algorithm via falsification of theories (GIFT) to retrieve the dynamical structure factor. We provide a detailed comparison of ground-state properties and collective excitations of 2D-SC and ^4 He atoms. The roton energy of the 2D-SC system is an increasing function of density, and not a decreasing one as in ^4 He. This result is in contrast with the view that the roton is the soft mode of the fluid-solid transition. We uncover a remarkable quasi-universality of backflow and of other properties when expressed in terms of the amount of short-range order as quantified by the height of the first peak of the static structure factor.

  9. Computer-aided diagnostics of screening mammography using content-based image retrieval

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Soiron, Michael; de Oliveira, Júlia E. E.; de A. Araújo, Arnaldo

    2012-03-01

    Breast cancer is one of the main causes of death among women in occidental countries. In the last years, screening mammography has been established worldwide for early detection of breast cancer, and computer-aided diagnostics (CAD) is being developed to assist physicians reading mammograms. A promising method for CAD is content-based image retrieval (CBIR). Recently, we have developed a classification scheme of suspicious tissue pattern based on the support vector machine (SVM). In this paper, we continue moving towards automatic CAD of screening mammography. The experiments are based on in total 10,509 radiographs that have been collected from different sources. From this, 3,375 images are provided with one and 430 radiographs with more than one chain code annotation of cancerous regions. In different experiments, this data is divided into 12 and 20 classes, distinguishing between four categories of tissue density, three categories of pathology and in the 20 class problem two categories of different types of lesions. Balancing the number of images in each class yields 233 and 45 images remaining in each of the 12 and 20 classes, respectively. Using a two-dimensional principal component analysis, features are extracted from small patches of 128 x 128 pixels and classified by means of a SVM. Overall, the accuracy of the raw classification was 61.6 % and 52.1 % for the 12 and the 20 class problem, respectively. The confusion matrices are assessed for detailed analysis. Furthermore, an implementation of a SVM-based CBIR system for CADx in screening mammography is presented. In conclusion, with a smarter patch extraction, the CBIR approach might reach precision rates that are helpful for the physicians. This, however, needs more comprehensive evaluation on clinical data.

  10. The Role of Screening Sinus Computed Tomography in Pediatric Hematopoietic Stem Cell Transplant Patients

    PubMed Central

    Zamora, Carlos A.; Oppenheimer, Avi G.; Dave, Hema; Symons, Heather; Huisman, Thierry A. G. M.; Izbudak, Izlem

    2015-01-01

    Objective The objective of this study was to evaluate pretransplant sinus computed tomography (CT) as predictor of post–hematopoietic stem cell transplant sinusitis. Methods We evaluated pretransplant and posttransplant CT findings in 100 children using the Lund-Mackay system and “common-practice” radiology reporting and correlated these with the presence of acute sinusitis. Results Fourteen percent of patients with normal screening CT developed posttransplant sinusitis, compared with 23%with radiographic abnormalities and 22% with clinical sinusitis alone, not statistically significant. Sensitivity of CT findings for clinical sinusitis ranged between 19% and 56%. Except for mucosal thickening (71% specificity), other findings had high specificity between 92% and 97%, particularly when combined. Lund-Mackay score change of 10 or greater from baseline was associated with a 2.8-fold increased likelihood of having sinusitis (P < 0.001). Conclusions Screening CT can serve as a baseline, with a Lund-Mackay score change of 10 or greater constituting a significant threshold. The strongest correlation with the presence of acute sinusitis was seen with combined CT findings. PMID:25474147

  11. Temporal analysis of laser beam propagation in the atmosphere using computer-generated long phase screens.

    PubMed

    Dios, Federico; Recolons, Jaume; Rodríguez, Alejandro; Batet, Oscar

    2008-02-01

    Temporal analysis of the irradiance at the detector plane is intended as the first step in the study of the mean fade time in a free optical communication system. In the present work this analysis has been performed for a Gaussian laser beam propagating in the atmospheric turbulence by means of computer simulation. To this end, we have adapted a previously known numerical method to the generation of long phase screens. The screens are displaced in a transverse direction as the wave is propagated, in order to simulate the wind effect. The amplitude of the temporal covariance and its power spectrum have been obtained at the optical axis, at the beam centroid and at a certain distance from these two points. Results have been worked out for weak, moderate and strong turbulence regimes and when possible they have been compared with theoretical models. These results show a significant contribution of beam wander to the temporal behaviour of the irradiance, even in the case of weak turbulence. We have also found that the spectral bandwidth of the covariance is hardly dependent on the Rytov variance.

  12. Computer aided screening of natural compounds targeting the E6 protein of HPV using molecular docking

    PubMed Central

    Mamgain, Saril; Sharma, Pushpendra; Pathak, Rajesh Kumar; Baunthiyal, Mamta

    2015-01-01

    The cancer profile in the Indian state of Uttarakhand reveals that the breast cancer is the most prevalent type of cancers in females followed by cervical and ovarian type. Literature survey shows that the E6 protein of Human Papilloma Virus-16 (HPV-16) is responsible for causing several forms of cancer in human. Therefore, it is of interest to screen HPV-16 E6 target protein with known natural compounds using computer aided molecular modeling and docking tools. The complete structure of E6 is unknown. Hence, the E6 structure model was constructed using different online servers followed by molecular docking of Colchine, Curcumin, Daphnoretin, Ellipticine and Epigallocatechin-3-gallate; five known natural compounds with best E6 protein model predicted by Phyre2 server. The screening exercise shows that Daphnoretin (with binding free energy of -8.3 kcal/mol), a natural compound derived from Wikstroemia indica has the top binding properties. Thus, it is of interest to consider the compound for further validation. PMID:26124567

  13. Screening of photosynthetic pigments for herbicidal activity with a new computational molecular approach.

    PubMed

    Krishnaraj, R Navanietha; Chandran, Saravanan; Pal, Parimal; Berchmans, Sheela

    2013-12-01

    There is an immense interest among the researchers to identify new herbicides which are effective against the herbs without affecting the environment. In this work, photosynthetic pigments are used as the ligands to predict their herbicidal activity. The enzyme 5-enolpyruvylshikimate-3-phosphate (EPSP) synthase is a good target for the herbicides. Homology modeling of the target enzyme is done using Modeler 9.11 and the model is validated. Docking studies were performed with AutoDock Vina algorithm to predict the binding of the natural pigments such as β-carotene, chlorophyll a, chlorophyll b, phycoerythrin and phycocyanin to the target. β-carotene, phycoerythrin and phycocyanin have higher binding energies indicating the herbicidal activity of the pigments. This work reports a procedure to screen herbicides with computational molecular approach. These pigments will serve as potential bioherbicides in the future. PMID:24050696

  14. A computational design approach for virtual screening of peptide interactions across K+ channel families☆

    PubMed Central

    Doupnik, Craig A.; Parra, Katherine C.; Guida, Wayne C.

    2014-01-01

    Ion channels represent a large family of membrane proteins with many being well established targets in pharmacotherapy. The ‘druggability’ of heteromeric channels comprised of different subunits remains obscure, due largely to a lack of channel-specific probes necessary to delineate their therapeutic potential in vivo. Our initial studies reported here, investigated the family of inwardly rectifying potassium (Kir) channels given the availability of high resolution crystal structures for the eukaryotic constitutively active Kir2.2 channel. We describe a ‘limited’ homology modeling approach that can yield chimeric Kir channels having an outer vestibule structure representing nearly any known vertebrate or invertebrate channel. These computationally-derived channel structures were tested ""in silico for ‘docking’ to NMR structures of tertiapin (TPN), a 21 amino acid peptide found in bee venom. TPN is a highly selective and potent blocker for the epithelial rat Kir1.1 channel, but does not block human or zebrafish Kir1.1 channel isoforms. Our Kir1.1 channel-TPN docking experiments recapitulated published in vitro ""findings for TPN-sensitive and TPN-insensitive channels. Additionally, in silico site-directed mutagenesis identified ‘hot spots’ within the channel outer vestibule that mediate energetically favorable docking scores and correlate with sites previously identified with in vitro thermodynamic mutant-cycle analysis. These ‘proof-of-principle’ results establish a framework for virtual screening of re-engineered peptide toxins for interactions with computationally derived Kir channels that currently lack channel-specific blockers. When coupled with electrophysiological validation, this virtual screening approach may accelerate the drug discovery process, and can be readily applied to other ion channels families where high resolution structures are available. PMID:25709757

  15. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    NASA Astrophysics Data System (ADS)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  16. Screening for lung cancer using low-dose computed tomography: concerns about the application in low-risk individuals

    PubMed Central

    Li, Wei; Han, Fu-Jun; Liu, Yu-Di

    2015-01-01

    Low-dose computed tomography (LDCT) has been increasingly accepted as an efficient screening method for high-risk individuals to reduce lung cancer mortality. However, there remains a gap of knowledge in the practical implementation of screening on a larger scale, especially for low-risk individuals. The aim of this study is to initiate discussion through an evidence-based analysis and provide valuable suggestions on LDCT screening for lung cancer in clinical practice. Among previously published randomized controlled trials (RCTs), the National Lung Screening Trial (NLST) is the only one demonstrating positive results in a high-risk population of old age and heavy smokers. It is also shown that the potential harms include false-positive findings, radiation exposure etc., but its magnitude is uncertain. In the meantime, the current risk stratification system is inadequate, and is difficult to define selection criteria. Thus, the efficacy of LDCT in lung cancer screening needs to be confirmed in future trials, and the procedure should not be proposed to individuals without comparable risk to those in the NLST. Furthermore, there is a lack of evidence to support the expansion of LDCT screening to low-risk individuals. Therefore, recommendation of LDCT screening for these patients could be premature in clinical practice although some of them might be missed based on current definition of risk factors. Further studies and advances in risk assessment tools are urgently needed to address the concerns about lung cancer screening in order to improve the outcomes of lung cancer. PMID:26207215

  17. Static & Dynamic Response of 2D Solids

    1996-07-15

    NIKE2D is an implicit finite-element code for analyzing the finite deformation, static and dynamic response of two-dimensional, axisymmetric, plane strain, and plane stress solids. The code is fully vectorized and available on several computing platforms. A number of material models are incorporated to simulate a wide range of material behavior including elasto-placicity, anisotropy, creep, thermal effects, and rate dependence. Slideline algorithms model gaps and sliding along material interfaces, including interface friction, penetration and single surfacemore » contact. Interactive-graphics and rezoning is included for analyses with large mesh distortions. In addition to quasi-Newton and arc-length procedures, adaptive algorithms can be defined to solve the implicit equations using the solution language ISLAND. Each of these capabilities and more make NIKE2D a robust analysis tool.« less

  18. Static & Dynamic Response of 2D Solids

    SciTech Connect

    Lin, Jerry

    1996-07-15

    NIKE2D is an implicit finite-element code for analyzing the finite deformation, static and dynamic response of two-dimensional, axisymmetric, plane strain, and plane stress solids. The code is fully vectorized and available on several computing platforms. A number of material models are incorporated to simulate a wide range of material behavior including elasto-placicity, anisotropy, creep, thermal effects, and rate dependence. Slideline algorithms model gaps and sliding along material interfaces, including interface friction, penetration and single surface contact. Interactive-graphics and rezoning is included for analyses with large mesh distortions. In addition to quasi-Newton and arc-length procedures, adaptive algorithms can be defined to solve the implicit equations using the solution language ISLAND. Each of these capabilities and more make NIKE2D a robust analysis tool.

  19. Realistic and efficient 2D crack simulation

    NASA Astrophysics Data System (ADS)

    Yadegar, Jacob; Liu, Xiaoqing; Singh, Abhishek

    2010-04-01

    Although numerical algorithms for 2D crack simulation have been studied in Modeling and Simulation (M&S) and computer graphics for decades, realism and computational efficiency are still major challenges. In this paper, we introduce a high-fidelity, scalable, adaptive and efficient/runtime 2D crack/fracture simulation system by applying the mathematically elegant Peano-Cesaro triangular meshing/remeshing technique to model the generation of shards/fragments. The recursive fractal sweep associated with the Peano-Cesaro triangulation provides efficient local multi-resolution refinement to any level-of-detail. The generated binary decomposition tree also provides efficient neighbor retrieval mechanism used for mesh element splitting and merging with minimal memory requirements essential for realistic 2D fragment formation. Upon load impact/contact/penetration, a number of factors including impact angle, impact energy, and material properties are all taken into account to produce the criteria of crack initialization, propagation, and termination leading to realistic fractal-like rubble/fragments formation. The aforementioned parameters are used as variables of probabilistic models of cracks/shards formation, making the proposed solution highly adaptive by allowing machine learning mechanisms learn the optimal values for the variables/parameters based on prior benchmark data generated by off-line physics based simulation solutions that produce accurate fractures/shards though at highly non-real time paste. Crack/fracture simulation has been conducted on various load impacts with different initial locations at various impulse scales. The simulation results demonstrate that the proposed system has the capability to realistically and efficiently simulate 2D crack phenomena (such as window shattering and shards generation) with diverse potentials in military and civil M&S applications such as training and mission planning.

  20. Effect of 3G cell phone exposure with computer controlled 2-D stepper motor on non-thermal activation of the hsp27/p38MAPK stress pathway in rat brain.

    PubMed

    Kesari, Kavindra Kumar; Meena, Ramovatar; Nirala, Jayprakash; Kumar, Jitender; Verma, H N

    2014-03-01

    Cell phone radiation exposure and its biological interaction is the present concern of debate. Present study aimed to investigate the effect of 3G cell phone exposure with computer controlled 2-D stepper motor on 45-day-old male Wistar rat brain. Animals were exposed for 2 h a day for 60 days by using mobile phone with angular movement up to zero to 30°. The variation of the motor is restricted to 90° with respect to the horizontal plane, moving at a pre-determined rate of 2° per minute. Immediately after 60 days of exposure, animals were scarified and numbers of parameters (DNA double-strand break, micronuclei, caspase 3, apoptosis, DNA fragmentation, expression of stress-responsive genes) were performed. Result shows that microwave radiation emitted from 3G mobile phone significantly induced DNA strand breaks in brain. Meanwhile a significant increase in micronuclei, caspase 3 and apoptosis were also observed in exposed group (P < 0.05). Western blotting result shows that 3G mobile phone exposure causes a transient increase in phosphorylation of hsp27, hsp70, and p38 mitogen-activated protein kinase (p38MAPK), which leads to mitochondrial dysfunction-mediated cytochrome c release and subsequent activation of caspases, involved in the process of radiation-induced apoptotic cell death. Study shows that the oxidative stress is the main factor which activates a variety of cellular signal transduction pathways, among them the hsp27/p38MAPK is the pathway of principle stress response. Results conclude that 3G mobile phone radiations affect the brain function and cause several neurological disorders.

  1. Resource Utilization and Costs during the Initial Years of Lung Cancer Screening with Computed Tomography in Canada

    PubMed Central

    Lam, Stephen; Tammemagi, Martin C.; Evans, William K.; Leighl, Natasha B.; Regier, Dean A.; Bolbocean, Corneliu; Shepherd, Frances A.; Tsao, Ming-Sound; Manos, Daria; Liu, Geoffrey; Atkar-Khattra, Sukhinder; Cromwell, Ian; Johnston, Michael R.; Mayo, John R.; McWilliams, Annette; Couture, Christian; English, John C.; Goffin, John; Hwang, David M.; Puksa, Serge; Roberts, Heidi; Tremblay, Alain; MacEachern, Paul; Burrowes, Paul; Bhatia, Rick; Finley, Richard J.; Goss, Glenwood D.; Nicholas, Garth; Seely, Jean M.; Sekhon, Harmanjatinder S.; Yee, John; Amjadi, Kayvan; Cutz, Jean-Claude; Ionescu, Diana N.; Yasufuku, Kazuhiro; Martel, Simon; Soghrati, Kamyar; Sin, Don D.; Tan, Wan C.; Urbanski, Stefan; Xu, Zhaolin; Peacock, Stuart J.

    2014-01-01

    Background: It is estimated that millions of North Americans would qualify for lung cancer screening and that billions of dollars of national health expenditures would be required to support population-based computed tomography lung cancer screening programs. The decision to implement such programs should be informed by data on resource utilization and costs. Methods: Resource utilization data were collected prospectively from 2059 participants in the Pan-Canadian Early Detection of Lung Cancer Study using low-dose computed tomography (LDCT). Participants who had 2% or greater lung cancer risk over 3 years using a risk prediction tool were recruited from seven major cities across Canada. A cost analysis was conducted from the Canadian public payer’s perspective for resources that were used for the screening and treatment of lung cancer in the initial years of the study. Results: The average per-person cost for screening individuals with LDCT was $453 (95% confidence interval [CI], $400–$505) for the initial 18-months of screening following a baseline scan. The screening costs were highly dependent on the detected lung nodule size, presence of cancer, screening intervention, and the screening center. The mean per-person cost of treating lung cancer with curative surgery was $33,344 (95% CI, $31,553–$34,935) over 2 years. This was lower than the cost of treating advanced-stage lung cancer with chemotherapy, radiotherapy, or supportive care alone, ($47,792; 95% CI, $43,254–$52,200; p = 0.061). Conclusion: In the Pan-Canadian study, the average cost to screen individuals with a high risk for developing lung cancer using LDCT and the average initial cost of curative intent treatment were lower than the average per-person cost of treating advanced stage lung cancer which infrequently results in a cure. PMID:25105438

  2. DYNA2D96. Explicit 2-D Hydrodynamic FEM Program

    SciTech Connect

    Whirley, R.G.

    1992-04-01

    DYNA2D is a vectorized, explicit, two-dimensional, axisymmetric and plane strain finite element program for analyzing the large deformation dynamic and hydrodynamic response of inelastic solids. DYNA2D contains 13 material models and 9 equations of state (EOS) to cover a wide range of material behavior. The material models implemented in all machine versions are: elastic, orthotropic elastic, kinematic/isotropic elastic plasticity, thermoelastoplastic, soil and crushable foam, linear viscoelastic, rubber, high explosive burn, isotropic elastic-plastic, temperature-dependent elastic-plastic. The isotropic and temperature-dependent elastic-plastic models determine only the deviatoric stresses. Pressure is determined by one of 9 equations of state including linear polynomial, JWL high explosive, Sack Tuesday high explosive, Gruneisen, ratio of polynomials, linear polynomial with energy deposition, ignition and growth of reaction in HE, tabulated compaction, and tabulated.

  3. 2D Four-Channel Perfect Reconstruction Filter Bank Realized with the 2D Lattice Filter Structure

    NASA Astrophysics Data System (ADS)

    Sezen, S.; Ertüzün, A.

    2006-12-01

    A novel orthogonal 2D lattice structure is incorporated into the design of a nonseparable 2D four-channel perfect reconstruction filter bank. The proposed filter bank is obtained by using the polyphase decomposition technique which requires the design of an orthogonal 2D lattice filter. Due to constraint of perfect reconstruction, each stage of this lattice filter bank is simply parameterized by two coefficients. The perfect reconstruction property is satisfied regardless of the actual values of these parameters and of the number of the lattice stages. It is also shown that a separable 2D four-channel perfect reconstruction lattice filter bank can be constructed from the 1D lattice filter and that this is a special case of the proposed 2D lattice filter bank under certain conditions. The perfect reconstruction property of the proposed 2D lattice filter approach is verified by computer simulations.

  4. The UK Lung Cancer Screening Trial: a pilot randomised controlled trial of low-dose computed tomography screening for the early detection of lung cancer.

    PubMed Central

    Field, John K; Duffy, Stephen W; Baldwin, David R; Brain, Kate E; Devaraj, Anand; Eisen, Tim; Green, Beverley A; Holemans, John A; Kavanagh, Terry; Kerr, Keith M; Ledson, Martin; Lifford, Kate J; McRonald, Fiona E; Nair, Arjun; Page, Richard D; Parmar, Mahesh Kb; Rintoul, Robert C; Screaton, Nicholas; Wald, Nicholas J; Weller, David; Whynes, David K; Williamson, Paula R; Yadegarfar, Ghasem; Hansell, David M

    2016-01-01

    BACKGROUND Lung cancer kills more people than any other cancer in the UK (5-year survival < 13%). Early diagnosis can save lives. The USA-based National Lung Cancer Screening Trial reported a 20% relative reduction in lung cancer mortality and 6.7% all-cause mortality in low-dose computed tomography (LDCT)-screened subjects. OBJECTIVES To (1) analyse LDCT lung cancer screening in a high-risk UK population, determine optimum recruitment, screening, reading and care pathway strategies; and (2) assess the psychological consequences and the health-economic implications of screening. DESIGN A pilot randomised controlled trial comparing intervention with usual care. A population-based risk questionnaire identified individuals who were at high risk of developing lung cancer (≥ 5% over 5 years). SETTING Thoracic centres with expertise in lung cancer imaging, respiratory medicine, pathology and surgery: Liverpool Heart & Chest Hospital, Merseyside, and Papworth Hospital, Cambridgeshire. PARTICIPANTS Individuals aged 50-75 years, at high risk of lung cancer, in the primary care trusts adjacent to the centres. INTERVENTIONS A thoracic LDCT scan. Follow-up computed tomography (CT) scans as per protocol. Referral to multidisciplinary team clinics was determined by nodule size criteria. MAIN OUTCOME MEASURES Population-based recruitment based on risk stratification; management of the trial through web-based database; optimal characteristics of CT scan readers (radiologists vs. radiographers); characterisation of CT-detected nodules utilising volumetric analysis; prevalence of lung cancer at baseline; sociodemographic factors affecting participation; psychosocial measures (cancer distress, anxiety, depression, decision satisfaction); and cost-effectiveness modelling. RESULTS A total of 247,354 individuals were approached to take part in the trial; 30.7% responded positively to the screening invitation. Recruitment of participants resulted in 2028 in the CT arm and 2027 in

  5. Drug search for leishmaniasis: a virtual screening approach by grid computing.

    PubMed

    Ochoa, Rodrigo; Watowich, Stanley J; Flórez, Andrés; Mesa, Carol V; Robledo, Sara M; Muskus, Carlos

    2016-07-01

    The trypanosomatid protozoa Leishmania is endemic in ~100 countries, with infections causing ~2 million new cases of leishmaniasis annually. Disease symptoms can include severe skin and mucosal ulcers, fever, anemia, splenomegaly, and death. Unfortunately, therapeutics approved to treat leishmaniasis are associated with potentially severe side effects, including death. Furthermore, drug-resistant Leishmania parasites have developed in most endemic countries. To address an urgent need for new, safe and inexpensive anti-leishmanial drugs, we utilized the IBM World Community Grid to complete computer-based drug discovery screens (Drug Search for Leishmaniasis) using unique leishmanial proteins and a database of 600,000 drug-like small molecules. Protein structures from different Leishmania species were selected for molecular dynamics (MD) simulations, and a series of conformational "snapshots" were chosen from each MD trajectory to simulate the protein's flexibility. A Relaxed Complex Scheme methodology was used to screen ~2000 MD conformations against the small molecule database, producing >1 billion protein-ligand structures. For each protein target, a binding spectrum was calculated to identify compounds predicted to bind with highest average affinity to all protein conformations. Significantly, four different Leishmania protein targets were predicted to strongly bind small molecules, with the strongest binding interactions predicted to occur for dihydroorotate dehydrogenase (LmDHODH; PDB:3MJY). A number of predicted tight-binding LmDHODH inhibitors were tested in vitro and potent selective inhibitors of Leishmania panamensis were identified. These promising small molecules are suitable for further development using iterative structure-based optimization and in vitro/in vivo validation assays. PMID:27438595

  6. Effect of Screen Reading and Reading from Printed Out Material on Student Success and Permanency in Introduction to Computer Lesson

    ERIC Educational Resources Information Center

    Tuncer, Murat; Bahadir, Ferdi

    2014-01-01

    In this study, the effect of screen reading and reading from printed out material on student success and permanency in Introduction to Computer Lesson is investigated. Study group of the research consists of 78 freshman students registered in Erzincan University Refahiye Vocational School Post Service department. Study groups of research consist…

  7. Effects of Text Display Variables on Reading Tasks: Computer Screen vs. Hard Copy. CDC Technical Report No. 3.

    ERIC Educational Resources Information Center

    Haas, Christina; Hayes, John R.

    Two studies were conducted to compare subjects' performance reading texts displayed on a computer terminal screen and on paper. In the first study, 10 graduate students read a 1,000-word article on knee injuries from "Science 83" magazine and were tested for recall of information on eight items. While subjects in the control condition (reading…

  8. The Study of Learners' Preference for Visual Complexity on Small Screens of Mobile Computers Using Neural Networks

    ERIC Educational Resources Information Center

    Wang, Lan-Ting; Lee, Kun-Chou

    2014-01-01

    The vision plays an important role in educational technologies because it can produce and communicate quite important functions in teaching and learning. In this paper, learners' preference for the visual complexity on small screens of mobile computers is studied by neural networks. The visual complexity in this study is divided into five…

  9. MOSS2D V1

    2001-01-31

    This software reduces the data from two-dimensional kSA MOS program, k-Space Associates, Ann Arbor, MI. Initial MOS data is recorded without headers in 38 columns, with one row of data per acquisition per lase beam tracked. The final MOSS 2d data file is reduced, graphed, and saved in a tab-delimited column format with headers that can be plotted in any graphing software.

  10. Computed tomography screening: the international early lung cancer action program experience.

    PubMed

    Henschke, Claudia I; Boffetta, Paolo; Yankelevitz, David F; Altorki, Nasser

    2015-05-01

    The International Early Lung Cancer Action Program (I-ELCAP) used a novel study design that provided quantitative information about annual CT screening for lung cancer. The results stimulated additional studies of lung cancer screening and ultimately led to the National Lung Screening Trial (NLST) being initiated in 2002, as the initial report in 1999 was sufficiently compelling to reawaken interest in screening for lung cancer. The authors think that the I-ELCAP and NLST "story" provides a strong argument for relevant agencies to consider alternative study designs for the public funding of studies aimed at evaluating the effectiveness of screening and other medical trials.

  11. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    PubMed

    Fredin, Lisa A; Allison, Thomas C

    2016-04-01

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  12. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    PubMed

    Fredin, Lisa A; Allison, Thomas C

    2016-04-01

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules. PMID:26982657

  13. Low-Dose CT Screening for Lung Cancer: Computer-aided Detection of Missed Lung Cancers.

    PubMed

    Liang, Mingzhu; Tang, Wei; Xu, Dong Ming; Jirapatnakul, Artit C; Reeves, Anthony P; Henschke, Claudia I; Yankelevitz, David

    2016-10-01

    Purpose To update information regarding the usefulness of computer-aided detection (CAD) systems with a focus on the most critical category, that of missed cancers at earlier imaging, for cancers that manifest as a solid nodule. Materials and Methods By using a HIPAA-compliant institutional review board-approved protocol where informed consent was obtained, 50 lung cancers that manifested as a solid nodule on computed tomographic (CT) scans in annual rounds of screening (time 1) were retrospectively identified that could, in retrospect, be identified on the previous CT scans (time 0). Four CAD systems were compared, which were referred to as CAD 1, CAD 2, CAD 3, and CAD 4. The total number of accepted CAD-system-detected nodules at time 0 was determined by consensus of two radiologists and the number of CAD-system-detected nodules that were rejected by the radiologists was also documented. Results At time 0 when all the cancers had been missed, CAD system detection rates for the cancers were 56%, 70%, 68%, and 60% (κ = 0.45) for CAD systems 1, 2, 3, and 4, respectively. At time 1, the rates were 74%, 82%, 82%, and 78% (κ = 0.32), respectively. The average diameter of the 50 cancers at time 0 and time 1 was 4.8 mm and 11.4 mm, respectively. The number of CAD-system-detected nodules that were rejected per CT scan for CAD systems 1-4 at time 0 was 7.4, 1.7, 0.6, and 4.5 respectively. Conclusion CAD systems detected up to 70% of lung cancers that were not detected by the radiologist but failed to detect about 20% of the lung cancers when they were identified by the radiologist, which suggests that CAD may be useful in the role of second reader. (©) RSNA, 2016.

  14. A whole genome RNAi screen of Drosophila S2 cell spreading performed using automated computational image analysis.

    PubMed

    D'Ambrosio, Michael V; Vale, Ronald D

    2010-11-01

    Recent technological advances in microscopy have enabled cell-based whole genome screens, but the analysis of the vast amount of image data generated by such screens usually proves to be rate limiting. In this study, we performed a whole genome RNA interference (RNAi) screen to uncover genes that affect spreading of Drosophila melanogaster S2 cells using several computational methods for analyzing the image data in an automated manner. Expected genes in the Scar-Arp2/3 actin nucleation pathway were identified as well as casein kinase I, which had a similar morphological RNAi signature. A distinct nonspreading morphological phenotype was identified for genes involved in membrane secretion or synthesis. In this group, we identified a new secretory peptide and investigated the functions of two poorly characterized endoplasmic reticulum proteins that have roles in secretion. Thus, this genome-wide screen succeeded in identifying known and unexpected proteins that are important for cell spreading, and the computational tools developed in this study should prove useful for other types of automated whole genome screens.

  15. The Effect of All-Capital vs. Regular Mixed Print, as Presented on a Computer Screen, on Reading Rate and Accuracy.

    ERIC Educational Resources Information Center

    Henney, Maribeth

    Two related studies were conducted to determine whether students read all-capital text and mixed text displayed on a computer screen with the same speed and accuracy. Seventy-seven college students read M. A. Tinker's "Basic Reading Rate Test" displayed on a PLATO computer screen. One treatment consisted of paragraphs in all-capital type followed…

  16. Large-Scale Computational Screening Identifies First in Class Multitarget Inhibitor of EGFR Kinase and BRD4

    PubMed Central

    Allen, Bryce K.; Mehta, Saurabh; Ember, Stewart W. J.; Schonbrunn, Ernst; Ayad, Nagi; Schürer, Stephan C.

    2015-01-01

    Inhibition of cancer-promoting kinases is an established therapeutic strategy for the treatment of many cancers, although resistance to kinase inhibitors is common. One way to overcome resistance is to target orthogonal cancer-promoting pathways. Bromo and Extra-Terminal (BET) domain proteins, which belong to the family of epigenetic readers, have recently emerged as promising therapeutic targets in multiple cancers. The development of multitarget drugs that inhibit kinase and BET proteins therefore may be a promising strategy to overcome tumor resistance and prolong therapeutic efficacy in the clinic. We developed a general computational screening approach to identify novel dual kinase/bromodomain inhibitors from millions of commercially available small molecules. Our method integrated machine learning using big datasets of kinase inhibitors and structure-based drug design. Here we describe the computational methodology, including validation and characterization of our models and their application and integration into a scalable virtual screening pipeline. We screened over 6 million commercially available compounds and selected 24 for testing in BRD4 and EGFR biochemical assays. We identified several novel BRD4 inhibitors, among them a first in class dual EGFR-BRD4 inhibitor. Our studies suggest that this computational screening approach may be broadly applicable for identifying dual kinase/BET inhibitors with potential for treating various cancers. PMID:26596901

  17. Smoking cessation interventions within the context of Low-Dose Computed Tomography lung cancer screening: A systematic review.

    PubMed

    Piñeiro, Bárbara; Simmons, Vani N; Palmer, Amanda M; Correa, John B; Brandon, Thomas H

    2016-08-01

    The integration of smoking cessation interventions (SCIs) within the context of lung cancer screening programs is strongly recommended by screening guidelines, and is a requirement for Medicare coverage of screening in the US. In Europe, there are no lung cancer screening guidelines, however, research trials are ongoing, and prominent professional societies have begun to recommend lung cancer screening. Little is known about the types and efficacy of SCIs among patients receiving low-dose computed tomography (LDCT) screening. This review addresses this gap. Based on a systematic search, we identified six empirical studies published prior to July 1, 2015, that met inclusion criteria for our review: English language, SCI for LDCT patients, and reported smoking-related outcomes. Three randomized studies and three single-arm studies were identified. Two randomized controlled trials (RCTs) evaluated self-help SCIs, whereas one pilot RCT evaluated the timing (before or after the LDCT scan) of a combined (counseling and pharmacotherapy) SCI. Among the single-arm trials, two observational studies evaluated the efficacy of combined SCI, and one retrospectively assessed the efficacy of clinician-delivered smoking assessment, advice, and assistance. Given the limited research to date, and particularly the lack of studies reporting results from RCTs, assumptions that SCIs would be effective among this population should be made with caution. Findings from this review suggest that participation in a lung screening trial promotes smoking cessation and may represent a teachable moment to quit smoking. Findings also suggest that providers can take advantage of this potentially teachable moment, and that SCIs have been successfully implemented in screening settings. Continued systematic and methodologically sound research in this area will help improve the knowledge base and implementation of interventions for this population of smokers at risk for chronic disease. PMID:27393513

  18. Generating a 2D Representation of a Complex Data Structure

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A computer program, designed to assist in the development and debugging of other software, generates a two-dimensional (2D) representation of a possibly complex n-dimensional (where n is an integer >2) data structure or abstract rank-n object in that other software. The nature of the 2D representation is such that it can be displayed on a non-graphical output device and distributed by non-graphical means.

  19. A simultaneous 2D/3D autostereo workstation

    NASA Astrophysics Data System (ADS)

    Chau, Dennis; McGinnis, Bradley; Talandis, Jonas; Leigh, Jason; Peterka, Tom; Knoll, Aaron; Sumer, Aslihan; Papka, Michael; Jellinek, Julius

    2012-03-01

    We present a novel immersive workstation environment that scientists can use for 3D data exploration and as their everyday 2D computer monitor. Our implementation is based on an autostereoscopic dynamic parallax barrier 2D/3D display, interactive input devices, and a software infrastructure that allows client/server software modules to couple the workstation to scientists' visualization applications. This paper describes the hardware construction and calibration, software components, and a demonstration of our system in nanoscale materials science exploration.

  20. No evidence for prolonged latency of saccadic eye movements due to intermittent light of a CRT computer screen.

    PubMed

    Jainta, Stephanie; Jaschinski, Wolfgang; Baccino, Thierry

    2004-01-15

    Despite previous studies it remains unclear, whether saccadic eye movements across computer screens may be adversely affected by the intermittency of light of cathode ray tubes (CRT). We measured the latency of simple saccades to peripheral targets presented on a CRT-screen, operated at refresh rates of 50, 100 and 150 Hz, compared with a special fluorescent lamp display (FLD). Our results suggest that the intermittent light of CRT screens does not prolong the latency of saccades not even relative to a control condition of unmodulated steady light at the FLD. Further, there was no evidence for any individual effect in possibly susceptible subjects, e.g. at high critical flicker frequencies (CFF).

  1. Excitons in van der Waals heterostructures: The important role of dielectric screening

    NASA Astrophysics Data System (ADS)

    Latini, S.; Olsen, T.; Thygesen, K. S.

    2015-12-01

    The existence of strongly bound excitons is one of the hallmarks of the newly discovered atomically thin semiconductors. While it is understood that the large binding energy is mainly due to the weak dielectric screening in two dimensions, a systematic investigation of the role of screening on two-dimensional (2D) excitons is still lacking. Here we provide a critical assessment of a widely used 2D hydrogenic exciton model, which assumes a dielectric function of the form ɛ (q )=1 +2 π α q , and we develop a quasi-2D model with a much broader applicability. Within the quasi-2D picture, electrons and holes are described as in-plane point charges with a finite extension in the perpendicular direction, and their interaction is screened by a dielectric function with a nonlinear q dependence which is computed ab initio. The screened interaction is used in a generalized Mott-Wannier model to calculate exciton binding energies in both isolated and supported 2D materials. For isolated 2D materials, the quasi-2D treatment yields results almost identical to those of the strict 2D model, and both are in good agreement with ab initio many-body calculations. On the other hand, for more complex structures such as supported layers or layers embedded in a van der Waals heterostructure, the size of the exciton in reciprocal space extends well beyond the linear regime of the dielectric function, and a quasi-2D description has to replace the 2D one. Our methodology has the merit of providing a seamless connection between the strict 2D limit of isolated monolayer materials and the more bulk-like screening characteristics of supported 2D materials or van der Waals heterostructures.

  2. The interplay of attention economics and computer-aided detection marks in screening mammography

    NASA Astrophysics Data System (ADS)

    Schwartz, Tayler M.; Sridharan, Radhika; Wei, Wei; Lukyanchenko, Olga; Geiser, William; Whitman, Gary J.; Haygood, Tamara Miner

    2016-03-01

    Introduction: According to attention economists, overabundant information leads to decreased attention for individual pieces of information. Computer-aided detection (CAD) alerts radiologists to findings potentially associated with breast cancer but is notorious for creating an abundance of false-positive marks. We suspected that increased CAD marks do not lengthen mammogram interpretation time, as radiologists will selectively disregard these marks when present in larger numbers. We explore the relevance of attention economics in mammography by examining how the number of CAD marks affects interpretation time. Methods: We performed a retrospective review of bilateral digital screening mammograms obtained between January 1, 2011 and February 28, 2014, using only weekend interpretations to decrease distractions and the likelihood of trainee participation. We stratified data according to reader and used ANOVA to assess the relationship between number of CAD marks and interpretation time. Results: Ten radiologists, with median experience after residency of 12.5 years (range 6 to 24,) interpreted 1849 mammograms. When accounting for number of images, Breast Imaging Reporting and Data System category, and breast density, increasing numbers of CAD marks was correlated with longer interpretation time only for the three radiologists with the fewest years of experience (median 7 years.) Conclusion: For the 7 most experienced readers, increasing CAD marks did not lengthen interpretation time. We surmise that as CAD marks increase, the attention given to individual marks decreases. Experienced radiologists may rapidly dismiss larger numbers of CAD marks as false-positive, having learned that devoting extra attention to such marks does not improve clinical detection.

  3. A computer-tailored intervention to promote informed decision making for prostate cancer screening among African-American men

    PubMed Central

    Allen, Jennifer D.; Mohllajee, Anshu P.; Shelton, Rachel C.; Drake, Bettina F.; Mars, Dana R.

    2010-01-01

    African-American men experience a disproportionate burden of prostate cancer (CaP) morbidity and mortality. National screening guidelines advise men to make individualized screening decisions through a process termed “informed decision making” (IDM). In this pilot study, a computer-tailored decision-aid designed to promote IDM was evaluated using a pre/post test design. African-American men aged 40+ recruited from a variety of community settings (n=108). At pre-test, 43% of men reported having made a screening decision; at post-test 47% reported this to be the case (p=0.39). Significant improvements were observed on scores (0–100%) of knowledge (54% vs 72%; p<0.001), decision self-efficacy (87% vs 89%; p<0.01), and decisional conflict (21% vs 13%; p<0.001). Men were also more likely to want an active role in decision-making after using the tool (67% vs 75%; p=0.03). These results suggest that use of a computer-tailored decision-aid is a promising strategy to promote IDM for CaP screening among African-American men. PMID:19477736

  4. Unparticle example in 2D.

    PubMed

    Georgi, Howard; Kats, Yevgeny

    2008-09-26

    We discuss what can be learned about unparticle physics by studying simple quantum field theories in one space and one time dimension. We argue that the exactly soluble 2D theory of a massless fermion coupled to a massive vector boson, the Sommerfield model, is an interesting analog of a Banks-Zaks model, approaching a free theory at high energies and a scale-invariant theory with nontrivial anomalous dimensions at low energies. We construct a toy standard model coupling to the fermions in the Sommerfield model and study how the transition from unparticle behavior at low energies to free particle behavior at high energies manifests itself in interactions with the toy standard model particles.

  5. High throughput screening for mammography using a human-computer interface with rapid serial visual presentation (RSVP)

    NASA Astrophysics Data System (ADS)

    Hope, Chris; Sterr, Annette; Elangovan, Premkumar; Geades, Nicholas; Windridge, David; Young, Ken; Wells, Kevin

    2013-03-01

    The steady rise of the breast cancer screening population, coupled with data expansion produced by new digital screening technologies (tomosynthesis/CT) motivates the development of new, more efficient image screening processes. Rapid Serial Visual Presentation (RSVP) is a new fast-content recognition approach which uses electroencephalography to record brain activity elicited by fast bursts of image data. These brain responses are then subjected to machine classification methods to reveal the expert's `reflex' response to classify images according to their presence or absence of particular targets. The benefit of this method is that images can be presented at high temporal rates (~10 per second), faster than that required for fully conscious detection, facilitating a high throughput of image (screening) material. In the present paper we present the first application of RSVP to medical image data, and demonstrate how cortically coupled computer vision can be successfully applied to breast cancer screening. Whilst prior RSVP work has utilised multichannel approaches, we also present the first RSVP results demonstrating discriminatory response on a single electrode with a ROC area under the curve of 0.62- 0.86 using a simple Fisher discriminator for classification. This increases to 0.75 - 0.94 when multiple electrodes are used in combination.

  6. Low-Dose Chest Computed Tomography for Lung Cancer Screening Among Hodgkin Lymphoma Survivors: A Cost-Effectiveness Analysis

    SciTech Connect

    Wattson, Daniel A.; Hunink, M.G. Myriam; DiPiro, Pamela J.; Das, Prajnan; Hodgson, David C.; Mauch, Peter M.; Ng, Andrea K.

    2014-10-01

    Purpose: Hodgkin lymphoma (HL) survivors face an increased risk of treatment-related lung cancer. Screening with low-dose computed tomography (LDCT) may allow detection of early stage, resectable cancers. We developed a Markov decision-analytic and cost-effectiveness model to estimate the merits of annual LDCT screening among HL survivors. Methods and Materials: Population databases and HL-specific literature informed key model parameters, including lung cancer rates and stage distribution, cause-specific survival estimates, and utilities. Relative risks accounted for radiation therapy (RT) technique, smoking status (>10 pack-years or current smokers vs not), age at HL diagnosis, time from HL treatment, and excess radiation from LDCTs. LDCT assumptions, including expected stage-shift, false-positive rates, and likely additional workup were derived from the National Lung Screening Trial and preliminary results from an internal phase 2 protocol that performed annual LDCTs in 53 HL survivors. We assumed a 3% discount rate and a willingness-to-pay (WTP) threshold of $50,000 per quality-adjusted life year (QALY). Results: Annual LDCT screening was cost effective for all smokers. A male smoker treated with mantle RT at age 25 achieved maximum QALYs by initiating screening 12 years post-HL, with a life expectancy benefit of 2.1 months and an incremental cost of $34,841/QALY. Among nonsmokers, annual screening produced a QALY benefit in some cases, but the incremental cost was not below the WTP threshold for any patient subsets. As age at HL diagnosis increased, earlier initiation of screening improved outcomes. Sensitivity analyses revealed that the model was most sensitive to the lung cancer incidence and mortality rates and expected stage-shift from screening. Conclusions: HL survivors are an important high-risk population that may benefit from screening, especially those treated in the past with large radiation fields including mantle or involved-field RT. Screening

  7. Long-term prognosis of patients with lung cancer detected on low-dose chest computed tomography screening.

    PubMed

    Nawa, Takeshi; Nakagawa, Tohru; Mizoue, Tetsuya; Kusano, Suzushi; Chonan, Tatsuya; Fukai, Shimao; Endo, Katsuyuki

    2012-02-01

    The effectiveness of lung cancer screening using low-dose chest computed tomography (CT) remains elusive. The present study examined the prognosis of patients with lung cancer detected on CT screening in Japanese men and women. Subjects were 210 patients with primary lung cancer identified on CT screening at two medical facilities in Hitachi, Japan, where a total of 61,914 CT screenings were performed among 25,385 screenees between 1998 and 2006. Prognostic status of these patients was sought by examining medical records at local hospitals, supplemented by vital status information from local government. The 5-year survival rate was estimated according to the characteristics of patients and lung nodule. A total of 203 (97%) patients underwent surgery. During a 5.7-year mean follow-up period, 19 patients died from lung cancer and 6 died from other causes. The estimated 5-year survival rate for all patients and for those on stage IA was 90% and 97%, respectively. Besides cancer stage, smoking and nodule appearance were independent predictors of a poor survival; multivariable-adjusted hazard ratio (95% confidence interval) was 4.7 (1.3, 16.5) for current and past smokers versus nonsmokers and 4.6 (1.6, 13.9) for solid nodule versus others. Even patients with solid shadow had a 5-year survival of 82% if the lesion was 20mm or less in size. Results suggest that lung cancers detected on CT screening are mostly curative. The impact of CT screening on mortality at community level needs to be clarified by monitoring lung cancer deaths.

  8. ORION96. 2-d Finite Element Code Postprocessor

    SciTech Connect

    Sanford, L.A.; Hallquist, J.O.

    1992-02-02

    ORION is an interactive program that serves as a postprocessor for the analysis programs NIKE2D, DYNA2D, TOPAZ2D, and CHEMICAL TOPAZ2D. ORION reads binary plot files generated by the two-dimensional finite element codes currently used by the Methods Development Group at LLNL. Contour and color fringe plots of a large number of quantities may be displayed on meshes consisting of triangular and quadrilateral elements. ORION can compute strain measures, interface pressures along slide lines, reaction forces along constrained boundaries, and momentum. ORION has been applied to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  9. Screening for pulmonary tuberculosis in a Tanzanian prison and computer-aided interpretation of chest X-rays

    PubMed Central

    Mangu, C.; van den Hombergh, J.; van Deutekom, H.; van Ginneken, B.; Clowes, P.; Mhimbira, F.; Mfinanga, S.; Rachow, A.; Hoelscher, M.

    2015-01-01

    Setting: Tanzania is a high-burden country for tuberculosis (TB), and prisoners are a high-risk group that should be screened actively, as recommended by the World Health Organization. Screening algorithms, starting with chest X-rays (CXRs), can detect asymptomatic cases, but depend on experienced readers, who are scarce in the penitentiary setting. Recent studies with patients seeking health care for TB-related symptoms showed good diagnostic performance of the computer software CAD4TB. Objective: To assess the potential of computer-assisted screening using CAD4TB in a predominantly asymptomatic prison population. Design: Cross-sectional study. Results: CAD4TB and seven health care professionals reading CXRs in local tuberculosis wards evaluated a set of 511 CXRs from the Ukonga prison in Dar es Salaam. Performance was compared using a radiological reference. Two readers performed significantly better than CAD4TB, three were comparable, and two performed significantly worse (area under the curve 0.75 in receiver operating characteristics analysis). On a superset of 1321 CXRs, CAD4TB successfully interpreted >99%, with a predictably short time to detection, while 160 (12.2%) reports were delayed by over 24 h with conventional CXR reading. Conclusion: CAD4TB reliably evaluates CXRs from a mostly asymptomatic prison population, with a diagnostic performance inferior to that of expert readers but comparable to local readers. PMID:26767179

  10. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    PubMed Central

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-01-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening. PMID:27126741

  11. Inhibitory effects of phytochemicals on metabolic capabilities of CYP2D6*1 and CYP2D6*10 using cell-based models in vitro

    PubMed Central

    Qu, Qiang; Qu, Jian; Han, Lu; Zhan, Min; Wu, Lan-xiang; Zhang, Yi-wen; Zhang, Wei; Zhou, Hong-hao

    2014-01-01

    Aim: Herbal products have been widely used, and the safety of herb-drug interactions has aroused intensive concerns. This study aimed to investigate the effects of phytochemicals on the catalytic activities of human CYP2D6*1 and CYP2D6*10 in vitro. Methods: HepG2 cells were stably transfected with CYP2D6*1 and CYP2D6*10 expression vectors. The metabolic kinetics of the enzymes was studied using HPLC and fluorimetry. Results: HepG2-CYP2D6*1 and HepG2-CYP2D6*10 cell lines were successfully constructed. Among the 63 phytochemicals screened, 6 compounds, including coptisine sulfate, bilobalide, schizandrin B, luteolin, schizandrin A and puerarin, at 100 μmol/L inhibited CYP2D6*1- and CYP2D6*10-mediated O-demethylation of a coumarin compound AMMC by more than 50%. Furthermore, the inhibition by these compounds was dose-dependent. Eadie-Hofstee plots demonstrated that these compounds competitively inhibited CYP2D6*1 and CYP2D6*10. However, their Ki values for CYP2D6*1 and CYP2D6*10 were very close, suggesting that genotype-dependent herb-drug inhibition was similar between the two variants. Conclusion: Six phytochemicals inhibit CYP2D6*1 and CYP2D6*10-mediated catalytic activities in a dose-dependent manner in vitro. Thus herbal products containing these phytochemicals may inhibit the in vivo metabolism of co-administered drugs whose primary route of elimination is CYP2D6. PMID:24786236

  12. PERSONAL COMPUTER MONITORS: A SCREENING EVALUATION OF VOLATILE ORGANIC EMISSIONS FROM EXISTING PRINTED CIRCUIT BOARD LAMINATES AND POTENTIAL POLLUTION PREVENTION ALTERNATIVES

    EPA Science Inventory

    The report gives results of a screening evaluation of volatile organic emissions from printed circuit board laminates and potential pollution prevention alternatives. In the evaluation, printed circuit board laminates, without circuitry, commonly found in personal computer (PC) m...

  13. Ultrafast 2D NMR: an emerging tool in analytical spectroscopy.

    PubMed

    Giraudeau, Patrick; Frydman, Lucio

    2014-01-01

    Two-dimensional nuclear magnetic resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago, a so-called ultrafast (UF) approach was proposed, capable of delivering arbitrary 2D NMR spectra involving any kind of homo- or heteronuclear correlation, in a single scan. During the intervening years, the performance of this subsecond 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool experiencing an expanded scope of applications. This review summarizes the principles and main developments that have contributed to the success of this approach and focuses on applications that have been recently demonstrated in various areas of analytical chemistry--from the real-time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  14. Beneficial effects through aggressive coronary screening for type 2 diabetes patients with advanced vascular complications.

    PubMed

    Tsujimoto, Tetsuro; Sugiyama, Takehiro; Yamamoto-Honda, Ritsuko; Kishimoto, Miyako; Noto, Hiroshi; Morooka, Miyako; Kubota, Kazuo; Kamimura, Munehiro; Hara, Hisao; Kajio, Hiroshi; Kakei, Masafumi; Noda, Mitsuhiko

    2016-08-01

    Glycemic control alone does not reduce cardiovascular events in patients with type 2 diabetes (T2D), and routine screening of all T2D patients for asymptomatic coronary artery disease (CAD) is not effective for preventing acute cardiac events. We examined the effectiveness of an aggressive screening protocol for asymptomatic CAD in T2D patients with advanced vascular complications.We designed a 3-year cohort study investigating the effectiveness of the aggressive coronary screening for T2D patients with advanced vascular complications and no known coronary events using propensity score adjusted analysis at a national center in Japan. Eligibility criteria included T2D without known coronary events and with any 1 of the following 4 complications: advanced diabetic retinopathy, advanced chronic kidney disease, peripheral artery disease, or cerebrovascular disease. In the aggressive screening group (n = 122), all patients received stress single photon emission computed tomography and those exhibiting myocardial perfusion abnormalities underwent coronary angiography. In the conventional screening group (n = 108), patients were examined for CAD at the discretion of their medical providers. Primary endpoint was composite outcome of cardiovascular death and nonfatal cardiovascular events.Asymptomatic CAD with ≥70% stenosis was detected in 39.3% of patients completing aggressive screening. The proportions achieving revascularization and receiving intensive medical therapy within 90 days after the screening were significantly higher in the aggressive screening group than in the conventional screening group [19.7% vs 0% (P < 0.001) and 48.4% vs 9.3% (P < 0.001), respectively]. The cumulative rate of primary composite outcome was significantly lower in the aggressive screening group according to a propensity score adjusted Cox proportional hazards model (hazard ratio, 0.35; 95% confidence interval, 0.12-0.96; P = 0.04).Aggressive coronary screening for T2D patients

  15. Circulating microRNA signature as liquid-biopsy to monitor lung cancer in low-dose computed tomography screening.

    PubMed

    Sestini, Stefano; Boeri, Mattia; Marchiano, Alfonso; Pelosi, Giuseppe; Galeone, Carlotta; Verri, Carla; Suatoni, Paola; Sverzellati, Nicola; La Vecchia, Carlo; Sozzi, Gabriella; Pastorino, Ugo

    2015-10-20

    Liquid biopsies can detect biomarkers carrying information on the development and progression of cancer. We demonstrated that a 24 plasma-based microRNA signature classifier (MSC) was capable of increasing the specificity of low dose computed tomography (LDCT) in a lung cancer screening trial. In the present study, we tested the prognostic performance of MSC, and its ability to monitor disease status recurrence in LDCT screening-detected lung cancers.Between 2000 and 2010, 3411 heavy smokers enrolled in two screening programmes, underwent annual or biennial LDCT. During the first five years of screening, 84 lung cancer patients were classified according to one of the three MSC levels of risk: high, intermediate or low. Kaplan-Meier survival analysis was performed according to MSC and clinico-pathological information. Follow-up MSC analysis was performed on longitudinal plasma samples (n = 100) collected from 31 patients before and after surgical resection.Five-year survival was 88.9% for low risk, 79.5% for intermediate risk and 40.1% for high risk MSC (p = 0.001). The prognostic power of MSC persisted after adjusting for tumor stage (p = 0.02) and when the analysis was restricted to LDCT-detected cases after exclusion of interval cancers (p < 0.001). The MSC risk level decreased after surgery in 76% of the 25 high-intermediate subjects who remained disease free, whereas in relapsing patients an increase of the MSC risk level was observed at the time of detection of second primary tumor or metastatic progression.These results encourage exploiting the MSC test for lung cancer monitoring in LDCT screening for lung cancer.

  16. Circulating microRNA signature as liquid-biopsy to monitor lung cancer in low-dose computed tomography screening

    PubMed Central

    Marchiano, Alfonso; Pelosi, Giuseppe; Galeone, Carlotta; Verri, Carla; Suatoni, Paola; Sverzellati, Nicola

    2015-01-01

    Liquid biopsies can detect biomarkers carrying information on the development and progression of cancer. We demonstrated that a 24 plasma-based microRNA signature classifier (MSC) was capable of increasing the specificity of low dose computed tomography (LDCT) in a lung cancer screening trial. In the present study, we tested the prognostic performance of MSC, and its ability to monitor disease status recurrence in LDCT screening-detected lung cancers. Between 2000 and 2010, 3411 heavy smokers enrolled in two screening programmes, underwent annual or biennial LDCT. During the first five years of screening, 84 lung cancer patients were classified according to one of the three MSC levels of risk: high, intermediate or low. Kaplan-Meier survival analysis was performed according to MSC and clinico-pathological information. Follow-up MSC analysis was performed on longitudinal plasma samples (n = 100) collected from 31 patients before and after surgical resection. Five-year survival was 88.9% for low risk, 79.5% for intermediate risk and 40.1% for high risk MSC (p = 0.001). The prognostic power of MSC persisted after adjusting for tumor stage (p = 0.02) and when the analysis was restricted to LDCT-detected cases after exclusion of interval cancers (p < 0.001). The MSC risk level decreased after surgery in 76% of the 25 high-intermediate subjects who remained disease free, whereas in relapsing patients an increase of the MSC risk level was observed at the time of detection of second primary tumor or metastatic progression. These results encourage exploiting the MSC test for lung cancer monitoring in LDCT screening for lung cancer. PMID:26451608

  17. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals

    PubMed Central

    Amat-ur-Rasool, Hafsa; Ahmed, Mehboob

    2015-01-01

    Alzheimer's disease (AD), a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh). The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE), an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals) and self-drawn ligands were compared with Food and Drug Administration (FDA) approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD. PMID:26325402

  18. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

    PubMed

    Amat-Ur-Rasool, Hafsa; Ahmed, Mehboob

    2015-01-01

    Alzheimer's disease (AD), a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh). The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE), an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals) and self-drawn ligands were compared with Food and Drug Administration (FDA) approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD. PMID:26325402

  19. Assessment of an Interactive Computer-Based Patient Prenatal Genetic Screening and Testing Education Tool

    ERIC Educational Resources Information Center

    Griffith, Jennifer M.; Sorenson, James R.; Bowling, J. Michael; Jennings-Grant, Tracey

    2005-01-01

    The Enhancing Patient Prenatal Education study tested the feasibility and educational impact of an interactive program for patient prenatal genetic screening and testing education. Patients at two private practices and one public health clinic participated (N = 207). The program collected knowledge and measures of anxiety before and after use of…

  20. Combinatorial screening of polymer precursors for preparation of benzo[α] pyrene imprinted polymer: an ab initio computational approach.

    PubMed

    Khan, Muntazir S; Wate, Prateek S; Krupadam, Reddithota J

    2012-05-01

    A combinatorial screening procedure was used for the selection of polymer precursors in the preparation of molecularly imprinted polymer (MIP), which is useful in the detection of the air pollution marker molecule benzo[a]pyrene (BAP). Molecular imprinting is a technique for the preparation of polymer materials with specific molecular recognition receptors. The preparation of imprinted polymers requires polymer precursors such as functional monomer, cross-linking monomer, solvent, an initiator of polymerization and thermal or UV radiation. A virtual library of functional monomers was prepared based on interaction binding scores computed using HyperChem Release 8.0 software. Initially, the possible minimum energy conformation of the monomers and BAP were optimized using the semi-empirical (PM3) quantum method. The binding energy between the functional monomer and the template (BAP) was computed using the Hartree-Fock (HF) method with 6-31 G basis set, which is an ab initio approach based on Moller-Plesset second order perturbation theory (MP2). From the computations, methacrylic acid (MAA) and ethylene glycol dimethacrylate (EGDMA) were selected for preparation of BAP imprinted polymer. The larger interaction energy (ΔE) represents possibility of more affinity binding sites formation in the polymer, which provides high binding capacity. The theoretical predictions were complimented through adsorption experiments. There is a good agreement between experimental binding results and theoretical computations, which provides further evidence of the validity of the usefulness of computational screening procedures in the selection of appropriate MIP precursors in an experiment-free way.

  1. On the sensitivity of the 2D electromagnetic invisibility cloak

    NASA Astrophysics Data System (ADS)

    Kaproulias, S.; Sigalas, M. M.

    2012-10-01

    A computational study of the sensitivity of the two dimensional (2D) electromagnetic invisibility cloaks is performed with the finite element method. A circular metallic object is covered with the cloak and the effects of absorption, gain and disorder are examined. Also the effect of covering the cloak with a thin dielectric layer is studied.

  2. Sparse radar imaging using 2D compressed sensing

    NASA Astrophysics Data System (ADS)

    Hou, Qingkai; Liu, Yang; Chen, Zengping; Su, Shaoying

    2014-10-01

    Radar imaging is an ill-posed linear inverse problem and compressed sensing (CS) has been proved to have tremendous potential in this field. This paper surveys the theory of radar imaging and a conclusion is drawn that the processing of ISAR imaging can be denoted mathematically as a problem of 2D sparse decomposition. Based on CS, we propose a novel measuring strategy for ISAR imaging radar and utilize random sub-sampling in both range and azimuth dimensions, which will reduce the amount of sampling data tremendously. In order to handle 2D reconstructing problem, the ordinary solution is converting the 2D problem into 1D by Kronecker product, which will increase the size of dictionary and computational cost sharply. In this paper, we introduce the 2D-SL0 algorithm into the reconstruction of imaging. It is proved that 2D-SL0 can achieve equivalent result as other 1D reconstructing methods, but the computational complexity and memory usage is reduced significantly. Moreover, we will state the results of simulating experiments and prove the effectiveness and feasibility of our method.

  3. Stuck on Screens: Patterns of Computer and Gaming Station Use in Youth Seen in a Psychiatric Clinic

    PubMed Central

    Baer, Susan; Bogusz, Elliot; Green, David A.

    2011-01-01

    Objective: Computer and gaming-station use has become entrenched in the culture of our youth. Parents of children with psychiatric disorders report concerns about overuse, but research in this area is limited. The goal of this study is to evaluate computer/gaming-station use in adolescents in a psychiatric clinic population and to examine the relationship between use and functional impairment. Method: 102 adolescents, ages 11–17, from out-patient psychiatric clinics participated. Amount of computer/gaming-station use, type of use (gaming or non-gaming), and presence of addictive features were ascertained along with emotional/functional impairment. Multivariate linear regression was used to examine correlations between patterns of use and impairment. Results: Mean screen time was 6.7±4.2 hrs/day. Presence of addictive features was positively correlated with emotional/functional impairment. Time spent on computer/gaming-station use was not correlated overall with impairment after controlling for addictive features, but non-gaming time was positively correlated with risky behavior in boys. Conclusions: Youth with psychiatric disorders are spending much of their leisure time on the computer/gaming-station and a substantial subset show addictive features of use which is associated with impairment. Further research to develop measures and to evaluate risk is needed to identify the impact of this problem. PMID:21541096

  4. Perspectives for spintronics in 2D materials

    NASA Astrophysics Data System (ADS)

    Han, Wei

    2016-03-01

    The past decade has been especially creative for spintronics since the (re)discovery of various two dimensional (2D) materials. Due to the unusual physical characteristics, 2D materials have provided new platforms to probe the spin interaction with other degrees of freedom for electrons, as well as to be used for novel spintronics applications. This review briefly presents the most important recent and ongoing research for spintronics in 2D materials.

  5. Screening for Substance Use Disorder among Incarcerated Men with the Alcohol, Smoking, Substance Involvement Screening Test (ASSIST): A Comparative Analysis of Computer-administered and Interviewer-administered Modalities

    PubMed Central

    Wolff, Nancy; Shi, Jing

    2015-01-01

    Substance use disorders are overrepresented in incarcerated male populations. Cost- effective screening for alcohol and substance use problems among incarcerated populations is a necessary first step forward intervention. The Alcohol, Smoking, and Substance Involvement Screening Test (ASSIST) holds promise because it has strong psychometric properties, requires minimal training, is easy to score, is available in the public domain but, because of complicated skip patterns, cannot be self-administered. This study tests the feasibility, reliability, and validity of using computer-administered self-interviewing (CASI) versus interviewer-administered interviewing (IAI) to screen for substance use problems among incarcerated men using the ASSIST. A 2 X 2 factorial design was used to randomly assign 396 incarcerated men to screening modality. Findings indicate that computer screening was feasible. Compared to IAI, CASI produced equally reliable screening information on substance use and symptom severity, with test-retest intraclass correlations for ASSIST total and substance-specific scores ranging from 0.7 to 0.9, and ASSIST substance-specific scores and a substance abuse disorder diagnosis based on the Structured Clinical Interview (SCID) were significantly correlated for IAI and CASI. These findings indicate that data on substance use and symptom severity using the ASSIST can be reliably and validly obtained from CASI technology, increasing the efficiency by which incarcerated populations can be screened for substance use problems and, those at risk, identified for treatment. PMID:25659203

  6. Radiative heat transfer in 2D Dirac materials.

    PubMed

    Rodriguez-López, Pablo; Tse, Wang-Kong; Dalvit, Diego A R

    2015-06-01

    We compute the radiative heat transfer between two sheets of 2D Dirac materials, including topological Chern insulators and graphene, within the framework of the local approximation for the optical response of these materials. In this approximation, which neglects spatial dispersion, we derive both numerically and analytically the short-distance asymptotic of the near-field heat transfer in these systems, and show that it scales as the inverse of the distance between the two sheets. Finally, we discuss the limitations to the validity of this scaling law imposed by spatial dispersion in 2D Dirac materials. PMID:25965703

  7. Radiative heat transfer in 2D Dirac materials

    DOE PAGES

    Rodriguez-López, Pablo; Tse, Wang -Kong; Dalvit, Diego A. R.

    2015-05-12

    We compute the radiative heat transfer between two sheets of 2D Dirac materials, including topological Chern insulators and graphene, within the framework of the local approximation for the optical response of these materials. In this approximation, which neglects spatial dispersion, we derive both numerically and analytically the short-distance asymptotic of the near-field heat transfer in these systems, and show that it scales as the inverse of the distance between the two sheets. In conclusion, we discuss the limitations to the validity of this scaling law imposed by spatial dispersion in 2D Dirac materials.

  8. Tools for building a comprehensive modeling system for virtual screening under real biological conditions: The Computational Titration algorithm.

    PubMed

    Kellogg, Glen E; Fornabaio, Micaela; Chen, Deliang L; Abraham, Donald J; Spyrakis, Francesca; Cozzini, Pietro; Mozzarelli, Andrea

    2006-05-01

    Computational tools utilizing a unique empirical modeling system based on the hydrophobic effect and the measurement of logP(o/w) (the partition coefficient for solvent transfer between 1-octanol and water) are described. The associated force field, Hydropathic INTeractions (HINT), contains much rich information about non-covalent interactions in the biological environment because of its basis in an experiment that measures interactions in solution. HINT is shown to be the core of an evolving virtual screening system that is capable of taking into account a number of factors often ignored such as entropy, effects of solvent molecules at the active site, and the ionization states of acidic and basic residues and ligand functional groups. The outline of a comprehensive modeling system for virtual screening that incorporates these features is described. In addition, a detailed description of the Computational Titration algorithm is provided. As an example, three complexes of dihydrofolate reductase (DHFR) are analyzed with our system and these results are compared with the experimental free energies of binding.

  9. Comparison of storage phosphor computed radiography with conventional film-screen radiography in the recognition of pneumoconiosis.

    PubMed

    Laney, A S; Petsonk, E L; Wolfe, A L; Attfield, M D

    2010-07-01

    Traditional film-screen radiography (FSR) has been useful in the recognition and evaluation of interstitial lung diseases, but is becoming increasingly obsolete. To evaluate the applicability of storage phosphor digital computed radiography (CR) images in the recognition of small lung opacities, we compared image quality and the profusion of small opacities between FSR and CR radiographs. We screened 1,388 working coal miners during the course of the study with FSR and CR images obtained on the same day from all participants. Each traditional chest film was independently interpreted by two of eight experienced readers using the International Labour Office (ILO) classification of radiographs of pneumoconiosis, as were CR images displayed on medical-grade computer monitors. The prevalence of small opacities (ILO category 1/0 or greater) did not differ between the two imaging modalities (5.2% for FSR and 4.8% for soft copy CR; p>0.50). Inter-reader agreement was also similar between FSR and CR. Significant differences between image modalities were observed in the shape of small opacities, and in the proportion of miners demonstrating high opacity profusion (category 2/1 and above). Our results indicate that, with appropriate attention to image acquisition and soft copy display, CR digital radiography can be equivalent to FSR in the identification of small interstitial lung opacities. PMID:19926739

  10. Quantitative 2D liquid-state NMR.

    PubMed

    Giraudeau, Patrick

    2014-06-01

    Two-dimensional (2D) liquid-state NMR has a very high potential to simultaneously determine the absolute concentration of small molecules in complex mixtures, thanks to its capacity to separate overlapping resonances. However, it suffers from two main drawbacks that probably explain its relatively late development. First, the 2D NMR signal is strongly molecule-dependent and site-dependent; second, the long duration of 2D NMR experiments prevents its general use for high-throughput quantitative applications and affects its quantitative performance. Fortunately, the last 10 years has witnessed an increasing number of contributions where quantitative approaches based on 2D NMR were developed and applied to solve real analytical issues. This review aims at presenting these recent efforts to reach a high trueness and precision in quantitative measurements by 2D NMR. After highlighting the interest of 2D NMR for quantitative analysis, the different strategies to determine the absolute concentrations from 2D NMR spectra are described and illustrated by recent applications. The last part of the manuscript concerns the recent development of fast quantitative 2D NMR approaches, aiming at reducing the experiment duration while preserving - or even increasing - the analytical performance. We hope that this comprehensive review will help readers to apprehend the current landscape of quantitative 2D NMR, as well as the perspectives that may arise from it.

  11. STEALTH - a Lagrange explicit finite-difference code for solid, structural, and thermohydraulic analysis. Volume 8A: STEALTH/WHAMSE - a 2-D fluid-structure interaction code. Computer code manual

    SciTech Connect

    Gross, M.B.

    1984-10-01

    STEALTH is a family of computer codes that can be used to calculate a variety of physical processes in which the dynamic behavior of a continuum is involved. The version of STEALTH described in this volume is designed for calculations of fluid-structure interaction. This version of the program consists of a hydrodynamic version of STEALTH which has been coupled to a finite-element code, WHAMSE. STEALTH computes the transient response of the fluid continuum, while WHAMSE computes the transient response of shell and beam structures under external fluid loadings. The coupling between STEALTH and WHAMSE is performed during each cycle or step of a calculation. Separate calculations of fluid response and structural response are avoided, thereby giving a more accurate model of the dynamic coupling between fluid and structure. This volume provides the theoretical background, the finite-difference equations, the finite-element equations, a discussion of several sample problems, a listing of the input decks for the sample problems, a programmer's manual and a description of the input records for the STEALTH/WHAMSE computer program.

  12. Computational Toxicology as Implemented by the U.S. EPA: Providing High Throughput Decision Support Tools for Screening and Assessing Chemical Exposure, Hazard and Risk

    EPA Science Inventory

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...

  13. Screen time and children

    MedlinePlus

    "Screen time" is a term used for activities done in front of a screen, such as watching TV, working on a computer, or playing video games. Screen time is sedentary activity, meaning you are being physically ...

  14. Computer methodology for transportation agencies to screen technologies for hazardous waste remediation. Technical report

    SciTech Connect

    Grenney, W.J.; Penmetsa, R.K.

    1993-03-01

    The objective of this study was to develop a user friendly computerized methodology for screening out the most inappropriate treatment technologies for a specific waste at a specific site. The STEP model was developed for this purpose using knowledge-base expert system techniques. Object oriented programming was used to interface multiple rule-bases, databases, and a simulation model. The STEP model was applied to a case study involving the spillage of 27,000 gallons of JP-4 jet fuel, due to the failure of an automatic shut-off valve, at an air facility.

  15. 2D bifurcations and Newtonian properties of memristive Chua's circuits

    NASA Astrophysics Data System (ADS)

    Marszalek, W.; Podhaisky, H.

    2016-01-01

    Two interesting properties of Chua's circuits are presented. First, two-parameter bifurcation diagrams of Chua's oscillatory circuits with memristors are presented. To obtain various 2D bifurcation images a substantial numerical effort, possibly with parallel computations, is needed. The numerical algorithm is described first and its numerical code for 2D bifurcation image creation is available for free downloading. Several color 2D images and the corresponding 1D greyscale bifurcation diagrams are included. Secondly, Chua's circuits are linked to Newton's law φ ''= F(t,φ,φ')/m with φ=\\text{flux} , constant m > 0, and the force term F(t,φ,φ') containing memory terms. Finally, the jounce scalar equations for Chua's circuits are also discussed.

  16. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction.

    PubMed

    Rowley-Neale, Samuel J; Fearn, Jamie M; Brownson, Dale A C; Smith, Graham C; Ji, Xiaobo; Banks, Craig E

    2016-08-21

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm(-2) modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR. PMID:27448174

  17. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction.

    PubMed

    Rowley-Neale, Samuel J; Fearn, Jamie M; Brownson, Dale A C; Smith, Graham C; Ji, Xiaobo; Banks, Craig E

    2016-08-21

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm(-2) modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR.

  18. Mean flow and anisotropic cascades in decaying 2D turbulence

    NASA Astrophysics Data System (ADS)

    Liu, Chien-Chia; Cerbus, Rory; Gioia, Gustavo; Chakraborty, Pinaki

    2015-11-01

    Many large-scale atmospheric and oceanic flows are decaying 2D turbulent flows embedded in a non-uniform mean flow. Despite its importance for large-scale weather systems, the affect of non-uniform mean flows on decaying 2D turbulence remains unknown. In the absence of mean flow it is well known that decaying 2D turbulent flows exhibit the enstrophy cascade. More generally, for any 2D turbulent flow, all computational, experimental and field data amassed to date indicate that the spectrum of longitudinal and transverse velocity fluctuations correspond to the same cascade, signifying isotropy of cascades. Here we report experiments on decaying 2D turbulence in soap films with a non-uniform mean flow. We find that the flow transitions from the usual isotropic enstrophy cascade to a series of unusual and, to our knowledge, never before observed or predicted, anisotropic cascades where the longitudinal and transverse spectra are mutually independent. We discuss implications of our results for decaying geophysical turbulence.

  19. Computational tool for the early screening of monoclonal antibodies for their viscosities

    PubMed Central

    Agrawal, Neeraj J; Helk, Bernhard; Kumar, Sandeep; Mody, Neil; Sathish, Hasige A.; Samra, Hardeep S.; Buck, Patrick M; Li, Li; Trout, Bernhardt L

    2016-01-01

    Highly concentrated antibody solutions often exhibit high viscosities, which present a number of challenges for antibody-drug development, manufacturing and administration. The antibody sequence is a key determinant for high viscosity of highly concentrated solutions; therefore, a sequence- or structure-based tool that can identify highly viscous antibodies from their sequence would be effective in ensuring that only antibodies with low viscosity progress to the development phase. Here, we present a spatial charge map (SCM) tool that can accurately identify highly viscous antibodies from their sequence alone (using homology modeling to determine the 3-dimensional structures). The SCM tool has been extensively validated at 3 different organizations, and has proved successful in correctly identifying highly viscous antibodies. As a quantitative tool, SCM is amenable to high-throughput automated analysis, and can be effectively implemented during the antibody screening or engineering phase for the selection of low-viscosity antibodies. PMID:26399600

  20. Computer methodology for transportation agencies to screen technologies for hazardous waste remediation. Volume 3. Technical report

    SciTech Connect

    Grenney, W.J.; Penmetsa, R.K.

    1993-03-01

    When transportation agencies become involved in the remediation of hazardous waste on their sites, the common practice is to hire consultants and contractors for the clean up process. Because the field of hazardous waste site remediation is changing so rapidly, agency personnel evaluating the consultant's recommendations need to have access to the most recent regulatory and remediation information. Early stages of the remediation process typically involve site assessment, and the identification of feasible technologies for treatment. A user friendly computerized methodology was developed for screening out the most inappropriate treatment technologies for a specific waste at a specific site. The STEP model was developed for this purpose using knowledge-base expert system techniques. Object oriented programming was used to interface multiple rule-bases, databases, and a simulation model.

  1. Identification of Potent Chemotypes Targeting Leishmania major Using a High-Throughput, Low-Stringency, Computationally Enhanced, Small Molecule Screen

    PubMed Central

    Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.

    2009-01-01

    Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337

  2. An Official American Thoracic Society/American College of Chest Physicians Policy Statement: Implementation of Low-Dose Computed Tomography Lung Cancer Screening Programs in Clinical Practice

    PubMed Central

    Wiener, Renda Soylemez; Gould, Michael K.; Arenberg, Douglas A.; Au, David H.; Fennig, Kathleen; Lamb, Carla R.; Mazzone, Peter J.; Midthun, David E.; Napoli, Maryann; Ost, David E.; Powell, Charles A.; Rivera, M. Patricia; Slatore, Christopher G.; Tanner, Nichole T.; Vachani, Anil; Wisnivesky, Juan P.; Yoon, Sue H.

    2015-01-01

    Rationale: Annual low-radiation-dose computed tomography (LDCT) screening for lung cancer has been shown to reduce lung cancer mortality among high-risk individuals and is now recommended by multiple organizations. However, LDCT screening is complex, and implementation requires careful planning to ensure benefits outweigh harms. Little guidance has been provided for sites wishing to develop and implement lung cancer screening programs. Objectives: To promote successful implementation of comprehensive LDCT screening programs that are safe, effective, and sustainable. Methods: The American Thoracic Society (ATS) and American College of Chest Physicians (CHEST) convened a committee with expertise in lung cancer screening, pulmonary nodule evaluation, and implementation science. The committee reviewed the evidence from systematic reviews, clinical practice guidelines, surveys, and the experience of early-adopting LDCT screening programs and summarized potential strategies to implement LDCT screening programs successfully. Measurements and Main Results: We address steps that sites should consider during the main three phases of developing an LDCT screening program: planning, implementation, and maintenance. We present multiple strategies to implement the nine core elements of comprehensive lung cancer screening programs enumerated in a recent CHEST/ATS statement, which will allow sites to select the strategy that best fits with their local context and workflow patterns. Although we do not comment on cost-effectiveness of LDCT screening, we outline the necessary costs associated with starting and sustaining a high-quality LDCT screening program. Conclusions: Following the strategies delineated in this policy statement may help sites to develop comprehensive LDCT screening programs that are safe and effective. PMID:26426785

  3. Application of computer-extracted breast tissue texture features in predicting false-positive recalls from screening mammography

    NASA Astrophysics Data System (ADS)

    Ray, Shonket; Choi, Jae Y.; Keller, Brad M.; Chen, Jinbo; Conant, Emily F.; Kontos, Despina

    2014-03-01

    Mammographic texture features have been shown to have value in breast cancer risk assessment. Previous models have also been developed that use computer-extracted mammographic features of breast tissue complexity to predict the risk of false-positive (FP) recall from breast cancer screening with digital mammography. This work details a novel locallyadaptive parenchymal texture analysis algorithm that identifies and extracts mammographic features of local parenchymal tissue complexity potentially relevant for false-positive biopsy prediction. This algorithm has two important aspects: (1) the adaptive nature of automatically determining an optimal number of region-of-interests (ROIs) in the image and each ROI's corresponding size based on the parenchymal tissue distribution over the whole breast region and (2) characterizing both the local and global mammographic appearances of the parenchymal tissue that could provide more discriminative information for FP biopsy risk prediction. Preliminary results show that this locallyadaptive texture analysis algorithm, in conjunction with logistic regression, can predict the likelihood of false-positive biopsy with an ROC performance value of AUC=0.92 (p<0.001) with a 95% confidence interval [0.77, 0.94]. Significant texture feature predictors (p<0.05) included contrast, sum variance and difference average. Sensitivity for false-positives was 51% at the 100% cancer detection operating point. Although preliminary, clinical implications of using prediction models incorporating these texture features may include the future development of better tools and guidelines regarding personalized breast cancer screening recommendations. Further studies are warranted to prospectively validate our findings in larger screening populations and evaluate their clinical utility.

  4. 2D materials for nanophotonic devices

    NASA Astrophysics Data System (ADS)

    Xu, Renjing; Yang, Jiong; Zhang, Shuang; Pei, Jiajie; Lu, Yuerui

    2015-12-01

    Two-dimensional (2D) materials have become very important building blocks for electronic, photonic, and phononic devices. The 2D material family has four key members, including the metallic graphene, transition metal dichalcogenide (TMD) layered semiconductors, semiconducting black phosphorous, and the insulating h-BN. Owing to the strong quantum confinements and defect-free surfaces, these atomically thin layers have offered us perfect platforms to investigate the interactions among photons, electrons and phonons. The unique interactions in these 2D materials are very important for both scientific research and application engineering. In this talk, I would like to briefly summarize and highlight the key findings, opportunities and challenges in this field. Next, I will introduce/highlight our recent achievements. We demonstrated atomically thin micro-lens and gratings using 2D MoS2, which is the thinnest optical component around the world. These devices are based on our discovery that the elastic light-matter interactions in highindex 2D materials is very strong. Also, I would like to introduce a new two-dimensional material phosphorene. Phosphorene has strongly anisotropic optical response, which creates 1D excitons in a 2D system. The strong confinement in phosphorene also enables the ultra-high trion (charged exciton) binding energies, which have been successfully measured in our experiments. Finally, I will briefly talk about the potential applications of 2D materials in energy harvesting.

  5. Internal Photoemission Spectroscopy of 2-D Materials

    NASA Astrophysics Data System (ADS)

    Nguyen, Nhan; Li, Mingda; Vishwanath, Suresh; Yan, Rusen; Xiao, Shudong; Xing, Huili; Cheng, Guangjun; Hight Walker, Angela; Zhang, Qin

    Recent research has shown the great benefits of using 2-D materials in the tunnel field-effect transistor (TFET), which is considered a promising candidate for the beyond-CMOS technology. The on-state current of TFET can be enhanced by engineering the band alignment of different 2D-2D or 2D-3D heterostructures. Here we present the internal photoemission spectroscopy (IPE) approach to determine the band alignments of various 2-D materials, in particular SnSe2 and WSe2, which have been proposed for new TFET designs. The metal-oxide-2-D semiconductor test structures are fabricated and characterized by IPE, where the band offsets from the 2-D semiconductor to the oxide conduction band minimum are determined by the threshold of the cube root of IPE yields as a function of photon energy. In particular, we find that SnSe2 has a larger electron affinity than most semiconductors and can be combined with other semiconductors to form near broken-gap heterojunctions with low barrier heights which can produce a higher on-state current. The details of data analysis of IPE and the results from Raman spectroscopy and spectroscopic ellipsometry measurements will also be presented and discussed.

  6. Gold silver alloy nanoparticles (GSAN): an imaging probe for breast cancer screening with dual-energy mammography or computed tomography.

    PubMed

    Naha, Pratap C; Lau, Kristen C; Hsu, Jessica C; Hajfathalian, Maryam; Mian, Shaameen; Chhour, Peter; Uppuluri, Lahari; McDonald, Elizabeth S; Maidment, Andrew D A; Cormode, David P

    2016-07-14

    Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening. PMID:27412458

  7. Interobserver variations on interpretation of multislice CT lung cancer screening studies, and the implications for computer-aided diagnosis

    NASA Astrophysics Data System (ADS)

    Novak, Carol L.; Qian, JianZhong; Fan, Li; Ko, Jane P.; Rubinowitz, Ami N.; McGuinness, Georgeann; Naidich, David

    2002-04-01

    With low dose multi-slice CT for screening of lung cancer, physicians are now finding and examining increasingly smaller nodules. However as the size of detectable nodules becomes smaller, there may be greater differences among physicians as to what is detected and what constitutes a nodule. In this study, 10 CT screening studies of smokers were individually evaluated by three thoracic radiologists. After consensus to determine a gold standard, the number of nodules detected by individual radiologists ranged from 1.4 to 2.1 detections per patient. Each radiologist detected nodules missed by the other two. Although a total of 26 true nodules were detected by one or more radiologists, only 8 (31%) were detected by all three radiologists. The number of true nodules detected by an integrated automatic detection algorithm was 3.2 per patient after radiologist validation. Including these nodules in the gold standard set reduced the sensitivity of nodule detection by each radiologist to less than half. The sensitivity of nodule detection by the computer was better at 64%, proving especially efficacious for detecting smaller and more central nodules. Use of the automatic detection module would allow individual radiologists to increase the number of detected nodules by 114% to 207%.

  8. Assisting people with multiple disabilities to improve computer typing efficiency through a mouse wheel and on-screen keyboard software.

    PubMed

    Shih, Ching-Hsiang

    2014-09-01

    The main purpose of this study was to find out whether three students with multiple disabilities could increase their keyboard typing performance by poking the standard mouse scroll wheel with the newly developed Dynamic Typing Assistive Program (DTAP) and the built-in On-Screen Keyboard (OSK) computer software. The DTAP is a software solution that allows users to complete typing tasks with OSK software easily, quickly, and accurately by poking the mouse wheel. This study was performed according to a multiple baseline design across participants, and the experimental data showed that all of the participants significantly increased their typing efficiency in the intervention phase. Moreover, this improved performance was maintained during the maintenance phase. Practical and developmental implications of the findings were discussed.

  9. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    PubMed

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements.

  10. A Study for Visual Realism of Designed Pictures on Computer Screens by Investigation and Brain-Wave Analyses.

    PubMed

    Wang, Lan-Ting; Lee, Kun-Chou

    2016-08-01

    In this article, the visual realism of designed pictures on computer screens is studied by investigation and brain-wave analyses. The practical electroencephalogram (EEG) measurement is always time-varying and fluctuating so that conventional statistical techniques are not adequate for analyses. This study proposes a new scheme based on "fingerprinting" to analyze the EEG. Fingerprinting is a technique of probabilistic pattern recognition used in electrical engineering, very like the identification of human fingerprinting in a criminal investigation. The goal of this study was to assess whether subjective preference for pictures could be manifested physiologically by EEG fingerprinting analyses. The most important advantage of the fingerprinting technique is that it does not require accurate measurement. Instead, it uses probabilistic classification. Participants' preference for pictures can be assessed using fingerprinting analyses of physiological EEG measurements. PMID:27324166

  11. Computational redesign of bacterial biotin carboxylase inhibitors using structure-based virtual screening of combinatorial libraries.

    PubMed

    Brylinski, Michal; Waldrop, Grover L

    2014-01-01

    As the spread of antibiotic resistant bacteria steadily increases, there is an urgent need for new antibacterial agents. Because fatty acid synthesis is only used for membrane biogenesis in bacteria, the enzymes in this pathway are attractive targets for antibacterial agent development. Acetyl-CoA carboxylase catalyzes the committed and regulated step in fatty acid synthesis. In bacteria, the enzyme is composed of three distinct protein components: biotin carboxylase, biotin carboxyl carrier protein, and carboxyltransferase. Fragment-based screening revealed that amino-oxazole inhibits biotin carboxylase activity and also exhibits antibacterial activity against Gram-negative organisms. In this report, we redesigned previously identified lead inhibitors to expand the spectrum of bacteria sensitive to the amino-oxazole derivatives by including Gram-positive species. Using 9,411 small organic building blocks, we constructed a diverse combinatorial library of 1.2×10⁸ amino-oxazole derivatives. A subset of 9×10⁶ of these compounds were subjected to structure-based virtual screening against seven biotin carboxylase isoforms using similarity-based docking by eSimDock. Potentially broad-spectrum antibiotic candidates were selected based on the consensus ranking by several scoring functions including non-linear statistical models implemented in eSimDock and traditional molecular mechanics force fields. The analysis of binding poses of the top-ranked compounds docked to biotin carboxylase isoforms suggests that: (1) binding of the amino-oxazole anchor is stabilized by a network of hydrogen bonds to residues 201, 202 and 204; (2) halogenated aromatic moieties attached to the amino-oxazole scaffold enhance interactions with a hydrophobic pocket formed by residues 157, 169, 171 and 203; and (3) larger substituents reach deeper into the binding pocket to form additional hydrogen bonds with the side chains of residues 209 and 233. These structural insights into drug

  12. Computational redesign of bacterial biotin carboxylase inhibitors using structure-based virtual screening of combinatorial libraries.

    PubMed

    Brylinski, Michal; Waldrop, Grover L

    2014-04-02

    As the spread of antibiotic resistant bacteria steadily increases, there is an urgent need for new antibacterial agents. Because fatty acid synthesis is only used for membrane biogenesis in bacteria, the enzymes in this pathway are attractive targets for antibacterial agent development. Acetyl-CoA carboxylase catalyzes the committed and regulated step in fatty acid synthesis. In bacteria, the enzyme is composed of three distinct protein components: biotin carboxylase, biotin carboxyl carrier protein, and carboxyltransferase. Fragment-based screening revealed that amino-oxazole inhibits biotin carboxylase activity and also exhibits antibacterial activity against Gram-negative organisms. In this report, we redesigned previously identified lead inhibitors to expand the spectrum of bacteria sensitive to the amino-oxazole derivatives by including Gram-positive species. Using 9,411 small organic building blocks, we constructed a diverse combinatorial library of 1.2×10⁸ amino-oxazole derivatives. A subset of 9×10⁶ of these compounds were subjected to structure-based virtual screening against seven biotin carboxylase isoforms using similarity-based docking by eSimDock. Potentially broad-spectrum antibiotic candidates were selected based on the consensus ranking by several scoring functions including non-linear statistical models implemented in eSimDock and traditional molecular mechanics force fields. The analysis of binding poses of the top-ranked compounds docked to biotin carboxylase isoforms suggests that: (1) binding of the amino-oxazole anchor is stabilized by a network of hydrogen bonds to residues 201, 202 and 204; (2) halogenated aromatic moieties attached to the amino-oxazole scaffold enhance interactions with a hydrophobic pocket formed by residues 157, 169, 171 and 203; and (3) larger substituents reach deeper into the binding pocket to form additional hydrogen bonds with the side chains of residues 209 and 233. These structural insights into drug

  13. 2D materials: to graphene and beyond.

    PubMed

    Mas-Ballesté, Rubén; Gómez-Navarro, Cristina; Gómez-Herrero, Julio; Zamora, Félix

    2011-01-01

    This review is an attempt to illustrate the different alternatives in the field of 2D materials. Graphene seems to be just the tip of the iceberg and we show how the discovery of alternative 2D materials is starting to show the rest of this iceberg. The review comprises the current state-of-the-art of the vast literature in concepts and methods already known for isolation and characterization of graphene, and rationalizes the quite disperse literature in other 2D materials such as metal oxides, hydroxides and chalcogenides, and metal-organic frameworks.

  14. Discovery of small molecule inhibitors of MyD88-dependent signaling pathways using a computational screen

    PubMed Central

    Olson, Mark A.; Lee, Michael S.; Kissner, Teri L.; Alam, Shahabuddin; Waugh, David S.; Saikh, Kamal U.

    2015-01-01

    In this study, we used high-throughput computational screening to discover drug-like inhibitors of the host MyD88 protein-protein signaling interaction implicated in the potentially lethal immune response associated with Staphylococcal enterotoxins. We built a protein-protein dimeric docking model of the Toll-interleukin receptor (TIR)-domain of MyD88 and identified a binding site for docking small molecules. Computational screening of 5 million drug-like compounds led to testing of 30 small molecules; one of these molecules inhibits the TIR-TIR domain interaction and attenuates pro-inflammatory cytokine production in human primary cell cultures. Compounds chemically similar to this hit from the PubChem database were observed to be more potent with improved drug-like properties. Most of these 2nd generation compounds inhibit Staphylococcal enterotoxin B (SEB)-induced TNF-α, IFN-γ, IL-6, and IL-1β production at 2–10 μM in human primary cells. Biochemical analysis and a cell-based reporter assay revealed that the most promising compound, T6167923, disrupts MyD88 homodimeric formation, which is critical for its signaling function. Furthermore, we observed that administration of a single dose of T6167923 completely protects mice from lethal SEB-induced toxic shock. In summary, our in silico approach has identified anti-inflammatory inhibitors against in vitro and in vivo toxin exposure with promise to treat other MyD88-related pro-inflammatory diseases. PMID:26381092

  15. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis

    NASA Astrophysics Data System (ADS)

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A.; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-06-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications.

  16. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis.

    PubMed

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-01-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications. PMID:27325228

  17. Computational Systems Bioinformatics and Bioimaging for Pathway Analysis and Drug Screening

    PubMed Central

    Zhou, Xiaobo; Wong, Stephen T. C.

    2009-01-01

    The premise of today’s drug development is that the mechanism of a disease is highly dependent upon underlying signaling and cellular pathways. Such pathways are often composed of complexes of physically interacting genes, proteins, or biochemical activities coordinated by metabolic intermediates, ions, and other small solutes and are investigated with molecular biology approaches in genomics, proteomics, and metabonomics. Nevertheless, the recent declines in the pharmaceutical industry’s revenues indicate such approaches alone may not be adequate in creating successful new drugs. Our observation is that combining methods of genomics, proteomics, and metabonomics with techniques of bioimaging will systematically provide powerful means to decode or better understand molecular interactions and pathways that lead to disease and potentially generate new insights and indications for drug targets. The former methods provide the profiles of genes, proteins, and metabolites, whereas the latter techniques generate objective, quantitative phenotypes correlating to the molecular profiles and interactions. In this paper, we describe pathway reconstruction and target validation based on the proposed systems biologic approach and show selected application examples for pathway analysis and drug screening. PMID:20011613

  18. Computational screening and selection of cyclic peptide hairpin mimetics by molecular simulation and kinetic network models.

    PubMed

    Razavi, Asghar M; Wuest, William M; Voelz, Vincent A

    2014-05-27

    Designing peptidomimetic compounds to have a preorganized structure in solution is highly nontrivial. To show how simulation-based approaches can help speed this process, we performed an extensive simulation study of designed cyclic peptide mimics of a β-hairpin from bacterial protein LapD involved in a protein-protein interaction (PPI) pertinent to bacterial biofilm formation. We used replica exchange molecular dynamics (REMD) simulation to screen 20 covalently cross-linked designs with varying stereochemistry and selected the most favorable of these for massively parallel simulation on Folding@home in explicit solvent. Markov state models (MSMs) built from the trajectory data reveal how subtle chemical modifications can have a significant effect on conformational populations, leading to the overall stabilization of the target structure. In particular, we identify a key steric interaction between a methyl substituent and a valine side chain that acts to allosterically shift population between native and near-native states, which could be exploited in future designs. Visualization of this mechanism is aided considerably by the tICA method, which identifies degrees of freedom most important in slow conformational transitions. The combination of quantitative detail and human comprehension provided by MSMs suggests such approaches will be increasingly useful for design.

  19. Discovery of earth-abundant nitride semiconductors by computational screening and high-pressure synthesis

    PubMed Central

    Hinuma, Yoyo; Hatakeyama, Taisuke; Kumagai, Yu; Burton, Lee A.; Sato, Hikaru; Muraba, Yoshinori; Iimura, Soshi; Hiramatsu, Hidenori; Tanaka, Isao; Hosono, Hideo; Oba, Fumiyasu

    2016-01-01

    Nitride semiconductors are attractive because they can be environmentally benign, comprised of abundant elements and possess favourable electronic properties. However, those currently commercialized are mostly limited to gallium nitride and its alloys, despite the rich composition space of nitrides. Here we report the screening of ternary zinc nitride semiconductors using first-principles calculations of electronic structure, stability and dopability. This approach identifies as-yet-unreported CaZn2N2 that has earth-abundant components, smaller carrier effective masses than gallium nitride and a tunable direct bandgap suited for light emission and harvesting. High-pressure synthesis realizes this phase, verifying the predicted crystal structure and band-edge red photoluminescence. In total, we propose 21 promising systems, including Ca2ZnN2, Ba2ZnN2 and Zn2PN3, which have not been reported as semiconductors previously. Given the variety in bandgaps of the identified compounds, the present study expands the potential suitability of nitride semiconductors for a broader range of electronic, optoelectronic and photovoltaic applications. PMID:27325228

  20. Computer based screening of compound databases: 1. Preselection of benzamidine-based thrombin inhibitors.

    PubMed

    Fox, T; Haaksma, E E

    2000-07-01

    We present a computational protocol which uses the known three-dimensional structure of a target enzyme to identify possible ligands from databases of compounds with low molecular weight. This is accomplished by first mapping the essential interactions in the binding site with the program GRID. The resulting regions of favorable interaction between target and ligand are translated into a database query, and with UNITY a flexible 3D database search is performed. The feasibility of this approach is calibrated with thrombin as the target. Our results show that the resulting hit lists are enriched with thrombin inhibitors compared to the total database.

  1. 2D molybdenum disulphide (2D-MoS2) modified electrodes explored towards the oxygen reduction reaction

    NASA Astrophysics Data System (ADS)

    Rowley-Neale, Samuel J.; Fearn, Jamie M.; Brownson, Dale A. C.; Smith, Graham C.; Ji, Xiaobo; Banks, Craig E.

    2016-08-01

    Two-dimensional molybdenum disulphide nanosheets (2D-MoS2) have proven to be an effective electrocatalyst, with particular attention being focused on their use towards increasing the efficiency of the reactions associated with hydrogen fuel cells. Whilst the majority of research has focused on the Hydrogen Evolution Reaction (HER), herein we explore the use of 2D-MoS2 as a potential electrocatalyst for the much less researched Oxygen Reduction Reaction (ORR). We stray from literature conventions and perform experiments in 0.1 M H2SO4 acidic electrolyte for the first time, evaluating the electrochemical performance of the ORR with 2D-MoS2 electrically wired/immobilised upon several carbon based electrodes (namely; Boron Doped Diamond (BDD), Edge Plane Pyrolytic Graphite (EPPG), Glassy Carbon (GC) and Screen-Printed Electrodes (SPE)) whilst exploring a range of 2D-MoS2 coverages/masses. Consequently, the findings of this study are highly applicable to real world fuel cell applications. We show that significant improvements in ORR activity can be achieved through the careful selection of the underlying/supporting carbon materials that electrically wire the 2D-MoS2 and utilisation of an optimal mass of 2D-MoS2. The ORR onset is observed to be reduced to ca. +0.10 V for EPPG, GC and SPEs at 2D-MoS2 (1524 ng cm-2 modification), which is far closer to Pt at +0.46 V compared to bare/unmodified EPPG, GC and SPE counterparts. This report is the first to demonstrate such beneficial electrochemical responses in acidic conditions using a 2D-MoS2 based electrocatalyst material on a carbon-based substrate (SPEs in this case). Investigation of the beneficial reaction mechanism reveals the ORR to occur via a 4 electron process in specific conditions; elsewhere a 2 electron process is observed. This work offers valuable insights for those wishing to design, fabricate and/or electrochemically test 2D-nanosheet materials towards the ORR.Two-dimensional molybdenum disulphide nanosheets

  2. Matrix models of 2d gravity

    SciTech Connect

    Ginsparg, P.

    1991-01-01

    These are introductory lectures for a general audience that give an overview of the subject of matrix models and their application to random surfaces, 2d gravity, and string theory. They are intentionally 1.5 years out of date.

  3. Matrix models of 2d gravity

    SciTech Connect

    Ginsparg, P.

    1991-12-31

    These are introductory lectures for a general audience that give an overview of the subject of matrix models and their application to random surfaces, 2d gravity, and string theory. They are intentionally 1.5 years out of date.

  4. Brittle damage models in DYNA2D

    SciTech Connect

    Faux, D.R.

    1997-09-01

    DYNA2D is an explicit Lagrangian finite element code used to model dynamic events where stress wave interactions influence the overall response of the system. DYNA2D is often used to model penetration problems involving ductile-to-ductile impacts; however, with the advent of the use of ceramics in the armor-anti-armor community and the need to model damage to laser optics components, good brittle damage models are now needed in DYNA2D. This report will detail the implementation of four brittle damage models in DYNA2D, three scalar damage models and one tensor damage model. These new brittle damage models are then used to predict experimental results from three distinctly different glass damage problems.

  5. An inverse design method for 2D airfoil

    NASA Astrophysics Data System (ADS)

    Liang, Zhi-Yong; Cui, Peng; Zhang, Gen-Bao

    2010-03-01

    The computational method for aerodynamic design of aircraft is applied more universally than before, in which the design of an airfoil is a hot problem. The forward problem is discussed by most relative papers, but inverse method is more useful in practical designs. In this paper, the inverse design of 2D airfoil was investigated. A finite element method based on the variational principle was used for carrying out. Through the simulation, it was shown that the method was fit for the design.

  6. To Screen or not to Screen: Low Dose Computed Tomography in Comparison to Chest Radiography or Usual Care in Reducing Morbidity and Mortality from Lung Cancer.

    PubMed

    Dajac, Joshua; Kamdar, Jay; Moats, Austin; Nguyen, Brenda

    2016-01-01

    Lung cancer has the highest mortality rate of all cancers. This paper seeks to address the question: Can the mortality of lung cancer be decreased by screening with low-dose computerized tomography (LDCT) in higher risk patients compared to chest X-rays (CXR) or regular patient care? Currently, CXR screening is recommended for certain high-risk patients. Several recent trials have examined the effectiveness of LDCT versus chest radiography or usual care as a control. These trials include National Lung Screening Trial (NLST), Detection And screening of early lung cancer with Novel imaging TEchnology (DANTE), Lung Screening Study (LSS), Depiscan, Italian Lung (ITALUNG), and Dutch-Belgian Randomized Lung Cancer Screening Trial (Dutch acronym: NELSON study). NLST, the largest trial (n=53, 454), demonstrated a decrease in mortality from lung cancer in the LDCT group (RRR=20%, P=0.004). LSS demonstrated a greater sensitivity in detecting both early stage and any stage of lung cancer in comparison to traditional CXR. Although the DANTE trial yielded data consistent with findings in LSS, it also showed that via LDCT screening a greater proportion of patients were placed under unnecessary surgical procedures. The Depiscan trial yielded a high nodule detection rate at the cost of a high false-positive rate compared to CXR screening. The ITALUNG and NELSON trials demonstrated the early detection capabilities of LDCT for lung cancers compared to usual care without surveillance imaging. False-positive findings with unnecessary workup, intervention, and radiation exposure remain significant concerns for routine LDCT screening. However, current data suggests LDCT may provide a highly sensitive and specific means for detecting lung cancers and reducing mortality. PMID:27375974

  7. To Screen or not to Screen: Low Dose Computed Tomography in Comparison to Chest Radiography or Usual Care in Reducing Morbidity and Mortality from Lung Cancer

    PubMed Central

    Kamdar, Jay; Moats, Austin; Nguyen, Brenda

    2016-01-01

    Lung cancer has the highest mortality rate of all cancers. This paper seeks to address the question: Can the mortality of lung cancer be decreased by screening with low-dose computerized tomography (LDCT) in higher risk patients compared to chest X-rays (CXR) or regular patient care? Currently, CXR screening is recommended for certain high-risk patients. Several recent trials have examined the effectiveness of LDCT versus chest radiography or usual care as a control. These trials include National Lung Screening Trial (NLST), Detection And screening of early lung cancer with Novel imaging TEchnology (DANTE), Lung Screening Study (LSS), Depiscan, Italian Lung (ITALUNG), and Dutch-Belgian Randomized Lung Cancer Screening Trial (Dutch acronym: NELSON study). NLST, the largest trial (n=53, 454), demonstrated a decrease in mortality from lung cancer in the LDCT group (RRR=20%, P=0.004). LSS demonstrated a greater sensitivity in detecting both early stage and any stage of lung cancer in comparison to traditional CXR. Although the DANTE trial yielded data consistent with findings in LSS, it also showed that via LDCT screening a greater proportion of patients were placed under unnecessary surgical procedures. The Depiscan trial yielded a high nodule detection rate at the cost of a high false-positive rate compared to CXR screening. The ITALUNG and NELSON trials demonstrated the early detection capabilities of LDCT for lung cancers compared to usual care without surveillance imaging. False-positive findings with unnecessary workup, intervention, and radiation exposure remain significant concerns for routine LDCT screening. However, current data suggests LDCT may provide a highly sensitive and specific means for detecting lung cancers and reducing mortality. PMID:27375974

  8. Organ Dose and Attributable Cancer Risk in Lung Cancer Screening with Low-Dose Computed Tomography

    PubMed Central

    Saltybaeva, Natalia; Martini, Katharina; Frauenfelder, Thomas; Alkadhi, Hatem

    2016-01-01

    Purpose Lung cancer screening with CT has been recently recommended for decreasing lung cancer mortality. The radiation dose of CT, however, must be kept as low as reasonably achievable for reducing potential stochastic risks from ionizing radiation. The purpose of this study was to calculate individual patients’ lung doses and to estimate cancer risks in low-dose CT (LDCT) in comparison with a standard dose CT (SDCT) protocol. Materials and Methods This study included 47 adult patients (mean age 63.0 ± 5.7 years) undergoing chest CT on a third-generation dual-source scanner. 23/47 patients (49%) had a non-enhanced chest SDCT, 24 patients (51%) underwent LDCT at 100 kVp with spectral shaping at a dose equivalent to a chest x-ray. 3D-dose distributions were obtained from Monte Carlo simulations for each patient, taking into account their body size and individual CT protocol. Based on the dose distributions, patient-specific lung doses were calculated and relative cancer risk was estimated according to BEIR VII recommendations. Results As compared to SDCT, the LDCT protocol allowed for significant organ dose and cancer risk reductions (p<0.001). On average, lung dose was reduced from 7.7 mGy to 0.3 mGy when using LDCT, which was associated with lowering of the cancer risk from 8.6 to 0.35 per 100’000 cases. A strong linear correlation between lung dose and patient effective diameter was found for both protocols (R2 = 0.72 and R2 = 0.75 for SDCT and LDCT, respectively). Conclusion Use of a LDCT protocol for chest CT with a dose equivalent to a chest x-ray allows for significant lung dose and cancer risk reduction from ionizing radiation. PMID:27203720

  9. Formulation pre-screening of inhalation powders using computational atom-atom systematic search method.

    PubMed

    Ramachandran, Vasuki; Murnane, Darragh; Hammond, Robert B; Pickering, Jonathan; Roberts, Kevin J; Soufian, Majeed; Forbes, Ben; Jaffari, Sara; Martin, Gary P; Collins, Elizabeth; Pencheva, Klimentina

    2015-01-01

    The synthonic modeling approach provides a molecule-centered understanding of the surface properties of crystals. It has been applied extensively to understand crystallization processes. This study aimed to investigate the functional relevance of synthonic modeling to the formulation of inhalation powders by assessing cohesivity of three active pharmaceutical ingredients (APIs, fluticasone propionate (FP), budesonide (Bud), and salbutamol base (SB)) and the commonly used excipient, α-lactose monohydrate (LMH). It is found that FP (-11.5 kcal/mol) has a higher cohesive strength than Bud (-9.9 kcal/mol) or SB (-7.8 kcal/mol). The prediction correlated directly to cohesive strength measurements using laser diffraction, where the airflow pressure required for complete dispersion (CPP) was 3.5, 2.0, and 1.0 bar for FP, Bud, and SB, respectively. The highest cohesive strength was predicted for LMH (-15.9 kcal/mol), which did not correlate with the CPP value of 2.0 bar (i.e., ranking lower than FP). High FP-LMH adhesive forces (-11.7 kcal/mol) were predicted. However, aerosolization studies revealed that the FP-LMH blends consisted of agglomerated FP particles with a large median diameter (∼4-5 μm) that were not disrupted by LMH. Modeling of the crystal and surface chemistry of LMH identified high electrostatic and H-bond components of its cohesive energy due to the presence of water and hydroxyl groups in lactose, unlike the APIs. A direct comparison of the predicted and measured cohesive balance of LMH with APIs will require a more in-depth understanding of highly hydrogen-bonded systems with respect to the synthonic engineering modeling tool, as well as the influence of agglomerate structure on surface-surface contact geometry. Overall, this research has demonstrated the possible application and relevance of synthonic engineering tools for rapid pre-screening in drug formulation and design.

  10. Chemical Approaches to 2D Materials.

    PubMed

    Samorì, Paolo; Palermo, Vincenzo; Feng, Xinliang

    2016-08-01

    Chemistry plays an ever-increasing role in the production, functionalization, processing and applications of graphene and other 2D materials. This special issue highlights a selection of enlightening chemical approaches to 2D materials, which nicely reflect the breadth of the field and convey the excitement of the individuals involved in it, who are trying to translate graphene and related materials from the laboratory into a real, high-impact technology. PMID:27478083

  11. Chemical Approaches to 2D Materials.

    PubMed

    Samorì, Paolo; Palermo, Vincenzo; Feng, Xinliang

    2016-08-01

    Chemistry plays an ever-increasing role in the production, functionalization, processing and applications of graphene and other 2D materials. This special issue highlights a selection of enlightening chemical approaches to 2D materials, which nicely reflect the breadth of the field and convey the excitement of the individuals involved in it, who are trying to translate graphene and related materials from the laboratory into a real, high-impact technology.

  12. Glitter in a 2D monolayer.

    PubMed

    Yang, Li-Ming; Dornfeld, Matthew; Frauenheim, Thomas; Ganz, Eric

    2015-10-21

    We predict a highly stable and robust atomically thin gold monolayer with a hexagonal close packed lattice stabilized by metallic bonding with contributions from strong relativistic effects and aurophilic interactions. We have shown that the framework of the Au monolayer can survive 10 ps MD annealing simulations up to 1400 K. The framework is also able to survive large motions out of the plane. Due to the smaller number of bonds per atom in the 2D layer compared to the 3D bulk we observe significantly enhanced energy per bond (0.94 vs. 0.52 eV per bond). This is similar to the increase in bond strength going from 3D diamond to 2D graphene. It is a non-magnetic metal, and was found to be the global minima in the 2D space. Phonon dispersion calculations demonstrate high kinetic stability with no negative modes. This 2D gold monolayer corresponds to the top monolayer of the bulk Au(111) face-centered cubic lattice. The close-packed lattice maximizes the aurophilic interactions. We find that the electrons are completely delocalized in the plane and behave as 2D nearly free electron gas. We hope that the present work can inspire the experimental fabrication of novel free standing 2D metal systems.

  13. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine

    PubMed Central

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-01

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1. PMID:26754609

  14. Computational screen and experimental validation of anti-influenza effects of quercetin and chlorogenic acid from traditional Chinese medicine.

    PubMed

    Liu, Zekun; Zhao, Junpeng; Li, Weichen; Shen, Li; Huang, Shengbo; Tang, Jingjing; Duan, Jie; Fang, Fang; Huang, Yuelong; Chang, Haiyan; Chen, Ze; Zhang, Ran

    2016-01-12

    The Influenza A virus is a great threat for human health, while various subtypes of the virus made it difficult to develop drugs. With the development of state-of-art computational chemistry, computational molecular docking could serve as a virtual screen of potential leading compound. In this study, we performed molecular docking for influenza A H1N1 (A/PR/8/34) with small molecules such as quercetin and chlorogenic acid, which were derived from traditional Chinese medicine. The results showed that these small molecules have strong binding abilities with neuraminidase from H1N1 (A/PR/8/34). Further details showed that the structural features of the molecules might be helpful for further drug design and development. The experiments in vitro, in vivo have validated the anti-influenza effect of quercetin and chlorogenic acid, which indicating comparable protection effects as zanamivir. Taken together, it was proposed that chlorogenic acid and quercetin could be employed as the effective lead compounds for anti-influenza A H1N1.

  15. A Malaria Diagnostic Tool Based on Computer Vision Screening and Visualization of Plasmodium falciparum Candidate Areas in Digitized Blood Smears

    PubMed Central

    Walliander, Margarita; Mårtensson, Andreas; Diwan, Vinod; Rahtu, Esa; Pietikäinen, Matti; Lundin, Mikael; Lundin, Johan

    2014-01-01

    Introduction Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. Methods Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27) and uninfected controls (n = 20) were digitally scanned with an oil immersion objective (0.1 µm/pixel) to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors) used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. Results The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls). From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. Conclusion We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for visual examination and

  16. ELLIPT2D: A Flexible Finite Element Code Written Python

    SciTech Connect

    Pletzer, A.; Mollis, J.C.

    2001-03-22

    The use of the Python scripting language for scientific applications and in particular to solve partial differential equations is explored. It is shown that Python's rich data structure and object-oriented features can be exploited to write programs that are not only significantly more concise than their counter parts written in Fortran, C or C++, but are also numerically efficient. To illustrate this, a two-dimensional finite element code (ELLIPT2D) has been written. ELLIPT2D provides a flexible and easy-to-use framework for solving a large class of second-order elliptic problems. The program allows for structured or unstructured meshes. All functions defining the elliptic operator are user supplied and so are the boundary conditions, which can be of Dirichlet, Neumann or Robbins type. ELLIPT2D makes extensive use of dictionaries (hash tables) as a way to represent sparse matrices.Other key features of the Python language that have been widely used include: operator over loading, error handling, array slicing, and the Tkinter module for building graphical use interfaces. As an example of the utility of ELLIPT2D, a nonlinear solution of the Grad-Shafranov equation is computed using a Newton iterative scheme. A second application focuses on a solution of the toroidal Laplace equation coupled to a magnetohydrodynamic stability code, a problem arising in the context of magnetic fusion research.

  17. Computational screening of iodine uptake in zeolitic imidazolate frameworks in a water-containing system.

    PubMed

    Yuan, Yue; Dong, Xiuqin; Chen, Yifei; Zhang, Minhua

    2016-08-17

    Iodine capture is of great environmental significance due to the high toxicity and volatility of I2. Here we conduct a systematic computational investigation of iodine adsorption in zeolitic imidazolate frameworks (ZIFs) by adopting the grand canonical Monte Carlo (GCMC) simulation and the density functional theory (DFT) method. The results confirm the vital structural factors for iodine adsorption at 298 K and moderate pressures including metal sites, organic linkers, symmetry, and topology types. The uptake will be enhanced by active metal sites, the simple imidazolate linker and single asymmetric linkers with polar functional groups. The symmetry effect is stronger than the surface properties. Meanwhile low steric hindrance is more beneficial than polar functional groups to iodine adsorption. The specific topology types like mer bringing large surface areas and large diameter cages result in high iodine capacities. Iodine molecules tend to locate in cages with large diameters and aggregates along the sides of cages. In contrast, water prefers small diameter cages. In hydrophilic materials, water has a negative impact on iodine uptake due to its similar adsorption sites to iodine. The selectivity of iodine over water increases with increasing water content due to the large diameter cages of ZIFs. This work proves that ZIFs can be identified as efficient and economical adsorbents with high diversity for iodine in a water-containing system. Furthermore, it provides comprehensive insights into key structural factors for iodine uptake and separation in silver-free porous solids. PMID:27499079

  18. Gold silver alloy nanoparticles (GSAN): an imaging probe for breast cancer screening with dual-energy mammography or computed tomography

    NASA Astrophysics Data System (ADS)

    Naha, Pratap C.; Lau, Kristen C.; Hsu, Jessica C.; Hajfathalian, Maryam; Mian, Shaameen; Chhour, Peter; Uppuluri, Lahari; McDonald, Elizabeth S.; Maidment, Andrew D. A.; Cormode, David P.

    2016-07-01

    Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various analytical methods. DEM and computed tomography (CT) phantom imaging showed that GSAN produced robust contrast that was comparable to silver alone. Cell viability, reactive oxygen species generation and DNA damage results revealed that the formulations with 30% or higher gold content are cytocompatible to Hep G2 and J774A.1 cells. In vivo imaging was performed in mice with and without breast tumors. The results showed that GSAN produce strong DEM and CT contrast and accumulated in tumors. Furthermore, both in vivo imaging and ex vivo analysis indicated the excretion of GSAN via both urine and feces. In summary, GSAN produce strong DEM and CT contrast, and has potential for both blood pool imaging and for breast cancer screening.Earlier detection of breast cancer reduces mortality from this disease. As a result, the development of better screening techniques is a topic of intense interest. Contrast-enhanced dual-energy mammography (DEM) is a novel technique that has improved sensitivity for cancer detection. However, the development of contrast agents for this technique is in its infancy. We herein report gold-silver alloy nanoparticles (GSAN) that have potent DEM contrast properties and improved biocompatibility. GSAN formulations containing a range of gold : silver ratios and capped with m-PEG were synthesized and characterized using various

  19. Features of Undiagnosed Breast Cancers at Screening Breast MR Imaging and Potential Utility of Computer-Aided Evaluation

    PubMed Central

    Seo, Mirinae; Bae, Min Sun; Koo, Hye Ryoung; Kim, Won Hwa; Lee, Su Hyun; Chu, Ajung

    2016-01-01

    Objective To retrospectively evaluate the features of undiagnosed breast cancers on prior screening breast magnetic resonance (MR) images in patients who were subsequently diagnosed with breast cancer, as well as the potential utility of MR-computer-aided evaluation (CAE). Materials and Methods Between March 2004 and May 2013, of the 72 consecutive pairs of prior negative MR images and subsequent MR images with diagnosed cancers (median interval, 32.8 months; range, 5.4-104.6 months), 36 (50%) had visible findings (mean size, 1.0 cm; range, 0.3-5.2 cm). The visible findings were divided into either actionable or underthreshold groups by the blinded review by 5 radiologists. MR imaging features, reasons for missed cancer, and MR-CAE features according to actionability were evaluated. Results Of the 36 visible findings on prior MR images, 33.3% (12 of 36) of the lesions were determined to be actionable and 66.7% (24 of 36) were underthreshold; 85.7% (6 of 7) of masses and 31.6% (6 of 19) of non-mass enhancements were classified as actionable lesions. Mimicking physiologic enhancements (27.8%, 10 of 36) and small lesion size (27.8%, 10 of 36) were the most common reasons for missed cancer. Actionable findings tended to show more washout or plateau kinetic patterns on MR-CAE than underthreshold findings, as the 100% of actionable findings and 46.7% of underthreshold findings showed washout or plateau (p = 0.008). Conclusion MR-CAE has the potential for reducing the number of undiagnosed breast cancers on screening breast MR images, the majority of which are caused by mimicking physiologic enhancements or small lesion size. PMID:26798217

  20. Targeting multiple types of tumors using NKG2D-coated iron oxide nanoparticles.

    PubMed

    Wu, Ming-Ru; Cook, W James; Zhang, Tong; Sentman, Charles L

    2014-11-28

    Iron oxide nanoparticles (IONPs) hold great potential for cancer therapy. Actively targeting IONPs to tumor cells can further increase therapeutic efficacy and decrease off-target side effects. To target tumor cells, a natural killer (NK) cell activating receptor, NKG2D, was utilized to develop pan-tumor targeting IONPs. NKG2D ligands are expressed on many tumor types and its ligands are not found on most normal tissues under steady state conditions. The data showed that mouse and human fragment crystallizable (Fc)-fusion NKG2D (Fc-NKG2D) coated IONPs (NKG2D/NPs) can target multiple NKG2D ligand positive tumor types in vitro in a dose dependent manner by magnetic cell sorting. Tumor targeting effect was robust even under a very low tumor cell to normal cell ratio and targeting efficiency correlated with NKG2D ligand expression level on tumor cells. Furthermore, the magnetic separation platform utilized to test NKG2D/NP specificity has the potential to be developed into high throughput screening strategies to identify ideal fusion proteins or antibodies for targeting IONPs. In conclusion, NKG2D/NPs can be used to target multiple tumor types and magnetic separation platform can facilitate the proof-of-concept phase of tumor targeting IONP development.

  1. Coronary artery calcium score on low-dose computed tomography for lung cancer screening

    PubMed Central

    Arcadi, Teresa; Maffei, Erica; Sverzellati, Nicola; Mantini, Cesare; Guaricci, Andrea I; Tedeschi, Carlo; Martini, Chiara; La Grutta, Ludovico; Cademartiri, Filippo

    2014-01-01

    AIM: To evaluate the feasibility of coronary artery calcium score (CACS) on low-dose non-gated chest CT (ngCCT). METHODS: Sixty consecutive individuals (30 males; 73 ± 7 years) scheduled for risk stratification by means of unenhanced ECG-triggered cardiac computed tomography (gCCT) underwent additional unenhanced ngCCT. All CT scans were performed on a 64-slice CT scanner (Somatom Sensation 64 Cardiac, Siemens, Germany). CACS was calculated using conventional methods/scores (Volume, Mass, Agatston) as previously described in literature. The CACS value obtained were compared. The Mayo Clinic classification was used to stratify cardiovascular risk based on Agatston CACS. Differences and correlations between the two methods were compared. A P-value < 0.05 was considered significant. RESULTS: Mean CACS values were significantly higher for gCCT as compared to ngCCT (Volume: 418 ± 747 vs 332 ± 597; Mass: 89 ± 151 vs 78 ± 141; Agatston: 481 ± 854 vs 428 ± 776; P < 0.05). The correlation between the two values was always very high (Volume: r = 0.95; Mass: r = 0.97; Agatston: r = 0.98). Of the 6 patients with 0 Agatston score on gCCT, 2 (33%) showed an Agatston score > 0 in the ngCCT. Of the 3 patients with 1-10 Agatston score on gCCT, 1 (33%) showed an Agatston score of 0 in the ngCCT. Overall, 23 (38%) patients were reclassified in a different cardiovascular risk category, mostly (18/23; 78%) shifting to a lower risk in the ngCCT. The estimated radiation dose was significantly higher for gCCT (DLP 115.8 ± 50.7 vs 83.8 ± 16.3; Effective dose 1.6 ± 0.7 mSv vs 1.2 ± 0.2 mSv; P < 0.01). CONCLUSION: CACS assessment is feasible on ngCCT; the variability of CACS values and the associated re-stratification of patients in cardiovascular risk groups should be taken into account. PMID:24976939

  2. Orthotropic Piezoelectricity in 2D Nanocellulose

    NASA Astrophysics Data System (ADS)

    García, Y.; Ruiz-Blanco, Yasser B.; Marrero-Ponce, Yovani; Sotomayor-Torres, C. M.

    2016-10-01

    The control of electromechanical responses within bonding regions is essential to face frontier challenges in nanotechnologies, such as molecular electronics and biotechnology. Here, we present Iβ-nanocellulose as a potentially new orthotropic 2D piezoelectric crystal. The predicted in-layer piezoelectricity is originated on a sui-generis hydrogen bonds pattern. Upon this fact and by using a combination of ab-initio and ad-hoc models, we introduce a description of electrical profiles along chemical bonds. Such developments lead to obtain a rationale for modelling the extended piezoelectric effect originated within bond scales. The order of magnitude estimated for the 2D Iβ-nanocellulose piezoelectric response, ~pm V‑1, ranks this material at the level of currently used piezoelectric energy generators and new artificial 2D designs. Such finding would be crucial for developing alternative materials to drive emerging nanotechnologies.

  3. Orthotropic Piezoelectricity in 2D Nanocellulose

    PubMed Central

    García, Y.; Ruiz-Blanco, Yasser B.; Marrero-Ponce, Yovani; Sotomayor-Torres, C. M.

    2016-01-01

    The control of electromechanical responses within bonding regions is essential to face frontier challenges in nanotechnologies, such as molecular electronics and biotechnology. Here, we present Iβ-nanocellulose as a potentially new orthotropic 2D piezoelectric crystal. The predicted in-layer piezoelectricity is originated on a sui-generis hydrogen bonds pattern. Upon this fact and by using a combination of ab-initio and ad-hoc models, we introduce a description of electrical profiles along chemical bonds. Such developments lead to obtain a rationale for modelling the extended piezoelectric effect originated within bond scales. The order of magnitude estimated for the 2D Iβ-nanocellulose piezoelectric response, ~pm V−1, ranks this material at the level of currently used piezoelectric energy generators and new artificial 2D designs. Such finding would be crucial for developing alternative materials to drive emerging nanotechnologies. PMID:27708364

  4. Optical modulators with 2D layered materials

    NASA Astrophysics Data System (ADS)

    Sun, Zhipei; Martinez, Amos; Wang, Feng

    2016-04-01

    Light modulation is an essential operation in photonics and optoelectronics. With existing and emerging technologies increasingly demanding compact, efficient, fast and broadband optical modulators, high-performance light modulation solutions are becoming indispensable. The recent realization that 2D layered materials could modulate light with superior performance has prompted intense research and significant advances, paving the way for realistic applications. In this Review, we cover the state of the art of optical modulators based on 2D materials, including graphene, transition metal dichalcogenides and black phosphorus. We discuss recent advances employing hybrid structures, such as 2D heterostructures, plasmonic structures, and silicon and fibre integrated structures. We also take a look at the future perspectives and discuss the potential of yet relatively unexplored mechanisms, such as magneto-optic and acousto-optic modulation.

  5. Differentiation of normal and leukemic cells by 2D light scattering label-free static cytometry.

    PubMed

    Xie, Linyan; Liu, Qiao; Shao, Changshun; Su, Xuantao

    2016-09-19

    Two-dimensional (2D) light scattering patterns of single microspheres, normal granulocytes and leukemic cells are obtained by label-free static cytometry. Statistical results of experimental 2D light scattering patterns obtained from standard microspheres with a mean diameter of 4.19 μm agree well with theoretical simulations. High accuracy rates (greater than 92%) for label-free differentiation of normal granulocytes and leukemic cells, both the acute and chronic leukemic cells, are achieved by analyzing the 2D light scattering patterns. Our label-free static cytometry is promising for leukemia screening in clinics. PMID:27661908

  6. Inkjet printing of 2D layered materials.

    PubMed

    Li, Jiantong; Lemme, Max C; Östling, Mikael

    2014-11-10

    Inkjet printing of 2D layered materials, such as graphene and MoS2, has attracted great interests for emerging electronics. However, incompatible rheology, low concentration, severe aggregation and toxicity of solvents constitute critical challenges which hamper the manufacturing efficiency and product quality. Here, we introduce a simple and general technology concept (distillation-assisted solvent exchange) to efficiently overcome these challenges. By implementing the concept, we have demonstrated excellent jetting performance, ideal printing patterns and a variety of promising applications for inkjet printing of 2D layered materials. PMID:25169938

  7. Inkjet printing of 2D layered materials.

    PubMed

    Li, Jiantong; Lemme, Max C; Östling, Mikael

    2014-11-10

    Inkjet printing of 2D layered materials, such as graphene and MoS2, has attracted great interests for emerging electronics. However, incompatible rheology, low concentration, severe aggregation and toxicity of solvents constitute critical challenges which hamper the manufacturing efficiency and product quality. Here, we introduce a simple and general technology concept (distillation-assisted solvent exchange) to efficiently overcome these challenges. By implementing the concept, we have demonstrated excellent jetting performance, ideal printing patterns and a variety of promising applications for inkjet printing of 2D layered materials.

  8. TOPAZ2D heat transfer code users manual and thermal property data base

    NASA Astrophysics Data System (ADS)

    Shapiro, A. B.; Edwards, A. L.

    1990-05-01

    TOPAZ2D is a two dimensional implicit finite element computer code for heat transfer analysis. This user's manual provides information on the structure of a TOPAZ2D input file. Also included is a material thermal property data base. This manual is supplemented with The TOPAZ2D Theoretical Manual and the TOPAZ2D Verification Manual. TOPAZ2D has been implemented on the CRAY, SUN, and VAX computers. TOPAZ2D can be used to solve for the steady state or transient temperature field on two dimensional planar or axisymmetric geometries. Material properties may be temperature dependent and either isotropic or orthotropic. A variety of time and temperature dependent boundary conditions can be specified including temperature, flux, convection, and radiation. Time or temperature dependent internal heat generation can be defined locally be element or globally by material. TOPAZ2D can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in material surrounding the enclosure. Additional features include thermally controlled reactive chemical mixtures, thermal contact resistance across an interface, bulk fluid flow, phase change, and energy balances. Thermal stresses can be calculated using the solid mechanics code NIKE2D which reads the temperature state data calculated by TOPAZ2D. A three dimensional version of the code, TOPAZ3D is available.

  9. Evaluating Computer Screen Time and Its Possible Link to Psychopathology in the Context of Age: A Cross-Sectional Study of Parents and Children

    PubMed Central

    Ross, Sharon; Silman, Zmira; Maoz, Hagai; Bloch, Yuval

    2015-01-01

    Background Several studies have suggested that high levels of computer use are linked to psychopathology. However, there is ambiguity about what should be considered normal or over-use of computers. Furthermore, the nature of the link between computer usage and psychopathology is controversial. The current study utilized the context of age to address these questions. Our hypothesis was that the context of age will be paramount for differentiating normal from excessive use, and that this context will allow a better understanding of the link to psychopathology. Methods In a cross-sectional study, 185 parents and children aged 3–18 years were recruited in clinical and community settings. They were asked to fill out questionnaires regarding demographics, functional and academic variables, computer use as well as psychiatric screening questionnaires. Using a regression model, we identified 3 groups of normal-use, over-use and under-use and examined known factors as putative differentiators between the over-users and the other groups. Results After modeling computer screen time according to age, factors linked to over-use were: decreased socialization (OR 3.24, Confidence interval [CI] 1.23–8.55, p = 0.018), difficulty to disengage from the computer (OR 1.56, CI 1.07–2.28, p = 0.022) and age, though borderline-significant (OR 1.1 each year, CI 0.99–1.22, p = 0.058). While psychopathology was not linked to over-use, post-hoc analysis revealed that the link between increased computer screen time and psychopathology was age-dependent and solidified as age progressed (p = 0.007). Unlike computer usage, the use of small-screens and smartphones was not associated with psychopathology. Conclusions The results suggest that computer screen time follows an age-based course. We conclude that differentiating normal from over-use as well as defining over-use as a possible marker for psychiatric difficulties must be performed within the context of age. If verified by

  10. 2-D Magnetohydrodynamic Modeling of A Pulsed Plasma Thruster

    NASA Technical Reports Server (NTRS)

    Thio, Y. C. Francis; Cassibry, J. T.; Wu, S. T.; Rodgers, Stephen L. (Technical Monitor)

    2002-01-01

    Experiments are being performed on the NASA Marshall Space Flight Center (MSFC) MK-1 pulsed plasma thruster. Data produced from the experiments provide an opportunity to further understand the plasma dynamics in these thrusters via detailed computational modeling. The detailed and accurate understanding of the plasma dynamics in these devices holds the key towards extending their capabilities in a number of applications, including their applications as high power (greater than 1 MW) thrusters, and their use for producing high-velocity, uniform plasma jets for experimental purposes. For this study, the 2-D MHD modeling code, MACH2, is used to provide detailed interpretation of the experimental data. At the same time, a 0-D physics model of the plasma initial phase is developed to guide our 2-D modeling studies.

  11. 2D FEM Heat Transfer & E&M Field Code

    SciTech Connect

    1992-04-02

    TOPAZ and TOPAZ2D are two-dimensional implicit finite element computer codes for heat transfer analysis. TOPAZ2D can also be used to solve electrostatic and magnetostatic problems. The programs solve for the steady-state or transient temperature or electrostatic and magnetostatic potential field on two-dimensional planar or axisymmetric geometries. Material properties may be temperature or potential-dependent and either isotropic or orthotropic. A variety of time and temperature-dependent boundary conditions can be specified including temperature, flux, convection, and radiation. By implementing the user subroutine feature, users can model chemical reaction kinetics and allow for any type of functional representation of boundary conditions and internal heat generation. The programs can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in the material surrounding the enclosure. Additional features include thermal contact resistance across an interface, bulk fluids, phase change, and energy balances.

  12. 2D FEM Heat Transfer & E&M Field Code

    1992-04-02

    TOPAZ and TOPAZ2D are two-dimensional implicit finite element computer codes for heat transfer analysis. TOPAZ2D can also be used to solve electrostatic and magnetostatic problems. The programs solve for the steady-state or transient temperature or electrostatic and magnetostatic potential field on two-dimensional planar or axisymmetric geometries. Material properties may be temperature or potential-dependent and either isotropic or orthotropic. A variety of time and temperature-dependent boundary conditions can be specified including temperature, flux, convection, and radiation.more » By implementing the user subroutine feature, users can model chemical reaction kinetics and allow for any type of functional representation of boundary conditions and internal heat generation. The programs can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in the material surrounding the enclosure. Additional features include thermal contact resistance across an interface, bulk fluids, phase change, and energy balances.« less

  13. FPCAS2D user's guide, version 1.0

    NASA Astrophysics Data System (ADS)

    Bakhle, Milind A.

    1994-12-01

    The FPCAS2D computer code has been developed for aeroelastic stability analysis of bladed disks such as those in fans, compressors, turbines, propellers, or propfans. The aerodynamic analysis used in this code is based on the unsteady two-dimensional full potential equation which is solved for a cascade of blades. The structural analysis is based on a two degree-of-freedom rigid typical section model for each blade. Detailed explanations of the aerodynamic analysis, the numerical algorithms, and the aeroelastic analysis are not given in this report. This guide can be used to assist in the preparation of the input data required by the FPCAS2D code. A complete description of the input data is provided in this report. In addition, four test cases, including inputs and outputs, are provided.

  14. COYOTE: A computer program for 2-D reactive flow simulations

    SciTech Connect

    Cloutman, L.D.

    1990-04-01

    We describe the numerical algorithm used in the COYOTE two- dimensional, transient, Eulerian hydrodynamics program for reactive flows. The program has a variety of options that provide capabilities for a wide range of applications, and it is designed to be robust and relatively easy to use while maintaining adequate accuracy and efficiency to solve realistic problems. It is based on the ICE method, and it includes a general species and chemical reaction network for simulating reactive flows. It also includes swirl, turbulence transport models, and a nonuniform mesh capability. We describe several applications of the program. 33 refs., 4 figs.

  15. Grid Cell Responses in 1D Environments Assessed as Slices through a 2D Lattice.

    PubMed

    Yoon, KiJung; Lewallen, Sam; Kinkhabwala, Amina A; Tank, David W; Fiete, Ila R

    2016-03-01

    Grid cells, defined by their striking periodic spatial responses in open 2D arenas, appear to respond differently on 1D tracks: the multiple response fields are not periodically arranged, peak amplitudes vary across fields, and the mean spacing between fields is larger than in 2D environments. We ask whether such 1D responses are consistent with the system's 2D dynamics. Combining analytical and numerical methods, we show that the 1D responses of grid cells with stable 1D fields are consistent with a linear slice through a 2D triangular lattice. Further, the 1D responses of comodular cells are well described by parallel slices, and the offsets in the starting points of the 1D slices can predict the measured 2D relative spatial phase between the cells. From these results, we conclude that the 2D dynamics of these cells is preserved in 1D, suggesting a common computation during both types of navigation behavior. PMID:26898777

  16. Syndrome identification based on 2D analysis software.

    PubMed

    Boehringer, Stefan; Vollmar, Tobias; Tasse, Christiane; Wurtz, Rolf P; Gillessen-Kaesbach, Gabriele; Horsthemke, Bernhard; Wieczorek, Dagmar

    2006-10-01

    Clinical evaluation of children with developmental delay continues to present a challenge to the clinicians. In many cases, the face provides important information to diagnose a condition. However, database support with respect to facial traits is limited at present. Computer-based analyses of 2D and 3D representations of faces have been developed, but it is unclear how well a larger number of conditions can be handled by such systems. We have therefore analysed 2D pictures of patients each being affected with one of 10 syndromes (fragile X syndrome; Cornelia de Lange syndrome; Williams-Beuren syndrome; Prader-Willi syndrome; Mucopolysaccharidosis type III; Cri-du-chat syndrome; Smith-Lemli-Opitz syndrome; Sotos syndrome; Microdeletion 22q11.2; Noonan syndrome). We can show that a classification accuracy of >75% can be achieved for a computer-based diagnosis among the 10 syndromes, which is about the same accuracy achieved for five syndromes in a previous study. Pairwise discrimination of syndromes ranges from 80 to 99%. Furthermore, we can demonstrate that the criteria used by the computer decisions match clinical observations in many cases. These findings indicate that computer-based picture analysis might be a helpful addition to existing database systems, which are meant to assist in syndrome diagnosis, especially as data acquisition is straightforward and involves off-the-shelf digital camera equipment. PMID:16773127

  17. The NH2D hyperfine structure revealed by astrophysical observations

    NASA Astrophysics Data System (ADS)

    Daniel, F.; Coudert, L. H.; Punanova, A.; Harju, J.; Faure, A.; Roueff, E.; Sipilä, O.; Caselli, P.; Güsten, R.; Pon, A.; Pineda, J. E.

    2016-02-01

    Context. The 111-101 lines of ortho- and para-NH2D (o/p-NH2D) at 86 and 110 GHz, respectively, are commonly observed to provide constraints on the deuterium fractionation in the interstellar medium. In cold regions, the hyperfine structure that is due to the nitrogen (14N) nucleus is resolved. To date, this splitting is the only one that is taken into account in the NH2D column density estimates. Aims: We investigate how including the hyperfine splitting caused by the deuterium (D) nucleus affects the analysis of the rotational lines of NH2D. Methods: We present 30 m IRAM observations of the above mentioned lines and APEX o/p-NH2D observations of the 101-000 lines at 333 GHz. The hyperfine patterns of the observed lines were calculated taking into account the splitting induced by the D nucleus. The analysis then relies on line lists that either neglect or include the splitting induced by the D nucleus. Results: The hyperfine spectra are first analyzed with a line list that only includes the hyperfine splitting that is due to the 14N nucleus. We find inconsistencies between the line widths of the 101-000 and 111-101 lines, the latter being larger by a factor of ~1.6 ± 0.3. Such a large difference is unexpected because the two sets of lines probably originate from the same region. We next employed a newly computed line list for the o/p-NH2D transitions where the hyperfine structure induced by both nitrogen and deuterium nuclei was included. With this new line list, the analysis of the previous spectra leads to compatible line widths. Conclusions: Neglecting the hyperfine structure caused by D leads to overestimating the line widths of the o/p-NH2D lines at 3 mm. The error for a cold molecular core is about 50%. This error propagates directly to the column density estimate. We therefore recommend to take the hyperfine splittings caused by both the 14N and D nuclei into account in any analysis that relies on these lines. Based on observations carried out with the IRAM

  18. A comparative study of adult patient doses in film screen and computed radiography in some Sudanese hospitals.

    PubMed

    Elshiekh, E; Suliman, I I; Habbani, F

    2015-07-01

    A study was performed to compare adult patient doses in film screen (FS) and computed radiography (CR) diagnostic X-ray examinations in some hospitals in Sudan over a period of 1 y; during this period of time, the CR systems were introduced to replace FS systems. Radiation doses were estimated for 354 patients in five hospitals (two FS units and three CR units). Entrance surface air kerma (ESAK) was estimated from incident air kerma using patient exposure parameters and tube output. Dose calculations were performed using CALDOSE X 3.5 Monte Carlo-based software. In FS, third quartile of ESAK values for skull PA, skull LAT, chest PA, pelvis AP, lumbar spine AP and lumbar spine LAT were 1.5, 1.3, 0.3, 1.9, 2.8 and 5.9 mGy, respectively, while in CR, third quartile of ESAK values for the same examinations were 2.7, 1.7, 0.18, 1.7, 3.2 and 10.8 mGy, respectively. Comparable ESAK values were presented in FS and CR units. The results are important for future dose optimisation and setting national diagnostic reference levels. PMID:25889604

  19. 2D OR NOT 2D: THE EFFECT OF DIMENSIONALITY ON THE DYNAMICS OF FINGERING CONVECTION AT LOW PRANDTL NUMBER

    SciTech Connect

    Garaud, Pascale; Brummell, Nicholas

    2015-12-10

    Fingering convection (otherwise known as thermohaline convection) is an instability that occurs in stellar radiative interiors in the presence of unstable compositional gradients. Numerical simulations have been used in order to estimate the efficiency of mixing induced by this instability. However, fully three-dimensional (3D) computations in the parameter regime appropriate for stellar astrophysics (i.e., low Prandtl number) are prohibitively expensive. This raises the question of whether two-dimensional (2D) simulations could be used instead to achieve the same goals. In this work, we address this issue by comparing the outcome of 2D and 3D simulations of fingering convection at low Prandtl number. We find that 2D simulations are never appropriate. However, we also find that the required 3D computational domain does not have to be very wide: the third dimension only needs to contain a minimum of two wavelengths of the fastest-growing linearly unstable mode to capture the essentially 3D dynamics of small-scale fingering. Narrow domains, however, should still be used with caution since they could limit the subsequent development of any large-scale dynamics typically associated with fingering convection.

  20. 2D or Not 2D: The Effect of Dimensionality on the Dynamics of Fingering Convection at Low Prandtl Number

    NASA Astrophysics Data System (ADS)

    Garaud, Pascale; Brummell, Nicholas

    2015-12-01

    Fingering convection (otherwise known as thermohaline convection) is an instability that occurs in stellar radiative interiors in the presence of unstable compositional gradients. Numerical simulations have been used in order to estimate the efficiency of mixing induced by this instability. However, fully three-dimensional (3D) computations in the parameter regime appropriate for stellar astrophysics (i.e., low Prandtl number) are prohibitively expensive. This raises the question of whether two-dimensional (2D) simulations could be used instead to achieve the same goals. In this work, we address this issue by comparing the outcome of 2D and 3D simulations of fingering convection at low Prandtl number. We find that 2D simulations are never appropriate. However, we also find that the required 3D computational domain does not have to be very wide: the third dimension only needs to contain a minimum of two wavelengths of the fastest-growing linearly unstable mode to capture the essentially 3D dynamics of small-scale fingering. Narrow domains, however, should still be used with caution since they could limit the subsequent development of any large-scale dynamics typically associated with fingering convection.

  1. Parallel stitching of 2D materials

    DOE PAGES

    Ling, Xi; Wu, Lijun; Lin, Yuxuan; Ma, Qiong; Wang, Ziqiang; Song, Yi; Yu, Lili; Huang, Shengxi; Fang, Wenjing; Zhang, Xu; et al

    2016-01-27

    Diverse parallel stitched 2D heterostructures, including metal–semiconductor, semiconductor–semiconductor, and insulator–semiconductor, are synthesized directly through selective “sowing” of aromatic molecules as the seeds in the chemical vapor deposition (CVD) method. Lastly, the methodology enables the large-scale fabrication of lateral heterostructures, which offers tremendous potential for its application in integrated circuits.

  2. Parallel Stitching of 2D Materials.

    PubMed

    Ling, Xi; Lin, Yuxuan; Ma, Qiong; Wang, Ziqiang; Song, Yi; Yu, Lili; Huang, Shengxi; Fang, Wenjing; Zhang, Xu; Hsu, Allen L; Bie, Yaqing; Lee, Yi-Hsien; Zhu, Yimei; Wu, Lijun; Li, Ju; Jarillo-Herrero, Pablo; Dresselhaus, Mildred; Palacios, Tomás; Kong, Jing

    2016-03-23

    Diverse parallel stitched 2D heterostructures, including metal-semiconductor, semiconductor-semiconductor, and insulator-semiconductor, are synthesized directly through selective "sowing" of aromatic molecules as the seeds in the chemical vapor deposition (CVD) method. The methodology enables the large-scale fabrication of lateral heterostructures, which offers tremendous potential for its application in integrated circuits.

  3. LOCA hydroloads calculations with multidimensional nonlinear fluid/structure interaction. Volume 2: STEALTH 2D/WHAMSE 2D single-phse fluid and elastic structure studies. Final report. [PWR

    SciTech Connect

    Chang, F.H.; Santee, G.E. Jr.; Mortensen, G.A.; Brockett, G.F.; Gross, M.B.; Silling, S.A.; Belytschko, T.

    1981-03-01

    This report, the second in a series of reports for RP-1065, describes the second step in the stepwise approach for developing the three-dimensional, nonlinear, fluid/structure interaction methodology to assess the hydroloads on a large PWR during the subcooled portions of a hypothetical LOCA. The second step in the methodology considers enhancements and special modifications to the 2D STEALTH-HYDRO computer program and the 2D WHAMSE computer program. The 2D STEALTH-HYDRO enhancements consist of a fluid-fluid coupling control-volume model and an orifice control-volume model. The enhancements to 2D WHAMSE include elimination of the implicit integration routines, material models, and structural elements not required for the hydroloads application. In addition the logic for coupling the 2D STEALTH-HYDRO computer program to the 2D WHAMSE computer program is discussed.

  4. Application of 2D Non-Graphene Materials and 2D Oxide Nanostructures for Biosensing Technology

    PubMed Central

    Shavanova, Kateryna; Bakakina, Yulia; Burkova, Inna; Shtepliuk, Ivan; Viter, Roman; Ubelis, Arnolds; Beni, Valerio; Starodub, Nickolaj; Yakimova, Rositsa; Khranovskyy, Volodymyr

    2016-01-01

    The discovery of graphene and its unique properties has inspired researchers to try to invent other two-dimensional (2D) materials. After considerable research effort, a distinct “beyond graphene” domain has been established, comprising the library of non-graphene 2D materials. It is significant that some 2D non-graphene materials possess solid advantages over their predecessor, such as having a direct band gap, and therefore are highly promising for a number of applications. These applications are not limited to nano- and opto-electronics, but have a strong potential in biosensing technologies, as one example. However, since most of the 2D non-graphene materials have been newly discovered, most of the research efforts are concentrated on material synthesis and the investigation of the properties of the material. Applications of 2D non-graphene materials are still at the embryonic stage, and the integration of 2D non-graphene materials into devices is scarcely reported. However, in recent years, numerous reports have blossomed about 2D material-based biosensors, evidencing the growing potential of 2D non-graphene materials for biosensing applications. This review highlights the recent progress in research on the potential of using 2D non-graphene materials and similar oxide nanostructures for different types of biosensors (optical and electrochemical). A wide range of biological targets, such as glucose, dopamine, cortisol, DNA, IgG, bisphenol, ascorbic acid, cytochrome and estradiol, has been reported to be successfully detected by biosensors with transducers made of 2D non-graphene materials. PMID:26861346

  5. Application of 2D Non-Graphene Materials and 2D Oxide Nanostructures for Biosensing Technology.

    PubMed

    Shavanova, Kateryna; Bakakina, Yulia; Burkova, Inna; Shtepliuk, Ivan; Viter, Roman; Ubelis, Arnolds; Beni, Valerio; Starodub, Nickolaj; Yakimova, Rositsa; Khranovskyy, Volodymyr

    2016-01-01

    The discovery of graphene and its unique properties has inspired researchers to try to invent other two-dimensional (2D) materials. After considerable research effort, a distinct "beyond graphene" domain has been established, comprising the library of non-graphene 2D materials. It is significant that some 2D non-graphene materials possess solid advantages over their predecessor, such as having a direct band gap, and therefore are highly promising for a number of applications. These applications are not limited to nano- and opto-electronics, but have a strong potential in biosensing technologies, as one example. However, since most of the 2D non-graphene materials have been newly discovered, most of the research efforts are concentrated on material synthesis and the investigation of the properties of the material. Applications of 2D non-graphene materials are still at the embryonic stage, and the integration of 2D non-graphene materials into devices is scarcely reported. However, in recent years, numerous reports have blossomed about 2D material-based biosensors, evidencing the growing potential of 2D non-graphene materials for biosensing applications. This review highlights the recent progress in research on the potential of using 2D non-graphene materials and similar oxide nanostructures for different types of biosensors (optical and electrochemical). A wide range of biological targets, such as glucose, dopamine, cortisol, DNA, IgG, bisphenol, ascorbic acid, cytochrome and estradiol, has been reported to be successfully detected by biosensors with transducers made of 2D non-graphene materials.

  6. A comparative analysis of 2D and 3D CAD for calcifications in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Acciavatti, Raymond J.; Ray, Shonket; Keller, Brad M.; Maidment, Andrew D. A.; Conant, Emily F.

    2015-03-01

    Many medical centers offer digital breast tomosynthesis (DBT) and 2D digital mammography acquired under the same compression (i.e., "Combo" examination) for screening. This paper compares a conventional 2D CAD algorithm (Hologic® ImageChecker® CAD v9.4) for calcification detection against a prototype 3D algorithm (Hologic® ImageChecker® 3D Calc CAD v1.0). Due to the newness of DBT, the development of this 3D CAD algorithm is ongoing, and it is currently not FDA-approved in the United States. For this study, DBT screening cases with suspicious calcifications were identified retrospectively at the University of Pennsylvania. An expert radiologist (E.F.C.) reviewed images with both 2D and DBT CAD marks, and compared the marks to biopsy results. Control cases with one-year negative follow-up were also studied; these cases either possess clearly benign calcifications or lacked calcifications. To allow the user to alter the sensitivity for cancer detection, an operating point is assigned to each CAD mark. As expected from conventional 2D CAD, increasing the operating point in 3D CAD increases sensitivity and reduces specificity. Additionally, we showed that some cancers are occult to 2D CAD at all operating points. By contrast, 3D CAD allows for detection of some cancers that are missed on 2D CAD. We also demonstrated that some non-cancerous CAD marks in 3D are not present at analogous locations in the 2D image. Hence, there are additional marks when using both 2D and 3D CAD in combination, leading to lower specificity than with conventional 2D CAD alone.

  7. 2D Fast Vessel Visualization Using a Vessel Wall Mask Guiding Fine Vessel Detection.

    PubMed

    Raptis, Sotirios; Koutsouris, Dimitris

    2010-01-01

    The paper addresses the fine retinal-vessel's detection issue that is faced in diagnostic applications and aims at assisting in better recognizing fine vessel anomalies in 2D. Our innovation relies in separating key visual features vessels exhibit in order to make the diagnosis of eventual retinopathologies easier to detect. This allows focusing on vessel segments which present fine changes detectable at different sampling scales. We advocate that these changes can be addressed as subsequent stages of the same vessel detection procedure. We first carry out an initial estimate of the basic vessel-wall's network, define the main wall-body, and then try to approach the ridges and branches of the vasculature's using fine detection. Fine vessel screening looks into local structural inconsistencies in vessels properties, into noise, or into not expected intensity variations observed inside pre-known vessel-body areas. The vessels are first modelled sufficiently but not precisely by their walls with a tubular model-structure that is the result of an initial segmentation. This provides a chart of likely Vessel Wall Pixels (VWPs) yielding a form of a likelihood vessel map mainly based on gradient filter's intensity and spatial arrangement parameters (e.g., linear consistency). Specific vessel parameters (centerline, width, location, fall-away rate, main orientation) are post-computed by convolving the image with a set of pre-tuned spatial filters called Matched Filters (MFs). These are easily computed as Gaussian-like 2D forms that use a limited range sub-optimal parameters adjusted to the dominant vessel characteristics obtained by Spatial Grey Level Difference statistics limiting the range of search into vessel widths of 16, 32, and 64 pixels. Sparse pixels are effectively eliminated by applying a limited range Hough Transform (HT) or region growing. Major benefits are limiting the range of parameters, reducing the search-space for post-convolution to only masked regions

  8. Design Application Translates 2-D Graphics to 3-D Surfaces

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Fabric Images Inc., specializing in the printing and manufacturing of fabric tension architecture for the retail, museum, and exhibit/tradeshow communities, designed software to translate 2-D graphics for 3-D surfaces prior to print production. Fabric Images' fabric-flattening design process models a 3-D surface based on computer-aided design (CAD) specifications. The surface geometry of the model is used to form a 2-D template, similar to a flattening process developed by NASA's Glenn Research Center. This template or pattern is then applied in the development of a 2-D graphic layout. Benefits of this process include 11.5 percent time savings per project, less material wasted, and the ability to improve upon graphic techniques and offer new design services. Partners include Exhibitgroup/Giltspur (end-user client: TAC Air, a division of Truman Arnold Companies Inc.), Jack Morton Worldwide (end-user client: Nickelodeon), as well as 3D Exhibits Inc., and MG Design Associates Corp.

  9. Numerical Simulation of Supersonic Compression Corners and Hypersonic Inlet Flows Using the RPLUS2D Code

    NASA Technical Reports Server (NTRS)

    Kapoor, Kamlesh; Anderson, Bernhard H.; Shaw, Robert J.

    1994-01-01

    A two-dimensional computational code, PRLUS2D, which was developed for the reactive propulsive flows of ramjets and scramjets, was validated for two-dimensional shock-wave/turbulent-boundary-layer interactions. The problem of compression corners at supersonic speeds was solved using the RPLUS2D code. To validate the RPLUS2D code for hypersonic speeds, it was applied to a realistic hypersonic inlet geometry. Both the Baldwin-Lomax and the Chien two-equation turbulence models were used. Computational results showed that the RPLUS2D code compared very well with experimentally obtained data for supersonic compression corner flows, except in the case of large separated flows resulting from the interactions between the shock wave and turbulent boundary layer. The computational results compared well with the experiment results in a hypersonic NASA P8 inlet case, with the Chien two-equation turbulence model performing better than the Baldwin-Lomax model.

  10. PARCEQ2D heat transfer grid sensitivity analysis

    SciTech Connect

    Saladino, A.J.; Praharaj, S.C.; Collins, F.G. Tennessee Univ., Tullahoma )

    1991-01-01

    The material presented in this paper is an extension of two-dimensional Aeroassist Flight Experiment (AFE) results shown previously. This study has focused on the heating rate calculations to the AFE obtained from an equilibrium real gas code, with attention placed on the sensitivity of grid dependence and wall temperature. Heat transfer results calculated by the PARCEQ2D code compare well with those computed by other researchers. Temperature convergence in the case of kinetic transport has been accomplished by increasing the wall temperature gradually from 300 K to the wall temperature of 1700 K. 28 refs.

  11. PARCEQ2D heat transfer grid sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Saladino, Anthony J.; Praharaj, Sarat C.; Collins, Frank G.

    1991-01-01

    The material presented in this paper is an extension of two-dimensional Aeroassist Flight Experiment (AFE) results shown previously. This study has focused on the heating rate calculations to the AFE obtained from an equilibrium real gas code, with attention placed on the sensitivity of grid dependence and wall temperature. Heat transfer results calculated by the PARCEQ2D code compare well with those computed by other researchers. Temperature convergence in the case of kinetic transport has been accomplished by increasing the wall temperature gradually from 300 K to the wall temperature of 1700 K.

  12. A parallel splitting wavelet method for 2D conservation laws

    NASA Astrophysics Data System (ADS)

    Schmidt, Alex A.; Kozakevicius, Alice J.; Jakobsson, Stefan

    2016-06-01

    The current work presents a parallel formulation using the MPI protocol for an adaptive high order finite difference scheme to solve 2D conservation laws. Adaptivity is achieved at each time iteration by the application of an interpolating wavelet transform in each space dimension. High order approximations for the numerical fluxes are computed by ENO and WENO schemes. Since time evolution is made by a TVD Runge-Kutta space splitting scheme, the problem is naturally suitable for parallelization. Numerical simulations and speedup results are presented for Euler equations in gas dynamics problems.

  13. Stochastic Inversion of 2D Magnetotelluric Data

    SciTech Connect

    Chen, Jinsong

    2010-07-01

    The algorithm is developed to invert 2D magnetotelluric (MT) data based on sharp boundary parametrization using a Bayesian framework. Within the algorithm, we consider the locations and the resistivity of regions formed by the interfaces are as unknowns. We use a parallel, adaptive finite-element algorithm to forward simulate frequency-domain MT responses of 2D conductivity structure. Those unknown parameters are spatially correlated and are described by a geostatistical model. The joint posterior probability distribution function is explored by Markov Chain Monte Carlo (MCMC) sampling methods. The developed stochastic model is effective for estimating the interface locations and resistivity. Most importantly, it provides details uncertainty information on each unknown parameter. Hardware requirements: PC, Supercomputer, Multi-platform, Workstation; Software requirements C and Fortan; Operation Systems/version is Linux/Unix or Windows

  14. Explicit 2-D Hydrodynamic FEM Program

    1996-08-07

    DYNA2D* is a vectorized, explicit, two-dimensional, axisymmetric and plane strain finite element program for analyzing the large deformation dynamic and hydrodynamic response of inelastic solids. DYNA2D* contains 13 material models and 9 equations of state (EOS) to cover a wide range of material behavior. The material models implemented in all machine versions are: elastic, orthotropic elastic, kinematic/isotropic elastic plasticity, thermoelastoplastic, soil and crushable foam, linear viscoelastic, rubber, high explosive burn, isotropic elastic-plastic, temperature-dependent elastic-plastic. Themore » isotropic and temperature-dependent elastic-plastic models determine only the deviatoric stresses. Pressure is determined by one of 9 equations of state including linear polynomial, JWL high explosive, Sack Tuesday high explosive, Gruneisen, ratio of polynomials, linear polynomial with energy deposition, ignition and growth of reaction in HE, tabulated compaction, and tabulated.« less

  15. Stochastic Inversion of 2D Magnetotelluric Data

    2010-07-01

    The algorithm is developed to invert 2D magnetotelluric (MT) data based on sharp boundary parametrization using a Bayesian framework. Within the algorithm, we consider the locations and the resistivity of regions formed by the interfaces are as unknowns. We use a parallel, adaptive finite-element algorithm to forward simulate frequency-domain MT responses of 2D conductivity structure. Those unknown parameters are spatially correlated and are described by a geostatistical model. The joint posterior probability distribution function ismore » explored by Markov Chain Monte Carlo (MCMC) sampling methods. The developed stochastic model is effective for estimating the interface locations and resistivity. Most importantly, it provides details uncertainty information on each unknown parameter. Hardware requirements: PC, Supercomputer, Multi-platform, Workstation; Software requirements C and Fortan; Operation Systems/version is Linux/Unix or Windows« less

  16. Explicit 2-D Hydrodynamic FEM Program

    SciTech Connect

    Lin, Jerry

    1996-08-07

    DYNA2D* is a vectorized, explicit, two-dimensional, axisymmetric and plane strain finite element program for analyzing the large deformation dynamic and hydrodynamic response of inelastic solids. DYNA2D* contains 13 material models and 9 equations of state (EOS) to cover a wide range of material behavior. The material models implemented in all machine versions are: elastic, orthotropic elastic, kinematic/isotropic elastic plasticity, thermoelastoplastic, soil and crushable foam, linear viscoelastic, rubber, high explosive burn, isotropic elastic-plastic, temperature-dependent elastic-plastic. The isotropic and temperature-dependent elastic-plastic models determine only the deviatoric stresses. Pressure is determined by one of 9 equations of state including linear polynomial, JWL high explosive, Sack Tuesday high explosive, Gruneisen, ratio of polynomials, linear polynomial with energy deposition, ignition and growth of reaction in HE, tabulated compaction, and tabulated.

  17. 2D photonic-crystal optomechanical nanoresonator.

    PubMed

    Makles, K; Antoni, T; Kuhn, A G; Deléglise, S; Briant, T; Cohadon, P-F; Braive, R; Beaudoin, G; Pinard, L; Michel, C; Dolique, V; Flaminio, R; Cagnoli, G; Robert-Philip, I; Heidmann, A

    2015-01-15

    We present the optical optimization of an optomechanical device based on a suspended InP membrane patterned with a 2D near-wavelength grating (NWG) based on a 2D photonic-crystal geometry. We first identify by numerical simulation a set of geometrical parameters providing a reflectivity higher than 99.8% over a 50-nm span. We then study the limitations induced by the finite value of the optical waist and lateral size of the NWG pattern using different numerical approaches. The NWG grating, pierced in a suspended InP 265-nm thick membrane, is used to form a compact microcavity involving the suspended nanomembrane as an end mirror. The resulting cavity has a waist size smaller than 10 μm and a finesse in the 200 range. It is used to probe the Brownian motion of the mechanical modes of the nanomembrane. PMID:25679837

  18. Compact 2-D graphical representation of DNA

    NASA Astrophysics Data System (ADS)

    Randić, Milan; Vračko, Marjan; Zupan, Jure; Novič, Marjana

    2003-05-01

    We present a novel 2-D graphical representation for DNA sequences which has an important advantage over the existing graphical representations of DNA in being very compact. It is based on: (1) use of binary labels for the four nucleic acid bases, and (2) use of the 'worm' curve as template on which binary codes are placed. The approach is illustrated on DNA sequences of the first exon of human β-globin and gorilla β-globin.

  19. 2D materials: Graphene and others

    NASA Astrophysics Data System (ADS)

    Bansal, Suneev Anil; Singh, Amrinder Pal; Kumar, Suresh

    2016-05-01

    Present report reviews the recent advancements in new atomically thick 2D materials. Materials covered in this review are Graphene, Silicene, Germanene, Boron Nitride (BN) and Transition metal chalcogenides (TMC). These materials show extraordinary mechanical, electronic and optical properties which make them suitable candidates for future applications. Apart from unique properties, tune-ability of highly desirable properties of these materials is also an important area to be emphasized on.

  20. Layer Engineering of 2D Semiconductor Junctions.

    PubMed

    He, Yongmin; Sobhani, Ali; Lei, Sidong; Zhang, Zhuhua; Gong, Yongji; Jin, Zehua; Zhou, Wu; Yang, Yingchao; Zhang, Yuan; Wang, Xifan; Yakobson, Boris; Vajtai, Robert; Halas, Naomi J; Li, Bo; Xie, Erqing; Ajayan, Pulickel

    2016-07-01

    A new concept for junction fabrication by connecting multiple regions with varying layer thicknesses, based on the thickness dependence, is demonstrated. This type of junction is only possible in super-thin-layered 2D materials, and exhibits similar characteristics as p-n junctions. Rectification and photovoltaic effects are observed in chemically homogeneous MoSe2 junctions between domains of different thicknesses. PMID:27136275

  1. 2D Spinodal Decomposition in Forced Turbulence

    NASA Astrophysics Data System (ADS)

    Fan, Xiang; Diamond, Patrick; Chacon, Luis; Li, Hui

    2015-11-01

    Spinodal decomposition is a second order phase transition for binary fluid mixture, from one thermodynamic phase to form two coexisting phases. The governing equation for this coarsening process below critical temperature, Cahn-Hilliard Equation, is very similar to 2D MHD Equation, especially the conserved quantities have a close correspondence between each other, so theories for MHD turbulence are used to study spinodal decomposition in forced turbulence. Domain size is increased with time along with the inverse cascade, and the length scale can be arrested by a forced turbulence with direct cascade. The two competing mechanisms lead to a stabilized domain size length scale, which can be characterized by Hinze Scale. The 2D spinodal decomposition in forced turbulence is studied by both theory and simulation with ``pixie2d.'' This work focuses on the relation between Hinze scale and spectra and cascades. Similarities and differences between spinodal decomposition and MHD are investigated. Also some transport properties are studied following MHD theories. This work is supported by the Department of Energy under Award Number DE-FG02-04ER54738.

  2. Engineering light outcoupling in 2D materials.

    PubMed

    Lien, Der-Hsien; Kang, Jeong Seuk; Amani, Matin; Chen, Kevin; Tosun, Mahmut; Wang, Hsin-Ping; Roy, Tania; Eggleston, Michael S; Wu, Ming C; Dubey, Madan; Lee, Si-Chen; He, Jr-Hau; Javey, Ali

    2015-02-11

    When light is incident on 2D transition metal dichalcogenides (TMDCs), it engages in multiple reflections within underlying substrates, producing interferences that lead to enhancement or attenuation of the incoming and outgoing strength of light. Here, we report a simple method to engineer the light outcoupling in semiconducting TMDCs by modulating their dielectric surroundings. We show that by modulating the thicknesses of underlying substrates and capping layers, the interference caused by substrate can significantly enhance the light absorption and emission of WSe2, resulting in a ∼11 times increase in Raman signal and a ∼30 times increase in the photoluminescence (PL) intensity of WSe2. On the basis of the interference model, we also propose a strategy to control the photonic and optoelectronic properties of thin-layer WSe2. This work demonstrates the utilization of outcoupling engineering in 2D materials and offers a new route toward the realization of novel optoelectronic devices, such as 2D LEDs and solar cells.

  3. Large-scale virtual high-throughput screening for the identification of new battery electrolyte solvents: computing infrastructure and collective properties.

    PubMed

    Husch, Tamara; Yilmazer, Nusret Duygu; Balducci, Andrea; Korth, Martin

    2015-02-01

    A volunteer computing approach is presented for the purpose of screening a large number of molecular structures with respect to their suitability as new battery electrolyte solvents. Collective properties like melting, boiling and flash points are evaluated using COSMOtherm and quantitative structure-property relationship (QSPR) based methods, while electronic structure theory methods are used for the computation of electrochemical stability window estimators. Two application examples are presented: first, the results of a previous large-scale screening test (PCCP, 2014, 16, 7919) are re-evaluated with respect to the mentioned collective properties. As a second application example, all reasonable nitrile solvents up to 12 heavy atoms are generated and used to illustrate a suitable filter protocol for picking Pareto-optimal candidates.

  4. Large-scale virtual high-throughput screening for the identification of new battery electrolyte solvents: computing infrastructure and collective properties.

    PubMed

    Husch, Tamara; Yilmazer, Nusret Duygu; Balducci, Andrea; Korth, Martin

    2015-02-01

    A volunteer computing approach is presented for the purpose of screening a large number of molecular structures with respect to their suitability as new battery electrolyte solvents. Collective properties like melting, boiling and flash points are evaluated using COSMOtherm and quantitative structure-property relationship (QSPR) based methods, while electronic structure theory methods are used for the computation of electrochemical stability window estimators. Two application examples are presented: first, the results of a previous large-scale screening test (PCCP, 2014, 16, 7919) are re-evaluated with respect to the mentioned collective properties. As a second application example, all reasonable nitrile solvents up to 12 heavy atoms are generated and used to illustrate a suitable filter protocol for picking Pareto-optimal candidates. PMID:25529013

  5. Toward fully automated high performance computing drug discovery: a massively parallel virtual screening pipeline for docking and molecular mechanics/generalized Born surface area rescoring to improve enrichment.

    PubMed

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2014-01-27

    In this work we announce and evaluate a high throughput virtual screening pipeline for in-silico screening of virtual compound databases using high performance computing (HPC). Notable features of this pipeline are an automated receptor preparation scheme with unsupervised binding site identification. The pipeline includes receptor/target preparation, ligand preparation, VinaLC docking calculation, and molecular mechanics/generalized Born surface area (MM/GBSA) rescoring using the GB model by Onufriev and co-workers [J. Chem. Theory Comput. 2007, 3, 156-169]. Furthermore, we leverage HPC resources to perform an unprecedented, comprehensive evaluation of MM/GBSA rescoring when applied to the DUD-E data set (Directory of Useful Decoys: Enhanced), in which we selected 38 protein targets and a total of ∼0.7 million actives and decoys. The computer wall time for virtual screening has been reduced drastically on HPC machines, which increases the feasibility of extremely large ligand database screening with more accurate methods. HPC resources allowed us to rescore 20 poses per compound and evaluate the optimal number of poses to rescore. We find that keeping 5-10 poses is a good compromise between accuracy and computational expense. Overall the results demonstrate that MM/GBSA rescoring has higher average receiver operating characteristic (ROC) area under curve (AUC) values and consistently better early recovery of actives than Vina docking alone. Specifically, the enrichment performance is target-dependent. MM/GBSA rescoring significantly out performs Vina docking for the folate enzymes, kinases, and several other enzymes. The more accurate energy function and solvation terms of the MM/GBSA method allow MM/GBSA to achieve better enrichment, but the rescoring is still limited by the docking method to generate the poses with the correct binding modes. PMID:24358939

  6. Toward fully automated high performance computing drug discovery: a massively parallel virtual screening pipeline for docking and molecular mechanics/generalized Born surface area rescoring to improve enrichment.

    PubMed

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2014-01-27

    In this work we announce and evaluate a high throughput virtual screening pipeline for in-silico screening of virtual compound databases using high performance computing (HPC). Notable features of this pipeline are an automated receptor preparation scheme with unsupervised binding site identification. The pipeline includes receptor/target preparation, ligand preparation, VinaLC docking calculation, and molecular mechanics/generalized Born surface area (MM/GBSA) rescoring using the GB model by Onufriev and co-workers [J. Chem. Theory Comput. 2007, 3, 156-169]. Furthermore, we leverage HPC resources to perform an unprecedented, comprehensive evaluation of MM/GBSA rescoring when applied to the DUD-E data set (Directory of Useful Decoys: Enhanced), in which we selected 38 protein targets and a total of ∼0.7 million actives and decoys. The computer wall time for virtual screening has been reduced drastically on HPC machines, which increases the feasibility of extremely large ligand database screening with more accurate methods. HPC resources allowed us to rescore 20 poses per compound and evaluate the optimal number of poses to rescore. We find that keeping 5-10 poses is a good compromise between accuracy and computational expense. Overall the results demonstrate that MM/GBSA rescoring has higher average receiver operating characteristic (ROC) area under curve (AUC) values and consistently better early recovery of actives than Vina docking alone. Specifically, the enrichment performance is target-dependent. MM/GBSA rescoring significantly out performs Vina docking for the folate enzymes, kinases, and several other enzymes. The more accurate energy function and solvation terms of the MM/GBSA method allow MM/GBSA to achieve better enrichment, but the rescoring is still limited by the docking method to generate the poses with the correct binding modes.

  7. Using computational modeling to assess the impact of clinical decision support on cancer screening improvement strategies within the community health centers.

    PubMed

    Carney, Timothy Jay; Morgan, Geoffrey P; Jones, Josette; McDaniel, Anna M; Weaver, Michael; Weiner, Bryan; Haggstrom, David A

    2014-10-01

    Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman's Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability.

  8. GBL-2D Version 1.0: a 2D geometry boolean library.

    SciTech Connect

    McBride, Cory L. (Elemental Technologies, American Fort, UT); Schmidt, Rodney Cannon; Yarberry, Victor R.; Meyers, Ray J.

    2006-11-01

    This report describes version 1.0 of GBL-2D, a geometric Boolean library for 2D objects. The library is written in C++ and consists of a set of classes and routines. The classes primarily represent geometric data and relationships. Classes are provided for 2D points, lines, arcs, edge uses, loops, surfaces and mask sets. The routines contain algorithms for geometric Boolean operations and utility functions. Routines are provided that incorporate the Boolean operations: Union(OR), XOR, Intersection and Difference. A variety of additional analytical geometry routines and routines for importing and exporting the data in various file formats are also provided. The GBL-2D library was originally developed as a geometric modeling engine for use with a separate software tool, called SummitView [1], that manipulates the 2D mask sets created by designers of Micro-Electro-Mechanical Systems (MEMS). However, many other practical applications for this type of software can be envisioned because the need to perform 2D Boolean operations can arise in many contexts.

  9. 2D quantum gravity at three loops: A counterterm investigation

    NASA Astrophysics Data System (ADS)

    Leduc, Lætitia; Bilal, Adel

    2016-02-01

    We analyze the divergences of the three-loop partition function at fixed area in 2D quantum gravity. Considering the Liouville action in the Kähler formalism, we extract the coefficient of the leading divergence ∼ AΛ2(ln ⁡ AΛ2) 2. This coefficient is non-vanishing. We discuss the counterterms one can and must add and compute their precise contribution to the partition function. This allows us to conclude that every local and non-local divergence in the partition function can be balanced by local counterterms, with the only exception of the maximally non-local divergence (ln ⁡ AΛ2) 3. Yet, this latter is computed and does cancel between the different three-loop diagrams. Thus, requiring locality of the counterterms is enough to renormalize the partition function. Finally, the structure of the new counterterms strongly suggests that they can be understood as a renormalization of the measure action.

  10. Periodically sheared 2D Yukawa systems

    SciTech Connect

    Kovács, Anikó Zsuzsa; Hartmann, Peter; Donkó, Zoltán

    2015-10-15

    We present non-equilibrium molecular dynamics simulation studies on the dynamic (complex) shear viscosity of a 2D Yukawa system. We have identified a non-monotonic frequency dependence of the viscosity at high frequencies and shear rates, an energy absorption maximum (local resonance) at the Einstein frequency of the system at medium shear rates, an enhanced collective wave activity, when the excitation is near the plateau frequency of the longitudinal wave dispersion, and the emergence of significant configurational anisotropy at small frequencies and high shear rates.

  11. ENERGY LANDSCAPE OF 2D FLUID FORMS

    SciTech Connect

    Y. JIANG; ET AL

    2000-04-01

    The equilibrium states of 2D non-coarsening fluid foams, which consist of bubbles with fixed areas, correspond to local minima of the total perimeter. (1) The authors find an approximate value of the global minimum, and determine directly from an image how far a foam is from its ground state. (2) For (small) area disorder, small bubbles tend to sort inwards and large bubbles outwards. (3) Topological charges of the same sign repel while charges of opposite sign attract. (4) They discuss boundary conditions and the uniqueness of the pattern for fixed topology.

  12. TOPAZ2D heat transfer code users manual and thermal property data base

    SciTech Connect

    Shapiro, A.B.; Edwards, A.L.

    1990-05-01

    TOPAZ2D is a two dimensional implicit finite element computer code for heat transfer analysis. This user's manual provides information on the structure of a TOPAZ2D input file. Also included is a material thermal property data base. This manual is supplemented with The TOPAZ2D Theoretical Manual and the TOPAZ2D Verification Manual. TOPAZ2D has been implemented on the CRAY, SUN, and VAX computers. TOPAZ2D can be used to solve for the steady state or transient temperature field on two dimensional planar or axisymmetric geometries. Material properties may be temperature dependent and either isotropic or orthotropic. A variety of time and temperature dependent boundary conditions can be specified including temperature, flux, convection, and radiation. Time or temperature dependent internal heat generation can be defined locally be element or globally by material. TOPAZ2D can solve problems of diffuse and specular band radiation in an enclosure coupled with conduction in material surrounding the enclosure. Additional features include thermally controlled reactive chemical mixtures, thermal contact resistance across an interface, bulk fluid flow, phase change, and energy balances. Thermal stresses can be calculated using the solid mechanics code NIKE2D which reads the temperature state data calculated by TOPAZ2D. A three dimensional version of the code, TOPAZ3D is available. The material thermal property data base, Chapter 4, included in this manual was originally published in 1969 by Art Edwards for use with his TRUMP finite difference heat transfer code. The format of the data has been altered to be compatible with TOPAZ2D. Bob Bailey is responsible for adding the high explosive thermal property data.

  13. Contributions of the Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) for the diagnosis of MCI in Brazil.

    PubMed

    Memória, Cláudia M; Yassuda, Mônica S; Nakano, Eduardo Y; Forlenza, Orestes V

    2014-05-01

    ABSTRACT Background: The Computer-Administered Neuropsychological Screen for Mild Cognitive Impairment (CANS-MCI) is a computer-based cognitive screening instrument that involves automated administration and scoring and immediate analyses of test sessions. The objective of this study was to translate and culturally adapt the Brazilian Portuguese version of the CANS-MCI (CANS-MCI-BR) and to evaluate its reliability and validity for the diagnostic screening of MCI and dementia due to Alzheimer's disease. Methods: The test was administered to 97 older adults (mean age 73.41 ± 5.27 years) with at least four years of formal education (mean education 12.23 ± 4.48 years). Participants were classified into three diagnostic groups according to global cognitive status (normal controls, n = 41; MCI, n = 35; AD, n = 21) based on clinical data and formal neuropsychological assessments. Results: The results indicated high internal consistency (Cronbach's α = 0.77) in the total sample. Three-month test-retest reliability correlations were significant and robust (0.875; p < 0.001). A moderate level of concurrent validity was attained relative to the screening test for MCI (MoCA test, r = 0.76, p < 0.001). Confirmatory factor analysis supported the three-factor model of the original test, i.e., memory, language/spatial fluency, and executive function/mental control. Goodness of fit indicators were strong (Bentler Comparative Fit Index = 0.96, Root Mean Square Error of Approximation = 0.09). Receiver operating characteristic curve analyses suggested high sensitivity and specificity (81% and 73% respectively) to screen for possible MCI cases. Conclusions: The CANS-MCI-BR maintains adequate psychometric characteristics that render it suitable to identify elderly adults with probable cognitive impairment to whom a more extensive evaluation by formal neuropsychological tests may be required.

  14. Computational Analysis and In silico Predictive Modeling for Inhibitors of PhoP Regulon in S. typhi on High-Throughput Screening Bioassay Dataset.

    PubMed

    Kaur, Harleen; Ahmad, Mohd; Scaria, Vinod

    2016-03-01

    There is emergence of multidrug-resistant Salmonella enterica serotype typhi in pandemic proportions throughout the world, and therefore, there is a necessity to speed up the discovery of novel molecules having different modes of action and also less influenced by the resistance formation that would be used as drug for the treatment of salmonellosis particularly typhoid fever. The PhoP regulon is well studied and has now been shown to be a critical regulator of number of gene expressions which are required for intracellular survival of S. enterica and pathophysiology of disease like typhoid. The evident roles of two-component PhoP-/PhoQ-regulated products in salmonella virulence have motivated attempts to target them therapeutically. Although the discovery process of biologically active compounds for the treatment of typhoid relies on hit-finding procedure, using high-throughput screening technology alone is very expensive, as well as time consuming when performed on large scales. With the recent advancement in combinatorial chemistry and contemporary technique for compounds synthesis, there are more and more compounds available which give ample growth of diverse compound library, but the time and endeavor required to screen these unfocused massive and diverse library have been slightly reduced in the past years. Hence, there is demand to improve the high-quality hits and success rate for high-throughput screening that required focused and biased compound library toward the particular target. Therefore, we still need an advantageous and expedient method to prioritize the molecules that will be utilized for biological screens, which saves time and is also inexpensive. In this concept, in silico methods like machine learning are widely applicable technique used to build computational model for high-throughput virtual screens to prioritize molecules for advance study. Furthermore, in computational analysis, we extended our study to identify the common enriched

  15. Rapid Computer Aided Ligand Design and Screening of Precious Metal Extractants from TRUEX Raffinate with Experimental Validation

    SciTech Connect

    Clark, Aurora Sue; Wall, Nathalie; Benny, Paul

    2015-11-16

    through the design of a software program that uses state-of-the-art computational combinatorial chemistry, and is developed and validated with experimental data acquisition; the resulting tool allows for rapid design and screening of new ligands for the extraction of precious metals from SNF. This document describes the software that has been produced, ligands that have been designed, and fundamental new understandings of the extraction process of Rh(III) as a function of solution phase conditions (pH, nature of acid, etc.).

  16. Comparison of 1D and 2D CSR Models with Application to the FERMI@ELETTRA Bunch Compressors

    SciTech Connect

    Bassi, G.; Ellison, J.A.; Heinemann, K.

    2011-03-28

    We compare our 2D mean field (Vlasov-Maxwell) treatment of coherent synchrotron radiation (CSR) effects with 1D approximations of the CSR force which are commonly implemented in CSR codes. In our model we track particles in 4D phase space and calculate 2D forces [1]. The major cost in our calculation is the computation of the 2D force. To speed up the computation and improve 1D models we also investigate approximations to our exact 2D force. As an application, we present numerical results for the Fermi{at}Elettra first bunch compressor with the configuration described in [1].

  17. [Lung cancer screening].

    PubMed

    Sánchez González, M

    2014-01-01

    Lung cancer is a very important disease, curable in early stages. There have been trials trying to show the utility of chest x-ray or computed tomography in Lung Cancer Screening for decades. In 2011, National Lung Screening Trial results were published, showing a 20% reduction in lung cancer mortality in patients with low dose computed tomography screened for three years. These results are very promising and several scientific societies have included lung cancer screening in their guidelines. Nevertheless we have to be aware of lung cancer screening risks, such as: overdiagnosis, radiation and false positive results. Moreover, there are many issues to be solved, including choosing the appropriate group to be screened, the duration of the screening program, intervals between screening and its cost-effectiveness. Ongoing trials will probably answer some of these questions. This article reviews the current evidence on lung cancer screening.

  18. WFR-2D: an analytical model for PWAS-generated 2D ultrasonic guided wave propagation

    NASA Astrophysics Data System (ADS)

    Shen, Yanfeng; Giurgiutiu, Victor

    2014-03-01

    This paper presents WaveFormRevealer 2-D (WFR-2D), an analytical predictive tool for the simulation of 2-D ultrasonic guided wave propagation and interaction with damage. The design of structural health monitoring (SHM) systems and self-aware smart structures requires the exploration of a wide range of parameters to achieve best detection and quantification of certain types of damage. Such need for parameter exploration on sensor dimension, location, guided wave characteristics (mode type, frequency, wavelength, etc.) can be best satisfied with analytical models which are fast and efficient. The analytical model was constructed based on the exact 2-D Lamb wave solution using Bessel and Hankel functions. Damage effects were inserted in the model by considering the damage as a secondary wave source with complex-valued directivity scattering coefficients containing both amplitude and phase information from wave-damage interaction. The analytical procedure was coded with MATLAB, and a predictive simulation tool called WaveFormRevealer 2-D was developed. The wave-damage interaction coefficients (WDICs) were extracted from harmonic analysis of local finite element model (FEM) with artificial non-reflective boundaries (NRB). The WFR-2D analytical simulation results were compared and verified with full scale multiphysics finite element models and experiments with scanning laser vibrometer. First, Lamb wave propagation in a pristine aluminum plate was simulated with WFR-2D, compared with finite element results, and verified by experiments. Then, an inhomogeneity was machined into the plate to represent damage. Analytical modeling was carried out, and verified by finite element simulation and experiments. This paper finishes with conclusions and suggestions for future work.

  19. Microwave Assisted 2D Materials Exfoliation

    NASA Astrophysics Data System (ADS)

    Wang, Yanbin

    Two-dimensional materials have emerged as extremely important materials with applications ranging from energy and environmental science to electronics and biology. Here we report our discovery of a universal, ultrafast, green, solvo-thermal technology for producing excellent-quality, few-layered nanosheets in liquid phase from well-known 2D materials such as such hexagonal boron nitride (h-BN), graphite, and MoS2. We start by mixing the uniform bulk-layered material with a common organic solvent that matches its surface energy to reduce the van der Waals attractive interactions between the layers; next, the solutions are heated in a commercial microwave oven to overcome the energy barrier between bulk and few-layers states. We discovered the minutes-long rapid exfoliation process is highly temperature dependent, which requires precise thermal management to obtain high-quality inks. We hypothesize a possible mechanism of this proposed solvo-thermal process; our theory confirms the basis of this novel technique for exfoliation of high-quality, layered 2D materials by using an as yet unknown role of the solvent.

  20. Multienzyme Inkjet Printed 2D Arrays.

    PubMed

    Gdor, Efrat; Shemesh, Shay; Magdassi, Shlomo; Mandler, Daniel

    2015-08-19

    The use of printing to produce 2D arrays is well established, and should be relatively facile to adapt for the purpose of printing biomaterials; however, very few studies have been published using enzyme solutions as inks. Among the printing technologies, inkjet printing is highly suitable for printing biomaterials and specifically enzymes, as it offers many advantages. Formulation of the inkjet inks is relatively simple and can be adjusted to a variety of biomaterials, while providing nonharmful environment to the enzymes. Here we demonstrate the applicability of inkjet printing for patterning multiple enzymes in a predefined array in a very straightforward, noncontact method. Specifically, various arrays of the enzymes glucose oxidase (GOx), invertase (INV) and horseradish peroxidase (HP) were printed on aminated glass surfaces, followed by immobilization using glutardialdehyde after printing. Scanning electrochemical microscopy (SECM) was used for imaging the printed patterns and to ascertain the enzyme activity. The successful formation of 2D arrays consisting of enzymes was explored as a means of developing the first surface confined enzyme based logic gates. Principally, XOR and AND gates, each consisting of two enzymes as the Boolean operators, were assembled, and their operation was studied by SECM. PMID:26214072

  1. 2-D or not 2-D, that is the question: A Northern California test

    SciTech Connect

    Mayeda, K; Malagnini, L; Phillips, W S; Walter, W R; Dreger, D

    2005-06-06

    Reliable estimates of the seismic source spectrum are necessary for accurate magnitude, yield, and energy estimation. In particular, how seismic radiated energy scales with increasing earthquake size has been the focus of recent debate within the community and has direct implications on earthquake source physics studies as well as hazard mitigation. The 1-D coda methodology of Mayeda et al. has provided the lowest variance estimate of the source spectrum when compared against traditional approaches that use direct S-waves, thus making it ideal for networks that have sparse station distribution. The 1-D coda methodology has been mostly confined to regions of approximately uniform complexity. For larger, more geophysically complicated regions, 2-D path corrections may be required. The complicated tectonics of the northern California region coupled with high quality broadband seismic data provides for an ideal ''apples-to-apples'' test of 1-D and 2-D path assumptions on direct waves and their coda. Using the same station and event distribution, we compared 1-D and 2-D path corrections and observed the following results: (1) 1-D coda results reduced the amplitude variance relative to direct S-waves by roughly a factor of 8 (800%); (2) Applying a 2-D correction to the coda resulted in up to 40% variance reduction from the 1-D coda results; (3) 2-D direct S-wave results, though better than 1-D direct waves, were significantly worse than the 1-D coda. We found that coda-based moment-rate source spectra derived from the 2-D approach were essentially identical to those from the 1-D approach for frequencies less than {approx}0.7-Hz, however for the high frequencies (0.7{le} f {le} 8.0-Hz), the 2-D approach resulted in inter-station scatter that was generally 10-30% smaller. For complex regions where data are plentiful, a 2-D approach can significantly improve upon the simple 1-D assumption. In regions where only 1-D coda correction is available it is still preferable over 2

  2. Canard configured aircraft with 2-D nozzle

    NASA Technical Reports Server (NTRS)

    Child, R. D.; Henderson, W. P.

    1978-01-01

    A closely-coupled canard fighter with vectorable two-dimensional nozzle was designed for enhanced transonic maneuvering. The HiMAT maneuver goal of a sustained 8g turn at a free-stream Mach number of 0.9 and 30,000 feet was the primary design consideration. The aerodynamic design process was initiated with a linear theory optimization minimizing the zero percent suction drag including jet effects and refined with three-dimensional nonlinear potential flow techniques. Allowances were made for mutual interference and viscous effects. The design process to arrive at the resultant configuration is described, and the design of a powered 2-D nozzle model to be tested in the LRC 16-foot Propulsion Wind Tunnel is shown.

  3. 2D Electrostatic Actuation of Microshutter Arrays

    NASA Technical Reports Server (NTRS)

    Burns, Devin E.; Oh, Lance H.; Li, Mary J.; Kelly, Daniel P.; Kutyrev, Alexander S.; Moseley, Samuel H.

    2015-01-01

    Electrostatically actuated microshutter arrays consisting of rotational microshutters (shutters that rotate about a torsion bar) were designed and fabricated through the use of models and experiments. Design iterations focused on minimizing the torsional stiffness of the microshutters, while maintaining their structural integrity. Mechanical and electromechanical test systems were constructed to measure the static and dynamic behavior of the microshutters. The torsional stiffness was reduced by a factor of four over initial designs without sacrificing durability. Analysis of the resonant behavior of the microshutters demonstrates that the first resonant mode is a torsional mode occurring around 3000 Hz. At low vacuum pressures, this resonant mode can be used to significantly reduce the drive voltage necessary for actuation requiring as little as 25V. 2D electrostatic latching and addressing was demonstrated using both a resonant and pulsed addressing scheme.

  4. 2D Electrostatic Actuation of Microshutter Arrays

    NASA Technical Reports Server (NTRS)

    Burns, Devin E.; Oh, Lance H.; Li, Mary J.; Jones, Justin S.; Kelly, Daniel P.; Zheng, Yun; Kutyrev, Alexander S.; Moseley, Samuel H.

    2015-01-01

    An electrostatically actuated microshutter array consisting of rotational microshutters (shutters that rotate about a torsion bar) were designed and fabricated through the use of models and experiments. Design iterations focused on minimizing the torsional stiffness of the microshutters, while maintaining their structural integrity. Mechanical and electromechanical test systems were constructed to measure the static and dynamic behavior of the microshutters. The torsional stiffness was reduced by a factor of four over initial designs without sacrificing durability. Analysis of the resonant behavior of the microshutter arrays demonstrates that the first resonant mode is a torsional mode occurring around 3000 Hz. At low vacuum pressures, this resonant mode can be used to significantly reduce the drive voltage necessary for actuation requiring as little as 25V. 2D electrostatic latching and addressing was demonstrated using both a resonant and pulsed addressing scheme.

  5. 2D quantum gravity from quantum entanglement.

    PubMed

    Gliozzi, F

    2011-01-21

    In quantum systems with many degrees of freedom the replica method is a useful tool to study the entanglement of arbitrary spatial regions. We apply it in a way that allows them to backreact. As a consequence, they become dynamical subsystems whose position, form, and extension are determined by their interaction with the whole system. We analyze, in particular, quantum spin chains described at criticality by a conformal field theory. Its coupling to the Gibbs' ensemble of all possible subsystems is relevant and drives the system into a new fixed point which is argued to be that of the 2D quantum gravity coupled to this system. Numerical experiments on the critical Ising model show that the new critical exponents agree with those predicted by the formula of Knizhnik, Polyakov, and Zamolodchikov.

  6. Graphene suspensions for 2D printing

    NASA Astrophysics Data System (ADS)

    Soots, R. A.; Yakimchuk, E. A.; Nebogatikova, N. A.; Kotin, I. A.; Antonova, I. V.

    2016-04-01

    It is shown that, by processing a graphite suspension in ethanol or water by ultrasound and centrifuging, it is possible to obtain particles with thicknesses within 1-6 nm and, in the most interesting cases, 1-1.5 nm. Analogous treatment of a graphite suspension in organic solvent yields eventually thicker particles (up to 6-10 nm thick) even upon long-term treatment. Using the proposed ink based on graphene and aqueous ethanol with ethylcellulose and terpineol additives for 2D printing, thin (~5 nm thick) films with sheet resistance upon annealing ~30 MΩ/□ were obtained. With the ink based on aqueous graphene suspension, the sheet resistance was ~5-12 kΩ/□ for 6- to 15-nm-thick layers with a carrier mobility of ~30-50 cm2/(V s).

  7. Metrology for graphene and 2D materials

    NASA Astrophysics Data System (ADS)

    Pollard, Andrew J.

    2016-09-01

    The application of graphene, a one atom-thick honeycomb lattice of carbon atoms with superlative properties, such as electrical conductivity, thermal conductivity and strength, has already shown that it can be used to benefit metrology itself as a new quantum standard for resistance. However, there are many application areas where graphene and other 2D materials, such as molybdenum disulphide (MoS2) and hexagonal boron nitride (h-BN), may be disruptive, areas such as flexible electronics, nanocomposites, sensing and energy storage. Applying metrology to the area of graphene is now critical to enable the new, emerging global graphene commercial world and bridge the gap between academia and industry. Measurement capabilities and expertise in a wide range of scientific areas are required to address this challenge. The combined and complementary approach of varied characterisation methods for structural, chemical, electrical and other properties, will allow the real-world issues of commercialising graphene and other 2D materials to be addressed. Here, examples of metrology challenges that have been overcome through a multi-technique or new approach are discussed. Firstly, the structural characterisation of defects in both graphene and MoS2 via Raman spectroscopy is described, and how nanoscale mapping of vacancy defects in graphene is also possible using tip-enhanced Raman spectroscopy (TERS). Furthermore, the chemical characterisation and removal of polymer residue on chemical vapour deposition (CVD) grown graphene via secondary ion mass spectrometry (SIMS) is detailed, as well as the chemical characterisation of iron films used to grow large domain single-layer h-BN through CVD growth, revealing how contamination of the substrate itself plays a role in the resulting h-BN layer. In addition, the role of international standardisation in this area is described, outlining the current work ongoing in both the International Organization of Standardization (ISO) and the

  8. Progress in 2D photonic crystal Fano resonance photonics

    NASA Astrophysics Data System (ADS)

    Zhou, Weidong; Zhao, Deyin; Shuai, Yi-Chen; Yang, Hongjun; Chuwongin, Santhad; Chadha, Arvinder; Seo, Jung-Hun; Wang, Ken X.; Liu, Victor; Ma, Zhenqiang; Fan, Shanhui

    2014-01-01

    In contrast to a conventional symmetric Lorentzian resonance, Fano resonance is predominantly used to describe asymmetric-shaped resonances, which arise from the constructive and destructive interference of discrete resonance states with broadband continuum states. This phenomenon and the underlying mechanisms, being common and ubiquitous in many realms of physical sciences, can be found in a wide variety of nanophotonic structures and quantum systems, such as quantum dots, photonic crystals, plasmonics, and metamaterials. The asymmetric and steep dispersion of the Fano resonance profile promises applications for a wide range of photonic devices, such as optical filters, switches, sensors, broadband reflectors, lasers, detectors, slow-light and non-linear devices, etc. With advances in nanotechnology, impressive progress has been made in the emerging field of nanophotonic structures. One of the most attractive nanophotonic structures for integrated photonics is the two-dimensional photonic crystal slab (2D PCS), which can be integrated into a wide range of photonic devices. The objective of this manuscript is to provide an in depth review of the progress made in the general area of Fano resonance photonics, focusing on the photonic devices based on 2D PCS structures. General discussions are provided on the origins and characteristics of Fano resonances in 2D PCSs. A nanomembrane transfer printing fabrication technique is also reviewed, which is critical for the heterogeneous integrated Fano resonance photonics. The majority of the remaining sections review progress made on various photonic devices and structures, such as high quality factor filters, membrane reflectors, membrane lasers, detectors and sensors, as well as structures and phenomena related to Fano resonance slow light effect, nonlinearity, and optical forces in coupled PCSs. It is expected that further advances in the field will lead to more significant advances towards 3D integrated photonics, flat

  9. 2D to 3D conversion implemented in different hardware

    NASA Astrophysics Data System (ADS)

    Ramos-Diaz, Eduardo; Gonzalez-Huitron, Victor; Ponomaryov, Volodymyr I.; Hernandez-Fragoso, Araceli

    2015-02-01

    Conversion of available 2D data for release in 3D content is a hot topic for providers and for success of the 3D applications, in general. It naturally completely relies on virtual view synthesis of a second view given by original 2D video. Disparity map (DM) estimation is a central task in 3D generation but still follows a very difficult problem for rendering novel images precisely. There exist different approaches in DM reconstruction, among them manually and semiautomatic methods that can produce high quality DMs but they demonstrate hard time consuming and are computationally expensive. In this paper, several hardware implementations of designed frameworks for an automatic 3D color video generation based on 2D real video sequence are proposed. The novel framework includes simultaneous processing of stereo pairs using the following blocks: CIE L*a*b* color space conversions, stereo matching via pyramidal scheme, color segmentation by k-means on an a*b* color plane, and adaptive post-filtering, DM estimation using stereo matching between left and right images (or neighboring frames in a video), adaptive post-filtering, and finally, the anaglyph 3D scene generation. Novel technique has been implemented on DSP TMS320DM648, Matlab's Simulink module over a PC with Windows 7, and using graphic card (NVIDIA Quadro K2000) demonstrating that the proposed approach can be applied in real-time processing mode. The time values needed, mean Similarity Structural Index Measure (SSIM) and Bad Matching Pixels (B) values for different hardware implementations (GPU, Single CPU, and DSP) are exposed in this paper.

  10. 2-D Model for Normal and Sickle Cell Blood Microcirculation

    NASA Astrophysics Data System (ADS)

    Tekleab, Yonatan; Harris, Wesley

    2011-11-01

    Sickle cell disease (SCD) is a genetic disorder that alters the red blood cell (RBC) structure and function such that hemoglobin (Hb) cannot effectively bind and release oxygen. Previous computational models have been designed to study the microcirculation for insight into blood disorders such as SCD. Our novel 2-D computational model represents a fast, time efficient method developed to analyze flow dynamics, O2 diffusion, and cell deformation in the microcirculation. The model uses a finite difference, Crank-Nicholson scheme to compute the flow and O2 concentration, and the level set computational method to advect the RBC membrane on a staggered grid. Several sets of initial and boundary conditions were tested. Simulation data indicate a few parameters to be significant in the perturbation of the blood flow and O2 concentration profiles. Specifically, the Hill coefficient, arterial O2 partial pressure, O2 partial pressure at 50% Hb saturation, and cell membrane stiffness are significant factors. Results were found to be consistent with those of Le Floch [2010] and Secomb [2006].

  11. Stereoscopic Vascular Models of the Head and Neck: A Computed Tomography Angiography Visualization

    ERIC Educational Resources Information Center

    Cui, Dongmei; Lynch, James C.; Smith, Andrew D.; Wilson, Timothy D.; Lehman, Michael N.

    2016-01-01

    Computer-assisted 3D models are used in some medical and allied health science schools; however, they are often limited to online use and 2D flat screen-based imaging. Few schools take advantage of 3D stereoscopic learning tools in anatomy education and clinically relevant anatomical variations when teaching anatomy. A new approach to teaching…

  12. CYP2D6*36 gene arrangements within the cyp2d6 locus: association of CYP2D6*36 with poor metabolizer status.

    PubMed

    Gaedigk, Andrea; Bradford, L Dianne; Alander, Sarah W; Leeder, J Steven

    2006-04-01

    Unexplained cases of CYP2D6 genotype/phenotype discordance continue to be discovered. In previous studies, several African Americans with a poor metabolizer phenotype carried the reduced function CYP2D6*10 allele in combination with a nonfunctional allele. We pursued the possibility that these alleles harbor either a known sequence variation (i.e., CYP2D6*36 carrying a gene conversion in exon 9 along the CYP2D6*10-defining 100C>T single-nucleotide polymorphism) or novel sequences variation(s). Discordant cases were evaluated by long-range polymerase chain reaction (PCR) to test for gene rearrangement events, and a 6.6-kilobase pair PCR product encompassing the CYP2D6 gene was cloned and entirely sequenced. Thereafter, allele frequencies were determined in different study populations comprising whites, African Americans, and Asians. Analyses covering the CYP2D7 to 2D6 gene region established that CYP2D6*36 did not only exist as a gene duplication (CYP2D6*36x2) or in tandem with *10 (CYP2D6*36+*10), as previously reported, but also by itself. This "single" CYP2D6*36 allele was found in nine African Americans and one Asian, but was absent in the whites tested. Ultimately, the presence of CYP2D6*36 resolved genotype/phenotype discordance in three cases. We also discovered an exon 9 conversion-positive CYP2D6*4 gene in a duplication arrangement (CYP2D6*4Nx2) and a CYP2D6*4 allele lacking 100C>T (CYP2D6*4M) in two white subjects. The discovery of an allele that carries only one CYP2D6*36 gene copy provides unequivocal evidence that both CYP2D6*36 and *36x2 are associated with a poor metabolizer phenotype. Given a combined frequency of between 0.5 and 3% in African Americans and Asians, genotyping for CYP2D6*36 should improve the accuracy of genotype-based phenotype prediction in these populations.

  13. Facial biometrics based on 2D vector geometry

    NASA Astrophysics Data System (ADS)

    Malek, Obaidul; Venetsanopoulos, Anastasios; Androutsos, Dimitrios

    2014-05-01

    The main challenge of facial biometrics is its robustness and ability to adapt to changes in position orientation, facial expression, and illumination effects. This research addresses the predominant deficiencies in this regard and systematically investigates a facial authentication system in the Euclidean domain. In the proposed method, Euclidean geometry in 2D vector space is being constructed for features extraction and the authentication method. In particular, each assigned point of the candidates' biometric features is considered to be a 2D geometrical coordinate in the Euclidean vector space. Algebraic shapes of the extracted candidate features are also computed and compared. The proposed authentication method is being tested on images from the public "Put Face Database". The performance of the proposed method is evaluated based on Correct Recognition (CRR), False Acceptance (FAR), and False Rejection (FRR) rates. The theoretical foundation of the proposed method along with the experimental results are also presented in this paper. The experimental results demonstrate the effectiveness of the proposed method.

  14. 2D Quantum Transport Modeling in Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, B.

    2001-01-01

    We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions, oxide tunneling and phase-breaking scattering are treated on an equal footing. Electron bandstructure is treated within the anisotropic effective mass approximation. We present the results of our simulations of MIT 25 and 90 nm "well-tempered" MOSFETs and compare them to those of classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. These results are consistent with 1D Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and subthreshold current has been studied. The shorter gate length device has an order of magnitude smaller leakage current than the longer gate length device without a significant trade-off in on-current.

  15. E-2D Advanced Hawkeye: primary flight display

    NASA Astrophysics Data System (ADS)

    Paolillo, Paul W.; Saxena, Ragini; Garruba, Jonathan; Tripathi, Sanjay; Blanchard, Randy

    2006-05-01

    This paper is a response to the challenge of providing a large area avionics display for the E-2D AHE aircraft. The resulting display design provides a pilot with high-resolution visual information content covering an image area of almost three square feet (Active Area of Samsung display = 33.792cm x 27.0336 cm = 13.304" x 10.643" = 141.596 square inches = 0.983 sq. ft x 3 = 2.95 sq. ft). The avionics display application, design and performance being described is the Primary Flight Display for the E-2D Advanced Hawkeye aircraft. This cockpit display has a screen diagonal size of 17 inches. Three displays, with minimum bezel width, just fit within the available instrument panel area. The significant design constraints of supporting an upgrade installation have been addressed. These constraints include a display image size that is larger than the mounting opening in the instrument panel. This, therefore, requires that the Electromagnetic Interference (EMI) window, LCD panel and backlight all fit within the limited available bezel depth. High brightness and a wide dimming range are supported with a dual mode Cold Cathode Fluorescent Tube (CCFT) and LED backlight. Packaging constraints dictated the use of multiple U shaped fluorescent lamps in a direct view backlight design for a maximum display brightness of 300 foot-Lamberts. The low intensity backlight levels are provided by remote LEDs coupled through a fiber optic mesh. This architecture generates luminous uniformity within a minimum backlight depth. Cross-cockpit viewing is supported with ultra-wide field-of-view performance including contrast and the color stability of an advanced LCD cell design supports. Display system design tradeoffs directed a priority to high optical efficiency for minimum power and weight.

  16. A new inversion method for (T2, D) 2D NMR logging and fluid typing

    NASA Astrophysics Data System (ADS)

    Tan, Maojin; Zou, Youlong; Zhou, Cancan

    2013-02-01

    One-dimensional nuclear magnetic resonance (1D NMR) logging technology has some significant limitations in fluid typing. However, not only can two-dimensional nuclear magnetic resonance (2D NMR) provide some accurate porosity parameters, but it can also identify fluids more accurately than 1D NMR. In this paper, based on the relaxation mechanism of (T2, D) 2D NMR in a gradient magnetic field, a hybrid inversion method that combines least-squares-based QR decomposition (LSQR) and truncated singular value decomposition (TSVD) is examined in the 2D NMR inversion of various fluid models. The forward modeling and inversion tests are performed in detail with different acquisition parameters, such as magnetic field gradients (G) and echo spacing (TE) groups. The simulated results are discussed and described in detail, the influence of the above-mentioned observation parameters on the inversion accuracy is investigated and analyzed, and the observation parameters in multi-TE activation are optimized. Furthermore, the hybrid inversion can be applied to quantitatively determine the fluid saturation. To study the effects of noise level on the hybrid method and inversion results, the numerical simulation experiments are performed using different signal-to-noise-ratios (SNRs), and the effect of different SNRs on fluid typing using three fluid models are discussed and analyzed in detail.

  17. High-Throughput Computational Screening of Electrical and Phonon Properties of Two-Dimensional Transition Metal Dichalcogenides

    NASA Astrophysics Data System (ADS)

    Williamson, Izaak; Hernandez, Andres Correa; Wong-Ng, Winnie; Li, Lan

    2016-08-01

    Two-dimensional transition metal dichalcogenides (2D-TMDs) are of broadening research interest due to their novel physical, electrical, and thermoelectric properties. Having the chemical formula MX 2, where M is a transition metal and X is a chalcogen, there are many possible combinations to consider for materials-by-design exploration. By identifying novel compositions and utilizing the lower dimensionality, which allows for improved thermoelectric performance (e.g., increased Seebeck coefficients without sacrificing electron concentration), MX 2 materials are promising candidates for thermoelectric applications. However, to develop these materials into wide-scale use, it is crucial to comprehensively understand the compositional affects. This work investigates the structure, electronic, and phonon properties of 18 different MX 2 materials compositions as a benchmark to explore the impact of various elements. There is significant correlation between properties of constituent transition metals (atomic mass and radius) and the structure/properties of the corresponding 2D-TMDs. As the mass of M increases, the n-type power factor and phonon frequency gap increases. Similarly, increases in the radius of M lead to increased layer thickness and Seebeck coefficient S. Our results identify key factors to optimize MX 2 compositions for desired performance.

  18. High-Throughput Computational Screening of Electrical and Phonon Properties of Two-Dimensional Transition Metal Dichalcogenides

    NASA Astrophysics Data System (ADS)

    Williamson, Izaak; Hernandez, Andres Correa; Wong-Ng, Winnie; Li, Lan

    2016-10-01

    Two-dimensional transition metal dichalcogenides (2D-TMDs) are of broadening research interest due to their novel physical, electrical, and thermoelectric properties. Having the chemical formula MX 2, where M is a transition metal and X is a chalcogen, there are many possible combinations to consider for materials-by-design exploration. By identifying novel compositions and utilizing the lower dimensionality, which allows for improved thermoelectric performance (e.g., increased Seebeck coefficients without sacrificing electron concentration), MX 2 materials are promising candidates for thermoelectric applications. However, to develop these materials into wide-scale use, it is crucial to comprehensively understand the compositional affects. This work investigates the structure, electronic, and phonon properties of 18 different MX 2 materials compositions as a benchmark to explore the impact of various elements. There is significant correlation between properties of constituent transition metals (atomic mass and radius) and the structure/properties of the corresponding 2D-TMDs. As the mass of M increases, the n-type power factor and phonon frequency gap increases. Similarly, increases in the radius of M lead to increased layer thickness and Seebeck coefficient S. Our results identify key factors to optimize MX 2 compositions for desired performance.

  19. Learning from graphically integrated 2D and 3D representations improves retention of neuroanatomy

    NASA Astrophysics Data System (ADS)

    Naaz, Farah

    Visualizations in the form of computer-based learning environments are highly encouraged in science education, especially for teaching spatial material. Some spatial material, such as sectional neuroanatomy, is very challenging to learn. It involves learning the two dimensional (2D) representations that are sampled from the three dimensional (3D) object. In this study, a computer-based learning environment was used to explore the hypothesis that learning sectional neuroanatomy from a graphically integrated 2D and 3D representation will lead to better learning outcomes than learning from a sequential presentation. The integrated representation explicitly demonstrates the 2D-3D transformation and should lead to effective learning. This study was conducted using a computer graphical model of the human brain. There were two learning groups: Whole then Sections, and Integrated 2D3D. Both groups learned whole anatomy (3D neuroanatomy) before learning sectional anatomy (2D neuroanatomy). The Whole then Sections group then learned sectional anatomy using 2D representations only. The Integrated 2D3D group learned sectional anatomy from a graphically integrated 3D and 2D model. A set of tests for generalization of knowledge to interpreting biomedical images was conducted immediately after learning was completed. The order of presentation of the tests of generalization of knowledge was counterbalanced across participants to explore a secondary hypothesis of the study: preparation for future learning. If the computer-based instruction programs used in this study are effective tools for teaching anatomy, the participants should continue learning neuroanatomy with exposure to new representations. A test of long-term retention of sectional anatomy was conducted 4-8 weeks after learning was completed. The Integrated 2D3D group was better than the Whole then Sections

  20. 2D/3D Visual Tracker for Rover Mast

    NASA Technical Reports Server (NTRS)

    Bajracharya, Max; Madison, Richard W.; Nesnas, Issa A.; Bandari, Esfandiar; Kunz, Clayton; Deans, Matt; Bualat, Maria

    2006-01-01

    A visual-tracker computer program controls an articulated mast on a Mars rover to keep a designated feature (a target) in view while the rover drives toward the target, avoiding obstacles. Several prior visual-tracker programs have been tested on rover platforms; most require very small and well-estimated motion between consecutive image frames a requirement that is not realistic for a rover on rough terrain. The present visual-tracker program is designed to handle large image motions that lead to significant changes in feature geometry and photometry between frames. When a point is selected in one of the images acquired from stereoscopic cameras on the mast, a stereo triangulation algorithm computes a three-dimensional (3D) location for the target. As the rover moves, its body-mounted cameras feed images to a visual-odometry algorithm, which tracks two-dimensional (2D) corner features and computes their old and new 3D locations. The algorithm rejects points, the 3D motions of which are inconsistent with a rigid-world constraint, and then computes the apparent change in the rover pose (i.e., translation and rotation). The mast pan and tilt angles needed to keep the target centered in the field-of-view of the cameras (thereby minimizing the area over which the 2D-tracking algorithm must operate) are computed from the estimated change in the rover pose, the 3D position of the target feature, and a model of kinematics of the mast. If the motion between the consecutive frames is still large (i.e., 3D tracking was unsuccessful), an adaptive view-based matching technique is applied to the new image. This technique uses correlation-based template matching, in which a feature template is scaled by the ratio between the depth in the original template and the depth of pixels in the new image. This is repeated over the entire search window and the best correlation results indicate the appropriate match. The program could be a core for building application programs for systems

  1. Advecting Procedural Textures for 2D Flow Animation

    NASA Technical Reports Server (NTRS)

    Kao, David; Pang, Alex; Moran, Pat (Technical Monitor)

    2001-01-01

    This paper proposes the use of specially generated 3D procedural textures for visualizing steady state 2D flow fields. We use the flow field to advect and animate the texture over time. However, using standard texture advection techniques and arbitrary textures will introduce some undesirable effects such as: (a) expanding texture from a critical source point, (b) streaking pattern from the boundary of the flowfield, (c) crowding of advected textures near an attracting spiral or sink, and (d) absent or lack of textures in some regions of the flow. This paper proposes a number of strategies to solve these problems. We demonstrate how the technique works using both synthetic data and computational fluid dynamics data.

  2. Fast and robust recognition and localization of 2D objects

    NASA Astrophysics Data System (ADS)

    Otterbach, Rainer; Gerdes, Rolf; Kammueller, R.

    1994-11-01

    The paper presents a vision system which provides a robust model-based identification and localization of 2-D objects in industrial scenes. A symbolic image description based on the polygonal approximation of the object silhouettes is extracted in video real time by the use of dedicated hardware. Candidate objects are selected from the model database using a time and memory efficient hashing algorithm. Any candidate object is submitted to the next computation stage which generates pose hypotheses by assigning model to image contours. Corresponding continuous measures of similarity are derived from the turning functions of the curves. Finally, the previous generated hypotheses are verified using a voting scheme in transformation space. Experimental results reveal the fault tolerance of the vision system with regard to noisy and split image contours as well as partial occlusion of objects. THe short cycle time and the easy adaptability of the vision system make it well suited for a wide variety of applications in industrial automation.

  3. Radiofrequency Spectroscopy and Thermodynamics of Fermi Gases in the 2D to Quasi-2D Dimensional Crossover

    NASA Astrophysics Data System (ADS)

    Cheng, Chingyun; Kangara, Jayampathi; Arakelyan, Ilya; Thomas, John

    2016-05-01

    We tune the dimensionality of a strongly interacting degenerate 6 Li Fermi gas from 2D to quasi-2D, by adjusting the radial confinement of pancake-shaped clouds to control the radial chemical potential. In the 2D regime with weak radial confinement, the measured pair binding energies are in agreement with 2D-BCS mean field theory, which predicts dimer pairing energies in the many-body regime. In the qausi-2D regime obtained with increased radial confinement, the measured pairing energy deviates significantly from 2D-BCS theory. In contrast to the pairing energy, the measured radii of the cloud profiles are not fit by 2D-BCS theory in either the 2D or quasi-2D regimes, but are fit in both regimes by a beyond mean field polaron-model of the free energy. Supported by DOE, ARO, NSF, and AFOSR.

  4. Competing coexisting phases in 2D water

    PubMed Central

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-01-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules. PMID:27185018

  5. Phase Engineering of 2D Tin Sulfides.

    PubMed

    Mutlu, Zafer; Wu, Ryan J; Wickramaratne, Darshana; Shahrezaei, Sina; Liu, Chueh; Temiz, Selcuk; Patalano, Andrew; Ozkan, Mihrimah; Lake, Roger K; Mkhoyan, K A; Ozkan, Cengiz S

    2016-06-01

    Tin sulfides can exist in a variety of phases and polytypes due to the different oxidation states of Sn. A subset of these phases and polytypes take the form of layered 2D structures that give rise to a wide host of electronic and optical properties. Hence, achieving control over the phase, polytype, and thickness of tin sulfides is necessary to utilize this wide range of properties exhibited by the compound. This study reports on phase-selective growth of both hexagonal tin (IV) sulfide SnS2 and orthorhombic tin (II) sulfide SnS crystals with diameters of over tens of microns on SiO2 substrates through atmospheric pressure vapor-phase method in a conventional horizontal quartz tube furnace with SnO2 and S powders as the source materials. Detailed characterization of each phase of tin sulfide crystals is performed using various microscopy and spectroscopy methods, and the results are corroborated by ab initio density functional theory calculations. PMID:27099950

  6. Phase Engineering of 2D Tin Sulfides.

    PubMed

    Mutlu, Zafer; Wu, Ryan J; Wickramaratne, Darshana; Shahrezaei, Sina; Liu, Chueh; Temiz, Selcuk; Patalano, Andrew; Ozkan, Mihrimah; Lake, Roger K; Mkhoyan, K A; Ozkan, Cengiz S

    2016-06-01

    Tin sulfides can exist in a variety of phases and polytypes due to the different oxidation states of Sn. A subset of these phases and polytypes take the form of layered 2D structures that give rise to a wide host of electronic and optical properties. Hence, achieving control over the phase, polytype, and thickness of tin sulfides is necessary to utilize this wide range of properties exhibited by the compound. This study reports on phase-selective growth of both hexagonal tin (IV) sulfide SnS2 and orthorhombic tin (II) sulfide SnS crystals with diameters of over tens of microns on SiO2 substrates through atmospheric pressure vapor-phase method in a conventional horizontal quartz tube furnace with SnO2 and S powders as the source materials. Detailed characterization of each phase of tin sulfide crystals is performed using various microscopy and spectroscopy methods, and the results are corroborated by ab initio density functional theory calculations.

  7. Competing coexisting phases in 2D water

    NASA Astrophysics Data System (ADS)

    Zanotti, Jean-Marc; Judeinstein, Patrick; Dalla-Bernardina, Simona; Creff, Gaëlle; Brubach, Jean-Blaise; Roy, Pascale; Bonetti, Marco; Ollivier, Jacques; Sakellariou, Dimitrios; Bellissent-Funel, Marie-Claire

    2016-05-01

    The properties of bulk water come from a delicate balance of interactions on length scales encompassing several orders of magnitudes: i) the Hydrogen Bond (HBond) at the molecular scale and ii) the extension of this HBond network up to the macroscopic level. Here, we address the physics of water when the three dimensional extension of the HBond network is frustrated, so that the water molecules are forced to organize in only two dimensions. We account for the large scale fluctuating HBond network by an analytical mean-field percolation model. This approach provides a coherent interpretation of the different events experimentally (calorimetry, neutron, NMR, near and far infra-red spectroscopies) detected in interfacial water at 160, 220 and 250 K. Starting from an amorphous state of water at low temperature, these transitions are respectively interpreted as the onset of creation of transient low density patches of 4-HBonded molecules at 160 K, the percolation of these domains at 220 K and finally the total invasion of the surface by them at 250 K. The source of this surprising behaviour in 2D is the frustration of the natural bulk tetrahedral local geometry and the underlying very significant increase in entropy of the interfacial water molecules.

  8. Ab initio modeling of 2D layered organohalide lead perovskites.

    PubMed

    Fraccarollo, Alberto; Cantatore, Valentina; Boschetto, Gabriele; Marchese, Leonardo; Cossi, Maurizio

    2016-04-28

    A number of 2D layered perovskites A2PbI4 and BPbI4, with A and B mono- and divalent ammonium and imidazolium cations, have been modeled with different theoretical methods. The periodic structures have been optimized (both in monoclinic and in triclinic systems, corresponding to eclipsed and staggered arrangements of the inorganic layers) at the DFT level, with hybrid functionals, Gaussian-type orbitals and dispersion energy corrections. With the same methods, the various contributions to the solid stabilization energy have been discussed, separating electrostatic and dispersion energies, organic-organic intralayer interactions and H-bonding effects, when applicable. Then the electronic band gaps have been computed with plane waves, at the DFT level with scalar and full relativistic potentials, and including the correlation energy through the GW approximation. Spin orbit coupling and GW effects have been combined in an additive scheme, validated by comparing the computed gap with well known experimental and theoretical results for a model system. Finally, various contributions to the computed band gaps have been discussed on some of the studied systems, by varying some geometrical parameters and by substituting one cation in another's place. PMID:27131557

  9. Comparing the Effect of Two Types of Computer Screen Background Lighting on Students' Reading Engagement and Achievement

    ERIC Educational Resources Information Center

    Botello, Jennifer A.

    2014-01-01

    With increased dependence on computer-based standardized tests to assess academic achievement, technological literacy has become an essential skill. Yet, because students have unequal access to technology, they may not have equal opportunities to perform well on these computer-based tests. The researcher had observed students taking the STAR…

  10. WormAssay: A Novel Computer Application for Whole-Plate Motion-based Screening of Macroscopic Parasites

    PubMed Central

    Marcellino, Chris; Gut, Jiri; Lim, K. C.; Singh, Rahul; McKerrow, James; Sakanari, Judy

    2012-01-01

    Lymphatic filariasis is caused by filarial nematode parasites, including Brugia malayi. Adult worms live in the lymphatic system and cause a strong immune reaction that leads to the obstruction of lymph vessels and swelling of the extremities. Chronic disease leads to the painful and disfiguring condition known as elephantiasis. Current drug therapy is effective against the microfilariae (larval stage) of the parasite, but no drugs are effective against the adult worms. One of the major stumbling blocks toward developing effective macrofilaricides to kill the adult worms is the lack of a high throughput screening method for candidate drugs. Current methods utilize systems that measure one well at a time and are time consuming and often expensive. We have developed a low-cost and simple visual imaging system to automate and quantify screening entire plates based on parasite movement. This system can be applied to the study of many macroparasites as well as other macroscopic organisms. PMID:22303493

  11. 2D Wavefront Sensor Analysis and Control

    1996-02-19

    This software is designed for data acquisition and analysis of two dimensional wavefront sensors. The software includes data acquisition and control functions for an EPIX frame grabber to acquire data from a computer and all the appropriate analysis functions necessary to produce and display intensity and phase information. This software is written in Visual Basic for windows.

  12. The Harvard Clean Energy Project. Large-scale computational screening and design of molecular motifs for organic photovoltaics on the World Community Grid

    NASA Astrophysics Data System (ADS)

    Hachmann, Johannes; Olivares-Amaya, Roberto; Atahan-Evrenk, Sule; Amador-Bedolla, Carlos; Aspuru-Guzik, Alan

    2011-03-01

    Organic solar cells are one of the promising approaches to ubiquitously establishing renewable energy sources; alas the necessary 10% energy conversion efficiency remains elusive. We present the Harvard Clean Energy Project (CEP, http://cleanenergy.harvard.edu) which is concerned with the screening and design of organic photovoltaics (and organic electronics in general) by means of first-principles computational quantum chemistry. We use modern DFT to assess the quality of candidate structures and systematically improve upon these based on the gathered understanding of structure-property relations. The CEP is a high-throughput investigation which utilizes the massive computational resource of the IBM World Community Grid, which allows us to characterize millions molecules of interest in the course of the next year. We address the combinatorial generation of our molecular library, our database, workflow organization and automation, data calibration and cheminformatics analysis, and the closure of the development cycle provided by our experimental collaborators.

  13. ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data

    NASA Astrophysics Data System (ADS)

    Akca, Irfan

    2016-04-01

    ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discretized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.

  14. Generates 2D Input for DYNA NIKE & TOPAZ

    SciTech Connect

    Hallquist, J. O.; Sanford, Larry

    1996-07-15

    MAZE is an interactive program that serves as an input and two-dimensional mesh generator for DYNA2D, NIKE2D, TOPAZ2D, and CHEMICAL TOPAZ2D. MAZE also generates a basic template for ISLAND input. MAZE has been applied to the generation of input data to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  15. MAZE96. Generates 2D Input for DYNA NIKE & TOPAZ

    SciTech Connect

    Sanford, L.; Hallquist, J.O.

    1992-02-24

    MAZE is an interactive program that serves as an input and two-dimensional mesh generator for DYNA2D, NIKE2D, TOPAZ2D, and CHEMICAL TOPAZ2D. MAZE also generates a basic template for ISLAND input. MAZE has been applied to the generation of input data to study the response of two-dimensional solids and structures undergoing finite deformations under a wide variety of large deformation transient dynamic and static problems and heat transfer analyses.

  16. Growth of Large and Highly Ordered 2D Crystals of a K+ Channel, Structural Role of Lipidic Environment

    PubMed Central

    De Zorzi, Rita; Nicholson, William V.; Guigner, Jean-Michel; Erne-Brand, Françoise; Vénien-Bryan, Catherine

    2013-01-01

    2D crystallography has proven to be an excellent technique to determine the 3D structure of membrane proteins. Compared to 3D crystallography, it has the advantage of visualizing the protein in an environment closer to the native one. However, producing good 2D crystals is still a challenge and little statistical knowledge can be gained from literature. Here, we present a thorough screening of 2D crystallization conditions for a prokaryotic inwardly rectifying potassium channel (>130 different conditions). Key parameters leading to very large and well-organized 2D crystals are discussed. In addition, the problem of formation of multilayers during the growth of 2D crystals is also addressed. An intermediate resolution projection map of KirBac3.1 at 6 Å is presented, which sheds (to our knowledge) new light on the structure of this channel in a lipid environment. PMID:23870261

  17. 2d PDE Linear Symmetric Matrix Solver

    1983-10-01

    ICCG2 (Incomplete Cholesky factorized Conjugate Gradient algorithm for 2d symmetric problems) was developed to solve a linear symmetric matrix system arising from a 9-point discretization of two-dimensional elliptic and parabolic partial differential equations found in plasma physics applications, such as resistive MHD, spatial diffusive transport, and phase space transport (Fokker-Planck equation) problems. These problems share the common feature of being stiff and requiring implicit solution techniques. When these parabolic or elliptic PDE''s are discretized withmore » finite-difference or finite-element methods,the resulting matrix system is frequently of block-tridiagonal form. To use ICCG2, the discretization of the two-dimensional partial differential equation and its boundary conditions must result in a block-tridiagonal supermatrix composed of elementary tridiagonal matrices. The incomplete Cholesky conjugate gradient algorithm is used to solve the linear symmetric matrix equation. Loops are arranged to vectorize on the Cray1 with the CFT compiler, wherever possible. Recursive loops, which cannot be vectorized, are written for optimum scalar speed. For matrices lacking symmetry, ILUCG2 should be used. Similar methods in three dimensions are available in ICCG3 and ILUCG3. A general source containing extensions and macros, which must be processed by a pre-compiler to obtain the standard FORTRAN source, is provided along with the standard FORTRAN source because it is believed to be more readable. The pre-compiler is not included, but pre-compilation may be performed by a text editor as described in the UCRL-88746 Preprint.« less

  18. 2d PDE Linear Asymmetric Matrix Solver

    1983-10-01

    ILUCG2 (Incomplete LU factorized Conjugate Gradient algorithm for 2d problems) was developed to solve a linear asymmetric matrix system arising from a 9-point discretization of two-dimensional elliptic and parabolic partial differential equations found in plasma physics applications, such as plasma diffusion, equilibria, and phase space transport (Fokker-Planck equation) problems. These equations share the common feature of being stiff and requiring implicit solution techniques. When these parabolic or elliptic PDE''s are discretized with finite-difference or finite-elementmore » methods, the resulting matrix system is frequently of block-tridiagonal form. To use ILUCG2, the discretization of the two-dimensional partial differential equation and its boundary conditions must result in a block-tridiagonal supermatrix composed of elementary tridiagonal matrices. A generalization of the incomplete Cholesky conjugate gradient algorithm is used to solve the matrix equation. Loops are arranged to vectorize on the Cray1 with the CFT compiler, wherever possible. Recursive loops, which cannot be vectorized, are written for optimum scalar speed. For problems having a symmetric matrix ICCG2 should be used since it runs up to four times faster and uses approximately 30% less storage. Similar methods in three dimensions are available in ICCG3 and ILUCG3. A general source, containing extensions and macros, which must be processed by a pre-compiler to obtain the standard FORTRAN source, is provided along with the standard FORTRAN source because it is believed to be more readable. The pre-compiler is not included, but pre-compilation may be performed by a text editor as described in the UCRL-88746 Preprint.« less

  19. Position control using 2D-to-2D feature correspondences in vision guided cell micromanipulation.

    PubMed

    Zhang, Yanliang; Han, Mingli; Shee, Cheng Yap; Ang, Wei Tech

    2007-01-01

    Conventional camera calibration that utilizes the extrinsic and intrinsic parameters of the camera and the objects has certain limitations for micro-level cell operations due to the presence of hardware deviations and external disturbances during the experimental process, thereby invalidating the extrinsic parameters. This invalidation is often neglected in macro-world visual servoing and affects the visual image processing quality, causing deviation from the desired position in micro-level cell operations. To increase the success rate of vision guided biological micromanipulations, a novel algorithm monitoring the changing image pattern of the manipulators including the injection micropipette and cell holder is designed and implemented based on 2 dimensional (2D)-to 2D feature correspondences and can adjust the manipulator and perform position control simultaneously. When any deviation is found, the manipulator is retracted to the initial focusing plane before continuing the operation.

  20. Electron dynamics and valley relaxation in 2D semiconductors

    NASA Astrophysics Data System (ADS)

    Gundogdu, Kenan

    2015-03-01

    Single layer transition metal dichalcogenides are 2D semiconducting systems with unique electronic band structure. Two-valley energy bands along with strong spin-orbital coupling lead to valley dependent career spin polarization, which is the basis for recently proposed valleytronic applications. Since the durations of valley population provide the time window in which valley specific processes take place, it is an essential parameter for developing valleytronic devices. These systems also exhibit unusually strong many body affects, such as strong exciton and trion binding, due to reduced dielectric screening of Coulomb interactions. But there is not much known about the impact of strong many particle correlations on spin and valley polarization dynamics. Here we report direct measurements of ultrafast valley specific relaxation dynamics in single layer MoS2 and WS2. We found that excitonic many body interactions significantly contribute to the relaxation process. Biexciton formation reveals hole valley spin relaxation time. Our results also suggest initial fast intervalley electron scattering and electron spin relaxation leads to loss of electron valley polarization, which then facilitates hole valley relaxation via excitonic spin exchange interaction.

  1. A Planar Quantum Transistor Based on 2D-2D Tunneling in Double Quantum Well Heterostructures

    SciTech Connect

    Baca, W.E.; Blount, M.A.; Hafich, M.J.; Lyo, S.K.; Moon, J.S.; Reno, J.L.; Simmons, J.A.; Wendt, J.R.

    1998-12-14

    We report on our work on the double electron layer tunneling transistor (DELTT), based on the gate-control of two-dimensional -- two-dimensional (2D-2D) tunneling in a double quantum well heterostructure. While previous quantum transistors have typically required tiny laterally-defined features, by contrast the DELTT is entirely planar and can be reliably fabricated in large numbers. We use a novel epoxy-bond-and-stop-etch (EBASE) flip-chip process, whereby submicron gating on opposite sides of semiconductor epitaxial layers as thin as 0.24 microns can be achieved. Because both electron layers in the DELTT are 2D, the resonant tunneling features are unusually sharp, and can be easily modulated with one or more surface gates. We demonstrate DELTTs with peak-to-valley ratios in the source-drain I-V curve of order 20:1 below 1 K. Both the height and position of the resonant current peak can be controlled by gate voltage over a wide range. DELTTs with larger subband energy offsets ({approximately} 21 meV) exhibit characteristics that are nearly as good at 77 K, in good agreement with our theoretical calculations. Using these devices, we also demonstrate bistable memories operating at 77 K. Finally, we briefly discuss the prospects for room temperature operation, increases in gain, and high-speed.

  2. 'Brukin2D': a 2D visualization and comparison tool for LC-MS data

    PubMed Central

    Tsagkrasoulis, Dimosthenis; Zerefos, Panagiotis; Loudos, George; Vlahou, Antonia; Baumann, Marc; Kossida, Sophia

    2009-01-01

    Background Liquid Chromatography-Mass Spectrometry (LC-MS) is a commonly used technique to resolve complex protein mixtures. Visualization of large data sets produced from LC-MS, namely the chromatogram and the mass spectra that correspond to its compounds is the focus of this work. Results The in-house developed 'Brukin2D' software, built in Matlab 7.4, which is presented here, uses the compound data that are exported from the Bruker 'DataAnalysis' program, and depicts the mean mass spectra of all the chromatogram compounds from one LC-MS run, in one 2D contour/density plot. Two contour plots from different chromatograph runs can then be viewed in the same window and automatically compared, in order to find their similarities and differences. The results of the comparison can be examined through detailed mass quantification tables, while chromatogram compound statistics are also calculated during the procedure. Conclusion 'Brukin2D' provides a user-friendly platform for quick, easy and integrated view of complex LC-MS data. The software is available at . PMID:19534737

  3. Inhibition of human cytochrome P450 2D6 (CYP2D6) by methadone.

    PubMed Central

    Wu, D; Otton, S V; Sproule, B A; Busto, U; Inaba, T; Kalow, W; Sellers, E M

    1993-01-01

    1. In microsomes prepared from three human livers, methadone competitively inhibited the O-demethylation of dextromethorphan, a marker substrate for CYP2D6. The apparent Ki value of methadone ranged from 2.5 to 5 microM. 2. Two hundred and fifty-two (252) white Caucasians, including 210 unrelated healthy volunteers and 42 opiate abusers undergoing treatment with methadone were phenotyped using dextromethorphan as the marker drug. Although the frequency of poor metabolizers was similar in both groups, the extensive metabolizers among the opiate abusers tended to have higher O-demethylation metabolic ratios and to excrete less of the dose as dextromethorphan metabolites than control extensive metabolizer subjects. These data suggest inhibition of CYP2D6 by methadone in vivo as well. 3. Because methadone is widely used in the treatment of opiate abuse, inhibition of CYP2D6 activity in these patients might contribute to exaggerated response or unexpected toxicity from drugs that are substrates of this enzyme. PMID:8448065

  4. An assessment of a modern touch-screen tablet computer with reference to core physical characteristics necessary for clinical vision testing.

    PubMed

    Aslam, Tariq M; Murray, Ian J; Lai, Michael Y T; Linton, Emma; Tahir, Humza J; Parry, Neil R A

    2013-07-01

    There are a multitude of applications using modern tablet computers for vision testing that are accessible to ophthalmology patients. While these may be of potential future benefit, they are often unsupported by scientific assessment. This report investigates the pertinent physical characteristics behind one of the most common highest specification tablet computers with regard to its capacity for vision testing. We demonstrate through plotting of a gamma curve that it is feasible to produce a precise programmable range of central luminance levels on the device, even with varying background luminance levels. It may not be possible to display very low levels of contrast, but carefully using the gamma curve information allows a reasonable range of contrast sensitivity to be tested. When the screen is first powered on, it may require up to 15 min for the luminance values to stabilize. Finally, luminance of objects varies towards the edge of the screen and when viewed at an angle. However, the resulting effective contrast of objects is less variable. Details of our assessments are important to developers, users and prescribers of tablet clinical vision tests. Without awareness of such findings, these tests may never reach satisfactory levels of clinical validity and reliability.

  5. Presence of activatable Shiga toxin genotype (stx(2d)) in Shiga toxigenic Escherichia coli from livestock sources.

    PubMed

    Gobius, Kari S; Higgs, Glen M; Desmarchelier, Patricia M

    2003-08-01

    Stx2d is a recently described Shiga toxin whose cytotoxicity is activated 10- to 1000-fold by the elastase present in mouse or human intestinal mucus. We examined Shiga toxigenic Escherichia coli (STEC) strains isolated from food and livestock sources for the presence of activatable stx(2d). The stx(2) operons of STEC were first analyzed by PCR-restriction fragment length polymorphism (RFLP) analysis and categorized as stx(2), stx(2c vha), stx(2c vhb), or stx(2d EH250). Subsequently, the stx(2c vha) and stx(2c vhb) operons were screened for the absence of a PstI site in the stx(2A) subunit gene, a restriction site polymorphism which is a predictive indicator for the stx(2d) (activatable) genotype. Twelve STEC isolates carrying putative stx(2d) operons were identified, and nucleotide sequencing was used to confirm the identification of these operons as stx(2d). The complete nucleotide sequences of seven representative stx(2d) operons were determined. Shiga toxin expression in stx(2d) isolates was confirmed by immunoblotting. stx(2d) isolates were induced for the production of bacteriophages carrying stx. Two isolates were able to produce bacteriophages phi1662a and phi1720a carrying the stx(2d) operons. RFLP analysis of bacteriophage genomic DNA revealed that phi1662a and phi1720a were highly related to each other; however, the DNA sequences of these two stx(2d) operons were distinct. The STEC strains carrying these operons were isolated from retail ground beef. Surveillance for STEC strains expressing activatable Stx2d Shiga toxin among clinical cases may indicate the significance of this toxin subtype to human health.

  6. Calculating tissue shear modulus and pressure by 2D Log-Elastographic methods

    PubMed Central

    McLaughlin, Joyce R; Zhang, Ning; Manduca, Armando

    2010-01-01

    Shear modulus imaging, often called elastography, enables detection and characterization of tissue abnormalities. In this paper the data is two displacement components obtained from successive MR or ultrasound data sets acquired while the tissue is excited mechanically. A 2D plane strain elastic model is assumed to govern the 2D displacement, u. The shear modulus, μ, is unknown and whether or not the first Lamé parameter, λ, is known the pressure p = λ∇ · u which is present in the plane strain model cannot be measured and is unreliably computed from measured data and can be shown to be an order one quantity in the units kPa. So here we present a 2D Log-Elastographic inverse algorithm that: (1) simultaneously reconstructs the shear modulus, μ, and p, which together satisfy a first order partial differential equation system, with the goal of imaging μ; (2) controls potential exponential growth in the numerical error; and (3) reliably reconstructs the quantity p in the inverse algorithm as compared to the same quantity computed with a forward algorithm. This work generalizes the Log-Elastographic algorithm in [20] which uses one displacement component, is derived assuming the component satisfies the wave equation, and is tested on synthetic data computed with the wave equation model. The 2D Log-Elastographic algorithm is tested on 2D synthetic data and 2D in-vivo data from Mayo Clinic. We also exhibit examples to show that the 2D Log-Elastographic algorithm improves the quality of the recovered images as compared to the Log-Elastographic and Direct Inversion algorithms. PMID:21822349

  7. Correlated Electron Phenomena in 2D Materials

    NASA Astrophysics Data System (ADS)

    Lambert, Joseph G.

    In this thesis, I present experimental results on coherent electron phenomena in layered two-dimensional materials: single layer graphene and van der Waals coupled 2D TiSe2. Graphene is a two-dimensional single-atom thick sheet of carbon atoms first derived from bulk graphite by the mechanical exfoliation technique in 2004. Low-energy charge carriers in graphene behave like massless Dirac fermions, and their density can be easily tuned between electron-rich and hole-rich quasiparticles with electrostatic gating techniques. The sharp interfaces between regions of different carrier densities form barriers with selective transmission, making them behave as partially reflecting mirrors. When two of these interfaces are set at a separation distance within the phase coherence length of the carriers, they form an electronic version of a Fabry-Perot cavity. I present measurements and analysis of multiple Fabry-Perot modes in graphene with parallel electrodes spaced a few hundred nanometers apart. Transition metal dichalcogenide (TMD) TiSe2 is part of the family of materials that coined the term "materials beyond graphene". It contains van der Waals coupled trilayer stacks of Se-Ti-Se. Many TMD materials exhibit a host of interesting correlated electronic phases. In particular, TiSe2 exhibits chiral charge density waves (CDW) below TCDW ˜ 200 K. Upon doping with copper, the CDW state gets suppressed with Cu concentration, and CuxTiSe2 becomes superconducting with critical temperature of T c = 4.15 K. There is still much debate over the mechanisms governing the coexistence of the two correlated electronic phases---CDW and superconductivity. I will present some of the first conductance spectroscopy measurements of proximity coupled superconductor-CDW systems. Measurements reveal a proximity-induced critical current at the Nb-TiSe2 interfaces, suggesting pair correlations in the pure TiSe2. The results indicate that superconducting order is present concurrently with CDW in

  8. Computer aided screening of potent inhibitor compounds against inhibitor resistant TEM β-lactamase mutants from traditional Chinese medicine

    PubMed Central

    Zhu, Qifeng; Yin, Yanxia; Liu, Hanjie; Tian, Jinhong

    2014-01-01

    Inhibitor-resistant TEM (IRT) type β-lactamase mutation is largely known. Therefore, it is of interest to identify new yet improved leads against IRT from traditional Chinese medicine. Hence, we screened more than 10,000 compounds from Chinese medicine (tcm@taiwan database) with mutant molecular IRT models through docking techniques. This exercise identified compounds affeic acid, curcumin, salvianolic acid E, ferulic acid and p-coumaric acid with high binding score with the mutants. This was further validated in vitro where salvianolic acid E combined with cefoperazone and sulbactam effectively inhibit the R244S mutant. PMID:25670878

  9. FRANC2D: A two-dimensional crack propagation simulator. Version 2.7: User's guide

    NASA Technical Reports Server (NTRS)

    Wawrzynek, Paul; Ingraffea, Anthony

    1994-01-01

    FRANC 2D (FRacture ANalysis Code, 2 Dimensions) is a menu driven, interactive finite element computer code that performs fracture mechanics analyses of 2-D structures. The code has an automatic mesh generator for triangular and quadrilateral elements. FRANC2D calculates the stress intensity factor using linear elastic fracture mechanics and evaluates crack extension using several methods that may be selected by the user. The code features a mesh refinement and adaptive mesh generation capability that is automatically developed according to the predicted crack extension direction and length. The code also has unique features that permit the analysis of layered structure with load transfer through simulated mechanical fasteners or bonded joints. The code was written for UNIX workstations with X-windows graphics and may be executed on the following computers: DEC DecStation 3000 and 5000 series, IBM RS/6000 series, Hewlitt-Packard 9000/700 series, SUN Sparc stations, and most Silicon Graphics models.

  10. Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy

    PubMed Central

    Giraudeau, Patrick; Frydman, Lucio

    2016-01-01

    Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342

  11. CYP2D6 polymorphism and mental and personality disorders in suicide attempters.

    PubMed

    Blasco-Fontecilla, Hilario; Peñas-Lledó, Eva; Vaquero-Lorenzo, Concepción; Dorado, Pedro; Saiz-Ruiz, Jerónimo; Llerena, Adrián; Baca-García, Enrique

    2014-12-01

    Prior studies on the association between the CYP2D6 polymorphism and suicide did not explore whether mental and personality disorders mediate this association. The main objective of the present study was to test an association between CYP2D6 polymorphism and mental and personality disorders among suicide attempters. The MINI and the DSM-IV version of the International Personality Disorder Examination Screening Questionnaire were used to diagnose mental and personality disorders, respectively, in 342 suicide attempters. Suicide attempters were divided into four groups according to their number of CYP2D6 active genes (zero, one, and two or more). Differences in mental and personality disorders across the four groups were measured using linear-by-linear association, chi square-test, and 95% confidence intervals. Suicide attempters carrying two or more active CYP2D6 genes were more likely to be diagnosed with at least one personality disorder than those with one or zero CYP2D6 active genes.

  12. CYP2D7 Sequence Variation Interferes with TaqMan CYP2D6*15 and *35 Genotyping

    PubMed Central

    Riffel, Amanda K.; Dehghani, Mehdi; Hartshorne, Toinette; Floyd, Kristen C.; Leeder, J. Steven; Rosenblatt, Kevin P.; Gaedigk, Andrea

    2016-01-01

    TaqMan™ genotyping assays are widely used to genotype CYP2D6, which encodes a major drug metabolizing enzyme. Assay design for CYP2D6 can be challenging owing to the presence of two pseudogenes, CYP2D7 and CYP2D8, structural and copy number variation and numerous single nucleotide polymorphisms (SNPs) some of which reflect the wild-type sequence of the CYP2D7 pseudogene. The aim of this study was to identify the mechanism causing false-positive CYP2D6*15 calls and remediate those by redesigning and validating alternative TaqMan genotype assays. Among 13,866 DNA samples genotyped by the CompanionDx® lab on the OpenArray platform, 70 samples were identified as heterozygotes for 137Tins, the key SNP of CYP2D6*15. However, only 15 samples were confirmed when tested with the Luminex xTAG CYP2D6 Kit and sequencing of CYP2D6-specific long range (XL)-PCR products. Genotype and gene resequencing of CYP2D6 and CYP2D7-specific XL-PCR products revealed a CC>GT dinucleotide SNP in exon 1 of CYP2D7 that reverts the sequence to CYP2D6 and allows a TaqMan assay PCR primer to bind. Because CYP2D7 also carries a Tins, a false-positive mutation signal is generated. This CYP2D7 SNP was also responsible for generating false-positive signals for rs769258 (CYP2D6*35) which is also located in exon 1. Although alternative CYP2D6*15 and *35 assays resolved the issue, we discovered a novel CYP2D6*15 subvariant in one sample that carries additional SNPs preventing detection with the alternate assay. The frequency of CYP2D6*15 was 0.1% in this ethnically diverse U.S. population sample. In addition, we also discovered linkage between the CYP2D7 CC>GT dinucleotide SNP and the 77G>A (rs28371696) SNP of CYP2D6*43. The frequency of this tentatively functional allele was 0.2%. Taken together, these findings emphasize that regardless of how careful genotyping assays are designed and evaluated before being commercially marketed, rare or unknown SNPs underneath primer and/or probe regions can impact

  13. CYP2D7 Sequence Variation Interferes with TaqMan CYP2D6 (*) 15 and (*) 35 Genotyping.

    PubMed

    Riffel, Amanda K; Dehghani, Mehdi; Hartshorne, Toinette; Floyd, Kristen C; Leeder, J Steven; Rosenblatt, Kevin P; Gaedigk, Andrea

    2015-01-01

    TaqMan™ genotyping assays are widely used to genotype CYP2D6, which encodes a major drug metabolizing enzyme. Assay design for CYP2D6 can be challenging owing to the presence of two pseudogenes, CYP2D7 and CYP2D8, structural and copy number variation and numerous single nucleotide polymorphisms (SNPs) some of which reflect the wild-type sequence of the CYP2D7 pseudogene. The aim of this study was to identify the mechanism causing false-positive CYP2D6 (*) 15 calls and remediate those by redesigning and validating alternative TaqMan genotype assays. Among 13,866 DNA samples genotyped by the CompanionDx® lab on the OpenArray platform, 70 samples were identified as heterozygotes for 137Tins, the key SNP of CYP2D6 (*) 15. However, only 15 samples were confirmed when tested with the Luminex xTAG CYP2D6 Kit and sequencing of CYP2D6-specific long range (XL)-PCR products. Genotype and gene resequencing of CYP2D6 and CYP2D7-specific XL-PCR products revealed a CC>GT dinucleotide SNP in exon 1 of CYP2D7 that reverts the sequence to CYP2D6 and allows a TaqMan assay PCR primer to bind. Because CYP2D7 also carries a Tins, a false-positive mutation signal is generated. This CYP2D7 SNP was also responsible for generating false-positive signals for rs769258 (CYP2D6 (*) 35) which is also located in exon 1. Although alternative CYP2D6 (*) 15 and (*) 35 assays resolved the issue, we discovered a novel CYP2D6 (*) 15 subvariant in one sample that carries additional SNPs preventing detection with the alternate assay. The frequency of CYP2D6 (*) 15 was 0.1% in this ethnically diverse U.S. population sample. In addition, we also discovered linkage between the CYP2D7 CC>GT dinucleotide SNP and the 77G>A (rs28371696) SNP of CYP2D6 (*) 43. The frequency of this tentatively functional allele was 0.2%. Taken together, these findings emphasize that regardless of how careful genotyping assays are designed and evaluated before being commercially marketed, rare or unknown SNPs underneath primer

  14. Mesophases in nearly 2D room-temperature ionic liquids.

    PubMed

    Manini, N; Cesaratto, M; Del Pópolo, M G; Ballone, P

    2009-11-26

    Computer simulations of (i) a [C(12)mim][Tf(2)N] film of nanometric thickness squeezed at kbar pressure by a piecewise parabolic confining potential reveal a mesoscopic in-plane density and composition modulation reminiscent of mesophases seen in 3D samples of the same room-temperature ionic liquid (RTIL). Near 2D confinement, enforced by a high normal load, as well as relatively long aliphatic chains are strictly required for the mesophase formation, as confirmed by computations for two related systems made of (ii) the same [C(12)mim][Tf(2)N] adsorbed at a neutral solid surface and (iii) a shorter-chain RTIL ([C(4)mim][Tf(2)N]) trapped in the potential well of part i. No in-plane modulation is seen for ii and iii. In case ii, the optimal arrangement of charge and neutral tails is achieved by layering parallel to the surface, while, in case iii, weaker dispersion and packing interactions are unable to bring aliphatic tails together into mesoscopic islands, against overwhelming entropy and Coulomb forces. The onset of in-plane mesophases could greatly affect the properties of long-chain RTILs used as lubricants. PMID:19886615

  15. Measurement of astrophysical S factors and electron screening potentials for d(d, n){sup 3}He reaction In ZrD{sub 2}, TiD{sub 2}, D{sub 2}O, and CD{sub 2} targets in the ultralow energy region using plasma accelerators

    SciTech Connect

    Bystritsky, V. M.; Bystritskii, Vit. M.; Dudkin, G. N.; Filipowicz, M.; Gazi, S.; Huran, J.; Kobzev, A. P.; Mesyats, G. A.; Nechaev, B. A.; Padalko, V. N.; Parzhitskii, S. S.; Pen'kov, F. M.; Philippov, A. V.; Kaminskii, V. L.; Tuleushev, Yu. Zh.; Wozniak, J.

    2012-01-15

    The paper is devoted to study electron screening effect influence on the rate of d(d, n){sup 3}He reaction in the ultralow deuteron collision energy range in the deuterated polyethylene (CD{sub 2}), frozen heavy water (D{sub 2}O) and deuterated metals (ZrD{sub 2} and TiD{sub 2}). The ZrD{sub 2} and TiD{sub 2} targets were fabricated via magnetron sputtering of titanium and zirconium in gas (deuterium) environment. The experiments have been carried out using high-current plasma pulsed accelerator with forming of inverse Z pinch (HCEIRAS, Russia) and pulsed Hall plasma accelerator (NPI at TPU, Russia). The detection of neutrons with energy of 2.5MeV from dd reaction was done with plastic scintillation spectrometers. As a result of the experiments the energy dependences of astrophysical S factor for the dd reaction in the deuteron collision energy range of 2-7 keV and the values of the electron screening potential U{sub e} of interacting deuterons have been measured for the indicated above target: U{sub e}(CD{sub 2}) Less-Than-Or-Slanted-Equal-To 40 eV; U{sub e}(D{sub 2}O) Less-Than-Or-Slanted-Equal-To 26 eV; U{sub e}(ZrD{sub 2}) = 157 {+-} 43 eV; U{sub e}(TiD{sub 2}) = 125{+-}34 eV. The value of astrophysical S factor, corresponding to the deuteron collision energy equal to zero, in the experiments with D{sub 2}O target is found: S{sub b}(0) = 58.6 {+-} 3.6 keV b. The paper compares our results with other available published experimental and calculated data.

  16. 2D Quantum Transport Modeling in Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, Bryan

    2001-01-01

    With the onset of quantum confinement in the inversion layer in nanoscale MOSFETs, behavior of the resonant level inevitably determines all device characteristics. While most classical device simulators take quantization into account in some simplified manner, the important details of electrostatics are missing. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL, and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density- gradient and quantum-corrected MEDICI). We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions, oxide tunneling and phase-breaking scattering are treated on equal footing. Electrons in the ellipsoids of the conduction band are treated within the anisotropic effective mass approximation. Quantum simulations are focused on MIT 25, 50 and 90 nm "well- tempered" MOSFETs and compared to classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. These results are quantitatively consistent with I D Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and sub-threshold current has been studied. The shorter gate length device has an order of magnitude smaller current at zero gate bias than the longer gate length device without a significant trade-off in on-current. This should be a device design consideration.

  17. 2D Quantum Mechanical Study of Nanoscale MOSFETs

    NASA Technical Reports Server (NTRS)

    Svizhenko, Alexei; Anantram, M. P.; Govindan, T. R.; Biegel, B.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    With the onset of quantum confinement in the inversion layer in nanoscale MOSFETs, behavior of the resonant level inevitably determines all device characteristics. While most classical device simulators take quantization into account in some simplified manner, the important details of electrostatics are missing. Our work addresses this shortcoming and provides: (a) a framework to quantitatively explore device physics issues such as the source-drain and gate leakage currents, DIBL, and threshold voltage shift due to quantization, and b) a means of benchmarking quantum corrections to semiclassical models (such as density-gradient and quantum-corrected MEDICI). We have developed physical approximations and computer code capable of realistically simulating 2-D nanoscale transistors, using the non-equilibrium Green's function (NEGF) method. This is the most accurate full quantum model yet applied to 2-D device simulation. Open boundary conditions and oxide tunneling are treated on an equal footing. Electrons in the ellipsoids of the conduction band are treated within the anisotropic effective mass approximation. We present the results of our simulations of MIT 25, 50 and 90 nm "well-tempered" MOSFETs and compare them to those of classical and quantum corrected models. The important feature of quantum model is smaller slope of Id-Vg curve and consequently higher threshold voltage. Surprisingly, the self-consistent potential profile shows lower injection barrier in the channel in quantum case. These results are qualitatively consistent with ID Schroedinger-Poisson calculations. The effect of gate length on gate-oxide leakage and subthreshold current has been studied. The shorter gate length device has an order of magnitude smaller current at zero gate bias than the longer gate length device without a significant trade-off in on-current. This should be a device design consideration.

  18. Evaluation of 2D shallow-water model for spillway flow with a complex geometry

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Although the two-dimensional (2D) shallow water model is formulated based on several assumptions such as hydrostatic pressure distribution and vertical velocity is negligible, as a simple alternative to the complex 3D model, it has been used to compute water flows in which these assumptions may be ...

  19. Parallel algorithms for 2-D cylindrical transport equations of Eigenvalue problem

    SciTech Connect

    Wei, J.; Yang, S.

    2013-07-01

    In this paper, aimed at the neutron transport equations of eigenvalue problem under 2-D cylindrical geometry on unstructured grid, the discrete scheme of Sn discrete ordinate and discontinuous finite is built, and the parallel computation for the scheme is realized on MPI systems. Numerical experiments indicate that the designed parallel algorithm can reach perfect speedup, it has good practicality and scalability. (authors)

  20. Nano-scale electronic and optoelectronic devices based on 2D crystals

    NASA Astrophysics Data System (ADS)

    Zhu, Wenjuan

    In the last few years, the research community has been rapidly growing interests in two-dimensional (2D) crystals and their applications. The properties of these 2D crystals are diverse -- ranging from semi-metal such as graphene, semiconductors such as MoS2, to insulator such as boron nitride. These 2D crystals have many unique properties as compared to their bulk counterparts due to their reduced dimensionality and symmetry. A key difference is the band structures, which lead to distinct electronic and photonic properties. The 2D nature of the material also plays an important role in defining their exceptional properties of mechanical strength, surface sensitivity, thermal conductivity, tunable band-gap and their interaction with light. These unique properties of 2D crystals open up a broad territory of applications in computing, communication, energy, and medicine. In this talk, I will present our work on understanding the electrical properties of graphene and MoS2, in particular current transport and band-gap engineering in graphene, interface between gate dielectrics and graphene, and gap states in MoS2. I will also present our work on the nano-scale electronic devices (RF and logic devices) and photonic devices (plasmonic devices and photo-detectors) based on these 2D crystals.

  1. Seeing through the Screen: Is Evaluative Feedback Communicated More Effectively in Face-to-Face or Computer-Mediated Exchanges?

    ERIC Educational Resources Information Center

    Hebert, Brenda G.; Vorauer, Jacquie D.

    2003-01-01

    Describes a study of college students that examined how the use of computer mediated communication affected the transmission of performance and interpersonal appraisal information. Examined whether interpersonal judgments obtained through face-to-face communication resulted in greater positivity, but compromised accuracy, relative to…

  2. A real-time multi-scale 2D Gaussian filter based on FPGA

    NASA Astrophysics Data System (ADS)

    Luo, Haibo; Gai, Xingqin; Chang, Zheng; Hui, Bin

    2014-11-01

    Multi-scale 2-D Gaussian filter has been widely used in feature extraction (e.g. SIFT, edge etc.), image segmentation, image enhancement, image noise removing, multi-scale shape description etc. However, their computational complexity remains an issue for real-time image processing systems. Aimed at this problem, we propose a framework of multi-scale 2-D Gaussian filter based on FPGA in this paper. Firstly, a full-hardware architecture based on parallel pipeline was designed to achieve high throughput rate. Secondly, in order to save some multiplier, the 2-D convolution is separated into two 1-D convolutions. Thirdly, a dedicate first in first out memory named as CAFIFO (Column Addressing FIFO) was designed to avoid the error propagating induced by spark on clock. Finally, a shared memory framework was designed to reduce memory costs. As a demonstration, we realized a 3 scales 2-D Gaussian filter on a single ALTERA Cyclone III FPGA chip. Experimental results show that, the proposed framework can computing a Multi-scales 2-D Gaussian filtering within one pixel clock period, is further suitable for real-time image processing. Moreover, the main principle can be popularized to the other operators based on convolution, such as Gabor filter, Sobel operator and so on.

  3. PLAN2D - A PROGRAM FOR ELASTO-PLASTIC ANALYSIS OF PLANAR FRAMES

    NASA Technical Reports Server (NTRS)

    Lawrence, C.

    1994-01-01

    PLAN2D is a FORTRAN computer program for the plastic analysis of planar rigid frame structures. Given a structure and loading pattern as input, PLAN2D calculates the ultimate load that the structure can sustain before collapse. Element moments and plastic hinge rotations are calculated for the ultimate load. The location of hinges required for a collapse mechanism to form are also determined. The program proceeds in an iterative series of linear elastic analyses. After each iteration the resulting elastic moments in each member are compared to the reserve plastic moment capacity of that member. The member or members that have moments closest to their reserve capacity will determine the minimum load factor and the site where the next hinge is to be inserted. Next, hinges are inserted and the structural stiffness matrix is reformulated. This cycle is repeated until the structure becomes unstable. At this point the ultimate collapse load is calculated by accumulating the minimum load factor from each previous iteration and multiplying them by the original input loads. PLAN2D is based on the program STAN, originally written by Dr. E.L. Wilson at U.C. Berkeley. PLAN2D has several limitations: 1) Although PLAN2D will detect unloading of hinges it does not contain the capability to remove hinges; 2) PLAN2D does not allow the user to input different positive and negative moment capacities and 3) PLAN2D does not consider the interaction between axial and plastic moment capacity. Axial yielding and buckling is ignored as is the reduction in moment capacity due to axial load. PLAN2D is written in FORTRAN and is machine independent. It has been tested on an IBM PC and a DEC MicroVAX. The program was developed in 1988.

  4. Mechanical characterization of 2D, 2D stitched, and 3D braided/RTM materials

    NASA Technical Reports Server (NTRS)

    Deaton, Jerry W.; Kullerd, Susan M.; Portanova, Marc A.

    1993-01-01

    Braided composite materials have potential for application in aircraft structures. Fuselage frames, floor beams, wing spars, and stiffeners are examples where braided composites could find application if cost effective processing and damage tolerance requirements are met. Another important consideration for braided composites relates to their mechanical properties and how they compare to the properties of composites produced by other textile composite processes being proposed for these applications. Unfortunately, mechanical property data for braided composites do not appear extensively in the literature. Data are presented in this paper on the mechanical characterization of 2D triaxial braid, 2D triaxial braid plus stitching, and 3D (through-the-thickness) braid composite materials. The braided preforms all had the same graphite tow size and the same nominal braid architectures, (+/- 30 deg/0 deg), and were resin transfer molded (RTM) using the same mold for each of two different resin systems. Static data are presented for notched and unnotched tension, notched and unnotched compression, and compression after impact strengths at room temperature. In addition, some static results, after environmental conditioning, are included. Baseline tension and compression fatigue results are also presented, but only for the 3D braided composite material with one of the resin systems.

  5. A computational perspective of molecular interactions through virtual screening, pharmacokinetic and dynamic prediction on ribosome toxin A chain and inhibitors of Ricinus communis

    PubMed Central

    Kumar, R. Barani; Suresh, M. Xavier

    2012-01-01

    Background: Ricin is considered to be one of the most deadly toxins and gained its favor as a bioweapon that has a serious social and biological impact, due to its widespread nature and abundant availability. The hazardous effects of this toxin in human being are seen in almost all parts of the organ system. The severe consequences of the toxin necessitate the need for developing potential inhibitors that can effectively block its interaction with the host system. Materials and Methods: In order to identify potential inhibitors that can effectively block ricin, we employed various computational approaches. In this work, we computationally screened and analyzed 66 analogs and further tested their ADME/T profiles. From the kinetic and toxicity studies we selected six analogs that possessed appropriate pharmacokinetic and dynamic property. We have also performed a computational docking of these analogs with the target. Results: On the basis of the dock scores and hydrogen bond interactions we have identified analog 64 to be the best interacting molecule. Molecule 64 seems to have stable interaction with the residues Tyr80, Arg180, and Val81. The pharmacophore feature that describes the key functional features of a molecule was also studied and presented. Conclusion: The pharmacophore features of the drugs provided suggests the key functional groups that can aid in the design and synthesis of more potential inhibitors. PMID:22224054

  6. Comparison of 2-D and 3-D estimates of placental volume in early pregnancy.

    PubMed

    Aye, Christina Y L; Stevenson, Gordon N; Impey, Lawrence; Collins, Sally L

    2015-03-01

    Ultrasound estimation of placental volume (PlaV) between 11 and 13 wk has been proposed as part of a screening test for small-for-gestational-age babies. A semi-automated 3-D technique, validated against the gold standard of manual delineation, has been found at this stage of gestation to predict small-for-gestational-age at term. Recently, when used in the third trimester, an estimate obtained using a 2-D technique was found to correlate with placental weight at delivery. Given its greater simplicity, the 2-D technique might be more useful as part of an early screening test. We investigated if the two techniques produced similar results when used in the first trimester. The correlation between PlaV values calculated by the two different techniques was assessed in 139 first-trimester placentas. The agreement on PlaV and derived "standardized placental volume," a dimensionless index correcting for gestational age, was explored with the Mann-Whitney test and Bland-Altman plots. Placentas were categorized into five different shape subtypes, and a subgroup analysis was performed. Agreement was poor for both PlaV and standardized PlaV (p < 0.001 and p < 0.001), with the 2-D technique yielding larger estimates for both indices compared with the 3-D method. The mean difference in standardized PlaV values between the two methods was 0.007 (95% confidence interval: 0.006-0.009). The best agreement was found for regular rectangle-shaped placentas (p = 0.438 and p = 0.408). The poor correlation between the 2-D and 3-D techniques may result from the heterogeneity of placental morphology at this stage of gestation. In early gestation, the simpler 2-D estimates of PlaV do not correlate strongly with those obtained with the validated 3-D technique.

  7. Single particle 3D reconstruction for 2D crystal images of membrane proteins.

    PubMed

    Scherer, Sebastian; Arheit, Marcel; Kowal, Julia; Zeng, Xiangyan; Stahlberg, Henning

    2014-03-01

    In cases where ultra-flat cryo-preparations of well-ordered two-dimensional (2D) crystals are available, electron crystallography is a powerful method for the determination of the high-resolution structures of membrane and soluble proteins. However, crystal unbending and Fourier-filtering methods in electron crystallography three-dimensional (3D) image processing are generally limited in their performance for 2D crystals that are badly ordered or non-flat. Here we present a single particle image processing approach, which is implemented as an extension of the 2D crystallographic pipeline realized in the 2dx software package, for the determination of high-resolution 3D structures of membrane proteins. The algorithm presented, addresses the low single-to-noise ratio (SNR) of 2D crystal images by exploiting neighborhood correlation between adjacent proteins in the 2D crystal. Compared with conventional single particle processing for randomly oriented particles, the computational costs are greatly reduced due to the crystal-induced limited search space, which allows a much finer search space compared to classical single particle processing. To reduce the considerable computational costs, our software features a hybrid parallelization scheme for multi-CPU clusters and computer with high-end graphic processing units (GPUs). We successfully apply the new refinement method to the structure of the potassium channel MloK1. The calculated 3D reconstruction shows more structural details and contains less noise than the map obtained by conventional Fourier-filtering based processing of the same 2D crystal images.

  8. CAS2D- NONROTATING BLADE-TO-BLADE, STEADY, POTENTIAL TRANSONIC CASCADE FLOW ANALYSIS CODE

    NASA Technical Reports Server (NTRS)

    Dulikravich, D. S.

    1994-01-01

    An exact, full-potential-equation model for the steady, irrotational, homoentropic, and homoenergetic flow of a compressible, inviscid fluid through a two-dimensional planar cascade together with its appropriate boundary conditions has been derived. The CAS2D computer program numerically solves an artificially time-dependent form of the actual full-potential-equation, providing a nonrotating blade-to-blade, steady, potential transonic cascade flow analysis code. Comparisons of results with test data and theoretical solutions indicate very good agreement. In CAS2D, the governing equation is discretized by using type-dependent, rotated finite differencing and the finite area technique. The flow field is discretized by providing a boundary-fitted, nonuniform computational mesh. This mesh is generated by using a sequence of conformal mapping, nonorthogonal coordinate stretching, and local, isoparametric, bilinear mapping functions. The discretized form of the full-potential equation is solved iteratively by using successive line over relaxation. Possible isentropic shocks are captured by the explicit addition of an artificial viscosity in a conservative form. In addition, a four-level, consecutive, mesh refinement feature makes CAS2D a reliable and fast algorithm for the analysis of transonic, two-dimensional cascade flows. The results from CAS2D are not directly applicable to three-dimensional, potential, rotating flows through a cascade of blades because CAS2D does not consider the effects of the Coriolis force that would be present in the three-dimensional case. This program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 200K of 8 bit bytes. The CAS2D program was developed in 1980.

  9. A search for mosquito larvicidal compounds by blocking the sterol carrying protein, AeSCP-2, through computational screening and docking strategies

    PubMed Central

    Kumar, R. Barani; Shanmugapriya, B.; Thiyagesan, K.; Kumar, S. Raj; Xavier, Suresh M.

    2010-01-01

    Background: Sterol is a very vital compound for most of the insects and mosquitoes to complete their life cycle. Unfortunately mosquitoes cannot synthesize the sterol, it depends on mammals for the same. Mosquitoes take the sterol from the plant decays during their larval stage in the form of phytosterol, which is then converted to cholesterol for further growth and reproduction. This conversion occurs with the help of the sterol carrier protein 2(SCP2). Methods: Mosquito populations are controlled by plant-based inhibitors, which inhibit sterol carrier protein (SCPI-Sterol carrier protein inhibitor) activity. In this article, we explain the methods of inhibiting Aedes aegypti SCP2 by insilico methods including natural inhibitor selection and filtrations by virtual screening and interaction studies. Results: In this study protein-ligand interactions were carried out with various phytochemicals, as a result of virtual screening Alpha-mangostin and Panthenol were found to be good analogs, and were allowed to dock with the mosquito cholesterol carrier protein AeSCP-2. Conclusion: Computational selections of SCPIs are highly reliable and novel methods for discovering new and more effective compounds to control mosquitoes. PMID:21808576

  10. Experimental study of heavy-ion computed tomography using a scintillation screen and an electron-multiplying charged coupled device camera for human head imaging

    NASA Astrophysics Data System (ADS)

    Muraishi, Hiroshi; Hara, Hidetake; Abe, Shinji; Yokose, Mamoru; Watanabe, Takara; Takeda, Tohoru; Koba, Yusuke; Fukuda, Shigekazu

    2016-03-01

    We have developed a heavy-ion computed tomography (IonCT) system using a scintillation screen and an electron-multiplying charged coupled device (EMCCD) camera that can measure a large object such as a human head. In this study, objective with the development of the system was to investigate the possibility of applying this system to heavy-ion treatment planning from the point of view of spatial resolution in a reconstructed image. Experiments were carried out on a rotation phantom using 12C accelerated up to 430 MeV/u by the Heavy-Ion Medical Accelerator in Chiba (HIMAC) at the National Institute of Radiological Sciences (NIRS). We demonstrated that the reconstructed image of an object with a water equivalent thickness (WET) of approximately 18 cm was successfully achieved with the spatial resolution of 1 mm, which would make this IonCT system worth applying to the heavy-ion treatment planning for head and neck cancers.

  11. Screening Method for calculating Global Warming Potential through computational and experimental investigations of radiative forcing and atmospheric lifetime

    NASA Astrophysics Data System (ADS)

    Bevington, C. B.; Betowski, D.; Ottinger, D.; Sheppard, M.; Elrod, M. J.; Offenberg, J.; Hetfield, C.; Libelo, E. L.

    2011-12-01

    The universe of chemical substances in commerce that may have significant atmospheric impacts such as global warming potential, ozone depletion potential, and ozone creation potential is not well defined. Staff from the U.S. E.P.A. have developed a screening method and evaluated chemicals using criteria indicative of potential atmospheric impact. Screening criteria included physical chemical properties such as boiling point and vapor pressure as well as structural characteristics such as molecular weight and number of halogen atoms. Preliminary results show that there are over 1,000 chemicals with a 100-year time horizon Global Warming Potential (GWP) of greater than 1 and over 700 chemicals with a GWP of greater than 10, relative to a value of 1 for CO2. The primary goal of this scoping project is to calculate the GWP for each of these chemicals. GWP is calculated using three primary inputs: molecular weight, atmospheric lifetime, and radiative forcing. Where available, experimentally derived radiative forcing and atmospheric lifetime values have been identified and are utilized. Surprisingly, measured values were only available for approximately 20% of chemicals. Where measured data were not available, values were estimated in various ways. Besides calculating these values, characterizing the accuracy and efficacy of these various estimation methods, is also of interest. Radiative efficiency was calculated using quantum mechanical ab initio methods, utilizing Gaussian software. In addition, a preliminary Quantitative Structure Activity Relationship (QSAR) building on the work of Bera et al's "Design strategies to minimize the radiative efficiency of global warming molecules" (2010) was used to estimate radiative forcing for over 800 fluorinated chemicals. For atmospheric lifetime, QSARs were used to estimate OH rate constants and atmospheric lifetime values. Recognizing the limitations and uncertainty introduced by using QSARs for atmospheric lifetime estimation

  12. Synthetic Covalent and Non-Covalent 2D Materials.

    PubMed

    Boott, Charlotte E; Nazemi, Ali; Manners, Ian

    2015-11-16

    The creation of synthetic 2D materials represents an attractive challenge that is ultimately driven by their prospective uses in, for example, electronics, biomedicine, catalysis, sensing, and as membranes for separation and filtration. This Review illustrates some recent advances in this diverse field with a focus on covalent and non-covalent 2D polymers and frameworks, and self-assembled 2D materials derived from nanoparticles, homopolymers, and block copolymers.

  13. A Geometric Boolean Library for 2D Objects

    2006-01-05

    The 2D Boolean Library is a collection of C++ classes -- which primarily represent 2D geometric data and relationships, and routines -- which contain algorithms for 2D geometric Boolean operations and utility functions. Classes are provided for 2D points, lines, arcs, edgeuses, loops, surfaces and mask sets. Routines are provided that incorporate the Boolean operations Union(OR), XOR, Intersection and Difference. Various analytical geometry routines and routines for importing and exporting the data in various filemore » formats, are also provided in the library.« less

  14. VizieR Online Data Catalog: The 2dF Galaxy Redshift Survey (2dFGRS) (2dFGRS Team, 1998-2003)

    NASA Astrophysics Data System (ADS)

    Colless, M.; Dalton, G.; Maddox, S.; Sutherland, W.; Norberg, P.; Cole, S.; Bland-Hawthorn, J.; Bridges, T.; Cannon, R.; Collins, C.; Couch, W.; Cross, N.; Deeley, K.; de Propris, R.; Driver, S. P.; Efstathiou, G.; Ellis, R. S.; Frenk, C. S.; Glazebrook, K.; Jackson, C.; Lahav, O.; Lewis, I.; Lumsden, S.; Madgwick, D.; Peacock, J. A.; Peterson, B. A.; Price, I.; Seaborne, M.; Taylor, K.

    2007-11-01

    The 2dF Galaxy Redshift Survey (2dFGRS) is a major spectroscopic survey taking full advantage of the unique capabilities of the 2dF facility built by the Anglo-Australian Observatory. The 2dFGRS is integrated with the 2dF QSO survey (2QZ, Cat. VII/241). The 2dFGRS obtained spectra for 245591 objects, mainly galaxies, brighter than a nominal extinction-corrected magnitude limit of bJ=19.45. Reliable (quality>=3) redshifts were obtained for 221414 galaxies. The galaxies cover an area of approximately 1500 square degrees selected from the extended APM Galaxy Survey in three regions: a North Galactic Pole (NGP) strip, a South Galactic Pole (SGP) strip, and random fields scattered around the SGP strip. Redshifts are measured from spectra covering 3600-8000 Angstroms at a two-pixel resolution of 9.0 Angstrom and a median S/N of 13 per pixel. All redshift identifications are visually checked and assigned a quality parameter Q in the range 1-5; Q>=3 redshifts are 98.4% reliable and have an rms uncertainty of 85 km/s. The overall redshift completeness for Q>=3 redshifts is 91.8% but this varies with magnitude from 99% for the brightest galaxies to 90% for objects at the survey limit. The 2dFGRS data base is available on the World Wide Web at http://www.mso.anu.edu.au/2dFGRS/. (6 data files).

  15. A comment on the rank correlation merit function for 2D/3D registration

    NASA Astrophysics Data System (ADS)

    Figl, Michael; Bloch, Christoph; Birkfellner, Wolfgang

    2010-02-01

    Lots of procedures in computer assisted interventions register pre-interventionally generated 3D data sets to the intraoperative situation using fast and simply generated 2D images, e.g. from a C-Arm, a B-mode Ultrasound, etc. Registration is typically done by generating a 2D image out of the 3D data set, comparison to the original 2D image using a planar similarity measure and subsequent optimisation. As these two images can be very different, a lot of different comparison functions are in use. In a recent article Stochastic Rank Correlation, a merit function based on Spearman's rank correlation coefficient was presented. By comparing randomly chosen subsets of the images, the authors wanted to avoid the computational expense of sorting all the points in the image. In the current paper we show that, because of the limited grey level range in medical images, full image rank correlation can be computed almost as fast as Pearson's correlation coefficient. A run time estimation is illustrated with numerical results using a 2D Shepp-Logan phantom at different sizes, and a sample data set of a pig.

  16. 2D-3D hybrid stabilized finite element method for tsunami runup simulations

    NASA Astrophysics Data System (ADS)

    Takase, S.; Moriguchi, S.; Terada, K.; Kato, J.; Kyoya, T.; Kashiyama, K.; Kotani, T.

    2016-09-01

    This paper presents a two-dimensional (2D)-three-dimensional (3D) hybrid stabilized finite element method that enables us to predict a propagation process of tsunami generated in a hypocentral region, which ranges from offshore propagation to runup to urban areas, with high accuracy and relatively low computational costs. To be more specific, the 2D shallow water equation is employed to simulate the propagation of offshore waves, while the 3D Navier-Stokes equation is employed for the runup in urban areas. The stabilized finite element method is utilized for numerical simulations for both of the 2D and 3D domains that are independently discretized with unstructured meshes. The multi-point constraint and transmission methods are applied to satisfy the continuity of flow velocities and pressures at the interface between the resulting 2D and 3D meshes, since neither their spatial dimensions nor node arrangements are consistent. Numerical examples are presented to demonstrate the performance of the proposed hybrid method to simulate tsunami behavior, including offshore propagation and runup to urban areas, with substantially lower computation costs in comparison with full 3D computations.

  17. Klassifikation von Standardebenen in der 2D-Echokardiographie mittels 2D-3D-Bildregistrierung

    NASA Astrophysics Data System (ADS)

    Bergmeir, Christoph; Subramanian, Navneeth

    Zum Zweck der Entwicklung eines Systems, das einen unerfahrenen Anwender von Ultraschall (US) zur Aufnahme relevanter anatomischer Strukturen leitet, untersuchen wir die Machbarkeit von 2D-US zu 3D-CT Registrierung. Wir verwenden US-Aufnahmen von Standardebenen des Herzens, welche zu einem 3D-CT-Modell registriert werden. Unser Algorithmus unterzieht sowohl die US-Bilder als auch den CT-Datensatz Vorverarbeitungsschritten, welche die Daten durch Segmentierung auf wesentliche Informationen in Form von Labein für Muskel und Blut reduzieren. Anschließend werden diese Label zur Registrierung mittels der Match-Cardinality-Metrik genutzt. Durch mehrmaliges Registrieren mit verschiedenen Initialisierungen ermitteln wir die im US-Bild sichtbare Standardebene. Wir evaluierten die Methode auf sieben US-Bildern von Standardebenen. Fünf davon wurden korrekt zugeordnet.

  18. Epitaxial 2D SnSe2/ 2D WSe2 van der Waals Heterostructures.

    PubMed

    Aretouli, Kleopatra Emmanouil; Tsoutsou, Dimitra; Tsipas, Polychronis; Marquez-Velasco, Jose; Aminalragia Giamini, Sigiava; Kelaidis, Nicolaos; Psycharis, Vassilis; Dimoulas, Athanasios

    2016-09-01

    van der Waals heterostructures of 2D semiconductor materials can be used to realize a number of (opto)electronic devices including tunneling field effect devices (TFETs). It is shown in this work that high quality SnSe2/WSe2 vdW heterostructure can be grown by molecular beam epitaxy on AlN(0001)/Si(111) substrates using a Bi2Se3 buffer layer. A valence band offset of 0.8 eV matches the energy gap of SnSe2 in such a way that the VB edge of WSe2 and the CB edge of SnSe2 are lined up, making this materials combination suitable for (nearly) broken gap TFETs. PMID:27537619

  19. CVMAC 2D Program: A method of converting 3D to 2D

    SciTech Connect

    Lown, J.

    1990-06-20

    This paper presents the user with a method of converting a three- dimensional wire frame model into a technical illustration, detail, or assembly drawing. By using the 2D Program, entities can be mapped from three-dimensional model space into two-dimensional model space, as if they are being traced. Selected entities to be mapped can include circles, arcs, lines, and points. This program prompts the user to digitize the view to be mapped, specify the layers in which the new two-dimensional entities will reside, and select the entities, either by digitizing or windowing. The new two-dimensional entities are displayed in a small view which the program creates in the lower left corner of the drawing. 9 figs.

  20. Multiple Strategies for Spatial Integration of 2D Layouts within Working Memory

    PubMed Central

    Meilinger, Tobias; Watanabe, Katsumi

    2016-01-01

    Prior results on the spatial integration of layouts within a room differed regarding the reference frame that participants used for integration. We asked whether these differences also occur when integrating 2D screen views and, if so, what the reasons for this might be. In four experiments we showed that integrating reference frames varied as a function of task familiarity combined with processing time, cues for spatial transformation, and information about action requirements paralleling results in the 3D case. Participants saw part of an object layout in screen 1, another part in screen 2, and reacted on the integrated layout in screen 3. Layout presentations between two screens coincided or differed in orientation. Aligning misaligned screens for integration is known to increase errors/latencies. The error/latency pattern was thus indicative of the reference frame used for integration. We showed that task familiarity combined with self-paced learning, visual updating, and knowing from where to act prioritized the integration within the reference frame of the initial presentation, which was updated later, and from where participants acted respectively. Participants also heavily relied on layout intrinsic frames. The results show how humans flexibly adjust their integration strategy to a wide variety of conditions. PMID:27101011

  1. Multiple Strategies for Spatial Integration of 2D Layouts within Working Memory.

    PubMed

    Meilinger, Tobias; Watanabe, Katsumi

    2016-01-01

    Prior results on the spatial integration of layouts within a room differed regarding the reference frame that participants used for integration. We asked whether these differences also occur when integrating 2D screen views and, if so, what the reasons for this might be. In four experiments we showed that integrating reference frames varied as a function of task familiarity combined with processing time, cues for spatial transformation, and information about action requirements paralleling results in the 3D case. Participants saw part of an object layout in screen 1, another part in screen 2, and reacted on the integrated layout in screen 3. Layout presentations between two screens coincided or differed in orientation. Aligning misaligned screens for integration is known to increase errors/latencies. The error/latency pattern was thus indicative of the reference frame used for integration. We showed that task familiarity combined with self-paced learning, visual updating, and knowing from where to act prioritized the integration within the reference frame of the initial presentation, which was updated later, and from where participants acted respectively. Participants also heavily relied on layout intrinsic frames. The results show how humans flexibly adjust their integration strategy to a wide variety of conditions.

  2. Multiple Strategies for Spatial Integration of 2D Layouts within Working Memory.

    PubMed

    Meilinger, Tobias; Watanabe, Katsumi

    2016-01-01

    Prior results on the spatial integration of layouts within a room differed regarding the reference frame that participants used for integration. We asked whether these differences also occur when integrating 2D screen views and, if so, what the reasons for this might be. In four experiments we showed that integrating reference frames varied as a function of task familiarity combined with processing time, cues for spatial transformation, and information about action requirements paralleling results in the 3D case. Participants saw part of an object layout in screen 1, another part in screen 2, and reacted on the integrated layout in screen 3. Layout presentations between two screens coincided or differed in orientation. Aligning misaligned screens for integration is known to increase errors/latencies. The error/latency pattern was thus indicative of the reference frame used for integration. We showed that task familiarity combined with self-paced learning, visual updating, and knowing from where to act prioritized the integration within the reference frame of the initial presentation, which was updated later, and from where participants acted respectively. Participants also heavily relied on layout intrinsic frames. The results show how humans flexibly adjust their integration strategy to a wide variety of conditions. PMID:27101011

  3. Routine Self-administered, Touch-Screen Computer Based Suicidal Ideation Assessment Linked to Automated Response Team Notification in an HIV Primary Care Setting

    PubMed Central

    Lawrence, Sarah T.; Willig, James H.; Crane, Heidi M.; Ye, Jiatao; Aban, Inmaculada; Lober, William; Nevin, Christa R.; Batey, D. Scott; Mugavero, Michael J.; McCullumsmith, Cheryl; Wright, Charles; Kitahata, Mari; Raper, James L.; Saag, Micheal S.; Schumacher, Joseph E.

    2010-01-01

    Summary The implementation of routine computer-based screening for suicidal ideation and other psychosocial domains through standardized patient reported outcome instruments in two high volume urban HIV clinics is described. Factors associated with an increased risk of self-reported suicidal ideation were determined. Background HIV/AIDS continues to be associated with an under-recognized risk for suicidal ideation, attempted as well as completed suicide. Suicidal ideation represents an important predictor for subsequent attempted and completed suicide. We sought to implement routine screening of suicidal ideation and associated conditions using computerized patient reported outcome (PRO) assessments. Methods Two geographically distinct academic HIV primary care clinics enrolled patients attending scheduled visits from 12/2005 to 2/2009. Touch-screen-based, computerized PRO assessments were implemented into routine clinical care. Substance abuse (ASSIST), alcohol consumption (AUDIT-C), depression (PHQ-9) and anxiety (PHQ-A) were assessed. The PHQ-9 assesses the frequency of suicidal ideation in the preceding two weeks. A response of “nearly every day” triggered an automated page to pre-determined clinic personnel who completed more detailed self-harm assessments. Results Overall 1,216 (UAB= 740; UW= 476) patients completed initial PRO assessment during the study period. Patients were white (53%; n=646), predominantly males (79%; n=959) with a mean age of 44 (± 10). Among surveyed patients, 170 (14%) endorsed some level of suicidal ideation, while 33 (3%) admitted suicidal ideation nearly every day. In multivariable analysis, suicidal ideation risk was lower with advancing age (OR=0.74 per 10 years;95%CI=0.58-0.96) and was increased with current substance abuse (OR=1.88;95%CI=1.03-3.44) and more severe depression (OR=3.91 moderate;95%CI=2.12-7.22; OR=25.55 severe;95%CI=12.73-51.30). Discussion Suicidal ideation was associated with current substance abuse and

  4. Breast Cancer Detection in a Screening Population: Comparison of Digital Mammography, Computer-Aided Detection Applied to Digital Mammography and Breast Ultrasound

    PubMed Central

    Cho, Kyu Ran; Woo, Ok Hee; Song, Sung Eun; Choi, Jungsoon; Whang, Shin Young; Park, Eun Kyung; Park, Ah Young; Shin, Hyeseon; Chung, Hwan Hoon

    2016-01-01

    Purpose We aimed to compare the detection of breast cancer using full-field digital mammography (FFDM), FFDM with computer-aided detection (FFDM+CAD), ultrasound (US), and FFDM+CAD plus US (FFDM+CAD+US), and to investigate the factors affecting cancer detection. Methods In this retrospective study conducted from 2008 to 2012, 48,251 women underwent FFDM and US for cancer screening. One hundred seventy-one breast cancers were detected: 115 invasive cancers and 56 carcinomas in situ. Two radiologists evaluated the imaging findings of FFDM, FFDM+CAD, and US, based on the Breast Imaging Reporting and Data System lexicon of the American College of Radiology by consensus. We reviewed the clinical and the pathological data to investigate factors affecting cancer detection. We statistically used generalized estimation equations with a logit link to compare the cancer detectability of different imaging modalities. To compare the various factors affecting detection versus nondetection, we used Wilcoxon rank sum, chi-square, or Fisher exact test. Results The detectability of breast cancer by US (96.5%) or FFDM+CAD+US (100%) was superior to that of FFDM (87.1%) (p=0.019 or p<0.001, respectively) or FFDM+ CAD (88.3%) (p=0.050 or p<0.001, respectively). However, cancer detectability was not significantly different between FFDM versus FFDM+CAD (p=1.000) and US alone versus FFDM+CAD+US (p=0.126). The tumor size influenced cancer detectability by all imaging modalities (p<0.050). In FFDM and FFDM+CAD, the nondetecting group consisted of younger patients and patients with a denser breast composition (p<0.050). In breast US, carcinoma in situ was more frequent in the nondetecting group (p=0.014). Conclusion For breast cancer screening, breast US alone is satisfactory for all age groups, although FFDM+ CAD+US is the perfect screening method. Patient age, breast composition, and pathological tumor size and type may influence cancer detection during screening. PMID:27721882

  5. 2D modeling of electromagnetic waves in cold plasmas

    SciTech Connect

    Crombé, K.; Van Eester, D.; Koch, R.; Kyrytsya, V.

    2014-02-12

    The consequences of sheath (rectified) electric fields, resulting from the different mobility of electrons and ions as a response to radio frequency (RF) fields, are a concern for RF antenna design as it can cause damage to antenna parts, limiters and other in-vessel components. As a first step to a more complete description, the usual cold plasma dielectric description has been adopted, and the density profile was assumed to be known as input. Ultimately, the relevant equations describing the wave-particle interaction both on the fast and slow timescale will need to be tackled but prior to doing so was felt as a necessity to get a feeling of the wave dynamics involved. Maxwell's equations are solved for a cold plasma in a 2D antenna box with strongly varying density profiles crossing also lower hybrid and ion-ion hybrid resonance layers. Numerical modelling quickly becomes demanding on computer power, since a fine grid spacing is required to capture the small wavelengths effects of strongly evanescent modes.

  6. Quantum Simulation with 2D Arrays of Trapped Ions

    NASA Astrophysics Data System (ADS)

    Richerme, Philip

    2016-05-01

    The computational difficulty of solving fully quantum many-body spin problems is a significant obstacle to understanding the behavior of strongly correlated quantum matter. This work proposes the design and construction of a 2D quantum spin simulator to investigate the physics of frustrated materials, highly entangled states, mechanisms potentially underpinning high-temperature superconductivity, and other topics inaccessible to current 1D systems. The effective quantum spins will be encoded within the well-isolated electronic levels of trapped ions, confined in a two-dimensional planar geometry, and made to interact using phonon-mediated optical dipole forces. The system will be scalable to 100+ quantum particles, far beyond the realm of classical intractability, while maintaining individual-ion control, long quantum coherence times, and site-resolved projective spin measurements. Once constructed, the two-dimensional quantum simulator will implement a broad range of spin models on a variety of reconfigurable lattices and characterize their behavior through measurements of spin-spin correlations and entanglement. This versatile tool will serve as an important experimental resource for exploring difficult quantum many-body problems in a regime where classical methods fail.

  7. Functional characterization of CYP2D6 enhancer polymorphisms

    PubMed Central

    Wang, Danxin; Papp, Audrey C.; Sun, Xiaochun

    2015-01-01

    CYP2D6 metabolizes nearly 25% of clinically used drugs. Genetic polymorphisms cause large inter-individual variability in CYP2D6 enzyme activity and are currently used as biomarker to predict CYP2D6 metabolizer phenotype. Previously, we had identified a region 115 kb downstream of CYP2D6 as enhancer for CYP2D6, containing two completely linked single nucleotide polymorphisms (SNPs), rs133333 and rs5758550, associated with enhanced transcription. However, the enhancer effect on CYP2D6 expression, and the causative variant, remained to be ascertained. To characterize the CYP2D6 enhancer element, we applied chromatin conformation capture combined with the next-generation sequencing (4C assays) and chromatin immunoprecipitation with P300 antibody, in HepG2 and human primary culture hepatocytes. The results confirmed the role of the previously identified enhancer region in CYP2D6 expression, expanding the number of candidate variants to three highly linked SNPs (rs133333, rs5758550 and rs4822082). Among these, only rs5758550 demonstrated regulating enhancer activity in a reporter gene assay. Use of clustered regularly interspaced short palindromic repeats mediated genome editing in HepG2 cells targeting suspected enhancer regions decreased CYP2D6 mRNA expression by 70%, only upon deletion of the rs5758550 region. These results demonstrate robust effects of both the enhancer element and SNP rs5758550 on CYP2D6 expression, supporting consideration of rs5758550 for CYP2D6 genotyping panels to yield more accurate phenotype prediction. PMID:25381333

  8. Computer simulated screening of dentin bonding primer monomers through analysis of their chemical functions and their spatial 3D alignment.

    PubMed

    Vaidyanathan, J; Vaidyanathan, T K; Ravichandran, S

    2009-02-01

    Binding interactions between dentin bonding primer monomers and dentinal collagen were studied by an analysis of their chemical functions and their spatial 3D alignment. A trial set of 12 monomers used as primers in dentin adhesives was characterized to assess them for binding to a complementary target. HipHop utility in the Catalyst software from Accelrys was used for the study. Ten hypotheses were generated by HipHop procedures involving (a) conformational generation using a poling technique to promote conformational variation, (b) extraction of functions to remodel ligands as function-based structures, and (c) identification of common patterns of functional alignment displayed by low energy conformations. The hypotheses, designated as pharmacaphores, were also scored and ranked. Analysis of pharmacaphore models through mapping of ligands revealed important differences between ligands. Top-ranked poses from direct docking simulations using type 1 collagen target were mapped in a rigid manner to the highest ranked pharmacophore model. The visual match observed in mapping and associated fit values suggest a strong correspondence between direct and indirect docking simulations. The results elegantly demonstrate that an indirect approach used to identify pharmacaphore models from adhesive ligands without a target may be a simple and viable approach to assess their intermolecular interactions with an intended target. Inexpensive indirect/direct virtual screening of hydrophilic monomer candidates may be a practical way to assess their initial promise for dentin primer use well before additional experimental evaluation of their priming/bonding efficacy. This is also of value in the search/design of new compounds for priming dentin. PMID:18546179

  9. Computer simulated screening of dentin bonding primer monomers through analysis of their chemical functions and their spatial 3D alignment.

    PubMed

    Vaidyanathan, J; Vaidyanathan, T K; Ravichandran, S

    2009-02-01

    Binding interactions between dentin bonding primer monomers and dentinal collagen were studied by an analysis of their chemical functions and their spatial 3D alignment. A trial set of 12 monomers used as primers in dentin adhesives was characterized to assess them for binding to a complementary target. HipHop utility in the Catalyst software from Accelrys was used for the study. Ten hypotheses were generated by HipHop procedures involving (a) conformational generation using a poling technique to promote conformational variation, (b) extraction of functions to remodel ligands as function-based structures, and (c) identification of common patterns of functional alignment displayed by low energy conformations. The hypotheses, designated as pharmacaphores, were also scored and ranked. Analysis of pharmacaphore models through mapping of ligands revealed important differences between ligands. Top-ranked poses from direct docking simulations using type 1 collagen target were mapped in a rigid manner to the highest ranked pharmacophore model. The visual match observed in mapping and associated fit values suggest a strong correspondence between direct and indirect docking simulations. The results elegantly demonstrate that an indirect approach used to identify pharmacaphore models from adhesive ligands without a target may be a simple and viable approach to assess their intermolecular interactions with an intended target. Inexpensive indirect/direct virtual screening of hydrophilic monomer candidates may be a practical way to assess their initial promise for dentin primer use well before additional experimental evaluation of their priming/bonding efficacy. This is also of value in the search/design of new compounds for priming dentin.

  10. Whole-exome sequencing defines the mutational landscape of pheochromocytoma and identifies KMT2D as a recurrently mutated gene.

    PubMed

    Juhlin, C Christofer; Stenman, Adam; Haglund, Felix; Clark, Victoria E; Brown, Taylor C; Baranoski, Jacob; Bilguvar, Kaya; Goh, Gerald; Welander, Jenny; Svahn, Fredrika; Rubinstein, Jill C; Caramuta, Stefano; Yasuno, Katsuhito; Günel, Murat; Bäckdahl, Martin; Gimm, Oliver; Söderkvist, Peter; Prasad, Manju L; Korah, Reju; Lifton, Richard P; Carling, Tobias

    2015-09-01

    As subsets of pheochromocytomas (PCCs) lack a defined molecular etiology, we sought to characterize the mutational landscape of PCCs to identify novel gene candidates involved in disease development. A discovery cohort of 15 PCCs wild type for mutations in PCC susceptibility genes underwent whole-exome sequencing, and an additional 83 PCCs served as a verification cohort for targeted sequencing of candidate mutations. A low rate of nonsilent single nucleotide variants (SNVs) was detected (6.1/sample). Somatic HRAS and EPAS1 mutations were observed in one case each, whereas the remaining 13 cases did not exhibit variants in established PCC genes. SNVs aggregated in apoptosis-related pathways, and mutations in COSMIC genes not previously reported in PCCs included ZAN, MITF, WDTC1, and CAMTA1. Two somatic mutations and one constitutional variant in the well-established cancer gene lysine (K)-specific methyltransferase 2D (KMT2D, MLL2) were discovered in one sample each, prompting KMT2D screening using focused exome-sequencing in the verification cohort. An additional 11 PCCs displayed KMT2D variants, of which two were recurrent. In total, missense KMT2D variants were found in 14 (11 somatic, two constitutional, one undetermined) of 99 PCCs (14%). Five cases displayed somatic mutations in the functional FYR/SET domains of KMT2D, constituting 36% of all KMT2D-mutated PCCs. KMT2D expression was upregulated in PCCs compared to normal adrenals, and KMT2D overexpression positively affected cell migration in a PCC cell line. We conclude that KMT2D represents a recurrently mutated gene with potential implication for PCC development. PMID:26032282

  11. Implicit adaptive mesh refinement for 2D reduced resistive magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Philip, Bobby; Chacón, Luis; Pernice, Michael

    2008-10-01

    An implicit structured adaptive mesh refinement (SAMR) solver for 2D reduced magnetohydrodynamics (MHD) is described. The time-implicit discretization is able to step over fast normal modes, while the spatial adaptivity resolves thin, dynamically evolving features. A Jacobian-free Newton-Krylov method is used for the nonlinear solver engine. For preconditioning, we have extended the optimal "physics-based" approach developed in [L. Chacón, D.A. Knoll, J.M. Finn, An implicit, nonlinear reduced resistive MHD solver, J. Comput. Phys. 178 (2002) 15-36] (which employed multigrid solver technology in the preconditioner for scalability) to SAMR grids using the well-known Fast Adaptive Composite grid (FAC) method [S. McCormick, Multilevel Adaptive Methods for Partial Differential Equations, SIAM, Philadelphia, PA, 1989]. A grid convergence study demonstrates that the solver performance is independent of the number of grid levels and only depends on the finest resolution considered, and that it scales well with grid refinement. The study of error generation and propagation in our SAMR implementation demonstrates that high-order (cubic) interpolation during regridding, combined with a robustly damping second-order temporal scheme such as BDF2, is required to minimize impact of grid errors at coarse-fine interfaces on the overall error of the computation for this MHD application. We also demonstrate that our implementation features the desired property that the overall numerical error is dependent only on the finest resolution level considered, and not on the base-grid resolution or on the number of refinement levels present during the simulation. We demonstrate the effectiveness of the tool on several challenging problems.

  12. Tensor representation of color images and fast 2D quaternion discrete Fourier transform

    NASA Astrophysics Data System (ADS)

    Grigoryan, Artyom M.; Agaian, Sos S.

    2015-03-01

    In this paper, a general, efficient, split algorithm to compute the two-dimensional quaternion discrete Fourier transform (2-D QDFT), by using the special partitioning in the frequency domain, is introduced. The partition determines an effective transformation, or color image representation in the form of 1-D quaternion signals which allow for splitting the N × M-point 2-D QDFT into a set of 1-D QDFTs. Comparative estimates revealing the efficiency of the proposed algorithms with respect to the known ones are given. In particular, a proposed method of calculating the 2r × 2r -point 2-D QDFT uses 18N2 less multiplications than the well-known column-row method and method of calculation based on the symplectic decomposition. The proposed algorithm is simple to apply and design, which makes it very practical in color image processing in the frequency domain.

  13. Advanced 2D-3D registration for endovascular aortic interventions: addressing dissimilarity in images

    NASA Astrophysics Data System (ADS)

    Demirci, Stefanie; Kutter, Oliver; Manstad-Hulaas, Frode; Bauernschmitt, Robert; Navab, Nassir

    2008-03-01

    In the current clinical workflow of minimally invasive aortic procedures navigation tasks are performed under 2D or 3D angiographic imaging. Many solutions for navigation enhancement suggest an integration of the preoperatively acquired computed tomography angiography (CTA) in order to provide the physician with more image information and reduce contrast injection and radiation exposure. This requires exact registration algorithms that align the CTA volume to the intraoperative 2D or 3D images. Additional to the real-time constraint, the registration accuracy should be independent of image dissimilarities due to varying presence of medical instruments and contrast agent. In this paper, we propose efficient solutions for image-based 2D-3D and 3D-3D registration that reduce the dissimilarities by image preprocessing, e.g. implicit detection and segmentation, and adaptive weights introduced into the registration procedure. Experiments and evaluations are conducted on real patient data.

  14. An Incompressible 2D Didactic Model with Singularity and Explicit Solutions of the 2D Boussinesq Equations

    NASA Astrophysics Data System (ADS)

    Chae, Dongho; Constantin, Peter; Wu, Jiahong

    2014-09-01

    We give an example of a well posed, finite energy, 2D incompressible active scalar equation with the same scaling as the surface quasi-geostrophic equation and prove that it can produce finite time singularities. In spite of its simplicity, this seems to be the first such example. Further, we construct explicit solutions of the 2D Boussinesq equations whose gradients grow exponentially in time for all time. In addition, we introduce a variant of the 2D Boussinesq equations which is perhaps a more faithful companion of the 3D axisymmetric Euler equations than the usual 2D Boussinesq equations.

  15. Lung Cancer Screening Update.

    PubMed

    Ruchalski, Kathleen L; Brown, Kathleen

    2016-07-01

    Since the release of the US Preventive Services Task Force and Centers for Medicare and Medicaid Services recommendations for lung cancer screening, low-dose chest computed tomography screening has moved from the research arena to clinical practice. Lung cancer screening programs must reach beyond image acquisition and interpretation and engage in a multidisciplinary effort of clinical shared decision-making, standardization of imaging and nodule management, smoking cessation, and patient follow-up. Standardization of radiologic reports and nodule management will systematize patient care, provide quality assurance, further reduce harm, and contain health care costs. Although the National Lung Screening Trial results and eligibility criteria of a heavy smoking history are the foundation for the standard guidelines for low-dose chest computed tomography screening in the United States, currently only 27% of patients diagnosed with lung cancer would meet US lung cancer screening recommendations. Current and future efforts must be directed to better delineate those patients who would most benefit from screening and to ensure that the benefits of screening reach all socioeconomic strata and racial and ethnic minorities. Further optimization of lung cancer screening program design and patient eligibility will assure that lung cancer screening benefits will outweigh the potential risks to our patients. PMID:27306387

  16. General Purpose 2D and 3D Similarity Approach to Identify hERG Blockers.

    PubMed

    Schyman, Patric; Liu, Ruifeng; Wallqvist, Anders

    2016-01-25

    Screening compounds for human ether-à-go-go-related gene (hERG) channel inhibition is an important component of early stage drug development and assessment. In this study, we developed a high-confidence (p-value < 0.01) hERG prediction model based on a combined two-dimensional (2D) and three-dimensional (3D) modeling approach. We developed a 3D similarity conformation approach (SCA) based on examining a limited fixed number of pairwise 3D similarity scores between a query molecule and a set of known hERG blockers. By combining 3D SCA with 2D similarity ensemble approach (SEA) methods, we achieved a maximum sensitivity in hERG inhibition prediction with an accuracy not achieved by either method separately. The combined model achieved 69% sensitivity and 95% specificity on an independent external data set. Further validation showed that the model correctly picked up documented hERG inhibition or interactions among the Food and Drug Administration- approved drugs with the highest similarity scores-with 18 of 20 correctly identified. The combination of ascertaining 2D and 3D similarity of compounds allowed us to synergistically use 2D fingerprint matching with 3D shape and chemical complementarity matching. PMID:26718126

  17. General Purpose 2D and 3D Similarity Approach to Identify hERG Blockers.

    PubMed

    Schyman, Patric; Liu, Ruifeng; Wallqvist, Anders

    2016-01-25

    Screening compounds for human ether-à-go-go-related gene (hERG) channel inhibition is an important component of early stage drug development and assessment. In this study, we developed a high-confidence (p-value < 0.01) hERG prediction model based on a combined two-dimensional (2D) and three-dimensional (3D) modeling approach. We developed a 3D similarity conformation approach (SCA) based on examining a limited fixed number of pairwise 3D similarity scores between a query molecule and a set of known hERG blockers. By combining 3D SCA with 2D similarity ensemble approach (SEA) methods, we achieved a maximum sensitivity in hERG inhibition prediction with an accuracy not achieved by either method separately. The combined model achieved 69% sensitivity and 95% specificity on an independent external data set. Further validation showed that the model correctly picked up documented hERG inhibition or interactions among the Food and Drug Administration- approved drugs with the highest similarity scores-with 18 of 20 correctly identified. The combination of ascertaining 2D and 3D similarity of compounds allowed us to synergistically use 2D fingerprint matching with 3D shape and chemical complementarity matching.

  18. Highly Omnidirectional and Frequency Controllable Carbon/Polyaniline-based 2D and 3D Monopole Antenna.

    PubMed

    Shin, Keun-Young; Kim, Minkyu; Lee, James S; Jang, Jyongsik

    2015-01-01

    Highly omnidirectional and frequency controllable carbon/polyaniline (C/PANI)-based, two- (2D) and three-dimensional (3D) monopole antennas were fabricated using screen-printing and a one-step, dimensionally confined hydrothermal strategy, respectively. Solvated C/PANI was synthesized by low-temperature interfacial polymerization, during which strong π-π interactions between graphene and the quinoid rings of PANI resulted in an expanded PANI conformation with enhanced crystallinity and improved mechanical and electrical properties. Compared to antennas composed of pristine carbon or PANI-based 2D monopole structures, 2D monopole antennas composed of this enhanced hybrid material were highly efficient and amenable to high-frequency, omnidirectional electromagnetic waves. The mean frequency of C/PANI fiber-based 3D monopole antennas could be controlled by simply cutting and stretching the antenna. These antennas attained high peak gain (3.60 dBi), high directivity (3.91 dBi) and radiation efficiency (92.12%) relative to 2D monopole antenna. These improvements were attributed the high packing density and aspect ratios of C/PANI fibers and the removal of the flexible substrate. This approach offers a valuable and promising tool for producing highly omnidirectional and frequency-controllable, carbon-based monopole antennas for use in wireless networking communications on industrial, scientific, and medical (ISM) bands. PMID:26338090

  19. Genomics of Dementia: APOE- and CYP2D6-Related Pharmacogenetics

    PubMed Central

    Cacabelos, Ramón; Martínez, Rocío; Fernández-Novoa, Lucía; Carril, Juan C.; Lombardi, Valter; Carrera, Iván; Corzo, Lola; Tellado, Iván; Leszek, Jerzy; McKay, Adam; Takeda, Masatoshi

    2012-01-01

    Dementia is a major problem of health in developed societies. Alzheimer's disease (AD), vascular dementia, and mixed dementia account for over 90% of the most prevalent forms of dementia. Both genetic and environmental factors are determinant for the phenotypic expression of dementia. AD is a complex disorder in which many different gene clusters may be involved. Most genes screened to date belong to different proteomic and metabolomic pathways potentially affecting AD pathogenesis. The ε4 variant of the APOE gene seems to be a major risk factor for both degenerative and vascular dementia. Metabolic factors, cerebrovascular disorders, and epigenetic phenomena also contribute to neurodegeneration. Five categories of genes are mainly involved in pharmacogenomics: genes associated with disease pathogenesis, genes associated with the mechanism of action of a particular drug, genes associated with phase I and phase II metabolic reactions, genes associated with transporters, and pleiotropic genes and/or genes associated with concomitant pathologies. The APOE and CYP2D6 genes have been extensively studied in AD. The therapeutic response to conventional drugs in patients with AD is genotype specific, with CYP2D6-PMs, CYP2D6-UMs, and APOE-4/4 carriers acting as the worst responders. APOE and CYP2D6 may cooperate, as pleiotropic genes, in the metabolism of drugs and hepatic function. The introduction of pharmacogenetic procedures into AD pharmacological treatment may help to optimize therapeutics. PMID:22482072

  20. Highly Omnidirectional and Frequency Controllable Carbon/Polyaniline-based 2D and 3D Monopole Antenna

    PubMed Central

    Shin, Keun-Young; Kim, Minkyu; Lee, James S.; Jang, Jyongsik

    2015-01-01

    Highly omnidirectional and frequency controllable carbon/polyaniline (C/PANI)-based, two- (2D) and three-dimensional (3D) monopole antennas were fabricated using screen-printing and a one-step, dimensionally confined hydrothermal strategy, respectively. Solvated C/PANI was synthesized by low-temperature interfacial polymerization, during which strong π–π interactions between graphene and the quinoid rings of PANI resulted in an expanded PANI conformation with enhanced crystallinity and improved mechanical and electrical properties. Compared to antennas composed of pristine carbon or PANI-based 2D monopole structures, 2D monopole antennas composed of this enhanced hybrid material were highly efficient and amenable to high-frequency, omnidirectional electromagnetic waves. The mean frequency of C/PANI fiber-based 3D monopole antennas could be controlled by simply cutting and stretching the antenna. These antennas attained high peak gain (3.60 dBi), high directivity (3.91 dBi) and radiation efficiency (92.12%) relative to 2D monopole antenna. These improvements were attributed the high packing density and aspect ratios of C/PANI fibers and the removal of the flexible substrate. This approach offers a valuable and promising tool for producing highly omnidirectional and frequency-controllable, carbon-based monopole antennas for use in wireless networking communications on industrial, scientific, and medical (ISM) bands. PMID:26338090

  1. Highly Omnidirectional and Frequency Controllable Carbon/Polyaniline-based 2D and 3D Monopole Antenna

    NASA Astrophysics Data System (ADS)

    Shin, Keun-Young; Kim, Minkyu; Lee, James S.; Jang, Jyongsik

    2015-09-01

    Highly omnidirectional and frequency controllable carbon/polyaniline (C/PANI)-based, two- (2D) and three-dimensional (3D) monopole antennas were fabricated using screen-printing and a one-step, dimensionally confined hydrothermal strategy, respectively. Solvated C/PANI was synthesized by low-temperature interfacial polymerization, during which strong π-π interactions between graphene and the quinoid rings of PANI resulted in an expanded PANI conformation with enhanced crystallinity and improved mechanical and electrical properties. Compared to antennas composed of pristine carbon or PANI-based 2D monopole structures, 2D monopole antennas composed of this enhanced hybrid material were highly efficient and amenable to high-frequency, omnidirectional electromagnetic waves. The mean frequency of C/PANI fiber-based 3D monopole antennas could be controlled by simply cutting and stretching the antenna. These antennas attained high peak gain (3.60 dBi), high directivity (3.91 dBi) and radiation efficiency (92.12%) relative to 2D monopole antenna. These improvements were attributed the high packing density and aspect ratios of C/PANI fibers and the removal of the flexible substrate. This approach offers a valuable and promising tool for producing highly omnidirectional and frequency-controllable, carbon-based monopole antennas for use in wireless networking communications on industrial, scientific, and medical (ISM) bands.

  2. Computer-aided detection system for clustered microcalcifications: comparison of performance on full-field digital mammograms and digitized screen-film mammograms

    NASA Astrophysics Data System (ADS)

    Ge, Jun; Hadjiiski, Lubomir M.; Sahiner, Berkman; Wei, Jun; Helvie, Mark A.; Zhou, Chuan; Chan, Heang-Ping

    2007-02-01

    We have developed a computer-aided detection (CAD) system to detect clustered microcalcifications automatically on full-field digital mammograms (FFDMs) and a CAD system for screen-film mammograms (SFMs). The two systems used the same computer vision algorithms but their false positive (FP) classifiers were trained separately with sample images of each modality. In this study, we compared the performance of the CAD systems for detection of clustered microcalcifications on pairs of FFDM and SFM obtained from the same patient. For case-based performance evaluation, the FFDM CAD system achieved detection sensitivities of 70%, 80% and 90% at an average FP cluster rate of 0.07, 0.16 and 0.63 per image, compared with an average FP cluster rate of 0.15, 0.38 and 2.02 per image for the SFM CAD system. The difference was statistically significant with the alternative free-response receiver operating characteristic (AFROC) analysis. When evaluated on data sets negative for microcalcification clusters, the average FP cluster rates of the FFDM CAD system were 0.04, 0.11 and 0.33 per image at detection sensitivity level of 70%, 80% and 90% compared with an average FP cluster rate of 0.08, 0.14 and 0.50 per image for the SFM CAD system. When evaluated for malignant cases only, the difference of the performance of the two CAD systems was not statistically significant with AFROC analysis.

  3. Adaptation algorithms for 2-D feedforward neural networks.

    PubMed

    Kaczorek, T

    1995-01-01

    The generalized weight adaptation algorithms presented by J.G. Kuschewski et al. (1993) and by S.H. Zak and H.J. Sira-Ramirez (1990) are extended for 2-D madaline and 2-D two-layer feedforward neural nets (FNNs).

  4. Integrating Mobile Multimedia into Textbooks: 2D Barcodes

    ERIC Educational Resources Information Center

    Uluyol, Celebi; Agca, R. Kagan

    2012-01-01

    The major goal of this study was to empirically compare text-plus-mobile phone learning using an integrated 2D barcode tag in a printed text with three other conditions described in multimedia learning theory. The method examined in the study involved modifications of the instructional material such that: a 2D barcode was used near the text, the…

  5. Efficient Visible Quasi-2D Perovskite Light-Emitting Diodes.

    PubMed

    Byun, Jinwoo; Cho, Himchan; Wolf, Christoph; Jang, Mi; Sadhanala, Aditya; Friend, Richard H; Yang, Hoichang; Lee, Tae-Woo

    2016-09-01

    Efficient quasi-2D-structure perovskite light-emitting diodes (4.90 cd A(-1) ) are demonstrated by mixing a 3D-structured perovskite material (methyl ammonium lead bromide) and a 2D-structured perovskite material (phenylethyl ammonium lead bromide), which can be ascribed to better film uniformity, enhanced exciton confinement, and reduced trap density. PMID:27334788

  6. CYP2D6: novel genomic structures and alleles

    PubMed Central

    Kramer, Whitney E.; Walker, Denise L.; O’Kane, Dennis J.; Mrazek, David A.; Fisher, Pamela K.; Dukek, Brian A.; Bruflat, Jamie K.; Black, John L.

    2010-01-01

    Objective CYP2D6 is a polymorphic gene. It has been observed to be deleted, to be duplicated and to undergo recombination events involving the CYP2D7 pseudogene and surrounding sequences. The objective of this study was to discover the genomic structure of CYP2D6 recombinants that interfere with clinical genotyping platforms that are available today. Methods Clinical samples containing rare homozygous CYP2D6 alleles, ambiguous readouts, and those with duplication signals and two different alleles were analyzed by long-range PCR amplification of individual genes, PCR fragment analysis, allele-specific primer extension assay, and DNA sequencing to characterize alleles and genomic structure. Results Novel alleles, genomic structures, and the DNA sequence of these structures are described. Interestingly, in 49 of 50 DNA samples that had CYP2D6 gene duplications or multiplications where two alleles were detected, the chromosome containing the duplication or multiplication had identical tandem alleles. Conclusion Several new CYP2D6 alleles and genomic structures are described which will be useful for CYP2D6 genotyping. The findings suggest that the recombination events responsible for CYP2D6 duplications and multiplications are because of mechanisms other than interchromosomal crossover during meiosis. PMID:19741566

  7. Efficient Visible Quasi-2D Perovskite Light-Emitting Diodes.

    PubMed

    Byun, Jinwoo; Cho, Himchan; Wolf, Christoph; Jang, Mi; Sadhanala, Aditya; Friend, Richard H; Yang, Hoichang; Lee, Tae-Woo

    2016-09-01

    Efficient quasi-2D-structure perovskite light-emitting diodes (4.90 cd A(-1) ) are demonstrated by mixing a 3D-structured perovskite material (methyl ammonium lead bromide) and a 2D-structured perovskite material (phenylethyl ammonium lead bromide), which can be ascribed to better film uniformity, enhanced exciton confinement, and reduced trap density.

  8. A 2D driven 3D vessel segmentation algorithm for 3D digital subtraction angiography data

    NASA Astrophysics Data System (ADS)

    Spiegel, M.; Redel, T.; Struffert, T.; Hornegger, J.; Doerfler, A.

    2011-10-01

    Cerebrovascular disease is among the leading causes of death in western industrial nations. 3D rotational angiography delivers indispensable information on vessel morphology and pathology. Physicians make use of this to analyze vessel geometry in detail, i.e. vessel diameters, location and size of aneurysms, to come up with a clinical decision. 3D segmentation is a crucial step in this pipeline. Although a lot of different methods are available nowadays, all of them lack a method to validate the results for the individual patient. Therefore, we propose a novel 2D digital subtraction angiography (DSA)-driven 3D vessel segmentation and validation framework. 2D DSA projections are clinically considered as gold standard when it comes to measurements of vessel diameter or the neck size of aneurysms. An ellipsoid vessel model is applied to deliver the initial 3D segmentation. To assess the accuracy of the 3D vessel segmentation, its forward projections are iteratively overlaid with the corresponding 2D DSA projections. Local vessel discrepancies are modeled by a global 2D/3D optimization function to adjust the 3D vessel segmentation toward the 2D vessel contours. Our framework has been evaluated on phantom data as well as on ten patient datasets. Three 2D DSA projections from varying viewing angles have been used for each dataset. The novel 2D driven 3D vessel segmentation approach shows superior results against state-of-the-art segmentations like region growing, i.e. an improvement of 7.2% points in precision and 5.8% points for the Dice coefficient. This method opens up future clinical applications requiring the greatest vessel accuracy, e.g. computational fluid dynamic modeling.

  9. Pareto joint inversion of 2D magnetotelluric and gravity data

    NASA Astrophysics Data System (ADS)

    Miernik, Katarzyna; Bogacz, Adrian; Kozubal, Adam; Danek, Tomasz; Wojdyła, Marek

    2015-04-01

    In this contribution, the first results of the "Innovative technology of petrophysical parameters estimation of geological media using joint inversion algorithms" project were described. At this stage of the development, Pareto joint inversion scheme for 2D MT and gravity data was used. Additionally, seismic data were provided to set some constrains for the inversion. Sharp Boundary Interface(SBI) approach and description model with set of polygons were used to limit the dimensionality of the solution space. The main engine was based on modified Particle Swarm Optimization(PSO). This algorithm was properly adapted to handle two or more target function at once. Additional algorithm was used to eliminate non- realistic solution proposals. Because PSO is a method of stochastic global optimization, it requires a lot of proposals to be evaluated to find a single Pareto solution and then compose a Pareto front. To optimize this stage parallel computing was used for both inversion engine and 2D MT forward solver. There are many advantages of proposed solution of joint inversion problems. First of all, Pareto scheme eliminates cumbersome rescaling of the target functions, that can highly affect the final solution. Secondly, the whole set of solution is created in one optimization run, providing a choice of the final solution. This choice can be based off qualitative data, that are usually very hard to be incorporated into the regular inversion schema. SBI parameterisation not only limits the problem of dimensionality, but also makes constraining of the solution easier. At this stage of work, decision to test the approach using MT and gravity data was made, because this combination is often used in practice. It is important to mention, that the general solution is not limited to this two methods and it is flexible enough to be used with more than two sources of data. Presented results were obtained for synthetic models, imitating real geological conditions, where

  10. 2D materials and van der Waals heterostructures.

    PubMed

    Novoselov, K S; Mishchenko, A; Carvalho, A; Castro Neto, A H

    2016-07-29

    The physics of two-dimensional (2D) materials and heterostructures based on such crystals has been developing extremely fast. With these new materials, truly 2D physics has begun to appear (for instance, the absence of long-range order, 2D excitons, commensurate-incommensurate transition, etc.). Novel heterostructure devices--such as tunneling transistors, resonant tunneling diodes, and light-emitting diodes--are also starting to emerge. Composed from individual 2D crystals, such devices use the properties of those materials to create functionalities that are not accessible in other heterostructures. Here we review the properties of novel 2D crystals and examine how their properties are used in new heterostructure devices.

  11. Van der Waals stacked 2D layered materials for optoelectronics

    NASA Astrophysics Data System (ADS)

    Zhang, Wenjing; Wang, Qixing; Chen, Yu; Wang, Zhuo; Wee, Andrew T. S.

    2016-06-01

    The band gaps of many atomically thin 2D layered materials such as graphene, black phosphorus, monolayer semiconducting transition metal dichalcogenides and hBN range from 0 to 6 eV. These isolated atomic planes can be reassembled into hybrid heterostructures made layer by layer in a precisely chosen sequence. Thus, the electronic properties of 2D materials can be engineered by van der Waals stacking, and the interlayer coupling can be tuned, which opens up avenues for creating new material systems with rich functionalities and novel physical properties. Early studies suggest that van der Waals stacked 2D materials work exceptionally well, dramatically enriching the optoelectronics applications of 2D materials. Here we review recent progress in van der Waals stacked 2D materials, and discuss their potential applications in optoelectronics.

  12. Casting process modeling using CAST2D: The part mold interface

    SciTech Connect

    Shapiro, A.B.

    1991-10-01

    Correctly modeling the physics across the part-mold interface is crucial in predicting the quality of a cast part. Most metals undergo a volume change on solidification (e.g., aluminum -6.6%) and shrinkage on cooling. As the cast metal shrinks, it pulls away from the mol wall creating a gap. This gap effects the thermal contact resistance between the part and mold. The thermal contact resistance increase as the gap widens. This directly effects the cooling rate and ultimately the final cast shape, stress state, and quality of the cast part. CAST2D is a coupled thermal-stress finite element computer code for casting process modeling. This code can be used to predict the final shape and stress state of cast parts. CAST2D couples the heat transfer code TOPAZ2D and solid mechanics code NIKE2D. CAST2D is a code in development. This report presents the status of a general purpose thermal-mechanical interface algorithm. 3 refs., 3 figs.

  13. Estrogen-Induced Cholestasis Leads to Repressed CYP2D6 Expression in CYP2D6-Humanized Mice

    PubMed Central

    Pan, Xian

    2015-01-01

    Cholestasis activates bile acid receptor farnesoid X receptor (FXR) and subsequently enhances hepatic expression of small heterodimer partner (SHP). We previously demonstrated that SHP represses the transactivation of cytochrome P450 2D6 (CYP2D6) promoter by hepatocyte nuclear factor (HNF) 4α. In this study, we investigated the effects of estrogen-induced cholestasis on CYP2D6 expression. Estrogen-induced cholestasis occurs in subjects receiving estrogen for contraception or hormone replacement, or in susceptible women during pregnancy. In CYP2D6-humanized transgenic (Tg-CYP2D6) mice, cholestasis triggered by administration of 17α-ethinylestradiol (EE2) at a high dose led to 2- to 3-fold decreases in CYP2D6 expression. This was accompanied by increased hepatic SHP expression and subsequent decreases in the recruitment of HNF4α to CYP2D6 promoter. Interestingly, estrogen-induced cholestasis also led to increased recruitment of estrogen receptor (ER) α, but not that of FXR, to Shp promoter, suggesting a predominant role of ERα in transcriptional regulation of SHP in estrogen-induced cholestasis. EE2 at a low dose (that does not cause cholestasis) also increased SHP (by ∼50%) and decreased CYP2D6 expression (by 1.5-fold) in Tg-CYP2D6 mice, the magnitude of differences being much smaller than that shown in EE2-induced cholestasis. Taken together, our data indicate that EE2-induced cholestasis increases SHP and represses CYP2D6 expression in Tg-CYP2D6 mice in part through ERα transactivation of Shp promoter. PMID:25943116

  14. Infant Imitation from Television Using Novel Touch Screen Technology

    ERIC Educational Resources Information Center

    Zack, Elizabeth; Barr, Rachel; Gerhardstein, Peter; Dickerson, Kelly; Meltzoff, Andrew N.

    2009-01-01

    Infants learn less from a televised demonstration than from a live demonstration, the "video deficit effect." The present study employs a novel approach, using touch screen technology to examine 15-month olds' transfer of learning. Infants were randomly assigned either to within-dimension (2D/2D or 3D/3D) or cross-dimension (3D/2D or 2D/3D)…

  15. Bond-based bilinear indices for computational discovery of novel trypanosomicidal drug-like compounds through virtual screening.

    PubMed

    Castillo-Garit, Juan Alberto; del Toro-Cortés, Oremia; Vega, Maria C; Rolón, Miriam; Rojas de Arias, Antonieta; Casañola-Martin, Gerardo M; Escario, José A; Gómez-Barrio, Alicia; Marrero-Ponce, Yovani; Torrens, Francisco; Abad, Concepción

    2015-01-01

    Two-dimensional bond-based bilinear indices and linear discriminant analysis are used in this report to perform a quantitative structure-activity relationship study to identify new trypanosomicidal compounds. A data set of 440 organic chemicals, 143 with antitrypanosomal activity and 297 having other clinical uses, is used to develop the theoretical models. Two discriminant models, computed using bond-based bilinear indices, are developed and both show accuracies higher than 86% for training and test sets. The stochastic model correctly indentifies nine out of eleven compounds of a set of organic chemicals obtained from our synthetic collaborators. The in vitro antitrypanosomal activity of this set against epimastigote forms of Trypanosoma cruzi is assayed. Both models show a good agreement between theoretical predictions and experimental results. Three compounds showed IC50 values for epimastigote elimination (AE) lower than 50 μM, while for the benznidazole the IC50 = 54.7 μM which was used as reference compound. The value of IC50 for cytotoxicity of these compounds is at least 5 times greater than their value of IC50 for AE. Finally, we can say that, the present algorithm constitutes a step forward in the search for efficient ways of discovering new antitrypanosomal compounds.