Generating high-quality single droplets for optical particle characterization with an easy setup
NASA Astrophysics Data System (ADS)
Xu, Jie; Ge, Baozhen; Meng, Rui
2018-06-01
The high-performance and micro-sized single droplet is significant for optical particle characterization. We develop a single-droplet generator (SDG) based on a piezoelectric inkjet technique with advantages of low cost and easy setup. By optimizing the pulse parameters, we achieve various size single droplets. Further investigations reveal that SDG generates single droplets of high quality, demonstrating good sphericity, monodispersity and a stable length of several millimeters.
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2017-01-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered VoronoiDelaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
NASA Astrophysics Data System (ADS)
Engwirda, Darren
2017-06-01
An algorithm for the generation of non-uniform, locally orthogonal staggered unstructured spheroidal grids is described. This technique is designed to generate very high-quality staggered Voronoi-Delaunay meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric simulation, ocean-modelling and numerical weather prediction. Using a recently developed Frontal-Delaunay refinement technique, a method for the construction of high-quality unstructured spheroidal Delaunay triangulations is introduced. A locally orthogonal polygonal grid, derived from the associated Voronoi diagram, is computed as the staggered dual. It is shown that use of the Frontal-Delaunay refinement technique allows for the generation of very high-quality unstructured triangulations, satisfying a priori bounds on element size and shape. Grid quality is further improved through the application of hill-climbing-type optimisation techniques. Overall, the algorithm is shown to produce grids with very high element quality and smooth grading characteristics, while imposing relatively low computational expense. A selection of uniform and non-uniform spheroidal grids appropriate for high-resolution, multi-scale general circulation modelling are presented. These grids are shown to satisfy the geometric constraints associated with contemporary unstructured C-grid-type finite-volume models, including the Model for Prediction Across Scales (MPAS-O). The use of user-defined mesh-spacing functions to generate smoothly graded, non-uniform grids for multi-resolution-type studies is discussed in detail.
Ultrafast treatment plan optimization for volumetric modulated arc therapy (VMAT)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Men Chunhua; Romeijn, H. Edwin; Jia Xun
2010-11-15
Purpose: To develop a novel aperture-based algorithm for volumetric modulated arc therapy (VMAT) treatment plan optimization with high quality and high efficiency. Methods: The VMAT optimization problem is formulated as a large-scale convex programming problem solved by a column generation approach. The authors consider a cost function consisting two terms, the first enforcing a desired dose distribution and the second guaranteeing a smooth dose rate variation between successive gantry angles. A gantry rotation is discretized into 180 beam angles and for each beam angle, only one MLC aperture is allowed. The apertures are generated one by one in a sequentialmore » way. At each iteration of the column generation method, a deliverable MLC aperture is generated for one of the unoccupied beam angles by solving a subproblem with the consideration of MLC mechanic constraints. A subsequent master problem is then solved to determine the dose rate at all currently generated apertures by minimizing the cost function. When all 180 beam angles are occupied, the optimization completes, yielding a set of deliverable apertures and associated dose rates that produce a high quality plan. Results: The algorithm was preliminarily tested on five prostate and five head-and-neck clinical cases, each with one full gantry rotation without any couch/collimator rotations. High quality VMAT plans have been generated for all ten cases with extremely high efficiency. It takes only 5-8 min on CPU (MATLAB code on an Intel Xeon 2.27 GHz CPU) and 18-31 s on GPU (CUDA code on an NVIDIA Tesla C1060 GPU card) to generate such plans. Conclusions: The authors have developed an aperture-based VMAT optimization algorithm which can generate clinically deliverable high quality treatment plans at very high efficiency.« less
Ultrafast treatment plan optimization for volumetric modulated arc therapy (VMAT).
Men, Chunhua; Romeijn, H Edwin; Jia, Xun; Jiang, Steve B
2010-11-01
To develop a novel aperture-based algorithm for volumetric modulated are therapy (VMAT) treatment plan optimization with high quality and high efficiency. The VMAT optimization problem is formulated as a large-scale convex programming problem solved by a column generation approach. The authors consider a cost function consisting two terms, the first enforcing a desired dose distribution and the second guaranteeing a smooth dose rate variation between successive gantry angles. A gantry rotation is discretized into 180 beam angles and for each beam angle, only one MLC aperture is allowed. The apertures are generated one by one in a sequential way. At each iteration of the column generation method, a deliverable MLC aperture is generated for one of the unoccupied beam angles by solving a subproblem with the consideration of MLC mechanic constraints. A subsequent master problem is then solved to determine the dose rate at all currently generated apertures by minimizing the cost function. When all 180 beam angles are occupied, the optimization completes, yielding a set of deliverable apertures and associated dose rates that produce a high quality plan. The algorithm was preliminarily tested on five prostate and five head-and-neck clinical cases, each with one full gantry rotation without any couch/collimator rotations. High quality VMAT plans have been generated for all ten cases with extremely high efficiency. It takes only 5-8 min on CPU (MATLAB code on an Intel Xeon 2.27 GHz CPU) and 18-31 s on GPU (CUDA code on an NVIDIA Tesla C1060 GPU card) to generate such plans. The authors have developed an aperture-based VMAT optimization algorithm which can generate clinically deliverable high quality treatment plans at very high efficiency.
Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications
NASA Astrophysics Data System (ADS)
Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu
2007-11-01
Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.
Verification and Validation in a Rapid Software Development Process
NASA Technical Reports Server (NTRS)
Callahan, John R.; Easterbrook, Steve M.
1997-01-01
The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.
Mixed Element Type Unstructured Grid Generation for Viscous Flow Applications
NASA Technical Reports Server (NTRS)
Marcum, David L.; Gaither, J. Adam
2000-01-01
A procedure is presented for efficient generation of high-quality unstructured grids suitable for CFD simulation of high Reynolds number viscous flow fields. Layers of anisotropic elements are generated by advancing along prescribed normals from solid boundaries. The points are generated such that either pentahedral or tetrahedral elements with an implied connectivity can be be directly recovered. As points are generated they are temporarily attached to a volume triangulation of the boundary points. This triangulation allows efficient local search algorithms to be used when checking merging layers, The existing advancing-front/local-reconnection procedure is used to generate isotropic elements outside of the anisotropic region. Results are presented for a variety of applications. The results demonstrate that high-quality anisotropic unstructured grids can be efficiently and consistently generated for complex configurations.
Lucyk, Kelsey; Tang, Karen; Quan, Hude
2017-11-22
Administrative health data are increasingly used for research and surveillance to inform decision-making because of its large sample sizes, geographic coverage, comprehensivity, and possibility for longitudinal follow-up. Within Canadian provinces, individuals are assigned unique personal health numbers that allow for linkage of administrative health records in that jurisdiction. It is therefore necessary to ensure that these data are of high quality, and that chart information is accurately coded to meet this end. Our objective is to explore the potential barriers that exist for high quality data coding through qualitative inquiry into the roles and responsibilities of medical chart coders. We conducted semi-structured interviews with 28 medical chart coders from Alberta, Canada. We used thematic analysis and open-coded each transcript to understand the process of administrative health data generation and identify barriers to its quality. The process of generating administrative health data is highly complex and involves a diverse workforce. As such, there are multiple points in this process that introduce challenges for high quality data. For coders, the main barriers to data quality occurred around chart documentation, variability in the interpretation of chart information, and high quota expectations. This study illustrates the complex nature of barriers to high quality coding, in the context of administrative data generation. The findings from this study may be of use to data users, researchers, and decision-makers who wish to better understand the limitations of their data or pursue interventions to improve data quality.
USDA-ARS?s Scientific Manuscript database
Using next-generation-sequencing technology to assess entire transcriptomes requires high quality starting RNA. Currently, RNA quality is routinely judged using automated microfluidic gel electrophoresis platforms and associated algorithms. Here we report that such automated methods generate false-n...
NASA Astrophysics Data System (ADS)
Sadat Hashemi, Somayeh; Ghavami Sabouri, Saeed; Khorsandi, Alireza
2018-04-01
We present a theoretical model in order to study the effect of a thermally loaded crystal on the quality of a second-harmonic (SH) beam generated in a high-power pumping regime. The model is provided based on using a particular structure of oven considered for MgO:PPsLT nonlinear crystal to compensate for the thermal de-phasing effect that as the pumping power reaches up to 50 W degrades the conversion efficiency and beam quality of the interacting beams. Hereupon, the quality of fundamental beam is involved in the modeling to investigate the final effect on the beam quality of generated SH beam. Beam quality evaluation is subsequently simulated using Hermite-Gaussian modal decomposition approach for a range of fundamental beam qualities varied from 1 to 3 and for different levels of input powers. To provide a meaningful comparison numerical simulation is correlated with real data deduced from a high-power SH generation (SHG) experimental device. It is found that when using the open-top oven scheme and fixing the fundamental M 2-factor at nearly 1, for a range of input powers changing from 15 to 30 W, the M 2-factor of SHG beam is degraded from 9% to 24%, respectively, confirming very good consistency with the reported experimental results.
Meyer, Mathias; Haubenreisser, Holger; Raupach, Rainer; Schmidt, Bernhard; Lietzmann, Florian; Leidecker, Christianne; Allmendinger, Thomas; Flohr, Thomas; Schad, Lothar R; Schoenberg, Stefan O; Henzler, Thomas
2015-01-01
To prospectively evaluate radiation dose and image quality of a third generation dual-source CT (DSCT) without z-axis filter behind the patient for temporal bone CT. Forty-five patients were either examined on a first, second, or third generation DSCT in an ultra-high-resolution (UHR) temporal bone-imaging mode. On the third generation DSCT system, the tighter focal spot of 0.2 mm(2) removes the necessity for an additional z-axis-filter, leading to an improved z-axis radiation dose efficiency. Images of 0.4 mm were reconstructed using standard filtered-back-projection or iterative reconstruction (IR) technique for previous generations of DSCT and a novel IR algorithm for the third generation DSCT. Radiation dose and image quality were compared between the three DSCT systems. The statistically significantly highest subjective and objective image quality was evaluated for the third generation DSCT when compared to the first or second generation DSCT systems (all p < 0.05). Total effective dose was 63%/39% lower for the third generation examination as compared to the first and second generation DSCT. Temporal bone imaging without z-axis-UHR-filter and a novel third generation IR algorithm allows for significantly higher image quality while lowering effective dose when compared to the first two generations of DSCTs. • Omitting the z-axis-filter allows a reduction in radiation dose of 50% • A smaller focal spot of 0.2 mm (2) significantly improves spatial resolution • Ultra-high-resolution temporal-bone-CT helps to gain diagnostic information of the middle/inner ear.
2016-09-01
TECHNICAL REPORT 3046 September 2016 GENERATION OF QUALITY PULSES FOR CONTROL OF QUBIT/QUANTUM MEMORY SPIN STATES: EXPERIMENTAL AND SIMULATION...nuclear spin states of qubits/quantum memory applicable to semiconductor, superconductor, ionic, and superconductor-ionic hybrid technologies. As the...pulse quality and need for development of single pulses with very high quality will impact directly the coherence time of the qubit/ memory , we present
Multi-Resolution Unstructured Grid-Generation for Geophysical Applications on the Sphere
NASA Technical Reports Server (NTRS)
Engwirda, Darren
2015-01-01
An algorithm for the generation of non-uniform unstructured grids on ellipsoidal geometries is described. This technique is designed to generate high quality triangular and polygonal meshes appropriate for general circulation modelling on the sphere, including applications to atmospheric and ocean simulation, and numerical weather predication. Using a recently developed Frontal-Delaunay-refinement technique, a method for the construction of high-quality unstructured ellipsoidal Delaunay triangulations is introduced. A dual polygonal grid, derived from the associated Voronoi diagram, is also optionally generated as a by-product. Compared to existing techniques, it is shown that the Frontal-Delaunay approach typically produces grids with near-optimal element quality and smooth grading characteristics, while imposing relatively low computational expense. Initial results are presented for a selection of uniform and non-uniform ellipsoidal grids appropriate for large-scale geophysical applications. The use of user-defined mesh-sizing functions to generate smoothly graded, non-uniform grids is discussed.
A high-throughput Sanger strategy for human mitochondrial genome sequencing
2013-01-01
Background A population reference database of complete human mitochondrial genome (mtGenome) sequences is needed to enable the use of mitochondrial DNA (mtDNA) coding region data in forensic casework applications. However, the development of entire mtGenome haplotypes to forensic data quality standards is difficult and laborious. A Sanger-based amplification and sequencing strategy that is designed for automated processing, yet routinely produces high quality sequences, is needed to facilitate high-volume production of these mtGenome data sets. Results We developed a robust 8-amplicon Sanger sequencing strategy that regularly produces complete, forensic-quality mtGenome haplotypes in the first pass of data generation. The protocol works equally well on samples representing diverse mtDNA haplogroups and DNA input quantities ranging from 50 pg to 1 ng, and can be applied to specimens of varying DNA quality. The complete workflow was specifically designed for implementation on robotic instrumentation, which increases throughput and reduces both the opportunities for error inherent to manual processing and the cost of generating full mtGenome sequences. Conclusions The described strategy will assist efforts to generate complete mtGenome haplotypes which meet the highest data quality expectations for forensic genetic and other applications. Additionally, high-quality data produced using this protocol can be used to assess mtDNA data developed using newer technologies and chemistries. Further, the amplification strategy can be used to enrich for mtDNA as a first step in sample preparation for targeted next-generation sequencing. PMID:24341507
SUGAR: graphical user interface-based data refiner for high-throughput DNA sequencing.
Sato, Yukuto; Kojima, Kaname; Nariai, Naoki; Yamaguchi-Kabata, Yumi; Kawai, Yosuke; Takahashi, Mamoru; Mimori, Takahiro; Nagasaki, Masao
2014-08-08
Next-generation sequencers (NGSs) have become one of the main tools for current biology. To obtain useful insights from the NGS data, it is essential to control low-quality portions of the data affected by technical errors such as air bubbles in sequencing fluidics. We develop a software SUGAR (subtile-based GUI-assisted refiner) which can handle ultra-high-throughput data with user-friendly graphical user interface (GUI) and interactive analysis capability. The SUGAR generates high-resolution quality heatmaps of the flowcell, enabling users to find possible signals of technical errors during the sequencing. The sequencing data generated from the error-affected regions of a flowcell can be selectively removed by automated analysis or GUI-assisted operations implemented in the SUGAR. The automated data-cleaning function based on sequence read quality (Phred) scores was applied to a public whole human genome sequencing data and we proved the overall mapping quality was improved. The detailed data evaluation and cleaning enabled by SUGAR would reduce technical problems in sequence read mapping, improving subsequent variant analysis that require high-quality sequence data and mapping results. Therefore, the software will be especially useful to control the quality of variant calls to the low population cells, e.g., cancers, in a sample with technical errors of sequencing procedures.
ERIC Educational Resources Information Center
Oppenheim, Jerrold; MacGregor, Theo
Noting that high-quality preschool increases the ability of low-income children to profit from elementary and secondary education, thereby increasing their high school graduation rate and generating economic and other returns for taxpayers, this report articulates and analyzes the economic benefits of providing a high-quality preschool education…
NASA Astrophysics Data System (ADS)
Tukhareli, V. D.; Tukhareli, A. V.; Cherednichenko, T. F.
2017-11-01
The creation of composite materials for generating structural elements with the desired properties has always been and still remains relevant. The basis of a modern concrete technology is the creation of a high-quality artificial stone characterized by low defectiveness and structure stability. Improving the quality of concrete compositions can be achieved by using chemical admixtures from local raw materials which is a very promising task of modern materials’ science for creation of a new generation of concretes. The new generation concretes are high-tech, high-quality, multicomponent concrete mixes and compositions with admixtures that preserve the required properties in service under all operating conditions. The growing complexity of concrete caused by systemic effects that allow you to control the structure formation at all stages of the technology ensures the obtaining of composites with "directional" quality, compositions, structure and properties. The possibility to use the organic fraction of oil refining as a multifunctional hydrophobic-plasticizing admixture in the effective cement concrete is examined.
Micalastic high-voltage insulation: Design features and experience
NASA Astrophysics Data System (ADS)
Wichmann, A.
1981-12-01
High-quality mica, carefully selected epoxy resins and a well-matched vacuum/pressure impregnation process determine the characteristics of the MICALASTIC insulation for large turbine-generators. Logical development and process manufacturing quality control have led to an insulation system of high quality and operating reliability. The first winding of a turbine-generator being impregnated and cured under vacuum with solvent-free synthetic resin in 1958 was designed for 10.5 kV rated voltage. Ever since, Siemens AG and Kraftwerk Union AG have used this type of insulation for all direct-cooled windings and also for an increasing number of indirect-cooled windings. At present, 240 turbine-generators with a total of more than 115,000 MVA output have been built. Since 1960, this insulation system has been registered for Siemens AG under the trade name MICALASTIC. The stator windings of the largest, single-shaft generators to date, rated 1560 MVA, 27 kV, has been built with MICALASTIC insulation.
ERIC Educational Resources Information Center
Hrin, Tamara; Milenkovic, Dušica; Segedinac, Mirjana
2018-01-01
The importance of well elaborated cognitive structures in a science knowledge domain has been noted in many studies. Therefore, the main aim of this particular study was to employ a new diagrammatic assessment approach, students' generated systemic synthesis questions (SSynQs), to evaluate and compare the quality of high school students' and…
NASA Astrophysics Data System (ADS)
Ota, Junko; Umehara, Kensuke; Ishimaru, Naoki; Ohno, Shunsuke; Okamoto, Kentaro; Suzuki, Takanori; Shirai, Naoki; Ishida, Takayuki
2017-02-01
As the capability of high-resolution displays grows, high-resolution images are often required in Computed Tomography (CT). However, acquiring high-resolution images takes a higher radiation dose and a longer scanning time. In this study, we applied the Sparse-coding-based Super-Resolution (ScSR) method to generate high-resolution images without increasing the radiation dose. We prepared the over-complete dictionary learned the mapping between low- and highresolution patches and seek a sparse representation of each patch of the low-resolution input. These coefficients were used to generate the high-resolution output. For evaluation, 44 CT cases were used as the test dataset. We up-sampled images up to 2 or 4 times and compared the image quality of the ScSR scheme and bilinear and bicubic interpolations, which are the traditional interpolation schemes. We also compared the image quality of three learning datasets. A total of 45 CT images, 91 non-medical images, and 93 chest radiographs were used for dictionary preparation respectively. The image quality was evaluated by measuring peak signal-to-noise ratio (PSNR) and structure similarity (SSIM). The differences of PSNRs and SSIMs between the ScSR method and interpolation methods were statistically significant. Visual assessment confirmed that the ScSR method generated a high-resolution image with sharpness, whereas conventional interpolation methods generated over-smoothed images. To compare three different training datasets, there were no significance between the CT, the CXR and non-medical datasets. These results suggest that the ScSR provides a robust approach for application of up-sampling CT images and yields substantial high image quality of extended images in CT.
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
3D conditional generative adversarial networks for high-quality PET image estimation at low dose.
Wang, Yan; Yu, Biting; Wang, Lei; Zu, Chen; Lalush, David S; Lin, Weili; Wu, Xi; Zhou, Jiliu; Shen, Dinggang; Zhou, Luping
2018-07-01
Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures. Copyright © 2018 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Qiong; Wang, Wen-xi; Zhu, Ke-ren; Zhang, Chao-yong; Rao, Yun-qing
2014-11-01
Mixed-model assembly line sequencing is significant in reducing the production time and overall cost of production. To improve production efficiency, a mathematical model aiming simultaneously to minimize overtime, idle time and total set-up costs is developed. To obtain high-quality and stable solutions, an advanced scatter search approach is proposed. In the proposed algorithm, a new diversification generation method based on a genetic algorithm is presented to generate a set of potentially diverse and high-quality initial solutions. Many methods, including reference set update, subset generation, solution combination and improvement methods, are designed to maintain the diversification of populations and to obtain high-quality ideal solutions. The proposed model and algorithm are applied and validated in a case company. The results indicate that the proposed advanced scatter search approach is significant for mixed-model assembly line sequencing in this company.
Darwin Assembly: fast, efficient, multi-site bespoke mutagenesis
Cozens, Christopher
2018-01-01
Abstract Engineering proteins for designer functions and biotechnological applications almost invariably requires (or at least benefits from) multiple mutations to non-contiguous residues. Several methods for multiple site-directed mutagenesis exist, but there remains a need for fast and simple methods to efficiently introduce such mutations – particularly for generating large, high quality libraries for directed evolution. Here, we present Darwin Assembly, which can deliver high quality libraries of >108 transformants, targeting multiple (>10) distal sites with minimal wild-type contamination (<0.25% of total population) and which takes a single working day from purified plasmid to library transformation. We demonstrate its efficacy with whole gene codon reassignment of chloramphenicol acetyl transferase, mutating 19 codons in a single reaction in KOD DNA polymerase and generating high quality, multiple-site libraries in T7 RNA polymerase and Tgo DNA polymerase. Darwin Assembly uses commercially available enzymes, can be readily automated, and offers a cost-effective route to highly complex and customizable library generation. PMID:29409059
Chirped-pulse coherent-OTDR with predistortion
NASA Astrophysics Data System (ADS)
Xiong, Ji; Jiang, Jialin; Wu, Yue; Chen, Yongxiang; Xie, Lianlian; Fu, Yun; Wang, Zinan
2018-03-01
In this paper, a novel method for generating high-quality chirped pulses with IQ modulator is studied theoretically and experimentally, which is a crucial building block for high-performance coherent optical time-domain reflectometry (COTDR). In order to compensate the nonlinearity of the modulator transfer function, we present a predistortion technique for chirped-pulse coherent optical time-domain reflectometry (CP-COTDR), the arcsin predistortion method and the single sideband with a suppressed carrier analog modulation used to generate the high quality chirped optical pulse. The high order sidebands, due to the large amplitude of the modulation signal and the nonlinear transfer function of the IQ modulator, can be relieved by the predistortion process, which means the power and the quality of the generated chirped pulse has been improved. In the experiment, this method increases the peak power of the chirped pulse by 4.2 dB compared to the case without predistortion process, as for the CP-COTDR system, this method increases the signal-to-noise ratio of the demodulated phase variation by 6.3 dB.
Air quality impacts of projections of natural gas-fired distributed generation
NASA Astrophysics Data System (ADS)
Horne, Jeremy R.; Carreras-Sospedra, Marc; Dabdub, Donald; Lemar, Paul; Nopmongcol, Uarporn; Shah, Tejas; Yarwood, Greg; Young, David; Shaw, Stephanie L.; Knipping, Eladio M.
2017-11-01
This study assesses the potential impacts on emissions and air quality from the increased adoption of natural gas-fired distributed generation of electricity (DG), including displacement of power from central power generation, in the contiguous United States. The study includes four major tasks: (1) modeling of distributed generation market penetration; (2) modeling of central power generation systems; (3) modeling of spatially and temporally resolved emissions; and (4) photochemical grid modeling to evaluate the potential air quality impacts of increased DG penetration, which includes both power-only DG and combined heat and power (CHP) units, for 2030. Low and high DG penetration scenarios estimate the largest penetration of future DG units in three regions - New England, New York, and California. Projections of DG penetration in the contiguous United States estimate 6.3 GW and 24 GW of market adoption in 2030 for the low DG penetration and high DG penetration scenarios, respectively. High DG penetration (all of which is natural gas-fired) serves to offset 8 GW of new natural gas combined cycle (NGCC) units, and 19 GW of solar photovoltaic (PV) installations by 2030. In all scenarios, air quality in the central United States and the northwest remains unaffected as there is little to no DG penetration in those states. California and several states in the northeast are the most impacted by emissions from DG units. Peak increases in maximum daily 8-h average ozone concentrations exceed 5 ppb, which may impede attainment of ambient air quality standards. Overall, air quality impacts from DG vary greatly based on meteorological conditions, proximity to emissions sources, the number and type of DG installations, and the emissions factors used for DG units.
Automatic structured grid generation using Gridgen (some restrictions apply)
NASA Technical Reports Server (NTRS)
Chawner, John R.; Steinbrenner, John P.
1995-01-01
The authors have noticed in the recent grid generation literature an emphasis on the automation of structured grid generation. The motivation behind such work is clear; grid generation is easily the most despised task in the grid-analyze-visualize triad of computational analysis (CA). However, because grid generation is closely coupled to both the design and analysis software and because quantitative measures of grid quality are lacking, 'push button' grid generation usually results in a compromise between speed, control, and quality. Overt emphasis on automation obscures the substantive issues of providing users with flexible tools for generating and modifying high quality grids in a design environment. In support of this paper's tongue-in-cheek title, many features of the Gridgen software are described. Gridgen is by no stretch of the imagination an automatic grid generator. Despite this fact, the code does utilize many automation techniques that permit interesting regenerative features.
NASA Astrophysics Data System (ADS)
Shaw, Amelia R.; Smith Sawyer, Heather; LeBoeuf, Eugene J.; McDonald, Mark P.; Hadjerioua, Boualem
2017-11-01
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2 is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. The reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.
Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.; ...
2017-10-24
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Amelia R.; Sawyer, Heather Smith; LeBoeuf, Eugene J.
Hydropower operations optimization subject to environmental constraints is limited by challenges associated with dimensionality and spatial and temporal resolution. The need for high-fidelity hydrodynamic and water quality models within optimization schemes is driven by improved computational capabilities, increased requirements to meet specific points of compliance with greater resolution, and the need to optimize operations of not just single reservoirs but systems of reservoirs. This study describes an important advancement for computing hourly power generation schemes for a hydropower reservoir using high-fidelity models, surrogate modeling techniques, and optimization methods. The predictive power of the high-fidelity hydrodynamic and water quality model CE-QUAL-W2more » is successfully emulated by an artificial neural network, then integrated into a genetic algorithm optimization approach to maximize hydropower generation subject to constraints on dam operations and water quality. This methodology is applied to a multipurpose reservoir near Nashville, Tennessee, USA. The model successfully reproduced high-fidelity reservoir information while enabling 6.8% and 6.6% increases in hydropower production value relative to actual operations for dissolved oxygen (DO) limits of 5 and 6 mg/L, respectively, while witnessing an expected decrease in power generation at more restrictive DO constraints. Exploration of simultaneous temperature and DO constraints revealed capability to address multiple water quality constraints at specified locations. Here, the reduced computational requirements of the new modeling approach demonstrated an ability to provide decision support for reservoir operations scheduling while maintaining high-fidelity hydrodynamic and water quality information as part of the optimization decision support routines.« less
Sharma, Davinder; Golla, Naresh; Singh, Dheer; Onteru, Suneel K
2018-03-01
The next-generation sequencing (NGS) based RNA sequencing (RNA-Seq) and transcriptome profiling offers an opportunity to unveil complex biological processes. Successful RNA-Seq and transcriptome profiling requires a large amount of high-quality RNA. However, NGS-quality RNA isolation is extremely difficult from recalcitrant adipose tissue (AT) with high lipid content and low cell numbers. Further, the amount and biochemical composition of AT lipid varies depending upon the animal species which can pose different degree of resistance to RNA extraction. Currently available approaches may work effectively in one species but can be almost unproductive in another species. Herein, we report a two step protocol for the extraction of NGS quality RNA from AT across a broad range of animal species. © 2017 Wiley Periodicals, Inc.
The World Meteorological Organization’s (WMO) Global Atmosphere Watch (GAW) Programme coordinates high-quality observations of atmospheric composition from global to local scales with the aim to drive high-quality and high-impact science while co-producing a new generation of pro...
Fast and accurate de novo genome assembly from long uncorrected reads
Vaser, Robert; Sović, Ivan; Nagarajan, Niranjan
2017-01-01
The assembly of long reads from Pacific Biosciences and Oxford Nanopore Technologies typically requires resource-intensive error-correction and consensus-generation steps to obtain high-quality assemblies. We show that the error-correction step can be omitted and that high-quality consensus sequences can be generated efficiently with a SIMD-accelerated, partial-order alignment–based, stand-alone consensus module called Racon. Based on tests with PacBio and Oxford Nanopore data sets, we show that Racon coupled with miniasm enables consensus genomes with similar or better quality than state-of-the-art methods while being an order of magnitude faster. PMID:28100585
Schilling, Birgit; Gibson, Bradford W.; Hunter, Christie L.
2017-01-01
Data-independent acquisition is a powerful mass spectrometry technique that enables comprehensive MS and MS/MS analysis of all detectable species, providing an information rich data file that can be mined deeply. Here, we describe how to acquire high-quality SWATH® Acquisition data to be used for large quantitative proteomic studies. We specifically focus on using variable sized Q1 windows for acquisition of MS/MS data for generating higher specificity quantitative data. PMID:28188533
NASA Astrophysics Data System (ADS)
Kirmani, Sheeraz; Kumar, Brijesh
2018-01-01
“Electric Power Quality (EPQ) is a term that refers to maintaining the near sinusoidal waveform of power distribution bus voltages and currents at rated magnitude and frequency”. Today customers are more aware of the seriousness that the power quality possesses, this prompt the utilities to assure good quality of power to their customer. The power quality is basically customer centric. Increased focus of utilities toward maintaining reliable power supply by employing power quality improvement tools has reduced the power outages and black out considerably. Good power quality is the characteristic of reliable power supply. Low power factor, harmonic pollution, load imbalance, fast voltage variations are some common parameters which are used to define the power quality. If the power quality issues are not checked i.e. the parameters that define power quality doesn't fall within the predefined standards than it will lead into high electricity bill, high running cost in industries, malfunctioning of equipments, challenges in connecting renewable. Capacitor banks, FACTS devices, harmonic filters, SVC’s (static voltage compensators), STATCOM (Static-Compensator) are the solutions to achieve the power quality. The performance of Wind turbine generators is affected by poor quality power, at the same time these wind power generating plant affects the power quality negatively. This paper presents the STATCOM-BESS (battery energy storage system) system and studies its impact on the power quality in a system which consists of wind turbine generator, non linear load, hysteresis controller for controlling the operation of STATCOM and grid. The model is simulated in the MATLAB/Simulink. This scheme mitigates the power quality issues, improves voltage profile and also reduces harmonic distortion of the waveforms. BESS level out the imbalances caused in real power due to intermittent nature of wind power available due to varying wind speeds.
PMG: online generation of high-quality molecular pictures and storyboarded animations
Autin, Ludovic; Tufféry, Pierre
2007-01-01
The Protein Movie Generator (PMG) is an online service able to generate high-quality pictures and animations for which one can then define simple storyboards. The PMG can therefore efficiently illustrate concepts such as molecular motion or formation/dissociation of complexes. Emphasis is put on the simplicity of animation generation. Rendering is achieved using Dino coupled to POV-Ray. In order to produce highly informative images, the PMG includes capabilities of using different molecular representations at the same time to highlight particular molecular features. Moreover, sophisticated rendering concepts including scene definition, as well as modeling light and materials are available. The PMG accepts Protein Data Bank (PDB) files as input, which may include series of models or molecular dynamics trajectories and produces images or movies under various formats. PMG can be accessed at http://bioserv.rpbs.jussieu.fr/PMG.html. PMID:17478496
Wind resource quality affected by high levels of renewables
Diakov, Victor
2015-06-17
For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a givenmore » level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.« less
High-Throughput Fabrication of Quality Nanofibers Using a Modified Free Surface Electrospinning.
Shao, Zhongbiao; Yu, Liang; Xu, Lan; Wang, Mingdi
2017-12-01
Based on bubble electrospinning (BE), a modified free surface electrospinning (MFSE) using a cone-shaped air nozzle combined with a solution reservoir made of copper tubes was presented to increase the production of quality nanofibers. In the MFSE process, sodium dodecyl benzene sulfonates (SDBS) were added in the electrospun solution to generate bubbles on a liquid surface. The effects of applied voltage and generated bubbles on the morphology and production of nanofibers were investigated experimentally and theoretically. The theoretical analysis results of the electric field were in good agreement with the experimental data and showed that the quality and production of nanofibers were improved with the increase of applied voltage, and the generated bubbles would decrease the quality and production of nanofibers.
High-Throughput Fabrication of Quality Nanofibers Using a Modified Free Surface Electrospinning
NASA Astrophysics Data System (ADS)
Shao, Zhongbiao; Yu, Liang; Xu, Lan; Wang, Mingdi
2017-07-01
Based on bubble electrospinning (BE), a modified free surface electrospinning (MFSE) using a cone-shaped air nozzle combined with a solution reservoir made of copper tubes was presented to increase the production of quality nanofibers. In the MFSE process, sodium dodecyl benzene sulfonates (SDBS) were added in the electrospun solution to generate bubbles on a liquid surface. The effects of applied voltage and generated bubbles on the morphology and production of nanofibers were investigated experimentally and theoretically. The theoretical analysis results of the electric field were in good agreement with the experimental data and showed that the quality and production of nanofibers were improved with the increase of applied voltage, and the generated bubbles would decrease the quality and production of nanofibers.
Analysis of quality raw data of second generation sequencers with Quality Assessment Software.
Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur
2011-04-18
Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.
Harvey-Knowles, Jacquelyn; Faw, Meara H
2018-04-01
Cancer caregivers often experience significant challenges in their motivation and ability to comfort cancer survivors, particularly in a spousal or romantic context. Spousal cancer caregivers have been known to report even greater levels of burden and distress than cancer sufferers, yet still take on the role of acting as an informal caregiver so they can attend to their partner's needs. The current study tested whether a theoretical model of supportive outcomes-the dual-process model of supportive communication-explained variations in cancer caregivers' motivation and ability to create high-quality support messages. The study also tested whether participant engagement with reflective journaling on supportive acts was associated with increased motivation or ability to generate high-quality support messages. Based upon the dual-process model, we posited that, following supportive journaling tasks, caregivers of spouses currently managing a cancer experience would report greater motivation but also greater difficulty in generating high-quality support messages, while individuals caring for a patient in remission would report lower motivation but greater ability to create high-quality support messages. Findings provided support for these assertions and suggested that reflective journaling tasks might be a useful tool for improving remission caregivers' ability to provide high-quality social support to survivors. Corresponding theoretical and applied implications are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daum, Christopher; Zane, Matthew; Han, James
2011-01-31
The U.S. Department of Energy (DOE) Joint Genome Institute's (JGI) Production Sequencing group is committed to the generation of high-quality genomic DNA sequence to support the mission areas of renewable energy generation, global carbon management, and environmental characterization and clean-up. Within the JGI's Production Sequencing group, a robust Illumina Genome Analyzer and HiSeq pipeline has been established. Optimization of the sesequencer pipelines has been ongoing with the aim of continual process improvement of the laboratory workflow, reducing operational costs and project cycle times to increases ample throughput, and improving the overall quality of the sequence generated. A sequence QC analysismore » pipeline has been implemented to automatically generate read and assembly level quality metrics. The foremost of these optimization projects, along with sequencing and operational strategies, throughput numbers, and sequencing quality results will be presented.« less
NASA Technical Reports Server (NTRS)
Tabib-Azar, M.; Akinwande, D.; Ponchak, George E.; LeClair, S. R.
1999-01-01
In this article we report the design, fabrication, and characterization of very high quality factor 10 GHz microstrip resonators on high-resistivity (high-rho) silicon substrates. Our experiments show that an external quality factor of over 13 000 can be achieved on microstripline resonators on high-rho silicon substrates. Such a high Q factor enables integration of arrays of previously reported evanescent microwave probe (EMP) on silicon cantilever beams. We also demonstrate that electron-hole pair recombination and generation lifetimes of silicon can be conveniently measured by illuminating the resonator using a pulsed light. Alternatively, the EMP was also used to nondestructively monitor excess carrier generation and recombination process in a semiconductor placed near the two-dimensional resonator.
1984-09-01
7D-Rt46 982 JOINT DOD VERSUS NAVY SPECIFIC LEAD GENERATION j/j ADVERTISING : COMPARISON OF..(U) J B FUGUR SCHOOL OF N BUSINESS DURHAM NC R C MOREY...REPORT I PEPIO0 COV9cO JOINT DOD VERSUS NAVY SPECIFIC LEAD GENERATION Technical Report ADVERTISING : Comparison of Conversion Rates to (0 Quality...block number) . Upper-Mental, High School Degree, enlistment contracts, national leads, Z Joint DOD advertising , Service Specific Advertising , conversion
Quantum Random Number Generation Using a Quanta Image Sensor
Amri, Emna; Felk, Yacine; Stucki, Damien; Ma, Jiaju; Fossum, Eric R.
2016-01-01
A new quantum random number generation method is proposed. The method is based on the randomness of the photon emission process and the single photon counting capability of the Quanta Image Sensor (QIS). It has the potential to generate high-quality random numbers with remarkable data output rate. In this paper, the principle of photon statistics and theory of entropy are discussed. Sample data were collected with QIS jot device, and its randomness quality was analyzed. The randomness assessment method and results are discussed. PMID:27367698
quality, cutting-edge genomic services and technologies in order to expand our understanding of disease high quality next generation sequencing and genotyping services to investigators working to discover issues as they relate to study design, data production and quality control. Completed studies encompass
Complementary DNA libraries: an overview.
Ying, Shao-Yao
2004-07-01
The generation of complete and full-length cDNA libraries for potential functional assays of specific gene sequences is essential for most molecules in biotechnology and biomedical research. The field of cDNA library generation has changed rapidly in the past 10 yr. This review presents an overview of the method available for the basic information of generating cDNA libraries, including the definition of the cDNA library, different kinds of cDNA libraries, difference between methods for cDNA library generation using conventional approaches and a novel strategy, and the quality of cDNA libraries. It is anticipated that the high-quality cDNA libraries so generated would facilitate studies involving genechips and the microarray, differential display, subtractive hybridization, gene cloning, and peptide library generation.
High Density Aerial Image Matching: State-Of and Future Prospects
NASA Astrophysics Data System (ADS)
Haala, N.; Cavegn, S.
2016-06-01
Ongoing innovations in matching algorithms are continuously improving the quality of geometric surface representations generated automatically from aerial images. This development motivated the launch of the joint ISPRS/EuroSDR project "Benchmark on High Density Aerial Image Matching", which aims on the evaluation of photogrammetric 3D data capture in view of the current developments in dense multi-view stereo-image matching. Originally, the test aimed on image based DSM computation from conventional aerial image flights for different landuse and image block configurations. The second phase then put an additional focus on high quality, high resolution 3D geometric data capture in complex urban areas. This includes both the extension of the test scenario to oblique aerial image flights as well as the generation of filtered point clouds as additional output of the respective multi-view reconstruction. The paper uses the preliminary outcomes of the benchmark to demonstrate the state-of-the-art in airborne image matching with a special focus of high quality geometric data capture in urban scenarios.
Instructor's Corner: Tips for Publishing and Reviewing Qualitative Studies in Applied Disciplines
ERIC Educational Resources Information Center
Storberg-Walker, Julia
2012-01-01
This "Instructor's Corner" describes a step forward on the journey to write, review, and publish high-quality qualitative research manuscripts. This article examines two existing perspectives on generating high-quality qualitative manuscripts and then compares and contrasts the different elements of each. First, an overview of Rocco's (2010) eight…
Stevens, Adrienne L.; Wilczynski, Nancy L.; McKibbon, K. Ann; Haynes, R. Brian
2001-01-01
Objective: To identify a journal subset that publishes reports of high quality studies and reviews relating to age-specific clinical specialties, such as pediatrics and geriatrics. Design: Handsearch of 172 journals using explicit criteria to determine methodologic quality for generating evidence for clinical practice. Main outcome measure: Frequency of high quality articles and their top yielding journals. Results: Between 17% and 33% of articles published in age-specific specialties are of high quality for clinical use. Top yielding journals for the specialties ranged from 16 to 130. Conclusion: Handsearch of the clinical literature for the year 2000 reveals that high quality articles for some age-specific specialties are concentrated in a small subset of journals (eg, obstetrics), whereas articles for other specialties are widely scattered among a large number of journals (eg, adult medicine).
User manual for two simple postscript output FORTRAN plotting routines
NASA Technical Reports Server (NTRS)
Nguyen, T. X.
1991-01-01
Graphics is one of the important tools in engineering analysis and design. However, plotting routines that generate output on high quality laser printers normally come in graphics packages, which tend to be expensive and system dependent. These factors become important for small computer systems or desktop computers, especially when only some form of a simple plotting routine is sufficient. With the Postscript language becoming popular, there are more and more Postscript laser printers now available. Simple, versatile, low cost plotting routines that can generate output on high quality laser printers are needed and standard FORTRAN language plotting routines using output in Postscript language seems logical. The purpose here is to explain two simple FORTRAN plotting routines that generate output in Postscript language.
Automatic Generation of High Quality DSM Based on IRS-P5 Cartosat-1 Stereo Data
NASA Astrophysics Data System (ADS)
d'Angelo, Pablo; Uttenthaler, Andreas; Carl, Sebastian; Barner, Frithjof; Reinartz, Peter
2010-12-01
IRS-P5 Cartosat-1 high resolution stereo satellite imagery is well suited for the creation of digital surface models (DSM). A system for highly automated and operational DSM and orthoimage generation based on IRS-P5 Cartosat-1 imagery is presented, with an emphasis on automated processing and product quality. The proposed system processes IRS-P5 level-1 stereo scenes using the rational polynomial coefficients (RPC) universal sensor model. The described method uses an RPC correction based on DSM alignment instead of using reference images with a lower lateral accuracy, this results in improved geolocation of the DSMs and orthoimages. Following RPC correction, highly detailed DSMs with 5 m grid spacing are derived using Semiglobal Matching. The proposed method is part of an operational Cartosat-1 processor for the generation of a high resolution DSM. Evaluation of 18 scenes against independent ground truth measurements indicates a mean lateral error (CE90) of 6.7 meters and a mean vertical accuracy (LE90) of 5.1 meters.
589 nm sum-frequency generation laser for the LGS/AO of Subaru Telescope
NASA Astrophysics Data System (ADS)
Saito, Yoshihiko; Hayano, Yutaka; Saito, Norihito; Akagawa, Kazuyuki; Takazawa, Akira; Kato, Mayumi; Ito, Meguru; Colley, Stephen; Dinkins, Matthew; Eldred, Michael; Golota, Taras; Guyon, Olivier; Hattori, Masayuki; Oya, Shin; Watanabe, Makoto; Takami, Hideki; Iye, Masanori; Wada, Satoshi
2006-06-01
We developed a high power and high beam quality 589 nm coherent light source by sum-frequency generation in order to utilize it as a laser guide star at the Subaru telescope. The sum-frequency generation is a nonlinear frequency conversion in which two mode-locked Nd:YAG lasers oscillating at 1064 and 1319 nm mix in a nonlinear crystal to generate a wave at the sum frequency. We achieved the qualities required for the laser guide star. The power of laser is reached to 4.5 W mixing 15.65 W at 1064 nm and 4.99 W at 1319 nm when the wavelength is adjusted to 589.159 nm. The wavelength is controllable in accuracy of 0.1 pm from 589.060 and 589.170 nm. The stability of the power holds within 1.3% during seven hours operation. The transverse mode of the beam is the TEM 00 and M2 of the beam is smaller than 1.2. We achieved these qualities by the following technical sources; (1) simple construction of the oscillator for high beam quality, (2) synchronization of mode-locked pulses at 1064 and 1319 nm by the control of phase difference between two radio frequencies fed to acousto-optic mode lockers, (3) precise tunability of wavelength and spectral band width, and (4) proper selection of nonlinear optical crystal. We report in this paper how we built up each technical source and how we combined those.
Full-color large-scaled computer-generated holograms using RGB color filters.
Tsuchiyama, Yasuhiro; Matsushima, Kyoji
2017-02-06
A technique using RGB color filters is proposed for creating high-quality full-color computer-generated holograms (CGHs). The fringe of these CGHs is composed of more than a billion pixels. The CGHs reconstruct full-parallax three-dimensional color images with a deep sensation of depth caused by natural motion parallax. The simulation technique as well as the principle and challenges of high-quality full-color reconstruction are presented to address the design of filter properties suitable for large-scaled CGHs. Optical reconstructions of actual fabricated full-color CGHs are demonstrated in order to verify the proposed techniques.
NASA Astrophysics Data System (ADS)
Kelley, Jay Paul
As the Navy's demands for high power transient loads evolves, so too does the need for alternative energy sources to back-up the more traditional power generation. Such applications in need of support include electrical grid backup and directed energy weapon systems such as electromagnetic launchers, laser systems, and high power microwave generators, among others. Among the alternative generation sources receiving considerable attention are energy storage devices such as rechargeable electrochemical batteries and capacitors. In such applications as those mentioned above, these energy storage devices offer the ability to serve a dual role as both a power source to the various loads as well high power loads themselves to the continual generation when the high power transient loads are in periods of downtime. With the recent developments in electrochemical energy storage, lithium-ion batteries (LIBs) seem like the obvious choice, but previous research has shown that the elevated rates of charging can be detrimental to both the cycle life and the operational life span of the device. In order to preserve the batteries, their charge rate must be limited. One proposed method to accomplish the dual role task mentioned above, while preserving the life of the batteries, is by combining high energy density LIBs with high power density electric double layer capacitors (EDLCs) or lithium-ion capacitors (LICs) using controllable power electronics to adjust the flow of power to and from each device. Such a configuration is typically referred to as hybrid energy storage module (HESM). While shipboard generators start up, the combined high energy density and high power density of the HESM provides the capability to source critical loads for an extended period of time at the high rates they demand. Once the generator is operationally efficient, the HESM can act as a high energy reservoir to harvest the energy from the generator while the loads are in short periods of inactivity. This enables the generator to maintain its operation at levels of high efficiency thereby increasing the power quality of the AC bus. The work discussed here is aimed at evaluating how the use of energy storage impacts the power quality on MicroGrid's AC bus when high rate DC and AC loads are sourced simultaneously. Also HESM has been developed and evaluated as a mean to optimizing both the power and energy density of the energy storage installed.
Microwave plasma assisted supersonic gas jet deposition of thin film materials
Schmitt, III, Jerome J.; Halpern, Bret L.
1993-01-01
An apparatus for fabricating thin film materials utilizing high speed gas dynamics relies on supersonic free jets of carrier gas to transport depositing vapor species generated in a microwave discharge to the surface of a prepared substrate where the vapor deposits to form a thin film. The present invention generates high rates of deposition and thin films of unforeseen high quality at low temperatures.
Generation and applications of an ultrahigh-fidelity four-photon Greenberger-Horne-Zeilinger state.
Zhang, Chao; Huang, Yun-Feng; Zhang, Cheng-Jie; Wang, Jian; Liu, Bi-Heng; Li, Chuan-Feng; Guo, Guang-Can
2016-11-28
High-quality entangled photon pairs generated via spontaneous parametric down-conversion have made great contributions to the modern quantum information science and the fundamental tests of quantum mechanics. However, the quality of the entangled states decreases sharply when moving from biphoton to multiphoton experiments, mainly due to the lack of interactions between photons. Here, for the first time, we generate a four-photon Greenberger-Horne-Zeilinger state with a fidelity of 98%, which is even comparable to the best fidelity of biphoton entangled states. Thus, it enables us to demonstrate an ultrahigh-fidelity entanglement swapping-the key ingredient in various quantum information tasks. Our results push the fidelity of multiphoton entanglement generation to a new level and would be useful in some demanding tasks, e.g., we successfully demonstrate the genuine multipartite nonlocality of the observed state in the nonsignaling scenario by violating a novel Hardy-like inequality, which requires very high state-fidelity.
2012-01-01
Background Genetic mapping and QTL detection are powerful methodologies in plant improvement and breeding. Construction of a high-density and high-quality genetic map would be of great benefit in the production of superior grapes to meet human demand. High throughput and low cost of the recently developed next generation sequencing (NGS) technology have resulted in its wide application in genome research. Sequencing restriction-site associated DNA (RAD) might be an efficient strategy to simplify genotyping. Combining NGS with RAD has proven to be powerful for single nucleotide polymorphism (SNP) marker development. Results An F1 population of 100 individual plants was developed. In-silico digestion-site prediction was used to select an appropriate restriction enzyme for construction of a RAD sequencing library. Next generation RAD sequencing was applied to genotype the F1 population and its parents. Applying a cluster strategy for SNP modulation, a total of 1,814 high-quality SNP markers were developed: 1,121 of these were mapped to the female genetic map, 759 to the male map, and 1,646 to the integrated map. A comparison of the genetic maps to the published Vitis vinifera genome revealed both conservation and variations. Conclusions The applicability of next generation RAD sequencing for genotyping a grape F1 population was demonstrated, leading to the successful development of a genetic map with high density and quality using our designed SNP markers. Detailed analysis revealed that this newly developed genetic map can be used for a variety of genome investigations, such as QTL detection, sequence assembly and genome comparison. PMID:22908993
Engineering High Assurance Distributed Cyber Physical Systems
2015-01-15
decisions: number of interacting agents and co-dependent decisions made in real-time without causing interference . To engineer a high assurance DART...environment specification, architecture definition, domain-specific languages, design patterns, code - generation, analysis, test-generation, and simulation...include synchronization between the models and source code , debugging at the model level, expression of the design intent, and quality of service
NASA Astrophysics Data System (ADS)
Gidey, Amanuel
2018-06-01
Determining suitability and vulnerability of groundwater quality for irrigation use is a key alarm and first aid for careful management of groundwater resources to diminish the impacts on irrigation. This study was conducted to determine the overall suitability of groundwater quality for irrigation use and to generate their spatial distribution maps in Elala catchment, Northern Ethiopia. Thirty-nine groundwater samples were collected to analyze and map the water quality variables. Atomic absorption spectrophotometer, ultraviolet spectrophotometer, titration and calculation methods were used for laboratory groundwater quality analysis. Arc GIS, geospatial analysis tools, semivariogram model types and interpolation methods were used to generate geospatial distribution maps. Twelve and eight water quality variables were used to produce weighted overlay and irrigation water quality index models, respectively. Root-mean-square error, mean square error, absolute square error, mean error, root-mean-square standardized error, measured values versus predicted values were used for cross-validation. The overall weighted overlay model result showed that 146 km2 areas are highly suitable, 135 km2 moderately suitable and 60 km2 area unsuitable for irrigation use. The result of irrigation water quality index confirms 10.26% with no restriction, 23.08% with low restriction, 20.51% with moderate restriction, 15.38% with high restriction and 30.76% with the severe restriction for irrigation use. GIS and irrigation water quality index are better methods for irrigation water resources management to achieve a full yield irrigation production to improve food security and to sustain it for a long period, to avoid the possibility of increasing environmental problems for the future generation.
Microwave plasma assisted supersonic gas jet deposition of thin film materials
Schmitt, J.J. III; Halpern, B.L.
1993-10-26
An apparatus for fabricating thin film materials utilizing high speed gas dynamics relies on supersonic free jets of carrier gas to transport depositing vapor species generated in a microwave discharge to the surface of a prepared substrate where the vapor deposits to form a thin film. The present invention generates high rates of deposition and thin films of unforeseen high quality at low temperatures. 5 figures.
Kim, S-J; Kim, D-K; Kang, D-H
2016-04-01
We investigated and compared the efficacy of a new apparatus for detaching micro-organisms from meat samples. The efficacy of Spindle and stomacher in detaching micro-organisms from meat samples was evaluated. Also, evaluation of appropriateness of suspensions generated by both methods for carrying out molecular biological analysis was implemented. A nearly identical correlation and high R(2) were obtained between Spindle and stomacher in Aerobic Plate Count (APC), and no significant differences were observed in detachment of three major foodborne pathogens. The suspension generated by the Spindle showed lower turbidity and total protein concentration. Also, significantly different threshold cycles were observed in Real-time PCR analysis using suspensions generated by both methods. The Spindle shows nearly identical efficacy with stomacher treatment in detaching micro-organisms from meat samples. Furthermore, the high quality of suspensions generated by the Spindle, in terms of turbidity and total protein assay, allows for a lower threshold cycle than stomached suspension in Real-time PCR. The Spindle could be an alternative method for detaching micro-organisms, yielding a higher quality of suspensions which may be better suited for further molecular microbiological analysis. © 2016 The Society for Applied Microbiology.
Zhang, Xiaoyan; Kim, Daeseung; Shen, Shunyao; Yuan, Peng; Liu, Siting; Tang, Zhen; Zhang, Guangming; Zhou, Xiaobo; Gateno, Jaime
2017-01-01
Accurate surgical planning and prediction of craniomaxillofacial surgery outcome requires simulation of soft tissue changes following osteotomy. This can only be achieved by using an anatomically detailed facial soft tissue model. The current state-of-the-art of model generation is not appropriate to clinical applications due to the time-intensive nature of manual segmentation and volumetric mesh generation. The conventional patient-specific finite element (FE) mesh generation methods are to deform a template FE mesh to match the shape of a patient based on registration. However, these methods commonly produce element distortion. Additionally, the mesh density for patients depends on that of the template model. It could not be adjusted to conduct mesh density sensitivity analysis. In this study, we propose a new framework of patient-specific facial soft tissue FE mesh generation. The goal of the developed method is to efficiently generate a high-quality patient-specific hexahedral FE mesh with adjustable mesh density while preserving the accuracy in anatomical structure correspondence. Our FE mesh is generated by eFace template deformation followed by volumetric parametrization. First, the patient-specific anatomically detailed facial soft tissue model (including skin, mucosa, and muscles) is generated by deforming an eFace template model. The adaptation of the eFace template model is achieved by using a hybrid landmark-based morphing and dense surface fitting approach followed by a thin-plate spline interpolation. Then, high-quality hexahedral mesh is constructed by using volumetric parameterization. The user can control the resolution of hexahedron mesh to best reflect clinicians’ need. Our approach was validated using 30 patient models and 4 visible human datasets. The generated patient-specific FE mesh showed high surface matching accuracy, element quality, and internal structure matching accuracy. They can be directly and effectively used for clinical simulation of facial soft tissue change. PMID:29027022
Zhang, Xiaoyan; Kim, Daeseung; Shen, Shunyao; Yuan, Peng; Liu, Siting; Tang, Zhen; Zhang, Guangming; Zhou, Xiaobo; Gateno, Jaime; Liebschner, Michael A K; Xia, James J
2018-04-01
Accurate surgical planning and prediction of craniomaxillofacial surgery outcome requires simulation of soft tissue changes following osteotomy. This can only be achieved by using an anatomically detailed facial soft tissue model. The current state-of-the-art of model generation is not appropriate to clinical applications due to the time-intensive nature of manual segmentation and volumetric mesh generation. The conventional patient-specific finite element (FE) mesh generation methods are to deform a template FE mesh to match the shape of a patient based on registration. However, these methods commonly produce element distortion. Additionally, the mesh density for patients depends on that of the template model. It could not be adjusted to conduct mesh density sensitivity analysis. In this study, we propose a new framework of patient-specific facial soft tissue FE mesh generation. The goal of the developed method is to efficiently generate a high-quality patient-specific hexahedral FE mesh with adjustable mesh density while preserving the accuracy in anatomical structure correspondence. Our FE mesh is generated by eFace template deformation followed by volumetric parametrization. First, the patient-specific anatomically detailed facial soft tissue model (including skin, mucosa, and muscles) is generated by deforming an eFace template model. The adaptation of the eFace template model is achieved by using a hybrid landmark-based morphing and dense surface fitting approach followed by a thin-plate spline interpolation. Then, high-quality hexahedral mesh is constructed by using volumetric parameterization. The user can control the resolution of hexahedron mesh to best reflect clinicians' need. Our approach was validated using 30 patient models and 4 visible human datasets. The generated patient-specific FE mesh showed high surface matching accuracy, element quality, and internal structure matching accuracy. They can be directly and effectively used for clinical simulation of facial soft tissue change.
Morsbach, Fabian; Gordic, Sonja; Desbiolles, Lotus; Husarik, Daniela; Frauenfelder, Thomas; Schmidt, Bernhard; Allmendinger, Thomas; Wildermuth, Simon; Alkadhi, Hatem; Leschka, Sebastian
2014-08-01
To evaluate image quality, maximal heart rate allowing for diagnostic imaging, and radiation dose of turbo high-pitch dual-source coronary computed tomographic angiography (CCTA). First, a cardiac motion phantom simulating heart rates (HRs) from 60-90 bpm in 5-bpm steps was examined on a third-generation dual-source 192-slice CT (prospective ECG-triggering, pitch 3.2; rotation time, 250 ms). Subjective image quality regarding the presence of motion artefacts was interpreted by two readers on a four-point scale (1, excellent; 4, non-diagnostic). Objective image quality was assessed by calculating distortion vectors. Thereafter, 20 consecutive patients (median, 50 years) undergoing clinically indicated CCTA were included. In the phantom study, image quality was rated diagnostic up to the HR75 bpm, with object distortion being 1 mm or less. Distortion increased above 1 mm at HR of 80-90 bpm. Patients had a mean HR of 66 bpm (47-78 bpm). Coronary segments were of diagnostic image quality for all patients with HR up to 73 bpm. Average effective radiation dose in patients was 0.6 ± 0.3 mSv. Our combined phantom and patient study indicates that CCTA with turbo high-pitch third-generation dual-source 192-slice CT can be performed at HR up to 75 bpm while maintaining diagnostic image quality, being associated with an average radiation dose of 0.6 mSv. • CCTA is feasible with the turbo high-pitch mode. • Turbo high-pitch CCTA provides diagnostic image quality up to 73 bpm. • The radiation dose of high-pitch CCTA is 0.6 mSv on average.
Appleton, P L; Quyn, A J; Swift, S; Näthke, I
2009-05-01
Visualizing overall tissue architecture in three dimensions is fundamental for validating and integrating biochemical, cell biological and visual data from less complex systems such as cultured cells. Here, we describe a method to generate high-resolution three-dimensional image data of intact mouse gut tissue. Regions of highest interest lie between 50 and 200 mum within this tissue. The quality and usefulness of three-dimensional image data of tissue with such depth is limited owing to problems associated with scattered light, photobleaching and spherical aberration. Furthermore, the highest-quality oil-immersion lenses are designed to work at a maximum distance of =10-15 mum into the sample, further compounding the ability to image at high-resolution deep within tissue. We show that manipulating the refractive index of the mounting media and decreasing sample opacity greatly improves image quality such that the limiting factor for a standard, inverted multi-photon microscope is determined by the working distance of the objective as opposed to detectable fluorescence. This method negates the need for mechanical sectioning of tissue and enables the routine generation of high-quality, quantitative image data that can significantly advance our understanding of tissue architecture and physiology.
NASA Astrophysics Data System (ADS)
Wang, Yang; Ma, Guowei; Ren, Feng; Li, Tuo
2017-12-01
A constrained Delaunay discretization method is developed to generate high-quality doubly adaptive meshes of highly discontinuous geological media. Complex features such as three-dimensional discrete fracture networks (DFNs), tunnels, shafts, slopes, boreholes, water curtains, and drainage systems are taken into account in the mesh generation. The constrained Delaunay triangulation method is used to create adaptive triangular elements on planar fractures. Persson's algorithm (Persson, 2005), based on an analogy between triangular elements and spring networks, is enriched to automatically discretize a planar fracture into mesh points with varying density and smooth-quality gradient. The triangulated planar fractures are treated as planar straight-line graphs (PSLGs) to construct piecewise-linear complex (PLC) for constrained Delaunay tetrahedralization. This guarantees the doubly adaptive characteristic of the resulted mesh: the mesh is adaptive not only along fractures but also in space. The quality of elements is compared with the results from an existing method. It is verified that the present method can generate smoother elements and a better distribution of element aspect ratios. Two numerical simulations are implemented to demonstrate that the present method can be applied to various simulations of complex geological media that contain a large number of discontinuities.
Frimpong, Joseph Asamoah; Amo-Addae, Maame Pokuah; Adewuyi, Peter Adebayo; Hall, Casey Daniel; Park, Meeyoung Mattie; Nagbe, Thomas Knue
2017-01-01
Public health officials depend on timely, complete, and accurate surveillance data for decision making. The quality of data generated from surveillance is highly dependent on external and internal factors which may either impede or enhance surveillance activities. One way of identifying challenges affecting the quality of data generated is to conduct a data quality audit. This case study, based on an audit conducted by residents of the Liberia Frontline Field Epidemiology Training Program, was designed to be a classroom simulation of a data quality audit in a health facility. It is suited to enforce theoretical lectures in surveillance data quality and auditing. The target group is public health trainees, who should be able to complete this exercise in approximately 2 hours and 30 minutes.
NASA Astrophysics Data System (ADS)
Cheng, Ming-Yuan; Chang, Yu-Chung; Galvanauskas, Almantas; Mamidipudi, Pri; Changkakoti, Rupak; Gatchell, Peter
2005-02-01
We explored high-energy and high-peak-power pulse generation in large-core multimode fiber amplifiers, achieving what is to our knowledge the highest reported energies, up to 82 mJ for 500-ns pulses, 27 mJ for 50-ns pulses, and 2.4-MW peak power for 4-ns pulses at 1064 nm, using 200-µm-diameter and 0.062-N.A. core Yb-doped double-clad fiber amplifiers. The highly multimode nature of the fiber core was mitigated by use of a coiling-induced mode-filtering effect to yield a significant improvement in output-beam quality from M^2 = 25 from an uncoiled fiber to M^2 = 6.5 from a properly coiled fiber, with the corresponding reduction in number of propagating transverse modes from >or=200 to <or=20.
Scotti, Dennis J; Harmon, Joel; Behson, Scott J
2007-01-01
Healthcare managers must deliver high-quality patient services that generate highly satisfied and loyal customers. In this article, we examine how a high-involvement approach to the work environment of healthcare employees may lead to exceptional service quality, satisfied patients, and ultimately to loyal customers. Specifically, we investigate the chain of events through which high-performance work systems (HPWS) and customer orientation influence employee and customer perceptions of service quality and patient satisfaction in a national sample of 113 Veterans Health Administration ambulatory care centers. We present a conceptual model for linking work environment to customer satisfaction and test this model using structural equations modeling. The results suggest that (1) HPWS is linked to employee perceptions of their ability to deliver high-quality customer service, both directly and through their perceptions of customer orientation; (2) employee perceptions of customer service are linked to customer perceptions of high-quality service; and (3) perceived service quality is linked with customer satisfaction. Theoretical and practical implications of our findings, including suggestions of how healthcare managers can implement changes to their work environments, are discussed.
2016-08-02
epitaxy platform, it is essential that malignant defects, such as in-grown stacking faults (IGSFs) and basal plane dislocations (BPDs), be...crystal quality. (5) Even though the inlet C/Si ratio is kept fixed , the C/Si ratio at the growth surface varies depending on the different gas...morphology, and quality (generation of additional defects). Two CVD reactor types, a chimney reactor and an inverted chimney reactor, are assembled; the
Quality Function Deployment for Large Systems
NASA Technical Reports Server (NTRS)
Dean, Edwin B.
1992-01-01
Quality Function Deployment (QFD) is typically applied to small subsystems. This paper describes efforts to extend QFD to large scale systems. It links QFD to the system engineering process, the concurrent engineering process, the robust design process, and the costing process. The effect is to generate a tightly linked project management process of high dimensionality which flushes out issues early to provide a high quality, low cost, and, hence, competitive product. A pre-QFD matrix linking customers to customer desires is described.
Generating Options for Active Risk Control (GO-ARC): introducing a novel technique.
Card, Alan J; Ward, James R; Clarkson, P John
2014-01-01
After investing significant amounts of time and money in conducting formal risk assessments, such as root cause analysis (RCA) or failure mode and effects analysis (FMEA), healthcare workers are left to their own devices in generating high-quality risk control options. They often experience difficulty in doing so, and tend toward an overreliance on administrative controls (the weakest category in the hierarchy of risk controls). This has important implications for patient safety and the cost effectiveness of risk management operations. This paper describes a before and after pilot study of the Generating Options for Active Risk Control (GO-ARC) technique, a novel tool to improve the quality of the risk control options generation process. The quantity, quality (using the three-tiered hierarchy of risk controls), variety, and novelty of risk controls generated. Use of the GO-ARC technique was associated with improvement on all measures. While this pilot study has some notable limitations, it appears that the GO-ARC technique improved the risk control options generation process. Further research is needed to confirm this finding. It is also important to note that improved risk control options are a necessary, but not sufficient, step toward the implementation of more robust risk controls. © 2013 National Association for Healthcare Quality.
Domain Decomposition By the Advancing-Partition Method for Parallel Unstructured Grid Generation
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.; Zagaris, George
2009-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
Properties and Frequency Conversion of High-Brightness Diode-Laser Systems
NASA Astrophysics Data System (ADS)
Boller, Klaus-Jochen; Beier, Bernard; Wallenstein, Richard
An overview of recent developments in the field of high-power, high-brightness diode-lasers, and the optically nonlinear conversion of their output into other wavelength ranges, is given. We describe the generation of continuous-wave (CW) laser beams at power levels of several hundreds of milliwatts to several watts with near-perfect spatial and spectral properties using Master-Oscillator Power-Amplifier (MOPA) systems. With single- or double-stage systems, using amplifiers of tapered or rectangular geometry, up to 2.85 W high-brightness radiation is generated at wavelengths around 810nm with AlGaAs diodes. Even higher powers, up to 5.2W of single-frequency and high spatial quality beams at 925nm, are obtained with InGaAs diodes. We describe the basic properties of the oscillators and amplifiers used. A strict proof-of-quality for the diode radiation is provided by direct and efficient nonlinear optical conversion of the diode MOPA output into other wavelength ranges. We review recent experiments with the highest power levels obtained so far by direct frequency doubling of diode radiation. In these experiments, 100mW single-frequency ultraviolet light at 403nm was generated, as well as 1W of single-frequency blue radiation at 465nm. Nonlinear conversion of diode radiation into widely tunable infrared radiation has recently yielded record values. We review the efficient generation of widely tunable single-frequency radiation in the infrared with diode-pumped Optical Parametric Oscillators (OPOs). With this system, single-frequency output radiation with powers of more than 0.5W was generated, widely tunable around wavelengths of 2.1,m and 1.65,m and with excellent spectral and spatial quality. These developments are clear indicators of recent advances in the field of high-brightness diode-MOPA systems, and may emphasize their future central importance for applications within a vast range of optical wavelengths.
Dynamic stability and handling qualities tests on a highly augmented, statically unstable airplane
NASA Technical Reports Server (NTRS)
Gera, Joseph; Bosworth, John T.
1987-01-01
Novel flight test and analysis techniques in the flight dynamics and handling qualities area are described. These techniques were utilized at NASA Ames-Dryden during the initial flight envelope clearance of the X-29A aircraft. It is shown that the open-loop frequency response of an aircraft with highly relaxed static stability can be successfully computed on the ground from telemetry data. Postflight closed-loop frequency response data were obtained from pilot-generated frequency sweeps and it is found that the current handling quality requirements for high-maneuverability aircraft are generally applicable to the X-29A.
Sanal, Madhusudana Girija
2014-01-01
Even after several years since the discovery of human embryonic stem cells and induced pluripotent stem cells (iPSC), we are still unable to make any significant therapeutic benefits out of them such as cell therapy or generation of organs for transplantation. Recent success in somatic cell nuclear transfer (SCNT) made it possible to generate diploid embryonic stem cells, which opens up the way to make high-quality pluripotent stem cells. However, the process is highly inefficient and hence expensive compared to the generation of iPSC. Even with the latest SCNT technology, we are not sure whether one can make therapeutic quality pluripotent stem cell from any patient's somatic cells or by using oocytes from any donor. Combining iPSC technology with SCNT, that is, by using the nucleus of the candidate somatic cell which got reprogrammed to pluripotent state instead that of the unmodified nucleus of the candidate somatic cell, would boost the efficiency of the technique, and we would be able to generate therapeutic quality pluripotent stem cells. Induced pluripotent stem cell nuclear transfer (iPSCNT) combines the efficiency of iPSC generation with the speed and natural reprogramming environment of SCNT. The new technique may be called iPSCNT. This technique could prove to have very revolutionary benefits for humankind. This could be useful in generating organs for transplantation for patients and for reproductive cloning, especially for childless men and women who cannot have children by any other techniques. When combined with advanced gene editing techniques (such as CRISPR-Cas system) this technique might also prove useful to those who want to have healthy children but suffer from inherited diseases. The current code of ethics may be against reproductive cloning. However, this will change with time as it happened with most of the revolutionary scientific breakthroughs. After all, it is the right of every human to have healthy offspring and it is the question of reproductive freedom and existence.
NASA Astrophysics Data System (ADS)
Schneider, Thomas
2015-03-01
High-quality frequency comb sources like femtosecond-lasers have revolutionized the metrology of fundamental physical constants. The generated comb consists of frequency lines with an equidistant separation over a bandwidth of several THz. This bandwidth can be broadened further to a super-continuum of more than an octave through propagation in nonlinear media. The frequency separation between the lines is defined by the repetition rate and the width of each comb line can be below 1 Hz, even without external stabilization. By extracting just one of these lines, an ultra-narrow linewidth, tunable laser line for applications in communications and spectroscopy can be generated. If two lines are extracted, the superposition of these lines in an appropriate photo-mixer produces high-quality millimeter- and THz-waves. The extraction of several lines can be used for the creation of almost-ideally sinc-shaped Nyquist pulses, which enable optical communications with the maximum-possible baud rate. Especially combs generated by low-cost, small-footprint fs-fiber lasers are very promising. However due to the resonator length, the comb frequencies have a typical separation of 80 - 100 MHz, far too narrow for the selection of single tones with standard optical filters. Here the extraction of single lines of an fs-fiber laser by polarization pulling assisted stimulated Brillouin scattering is presented. The application of these extracted lines as ultra-narrow, stable and tunable laser lines, for the generation of very high-quality mm and THz-waves with an ultra-narrow linewidth and phase noise and for the generation of sinc-shaped Nyquist pulses with arbitrary bandwidth and repetition rate is discussed.
Maize - GO annotation methods, evaluation, and review (Maize-GAMER)
USDA-ARS?s Scientific Manuscript database
Making a genome sequence accessible and useful involves three basic steps: genome assembly, structural annotation, and functional annotation. The quality of data generated at each step influences the accuracy of inferences that can be made, with high-quality analyses produce better datasets resultin...
A practical workflow for making anatomical atlases for biological research.
Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles
2012-01-01
The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.
Recent progress in opto-electronic oscillator
NASA Technical Reports Server (NTRS)
Maleki, Lute
2005-01-01
The optoelectronic oscillator (OEO) is a unique device based on photonics techniques to generate highly spectrally pure microwave signals [1]. The development of the OEO was motivated by the need for high performance oscillators in the frequency range larger than 10 GHz, where conventional electronic oscillators have a number of limitations. These limitations typically stem from the product of fQ, where f is the oscillator frequency and Q is the quality factor of the resonator in the oscillator. In conventional resonators, whether electromagnetic or piezoelectric, this product is usually a constant. Thus, as the oscillator frequency is pushed higher, the quality factor degrades, resulting in degradation of the phase noise of the oscillator. An approach to mitigate the problem is to start with a very high quality signal in the 5 to 100 MHz range generated by a quartz oscillator and multiply the frequency to achieve the desired microwave signal. Here again, frequency multiplication also results in an increase of the phase noise by a factor of 2010gN, where N is the multiplication factor.
Pharmacophore-Map-Pick: A Method to Generate Pharmacophore Models for All Human GPCRs.
Dai, Shao-Xing; Li, Gong-Hua; Gao, Yue-Dong; Huang, Jing-Fei
2016-02-01
GPCR-based drug discovery is hindered by a lack of effective screening methods for most GPCRs that have neither ligands nor high-quality structures. With the aim to identify lead molecules for these GPCRs, we developed a new method called Pharmacophore-Map-Pick to generate pharmacophore models for all human GPCRs. The model of ADRB2 generated using this method not only predicts the binding mode of ADRB2-ligands correctly but also performs well in virtual screening. Findings also demonstrate that this method is powerful for generating high-quality pharmacophore models. The average enrichment for the pharmacophore models of the 15 targets in different GPCR families reached 15-fold at 0.5 % false-positive rate. Therefore, the pharmacophore models can be applied in virtual screening directly with no requirement for any ligand information or shape constraints. A total of 2386 pharmacophore models for 819 different GPCRs (99 % coverage (819/825)) were generated and are available at http://bsb.kiz.ac.cn/GPCRPMD. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Domain Decomposition By the Advancing-Partition Method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar Z.
2008-01-01
A new method of domain decomposition has been developed for generating unstructured grids in subdomains either sequentially or using multiple computers in parallel. Domain decomposition is a crucial and challenging step for parallel grid generation. Prior methods are generally based on auxiliary, complex, and computationally intensive operations for defining partition interfaces and usually produce grids of lower quality than those generated in single domains. The new technique, referred to as "Advancing Partition," is based on the Advancing-Front method, which partitions a domain as part of the volume mesh generation in a consistent and "natural" way. The benefits of this approach are: 1) the process of domain decomposition is highly automated, 2) partitioning of domain does not compromise the quality of the generated grids, and 3) the computational overhead for domain decomposition is minimal. The new method has been implemented in NASA's unstructured grid generation code VGRID.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nicolae, Alexandru; Department of Medical Physics, Odette Cancer Center, Sunnybrook Health Sciences Centre, Toronto, Ontario; Morton, Gerard
Purpose: This work presents the application of a machine learning (ML) algorithm to automatically generate high-quality, prostate low-dose-rate (LDR) brachytherapy treatment plans. The ML algorithm can mimic characteristics of preoperative treatment plans deemed clinically acceptable by brachytherapists. The planning efficiency, dosimetry, and quality (as assessed by experts) of preoperative plans generated with an ML planning approach was retrospectively evaluated in this study. Methods and Materials: Preimplantation and postimplantation treatment plans were extracted from 100 high-quality LDR treatments and stored within a training database. The ML training algorithm matches similar features from a new LDR case to those within the trainingmore » database to rapidly obtain an initial seed distribution; plans were then further fine-tuned using stochastic optimization. Preimplantation treatment plans generated by the ML algorithm were compared with brachytherapist (BT) treatment plans in terms of planning time (Wilcoxon rank sum, α = 0.05) and dosimetry (1-way analysis of variance, α = 0.05). Qualitative preimplantation plan quality was evaluated by expert LDR radiation oncologists using a Likert scale questionnaire. Results: The average planning time for the ML approach was 0.84 ± 0.57 minutes, compared with 17.88 ± 8.76 minutes for the expert planner (P=.020). Preimplantation plans were dosimetrically equivalent to the BT plans; the average prostate V150% was 4% lower for ML plans (P=.002), although the difference was not clinically significant. Respondents ranked the ML-generated plans as equivalent to expert BT treatment plans in terms of target coverage, normal tissue avoidance, implant confidence, and the need for plan modifications. Respondents had difficulty differentiating between plans generated by a human or those generated by the ML algorithm. Conclusions: Prostate LDR preimplantation treatment plans that have equivalent quality to plans created by brachytherapists can be rapidly generated using ML. The adoption of ML in the brachytherapy workflow is expected to improve LDR treatment plan uniformity while reducing planning time and resources.« less
NASA Astrophysics Data System (ADS)
Zhou, Weijun; Hong, Xueren; Xie, Baisong; Yang, Yang; Wang, Li; Tian, Jianmin; Tang, Rongan; Duan, Wenshan
2018-02-01
In order to generate high quality ion beams through a relatively uniform radiation pressure acceleration (RPA) of a common flat foil, a new scheme is proposed to overcome the curve of the target while being radiated by a single transversely Gaussian laser. In this scheme, two matched counterpropagating transversely Gaussian laser pulses, a main pulse and an auxiliary pulse, impinge on the foil target at the meantime. It is found that in the two-dimensional (2D) particle-in-cell (PIC) simulation, by the restraint of the auxiliary laser, the curve of the foil can be effectively suppressed. As a result, a high quality monoenergetic ion beam is generated through an efficient RPA of the foil target. For example, two counterpropagating transversely circularly polarized Gaussian lasers with normalized amplitudes a1=120 and a2=30 , respectively, impinge on the foil target at the meantime, a 1.3 GeV monoenergetic proton beam with high collimation is obtained finally. Furthermore, the effects on the ions acceleration with different parameters of the auxiliary laser are also investigated.
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation
Wilke, Marko; Altaye, Mekibib; Holland, Scott K.
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating “unusual” populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php. PMID:28275348
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation.
Wilke, Marko; Altaye, Mekibib; Holland, Scott K
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating "unusual" populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php.
NASA Technical Reports Server (NTRS)
Whalen, Michael; Schumann, Johann; Fischer, Bernd
2002-01-01
Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.
The SCALE Verified, Archived Library of Inputs and Data - VALID
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, William BJ J; Rearden, Bradley T
The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less
Experimental Study of Hysteresis behavior of Foam Generation in Porous Media.
Kahrobaei, S; Vincent-Bonnieu, S; Farajzadeh, R
2017-08-21
Foam can be used for gas mobility control in different subsurface applications. The success of foam-injection process depends on foam-generation and propagation rate inside the porous medium. In some cases, foam properties depend on the history of the flow or concentration of the surfactant, i.e., the hysteresis effect. Foam may show hysteresis behavior by exhibiting multiple states at the same injection conditions, where coarse-textured foam is converted into strong foam with fine texture at a critical injection velocity or pressure gradient. This study aims to investigate the effects of injection velocity and surfactant concentration on foam generation and hysteresis behavior as a function of foam quality. We find that the transition from coarse-foam to strong-foam (i.e., the minimum pressure gradient for foam generation) is almost independent of flowrate, surfactant concentration, and foam quality. Moreover, the hysteresis behavior in foam generation occurs only at high-quality regimes and when the pressure gradient is below a certain value regardless of the total flow rate and surfactant concentration. We also observe that the rheological behavior of foam is strongly dependent on liquid velocity.
Lundin, Margareta; Lidén, Mats; Magnuson, Anders; Mohammed, Ahmed Abdulilah; Geijer, Håkan; Andersson, Torbjörn; Persson, Anders
2012-07-01
Dual-energy computed tomography (DECT) has been shown to be useful for subtracting bone or calcium in CT angiography and gives an opportunity to produce a virtual non-contrast-enhanced (VNC) image from a series where contrast agents have been given intravenously. High noise levels and low resolution have previously limited the diagnostic value of the VNC images created with the first generation of DECT. With the recent introduction of a second generation of DECT, there is a possibility of obtaining VNC images with better image quality at hopefully lower radiation dose compared to the previous generation. To compare the image quality of the single-energy series to a VNC series obtained with a two generations of DECT scanners. CT of the urinary tract was used as a model. Thirty patients referred for evaluation of hematuria were examined with an older system (Somatom Definition) and another 30 patients with a new generation (Somatom Definition Flash). One single-energy series was obtained before and one dual-energy series after administration of intravenous contrast media. We created a VNC series from the contrast-enhanced images. Images were assessed concerning image quality with a visual grading scale evaluation of the VNC series with the single-energy series as gold standard. The image quality of the VNC images was rated inferior to the single-energy variant for both scanners, OR 11.5-67.3 for the Definition and OR 2.1-2.8 for the Definition Flash. Visual noise and overall quality were regarded as better with Flash than Definition. Image quality of VNC images obtained with the new generation of DECT is still slightly inferior compared to native images. However, the difference is smaller with the new compared to the older system.
Video quality assessment method motivated by human visual perception
NASA Astrophysics Data System (ADS)
He, Meiling; Jiang, Gangyi; Yu, Mei; Song, Yang; Peng, Zongju; Shao, Feng
2016-11-01
Research on video quality assessment (VQA) plays a crucial role in improving the efficiency of video coding and the performance of video processing. It is well acknowledged that the motion energy model generates motion energy responses in a middle temporal area by simulating the receptive field of neurons in V1 for the motion perception of the human visual system. Motivated by the biological evidence for the visual motion perception, a VQA method is proposed in this paper, which comprises the motion perception quality index and the spatial index. To be more specific, the motion energy model is applied to evaluate the temporal distortion severity of each frequency component generated from the difference of Gaussian filter bank, which produces the motion perception quality index, and the gradient similarity measure is used to evaluate the spatial distortion of the video sequence to get the spatial quality index. The experimental results of the LIVE, CSIQ, and IVP video databases demonstrate that the random forests regression technique trained by the generated quality indices is highly correspondent to human visual perception and has many significant improvements than comparable well-performing methods. The proposed method has higher consistency with subjective perception and higher generalization capability.
A protocol for generating a high-quality genome-scale metabolic reconstruction.
Thiele, Ines; Palsson, Bernhard Ø
2010-01-01
Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have been developed over the last 10 years. These reconstructions represent structured knowledge bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates a myriad of computational biological studies, including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge bases. Here we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction, as well as the common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process.
A protocol for generating a high-quality genome-scale metabolic reconstruction
Thiele, Ines; Palsson, Bernhard Ø.
2011-01-01
Network reconstructions are a common denominator in systems biology. Bottom-up metabolic network reconstructions have developed over the past 10 years. These reconstructions represent structured knowledge-bases that abstract pertinent information on the biochemical transformations taking place within specific target organisms. The conversion of a reconstruction into a mathematical format facilitates myriad computational biological studies including evaluation of network content, hypothesis testing and generation, analysis of phenotypic characteristics, and metabolic engineering. To date, genome-scale metabolic reconstructions for more than 30 organisms have been published and this number is expected to increase rapidly. However, these reconstructions differ in quality and coverage that may minimize their predictive potential and use as knowledge-bases. Here, we present a comprehensive protocol describing each step necessary to build a high-quality genome-scale metabolic reconstruction as well as common trials and tribulations. Therefore, this protocol provides a helpful manual for all stages of the reconstruction process. PMID:20057383
A large flat panel multifunction display for military and space applications
NASA Astrophysics Data System (ADS)
Pruitt, James S.
1992-09-01
A flat panel multifunction display (MFD) that offers the size and reliability benefits of liquid crystal display technology while achieving near-CRT display quality is presented. Display generation algorithms that provide exceptional display quality are being implemented in custom VLSI components to minimize MFD size. A high-performance processor converts user-specified display lists to graphics commands used by these components, resulting in high-speed updates of two-dimensional and three-dimensional images. The MFD uses the MIL-STD-1553B data bus for compatibility with virtually all avionics systems. The MFD can generate displays directly from display lists received from the MIL-STD-1553B bus. Complex formats can be stored in the MFD and displayed using parameters from the data bus. The MFD also accepts direct video input and performs special processing on this input to enhance image quality.
Thermally-enhanced oil recovery method and apparatus
Stahl, Charles R.; Gibson, Michael A.; Knudsen, Christian W.
1987-01-01
A thermally-enhanced oil recovery method and apparatus for exploiting deep well reservoirs utilizes electric downhole steam generators to provide supplemental heat to generate high quality steam from hot pressurized water which is heated at the surface. A downhole electric heater placed within a well bore for local heating of the pressurized liquid water into steam is powered by electricity from the above-ground gas turbine-driven electric generators fueled by any clean fuel such as natural gas, distillate or some crude oils, or may come from the field being stimulated. Heat recovered from the turbine exhaust is used to provide the hot pressurized water. Electrical power may be cogenerated and sold to an electric utility to provide immediate cash flow and improved economics. During the cogeneration period (no electrical power to some or all of the downhole units), the oil field can continue to be stimulated by injecting hot pressurized water, which will flash into lower quality steam at reservoir conditions. The heater includes electrical heating elements supplied with three-phase alternating current or direct current. The injection fluid flows through the heater elements to generate high quality steam to exit at the bottom of the heater assembly into the reservoir. The injection tube is closed at the bottom and has radial orifices for expanding the injection fluid to reservoir pressure.
Semi-automated ontology generation within OBO-Edit.
Wächter, Thomas; Schroeder, Michael
2010-06-15
Ontologies and taxonomies have proven highly beneficial for biocuration. The Open Biomedical Ontology (OBO) Foundry alone lists over 90 ontologies mainly built with OBO-Edit. Creating and maintaining such ontologies is a labour-intensive, difficult, manual process. Automating parts of it is of great importance for the further development of ontologies and for biocuration. We have developed the Dresden Ontology Generator for Directed Acyclic Graphs (DOG4DAG), a system which supports the creation and extension of OBO ontologies by semi-automatically generating terms, definitions and parent-child relations from text in PubMed, the web and PDF repositories. DOG4DAG is seamlessly integrated into OBO-Edit. It generates terms by identifying statistically significant noun phrases in text. For definitions and parent-child relations it employs pattern-based web searches. We systematically evaluate each generation step using manually validated benchmarks. The term generation leads to high-quality terms also found in manually created ontologies. Up to 78% of definitions are valid and up to 54% of child-ancestor relations can be retrieved. There is no other validated system that achieves comparable results. By combining the prediction of high-quality terms, definitions and parent-child relations with the ontology editor OBO-Edit we contribute a thoroughly validated tool for all OBO ontology engineers. DOG4DAG is available within OBO-Edit 2.1 at http://www.oboedit.org. Supplementary data are available at Bioinformatics online.
A novel approach to generating CER hypotheses based on mining clinical data.
Zhang, Shuo; Li, Lin; Yu, Yiqin; Sun, Xingzhi; Xu, Linhao; Zhao, Wei; Teng, Xiaofei; Pan, Yue
2013-01-01
Comparative effectiveness research (CER) is a scientific method of investigating the effectiveness of alternative intervention methods. In a CER study, clinical researchers typically start with a CER hypothesis, and aim to evaluate it by applying a series of medical statistical methods. Traditionally, the CER hypotheses are defined manually by clinical researchers. This makes the task of hypothesis generation very time-consuming and the quality of hypothesis heavily dependent on the researchers' skills. Recently, with more electronic medical data being collected, it is highly promising to apply the computerized method for discovering CER hypotheses from clinical data sets. In this poster, we proposes a novel approach to automatically generating CER hypotheses based on mining clinical data, and presents a case study showing that the approach can facilitate clinical researchers to identify potentially valuable hypotheses and eventually define high quality CER studies.
Rosendo, A; Druet, T; Péry, C; Bidanel, J P
2010-03-01
Correlated effects of selection for components of litter size on carcass and meat quality traits were estimated using data from 3 lines of pigs derived from the same Large White base population. Two lines were selected for 6 generations on high ovulation rate at puberty (OR) or high prenatal survival corrected for ovulation rate in the first 2 parities (PS). The third line was an unselected control (CON). The 3 lines were kept for a 7th generation, but without any selection. Carcass and meat quality traits were recorded on the 5th to 7th generation of the experiment. Carcass traits included dressing percentage, carcass length (LGTH), average backfat thickness (ABT), estimated lean meat content, and 8 carcass joint weight traits. Meat quality traits included pH recorded 24 h after slaughter (pH24) of LM, gluteus superficialis (GS), biceps femoris (BF), and adductor femoris (AD) muscles, as well as reflectance and water-holding capacity (WHC) of GS and BF muscles. Heritabilities of carcass and meat quality traits and their genetic correlations with OR and PS were estimated using REML methodology applied to a multiple trait animal model. Correlated responses to selection were then estimated by computing differences between OR or PS and CON lines at generations 5 to 7 using least squares and mixed model methodology. Heritability (h(2)) estimates were 0.08 +/- 0.04, 0.58 +/- 0.10, 0.70 +/- 0.10, and 0.74 +/- 0.10 for dressing percentage, LGTH, ABT, and lean meat content, respectively, ranged from 0.28 to 0.72 for carcass joint traits, from 0.28 to 0.45 for pH24 and reflectance measurements, and from 0.03 to 0.11 for WHC measurements. Both OR and PS had weak genetic correlations with carcass (r(G) = -0.09 to 0.17) and most meat quality traits. Selection for OR did not affect any carcass composition or meat quality trait. Correlated responses to selection for PS were also limited, with the exception of a decrease in pH24 of GS and BF muscles (-0.12 to -0.14 after 6 generations; P < 0.05), in WHC of GS muscle (-18.9 s after 6 generations; P < 0.05) and a tendency toward an increase in loin weight (0.44 kg after 6 generations; P < 0.10) .
Cahenzli, Fabian; Wenk, Barbara A; Erhardt, Andreas
2015-07-01
Recent studies with diverse taxa have shown that parents can utilize their experience of the environment to adapt their offspring's phenotype to the same environmental conditions. Thus, offspring would then perform best under environmental conditions experienced by their parents due to transgenerational phenotypic plasticity. Such an effect has been dubbed transgenerational acclimatization. However, evidence that parents can subsequently ensure the appropriate environmental conditions in order that offspring benefit from transgenerational acclimatization has never been demonstrated. We reared Pieris rapae larvae in the parental generation on high-nitrogen and low-nitrogen host plants, and reared the offspring (F1) of both treatments again on high- and low-nitrogen plants. Furthermore, we tested if females prefer to oviposit on high- or low-nitrogen host plants in two-way choice tests. We here show not only that females adapt their offspring's phenotype to the host-plant quality that they themselves experienced, but that females also mainly oviposit on the host quality to which they adapt their offspring. Moreover, effects of larval host plant on oviposition preference of females increased across two generations in F1-females acclimatized to low-nitrogen host plants, showing an adaptive host shift from one generation to the next. These findings may have profound implications for host-race formation and sympatric speciation.
USDA-ARS?s Scientific Manuscript database
Tomato (Solanum lycopersicum L.) is an excellent plant model for unraveling physiological processes, fruit quality and fruit shelf determinants, stress responsive signaling, pathogenicity, and ripening development in climacteric fruits. Tomato is a popular vegetable, and along with potato, it is cla...
Study Methods for Improving Quality Learning and Performance in Higher Education
ERIC Educational Resources Information Center
Mutsotso, S. N.; Abenga, E. S. B.
2010-01-01
Education is an investment to development and poor study methods should not compromise the mandate of higher education institutions to generate, preserve and disseminate knowledge and produce high quality graduates. Universities admit students with varying backgrounds in terms of learning/study styles, levels of preparedness and concepts of…
New generation of compact high power disk lasers
NASA Astrophysics Data System (ADS)
Feuchtenbeiner, Stefanie; Zaske, Sebastian; Schad, Sven-Silvius; Gottwald, Tina; Kuhn, Vincent; Kumkar, Sören; Metzger, Bernd; Killi, Alexander; Haug, Patrick; Speker, Nicolai
2018-02-01
New technological developments in high power disk lasers emitting at 1030 nm are presented. These include the latest generation of TRUMPF's TruDisk product line offering high power disk lasers with up to 6 kW output power and beam qualities of up to 4 mm*mrad. With these compact devices a footprint reduction of 50% compared to the previous model could be achieved while at the same time improving robustness and increasing system efficiency. In the context of Industry 4.0, the new generation of TruDisk lasers features a synchronized data recording of all sensors, offering high-quality data for virtual analyses. The lasers therefore provide optimal hardware requirements for services like Condition Monitoring and Predictive Maintenance. We will also discuss its innovative and space-saving cooling architecture. It allows operation of the laser under very critical ambient conditions. Furthermore, an outlook on extending the new disk laser platform to higher power levels will be given. We will present a disk laser with 8 kW laser power out of a single disk with a beam quality of 5 mm*mrad using a 125 μm fiber, which makes it ideally suited for cutting and welding applications. The flexibility of the disk laser platform also enables the realization of a wide variety of beam guiding setups. As an example a new scheme called BrightLine Weld will be discussed. This technology allows for an almost spatter free laser welding process, even at high feed rates.
Monroe, J Grey; Allen, Zachariah A; Tanger, Paul; Mullen, Jack L; Lovell, John T; Moyers, Brook T; Whitley, Darrell; McKay, John K
2017-01-01
Recent advances in nucleic acid sequencing technologies have led to a dramatic increase in the number of markers available to generate genetic linkage maps. This increased marker density can be used to improve genome assemblies as well as add much needed resolution for loci controlling variation in ecologically and agriculturally important traits. However, traditional genetic map construction methods from these large marker datasets can be computationally prohibitive and highly error prone. We present TSPmap , a method which implements both approximate and exact Traveling Salesperson Problem solvers to generate linkage maps. We demonstrate that for datasets with large numbers of genomic markers (e.g. 10,000) and in multiple population types generated from inbred parents, TSPmap can rapidly produce high quality linkage maps with low sensitivity to missing and erroneous genotyping data compared to two other benchmark methods, JoinMap and MSTmap . TSPmap is open source and freely available as an R package. With the advancement of low cost sequencing technologies, the number of markers used in the generation of genetic maps is expected to continue to rise. TSPmap will be a useful tool to handle such large datasets into the future, quickly producing high quality maps using a large number of genomic markers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, X. Q., E-mail: xq-shen@aist.go.jp; Takahashi, T.; Ide, T.
2015-09-28
We investigate the generation mechanisms of micro-cracks (MCs) in an ultra-thin AlN/GaN superlattice (SL) structure grown on Si(110) substrates by metalorganic chemical vapor deposition. The SL is intended to be used as an interlayer (IL) for relaxing tensile stress and obtaining high-quality crack-free GaN grown on Si substrates. It is found that the MCs can be generated by two different mechanisms, where large mismatches of the lattice constant (LC) and the coefficient of thermal expansion (CTE) play key roles in the issue. Different MC configurations (low-density and high-density MCs) are observed, which are considered to be formed during the differentmore » growth stages (SL growth and cooling down processes) due to the LC and the CTE effects. In-situ and ex-situ experimental results support the mechanism interpretations of the MCs generation. The mechanism understanding makes it possible to optimize the SL IL structure for growing high-quality crack-free GaN films on Si substrates for optical and electronic device applications.« less
NASA Astrophysics Data System (ADS)
Cernansky, Robert; Martini, Francesco; Politi, Alberto
2018-02-01
We demonstrate on chip generation of correlated pairs of photons in the near-visible spectrum using a CMOS compatible PECVD Silicon Nitride photonic device. Photons are generated via spontaneous four wave mixing enhanced by a ring resonator with high quality Q-factor of 320,000 resulting in a generation rate of 950,000 $\\frac{pairs}{mW}$. The high brightness of this source offers the opportunity to expand photonic quantum technologies over a broad wavelength range and provides a path to develop fully integrated quantum chips working at room temperature.
USDA-ARS?s Scientific Manuscript database
Next-generation sequencing technology such as genotyping-by-sequencing (GBS) made low-cost, but often low-coverage, whole-genome sequencing widely available. Extensive inbreeding in crop plants provides an untapped, high quality source of phased haplotypes for imputing missing genotypes. We introduc...
Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla
2018-05-01
Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.
Recent developments in novel freezing and thawing technologies applied to foods.
Wu, Xiao-Fei; Zhang, Min; Adhikari, Benu; Sun, Jincai
2017-11-22
This article reviews the recent developments in novel freezing and thawing technologies applied to foods. These novel technologies improve the quality of frozen and thawed foods and are energy efficient. The novel technologies applied to freezing include pulsed electric field pre-treatment, ultra-low temperature, ultra-rapid freezing, ultra-high pressure and ultrasound. The novel technologies applied to thawing include ultra-high pressure, ultrasound, high voltage electrostatic field (HVEF), and radio frequency. Ultra-low temperature and ultra-rapid freezing promote the formation and uniform distribution of small ice crystals throughout frozen foods. Ultra-high pressure and ultrasound assisted freezing are non-thermal methods and shorten the freezing time and improve product quality. Ultra-high pressure and HVEF thawing generate high heat transfer rates and accelerate the thawing process. Ultrasound and radio frequency thawing can facilitate thawing process by volumetrically generating heat within frozen foods. It is anticipated that these novel technologies will be increasingly used in food industries in the future.
A Procedure for High Resolution Satellite Imagery Quality Assessment
Crespi, Mattia; De Vendictis, Laura
2009-01-01
Data products generated from High Resolution Satellite Imagery (HRSI) are routinely evaluated during the so-called in-orbit test period, in order to verify if their quality fits the desired features and, if necessary, to obtain the image correction parameters to be used at the ground processing center. Nevertheless, it is often useful to have tools to evaluate image quality also at the final user level. Image quality is defined by some parameters, such as the radiometric resolution and its accuracy, represented by the noise level, and the geometric resolution and sharpness, described by the Modulation Transfer Function (MTF). This paper proposes a procedure to evaluate these image quality parameters; the procedure was implemented in a suitable software and tested on high resolution imagery acquired by the QuickBird, WorldView-1 and Cartosat-1 satellites. PMID:22412312
High-power Broadband Organic THz Generator
Jeong, Jae-Hyeok; Kang, Bong-Joo; Kim, Ji-Soo; Jazbinsek, Mojca; Lee, Seung-Heon; Lee, Seung-Chul; Baek, In-Hyung; Yun, Hoseop; Kim, Jongtaek; Lee, Yoon Sup; Lee, Jae-Hyeok; Kim, Jae-Ho; Rotermund, Fabian; Kwon, O-Pil
2013-01-01
The high-power broadband terahertz (THz) generator is an essential tool for a wide range of THz applications. Here, we present a novel highly efficient electro-optic quinolinium single crystal for THz wave generation. For obtaining intense and broadband THz waves by optical-to-THz frequency conversion, a quinolinium crystal was developed to fulfill all the requirements, which are in general extremely difficult to maintain simultaneously in a single medium, such as a large macroscopic electro-optic response and excellent crystal characteristics including a large crystal size with desired facets, good environmental stability, high optical quality, wide transparency range, and controllable crystal thickness. Compared to the benchmark inorganic and organic crystals, the new quinolinium crystal possesses excellent crystal properties and THz generation characteristics with broader THz spectral coverage and higher THz conversion efficiency at the technologically important pump wavelength of 800 nm. Therefore, the quinolinium crystal offers great potential for efficient and gap-free broadband THz wave generation. PMID:24220234
High-power broadband organic THz generator.
Jeong, Jae-Hyeok; Kang, Bong-Joo; Kim, Ji-Soo; Jazbinsek, Mojca; Lee, Seung-Heon; Lee, Seung-Chul; Baek, In-Hyung; Yun, Hoseop; Kim, Jongtaek; Lee, Yoon Sup; Lee, Jae-Hyeok; Kim, Jae-Ho; Rotermund, Fabian; Kwon, O-Pil
2013-11-13
The high-power broadband terahertz (THz) generator is an essential tool for a wide range of THz applications. Here, we present a novel highly efficient electro-optic quinolinium single crystal for THz wave generation. For obtaining intense and broadband THz waves by optical-to-THz frequency conversion, a quinolinium crystal was developed to fulfill all the requirements, which are in general extremely difficult to maintain simultaneously in a single medium, such as a large macroscopic electro-optic response and excellent crystal characteristics including a large crystal size with desired facets, good environmental stability, high optical quality, wide transparency range, and controllable crystal thickness. Compared to the benchmark inorganic and organic crystals, the new quinolinium crystal possesses excellent crystal properties and THz generation characteristics with broader THz spectral coverage and higher THz conversion efficiency at the technologically important pump wavelength of 800 nm. Therefore, the quinolinium crystal offers great potential for efficient and gap-free broadband THz wave generation.
Predictive model for CO2 generation and decay in building envelopes
NASA Astrophysics Data System (ADS)
Aglan, Heshmat A.
2003-01-01
Understanding carbon dioxide generation and decay patterns in buildings with high occupancy levels is useful to identify their indoor air quality, air change rates, percent fresh air makeup, occupancy pattern, and how a variable air volume system to off-set undesirable CO2 level can be modulated. A mathematical model governing the generation and decay of CO2 in building envelopes with forced ventilation due to high occupancy is developed. The model has been verified experimentally in a newly constructed energy efficient healthy house. It was shown that the model accurately predicts the CO2 concentration at any time during the generation and decay processes.
Irwin, Jodi A; Saunier, Jessica L; Strouss, Katharine M; Sturk, Kimberly A; Diegoli, Toni M; Just, Rebecca S; Coble, Michael D; Parson, Walther; Parsons, Thomas J
2007-06-01
In an effort to increase the quantity, breadth and availability of mtDNA databases suitable for forensic comparisons, we have developed a high-throughput process to generate approximately 5000 control region sequences per year from regional US populations, global populations from which the current US population is derived and global populations currently under-represented in available forensic databases. The system utilizes robotic instrumentation for all laboratory steps from pre-extraction through sequence detection, and a rigorous eight-step, multi-laboratory data review process with entirely electronic data transfer. Over the past 3 years, nearly 10,000 control region sequences have been generated using this approach. These data are being made publicly available and should further address the need for consistent, high-quality mtDNA databases for forensic testing.
NASA Astrophysics Data System (ADS)
Kai, Takaaki; Tanaka, Yuji; Kaneda, Hirotoshi; Kobayashi, Daichi; Tanaka, Akio
Recently, doubly fed induction generator (DFIG) and synchronous generator are mostly applied for wind power generation, and variable speed control and power factor control are executed for high efficiently for wind energy capture and high quality for power system voltage. In variable speed control, a wind speed or a generator speed is used for maximum power point tracking. However, performances of a wind generation power fluctuation due to wind speed variation have not yet investigated for those controls. The authors discuss power smoothing by those controls for the DFIG inter-connected to 6.6kV distribution line. The performances are verified using power system simulation software PSCAD/EMTDC for actual wind speed data and are examined from an approximate equation of wind generation power fluctuation for wind speed variation.
Boiler for generating high quality vapor
NASA Technical Reports Server (NTRS)
Gray, V. H.; Marto, P. J.; Joslyn, A. W.
1972-01-01
Boiler supplies vapor for use in turbines by imparting a high angular velocity to the liquid annulus in heated rotating drum. Drum boiler provides a sharp interface between boiling liquid and vapor, thereby, inhibiting the formation of unwanted liquid droplets.
Spectral and spatial characterisation of laser-driven positron beams
Sarri, G.; Warwick, J.; Schumaker, W.; ...
2016-10-18
The generation of high-quality relativistic positron beams is a central area of research in experimental physics, due to their potential relevance in a wide range of scientific and engineering areas, ranging from fundamental science to practical applications. There is now growing interest in developing hybrid machines that will combine plasma-based acceleration techniques with more conventional radio-frequency accelerators, in order to minimise the size and cost of these machines. Here we report on recent experiments on laser-driven generation of high-quality positron beams using a relatively low energy and potentially table-top laser system. Lastly, the results obtained indicate that current technology allowsmore » to create, in a compact setup, positron beams suitable for injection in radio-frequency accelerators.« less
Library construction for next-generation sequencing: Overviews and challenges
Head, Steven R.; Komori, H. Kiyomi; LaMere, Sarah A.; Whisenant, Thomas; Van Nieuwerburgh, Filip; Salomon, Daniel R.; Ordoukhanian, Phillip
2014-01-01
High-throughput sequencing, also known as next-generation sequencing (NGS), has revolutionized genomic research. In recent years, NGS technology has steadily improved, with costs dropping and the number and range of sequencing applications increasing exponentially. Here, we examine the critical role of sequencing library quality and consider important challenges when preparing NGS libraries from DNA and RNA sources. Factors such as the quantity and physical characteristics of the RNA or DNA source material as well as the desired application (i.e., genome sequencing, targeted sequencing, RNA-seq, ChIP-seq, RIP-seq, and methylation) are addressed in the context of preparing high quality sequencing libraries. In addition, the current methods for preparing NGS libraries from single cells are also discussed. PMID:24502796
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niu, K; Li, K; Smilowitz, J
2014-06-15
Purpose: To develop a high quality 4D cone beam CT (4DCBCT) method that is immune to patient/couch truncations and to investigate its application in adaptive replanning of lung XRT. Methods: In this study, IRB-approved human subject CBCT data was acquired using a Varian on-board imager with 1 minute rotation time. The acquired projection data was retrospectively sorted into 20 respiratory phase bins, from which 4DCBCT images with high SNR and high temporal resolution were generated using Prior Image Constrained Compressed Sensing (PICCS). Couch and patient truncations generate strong data inconsistency in the projection data and artifacts in the 4DCBCT image.more » They were addressed using an adaptive PICCS method. The artifact-free PICCS-4DCBCT images were used to generate adaptive treatment plans for the same patient at the 10th (day 21) and 30th (day 47) fractions. Dosimetric impacts with and without PICCS- 4DCBCT were evaluated by isodose distributions, DVHs, and other dosimetric factors. Results: The adaptive PICCS-4DCBCT method improves image quality by removing residue truncation artifacts; measured universal image quality increased 37%. The isodose lines and DVHs with PICCS-4DCBCT-based adaptive replanning were significantly more conformal to PTV than without replanning due to changes in patient anatomy caused by progress of the treatment. The mean dose to PTV at the 10th fraction was 63.1Gy with replanning and 64.2Gy without replanning, where the prescribed dose was 60Gy, in 2Gy × 30 fractions. The mean dose to PTV at the 30th fraction was 61.6Gy with replanning and 64.9Gy without replanning. Lung V20 was 37.1%, 41.9% and 43.3% for original plan, 10th fraction plan and 30th fraction plan; with re-planning, Lung V20 was 37.1%, 32%, 27.8%. Conclusion: 4DCBCT imaging using adaptive PICCS is able to generate high quality, artifact-free images that potentially can be used to create replanning for improving radiotherapy of the lung. K Niu, K Li, J Smilowitz: Nothing to Disclose. G Chen: General Electric Company Research funded, Siemens AG Research funded, Varian Medical Systems Research funded, Hologic Research funded.« less
What Do We Learn from Recall Consumption Data?
ERIC Educational Resources Information Center
Battistin, Erich; Miniaci, Raffaele; Weber, Guglielmo
2003-01-01
In this paper, we use two complementary Italian data sources (the 1995 ISTAT and Bank of Italy household surveys) to generate household-specific nondurable expenditure in the Bank of Italy sample that contains relatively high-quality income data. We show that food expenditure data are of comparable quality and informational content across the two…
USDA-ARS?s Scientific Manuscript database
Predicted rising global temperatures due to climate change have generated a demand for crops that are resistant to yield and quality losses from heat stress. Broccoli (Brassica oleracea var. italica) is a cool weather crop with high temperatures during production decreasing both head quality and yie...
USDA-ARS?s Scientific Manuscript database
Grain yield and semolina quality traits are essential selection criteria in durum wheat breeding. However, high cost of phenotypic screening limited the selection only on small number of lines and at later generations. This leads to relatively low selection efficiency due to the advancement of undes...
Moll, Karen M; Zhou, Peng; Ramaraj, Thiruvarangan; Fajardo, Diego; Devitt, Nicholas P; Sadowsky, Michael J; Stupar, Robert M; Tiffin, Peter; Miller, Jason R; Young, Nevin D; Silverstein, Kevin A T; Mudge, Joann
2017-08-04
Third generation sequencing technologies, with sequencing reads in the tens- of kilo-bases, facilitate genome assembly by spanning ambiguous regions and improving continuity. This has been critical for plant genomes, which are difficult to assemble due to high repeat content, gene family expansions, segmental and tandem duplications, and polyploidy. Recently, high-throughput mapping and scaffolding strategies have further improved continuity. Together, these long-range technologies enable quality draft assemblies of complex genomes in a cost-effective and timely manner. Here, we present high quality genome assemblies of the model legume plant, Medicago truncatula (R108) using PacBio, Dovetail Chicago (hereafter, Dovetail) and BioNano technologies. To test these technologies for plant genome assembly, we generated five assemblies using all possible combinations and ordering of these three technologies in the R108 assembly. While the BioNano and Dovetail joins overlapped, they also showed complementary gains in continuity and join numbers. Both technologies spanned repetitive regions that PacBio alone was unable to bridge. Combining technologies, particularly Dovetail followed by BioNano, resulted in notable improvements compared to Dovetail or BioNano alone. A combination of PacBio, Dovetail, and BioNano was used to generate a high quality draft assembly of R108, a M. truncatula accession widely used in studies of functional genomics. As a test for the usefulness of the resulting genome sequence, the new R108 assembly was used to pinpoint breakpoints and characterize flanking sequence of a previously identified translocation between chromosomes 4 and 8, identifying more than 22.7 Mb of novel sequence not present in the earlier A17 reference assembly. Adding Dovetail followed by BioNano data yielded complementary improvements in continuity over the original PacBio assembly. This strategy proved efficient and cost-effective for developing a quality draft assembly compared to traditional reference assemblies.
Hansen, Christian Rønn; Nielsen, Morten; Bertelsen, Anders Smedegaard; Hazell, Irene; Holtved, Eva; Zukauskaite, Ruta; Bjerregaard, Jon Kroll; Brink, Carsten; Bernchou, Uffe
2017-11-01
The quality of radiotherapy planning has improved substantially in the last decade with the introduction of intensity modulated radiotherapy. The purpose of this study was to analyze the plan quality and efficacy of automatically (AU) generated VMAT plans for inoperable esophageal cancer patients. Thirty-two consecutive inoperable patients with esophageal cancer originally treated with manually (MA) generated volumetric modulated arc therapy (VMAT) plans were retrospectively replanned using an auto-planning engine. All plans were optimized with one full 6MV VMAT arc giving 60 Gy to the primary target and 50 Gy to the elective target. The planning techniques were blinded before clinical evaluation by three specialized oncologists. To supplement the clinical evaluation, the optimization time for the AU plan was recorded along with DVH parameters for all plans. Upon clinical evaluation, the AU plan was preferred for 31/32 patients, and for one patient, there was no difference in the plans. In terms of DVH parameters, similar target coverage was obtained between the two planning methods. The mean dose for the spinal cord increased by 1.8 Gy using AU (p = .002), whereas the mean lung dose decreased by 1.9 Gy (p < .001). The AU plans were more modulated as seen by the increase of 12% in mean MUs (p = .001). The median optimization time for AU plans was 117 min. The AU plans were in general preferred and showed a lower mean dose to the lungs. The automation of the planning process generated esophageal cancer treatment plans quickly and with high quality.
Axis: Generating Explanations at Scale with Learnersourcing and Machine Learning
ERIC Educational Resources Information Center
Williams, Joseph Jay; Kim, Juho; Rafferty, Anna; Heffernan, Neil; Maldonado, Samuel; Gajos, Krzysztof Z.; Lasecki, Walter S.; Heffernan, Neil
2016-01-01
While explanations may help people learn by providing information about why an answer is correct, many problems on online platforms lack high-quality explanations. This paper presents AXIS (Adaptive eXplanation Improvement System), a system for obtaining explanations. AXIS asks learners to generate, revise, and evaluate explanations as they solve…
DESIGN OF AN ENGINE GENERATOR FOR THE RURAL POOR: A SUSTAINABLE SYSTEMS APPROACH
The system consists of a fuel source (a biodiesel system), a combustion/boiler system, and a steam engine/generator. The biodiesel system proved to be simplistic in its design and low cost; it successfully made high-quality biodiesel in an efficient manner. The main issues to ...
Cross-species Extrapolation of EDC Toxicity: Consequences for Screening Programs
Many structural and functional aspects of the vertebrate hypothalamic-pituitary-gonadal (HPG) axis are known to be highly conserved, but the full significance of this from a toxicological perspective has received comparatively little attention. High-quality data generated through...
A revision of the subtract-with-borrow random number generators
NASA Astrophysics Data System (ADS)
Sibidanov, Alexei
2017-12-01
The most popular and widely used subtract-with-borrow generator, also known as RANLUX, is reimplemented as a linear congruential generator using large integer arithmetic with the modulus size of 576 bits. Modern computers, as well as the specific structure of the modulus inferred from RANLUX, allow for the development of a fast modular multiplication - the core of the procedure. This was previously believed to be slow and have too high cost in terms of computing resources. Our tests show a significant gain in generation speed which is comparable with other fast, high quality random number generators. An additional feature is the fast skipping of generator states leading to a seeding scheme which guarantees the uniqueness of random number sequences. Licensing provisions: GPLv3 Programming language: C++, C, Assembler
Smooth and flat phase-locked Kerr frequency comb generation by higher order mode suppression
Huang, S.-W.; Liu, H.; Yang, J.; Yu, M.; Kwong, D.-L.; Wong, C. W.
2016-01-01
High-Q microresonator is perceived as a promising platform for optical frequency comb generation, via dissipative soliton formation. In order to achieve a higher quality factor and obtain the necessary anomalous dispersion, multi-mode waveguides were previously implemented in Si3N4 microresonators. However, coupling between different transverse mode families in multi-mode waveguides results in periodic disruption of dispersion and quality factor, and consequently causes perturbation to dissipative soliton formation and amplitude modulation to the corresponding spectrum. Careful choice of pump wavelength to avoid the mode crossing region is thus critical in conventional Si3N4 microresonators. Here, we report a novel design of Si3N4 microresonator in which single-mode operation, high quality factor, and anomalous dispersion are attained simultaneously. The novel microresonator is consisted of uniform single-mode waveguides in the semi-circle region, to eliminate bending induced mode coupling, and adiabatically tapered waveguides in the straight region, to avoid excitation of higher order modes. The intrinsic quality factor of the microresonator reaches 1.36 × 106 while the group velocity dispersion remains to be anomalous at −50 fs2/mm. With this novel microresonator, we demonstrate that broadband phase-locked Kerr frequency combs with flat and smooth spectra can be generated by pumping at any resonances in the optical C-band. PMID:27181420
NASA Astrophysics Data System (ADS)
Rolfe, John; Windle, Jill
2011-12-01
Policymakers wanting to increase protection of the Great Barrier Reef from pollutants generated by agriculture need to identify when measures to improve water quality generate benefits to society that outweigh the costs involved. The research reported in this paper makes a contribution in several ways. First, it uses the improved science understanding about the links between management changes and reef health to bring together the analysis of costs and benefits of marginal changes, helping to demonstrate the appropriate way of addressing policy questions relating to reef protection. Second, it uses the scientific relationships to frame a choice experiment to value the benefits of improved reef health, with the results of mixed logit (random parameter) models linking improvements explicitly to changes in "water quality units." Third, the research demonstrates how protection values are consistent across a broader population, with some limited evidence of distance effects. Fourth, the information on marginal costs and benefits that are reported provide policymakers with information to help improve management decisions. The results indicate that while there is potential for water quality improvements to generate net benefits, high cost water quality improvements are generally uneconomic. A major policy implication is that cost thresholds for key pollutants should be set to avoid more expensive water quality proposals being selected.
High-quality uniform dry transfer of graphene to polymers.
Lock, Evgeniya H; Baraket, Mira; Laskoski, Matthew; Mulvaney, Shawn P; Lee, Woo K; Sheehan, Paul E; Hines, Daniel R; Robinson, Jeremy T; Tosado, Jacob; Fuhrer, Michael S; Hernández, Sandra C; Walton, Scott G
2012-01-11
In this paper we demonstrate high-quality, uniform dry transfer of graphene grown by chemical vapor deposition on copper foil to polystyrene. The dry transfer exploits an azide linker molecule to establish a covalent bond to graphene and to generate greater graphene-polymer adhesion compared to that of the graphene-metal foil. Thus, this transfer approach provides a novel alternative route for graphene transfer, which allows for the metal foils to be reused. © 2011 American Chemical Society
NASA Astrophysics Data System (ADS)
Millstein, D.; Zhai, P.; Menon, S.
2011-12-01
Over the past decade significant reductions of NOx and SOx emissions from coal burning power plants in the U.S. have been achieved due to regulatory action and substitution of new generation towards natural gas and wind power. Low natural gas prices, ever decreasing solar generation costs, and proposed regulatory changes, such as to the Cross State Air Pollution Rule, promise further long-run coal power plant emission reductions. Reduced power plant emissions have the potential to affect ozone and particulate air quality and influence regional climate through aerosol cloud interactions and visibility effects. Here we investigate, on a national scale, the effects on future (~2030) air quality and regional climate of power plant emission regulations in contrast to and combination with policies designed to aggressively promote solar electricity generation. A sophisticated, economic and engineering based, hourly power generation dispatch model is developed to explore the integration of significant solar generation resources (>10% on an energy basis) at various regions across the county, providing detailed estimates of substitution of solar generation for fossil fuel generation resources. Future air pollutant emissions from all sectors of the economy are scaled based on the U.S. Environmental Protection Agency's National Emission Inventory to account for activity changes based on population and economic projections derived from county level U.S. Census data and the Energy Information Administration's Annual Energy Outlook. Further adjustments are made for technological and regulatory changes applicable within various sectors, for example, emission intensity adjustments to on-road diesel trucking due to exhaust treatment and improved engine design. The future year 2030 is selected for the emissions scenarios to allow for the development of significant solar generation resources. A regional climate and air quality model (Weather Research and Forecasting, WRF model) is used to investigate the effects of the various solar generation scenarios given emissions projections that account for changing regulatory environment, economic and population growth, and technological change. The results will help to quantify the potential air quality benefits of promotion of solar electricity generation in regions containing high penetration of coal-fired power generation. Note current national solar incentives that are based only on solar generation capacity. Further investigation of changes to regional climate due to emission reductions of aerosols and relevant precursors will provide insight into the environmental effects that may occur if solar power generation becomes widespread.
Image thumbnails that represent blur and noise.
Samadani, Ramin; Mauer, Timothy A; Berfanger, David M; Clark, James H
2010-02-01
The information about the blur and noise of an original image is lost when a standard image thumbnail is generated by filtering and subsampling. Image browsing becomes difficult since the standard thumbnails do not distinguish between high-quality and low-quality originals. In this paper, an efficient algorithm with a blur-generating component and a noise-generating component preserves the local blur and the noise of the originals. The local blur is rapidly estimated using a scale-space expansion of the standard thumbnail and subsequently used to apply a space-varying blur to the thumbnail. The noise is estimated and rendered by using multirate signal transformations that allow most of the processing to occur at the lower spatial sampling rate of the thumbnail. The new thumbnails provide a quick, natural way for users to identify images of good quality. A subjective evaluation shows the new thumbnails are more representative of their originals for blurry images. The noise generating component improves the results for noisy images, but degrades the results for textured images. The blur generating component of the new thumbnails may always be used to advantage. The decision to use the noise generating component of the new thumbnails should be based on testing with the particular image mix expected for the application.
ToxCast Data Generation: Chemical Workflow
This page describes the process EPA follows to select chemicals, procure chemicals, register chemicals, conduct a quality review of the chemicals, and prepare the chemicals for high-throughput screening.
Impact of flying qualities on mission effectiveness for helicopter air combat
NASA Technical Reports Server (NTRS)
Harris, T. M.; Beerman, D. A.; Bivens, C. C.
1984-01-01
Battlefield nap-of-the-earth (NOE) helicopter operations are vital for a use of the helicopter in a high-threat environment. As the pilot's workload in this flight regime is very high, the helicopter's handling qualities become an important factor. The present investigation is concerned with overall mission effectiveness, flying qualities, and their interaction with other parameters. A description is presented of a study which generated a significant amount of date relating the importance of flying qualities to the ability to perform several specific mission tasks. It was found that flying qualities do have a major impact on the ability to perform a specific mission. The impact of flying qualities on Scout helicopter mission effectiveness is mainly related to the probability of being detected. The flying qualities effect most critical to the Scout mission was found to be precision of hover control.
Argumentation Based Joint Learning: A Novel Ensemble Learning Approach
Xu, Junyi; Yao, Li; Li, Le
2015-01-01
Recently, ensemble learning methods have been widely used to improve classification performance in machine learning. In this paper, we present a novel ensemble learning method: argumentation based multi-agent joint learning (AMAJL), which integrates ideas from multi-agent argumentation, ensemble learning, and association rule mining. In AMAJL, argumentation technology is introduced as an ensemble strategy to integrate multiple base classifiers and generate a high performance ensemble classifier. We design an argumentation framework named Arena as a communication platform for knowledge integration. Through argumentation based joint learning, high quality individual knowledge can be extracted, and thus a refined global knowledge base can be generated and used independently for classification. We perform numerous experiments on multiple public datasets using AMAJL and other benchmark methods. The results demonstrate that our method can effectively extract high quality knowledge for ensemble classifier and improve the performance of classification. PMID:25966359
NASA Astrophysics Data System (ADS)
Xu, Gang; Li, Ming; Mourrain, Bernard; Rabczuk, Timon; Xu, Jinlan; Bordas, Stéphane P. A.
2018-01-01
In this paper, we propose a general framework for constructing IGA-suitable planar B-spline parameterizations from given complex CAD boundaries consisting of a set of B-spline curves. Instead of forming the computational domain by a simple boundary, planar domains with high genus and more complex boundary curves are considered. Firstly, some pre-processing operations including B\\'ezier extraction and subdivision are performed on each boundary curve in order to generate a high-quality planar parameterization; then a robust planar domain partition framework is proposed to construct high-quality patch-meshing results with few singularities from the discrete boundary formed by connecting the end points of the resulting boundary segments. After the topology information generation of quadrilateral decomposition, the optimal placement of interior B\\'ezier curves corresponding to the interior edges of the quadrangulation is constructed by a global optimization method to achieve a patch-partition with high quality. Finally, after the imposition of C1=G1-continuity constraints on the interface of neighboring B\\'ezier patches with respect to each quad in the quadrangulation, the high-quality B\\'ezier patch parameterization is obtained by a C1-constrained local optimization method to achieve uniform and orthogonal iso-parametric structures while keeping the continuity conditions between patches. The efficiency and robustness of the proposed method are demonstrated by several examples which are compared to results obtained by the skeleton-based parameterization approach.
Chalcogenide based rib waveguide for compact on-chip supercontinuum sources in mid-infrared domain
NASA Astrophysics Data System (ADS)
Saini, Than Singh; Tiwari, Umesh Kumar; Sinha, Ravindra Kumar
2017-08-01
We have designed and analysed a rib waveguide structure in recently reported Ga-Sb-S based highly nonlinear chalcogenide glass for nonlinear applications. The proposed waveguide structure possesses a very high nonlinear coefficient and can be used to generate broadband supercontinuum in mid-infrared domain. The reported design of the chalcogenide waveguide offers two zero dispersion values at 1800 nm and 2900 nm. Such rib waveguide structure is suitable to generate efficient supercontinuum generation ranging from 500 - 7400 μm. The reported waveguide can be used for the realization of the compact on-chip supercontinuum sources which are highly applicable in optical imaging, optical coherence tomography, food quality control, security and sensing.
High energy, high average power solid state green or UV laser
Hackel, Lloyd A.; Norton, Mary; Dane, C. Brent
2004-03-02
A system for producing a green or UV output beam for illuminating a large area with relatively high beam fluence. A Nd:glass laser produces a near-infrared output by means of an oscillator that generates a high quality but low power output and then multi-pass through and amplification in a zig-zag slab amplifier and wavefront correction in a phase conjugator at the midway point of the multi-pass amplification. The green or UV output is generated by means of conversion crystals that follow final propagation through the zig-zag slab amplifier.
De La Vega, Francisco M; Dailey, David; Ziegle, Janet; Williams, Julie; Madden, Dawn; Gilbert, Dennis A
2002-06-01
Since public and private efforts announced the first draft of the human genome last year, researchers have reported great numbers of single nucleotide polymorphisms (SNPs). We believe that the availability of well-mapped, quality SNP markers constitutes the gateway to a revolution in genetics and personalized medicine that will lead to better diagnosis and treatment of common complex disorders. A new generation of tools and public SNP resources for pharmacogenomic and genetic studies--specifically for candidate-gene, candidate-region, and whole-genome association studies--will form part of the new scientific landscape. This will only be possible through the greater accessibility of SNP resources and superior high-throughput instrumentation-assay systems that enable affordable, highly productive large-scale genetic studies. We are contributing to this effort by developing a high-quality linkage disequilibrium SNP marker map and an accompanying set of ready-to-use, validated SNP assays across every gene in the human genome. This effort incorporates both the public sequence and SNP data sources, and Celera Genomics' human genome assembly and enormous resource ofphysically mapped SNPs (approximately 4,000,000 unique records). This article discusses our approach and methodology for designing the map, choosing quality SNPs, designing and validating these assays, and obtaining population frequency ofthe polymorphisms. We also discuss an advanced, high-performance SNP assay chemisty--a new generation of the TaqMan probe-based, 5' nuclease assay-and high-throughput instrumentation-software system for large-scale genotyping. We provide the new SNP map and validation information, validated SNP assays and reagents, and instrumentation systems as a novel resource for genetic discoveries.
Lateral-Directional Eigenvector Flying Qualities Guidelines for High Performance Aircraft
NASA Technical Reports Server (NTRS)
Davidson, John B.; Andrisani, Dominick, II
1996-01-01
This report presents the development of lateral-directional flying qualities guidelines with application to eigenspace (eigenstructure) assignment methods. These guidelines will assist designers in choosing eigenvectors to achieve desired closed-loop flying qualities or performing trade-offs between flying qualities and other important design requirements, such as achieving realizable gain magnitudes or desired system robustness. This has been accomplished by developing relationships between the system's eigenvectors and the roll rate and sideslip transfer functions. Using these relationships, along with constraints imposed by system dynamics, key eigenvector elements are identified and guidelines for choosing values of these elements to yield desirable flying qualities have been developed. Two guidelines are developed - one for low roll-to-sideslip ratio and one for moderate-to-high roll-to-sideslip ratio. These flying qualities guidelines are based upon the Military Standard lateral-directional coupling criteria for high performance aircraft - the roll rate oscillation criteria and the sideslip excursion criteria. Example guidelines are generated for a moderate-to-large, an intermediate, and low value of roll-to-sideslip ratio.
Unstructured viscous grid generation by advancing-front method
NASA Technical Reports Server (NTRS)
Pirzadeh, Shahyar
1993-01-01
A new method of generating unstructured triangular/tetrahedral grids with high-aspect-ratio cells is proposed. The method is based on new grid-marching strategy referred to as 'advancing-layers' for construction of highly stretched cells in the boundary layer and the conventional advancing-front technique for generation of regular, equilateral cells in the inviscid-flow region. Unlike the existing semi-structured viscous grid generation techniques, the new procedure relies on a totally unstructured advancing-front grid strategy resulting in a substantially enhanced grid flexibility and efficiency. The method is conceptually simple but powerful, capable of producing high quality viscous grids for complex configurations with ease. A number of two-dimensional, triangular grids are presented to demonstrate the methodology. The basic elements of the method, however, have been primarily designed with three-dimensional problems in mind, making it extendible for tetrahedral, viscous grid generation.
Dawes, Timothy D; Turincio, Rebecca; Jones, Steven W; Rodriguez, Richard A; Gadiagellan, Dhireshan; Thana, Peter; Clark, Kevin R; Gustafson, Amy E; Orren, Linda; Liimatta, Marya; Gross, Daniel P; Maurer, Till; Beresini, Maureen H
2016-02-01
Acoustic droplet ejection (ADE) as a means of transferring library compounds has had a dramatic impact on the way in which high-throughput screening campaigns are conducted in many laboratories. Two Labcyte Echo ADE liquid handlers form the core of the compound transfer operation in our 1536-well based ultra-high-throughput screening (uHTS) system. Use of these instruments has promoted flexibility in compound formatting in addition to minimizing waste and eliminating compound carryover. We describe the use of ADE for the generation of assay-ready plates for primary screening as well as for follow-up dose-response evaluations. Custom software has enabled us to harness the information generated by the ADE instrumentation. Compound transfer via ADE also contributes to the screening process outside of the uHTS system. A second fully automated ADE-based system has been used to augment the capacity of the uHTS system as well as to permit efficient use of previously picked compound aliquots for secondary assay evaluations. Essential to the utility of ADE in the high-throughput screening process is the high quality of the resulting data. Examples of data generated at various stages of high-throughput screening campaigns are provided. Advantages and disadvantages of the use of ADE in high-throughput screening are discussed. © 2015 Society for Laboratory Automation and Screening.
Curatr: a web application for creating, curating and sharing a mass spectral library.
Palmer, Andrew; Phapale, Prasad; Fay, Dominik; Alexandrov, Theodore
2018-04-15
We have developed a web application curatr for the rapid generation of high quality mass spectral fragmentation libraries from liquid-chromatography mass spectrometry datasets. Curatr handles datasets from single or multiplexed standards and extracts chromatographic profiles and potential fragmentation spectra for multiple adducts. An intuitive interface helps users to select high quality spectra that are stored along with searchable molecular information, the providence of each standard and experimental metadata. Curatr supports exports to several standard formats for use with third party software or submission to repositories. We demonstrate the use of curatr to generate the EMBL Metabolomics Core Facility spectral library http://curatr.mcf.embl.de. Source code and example data are at http://github.com/alexandrovteam/curatr/. palmer@embl.de. Supplementary data are available at Bioinformatics online.
Assembling short reads from jumping libraries with large insert sizes.
Vasilinetc, Irina; Prjibelski, Andrey D; Gurevich, Alexey; Korobeynikov, Anton; Pevzner, Pavel A
2015-10-15
Advances in Next-Generation Sequencing technologies and sample preparation recently enabled generation of high-quality jumping libraries that have a potential to significantly improve short read assemblies. However, assembly algorithms have to catch up with experimental innovations to benefit from them and to produce high-quality assemblies. We present a new algorithm that extends recently described exSPAnder universal repeat resolution approach to enable its applications to several challenging data types, including jumping libraries generated by the recently developed Illumina Nextera Mate Pair protocol. We demonstrate that, with these improvements, bacterial genomes often can be assembled in a few contigs using only a single Nextera Mate Pair library of short reads. Described algorithms are implemented in C++ as a part of SPAdes genome assembler, which is freely available at bioinf.spbau.ru/en/spades. ap@bioinf.spbau.ru Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Ahmed Ali, Usama; Reiber, Beata M M; Ten Hove, Joren R; van der Sluis, Pieter C; Gooszen, Hein G; Boermeester, Marja A; Besselink, Marc G
2017-11-01
The journal impact factor (IF) is often used as a surrogate marker for methodological quality. The objective of this study is to evaluate the relation between the journal IF and methodological quality of surgical randomized controlled trials (RCTs). Surgical RCTs published in PubMed in 1999 and 2009 were identified. According to IF, RCTs were divided into groups of low (<2), median (2-3) and high IF (>3), as well as into top-10 vs all other journals. Methodological quality characteristics and factors concerning funding, ethical approval and statistical significance of outcomes were extracted and compared between the IF groups. Additionally, a multivariate regression was performed. The median IF was 2.2 (IQR 2.37). The percentage of 'low-risk of bias' RCTs was 13% for top-10 journals vs 4% for other journals in 1999 (P < 0.02), and 30 vs 12% in 2009 (P < 0.02). Similar results were observed for high vs low IF groups. The presence of sample-size calculation, adequate generation of allocation and intention-to-treat analysis were independently associated with publication in higher IF journals; as were multicentre trials and multiple authors. Publication of RCTs in high IF journals is associated with moderate improvement in methodological quality compared to RCTs published in lower IF journals. RCTs with adequate sample-size calculation, generation of allocation or intention-to-treat analysis were associated with publication in a high IF journal. On the other hand, reporting a statistically significant outcome and being industry funded were not independently associated with publication in a higher IF journal.
SkySat-1: very high-resolution imagery from a small satellite
NASA Astrophysics Data System (ADS)
Murthy, Kiran; Shearn, Michael; Smiley, Byron D.; Chau, Alexandra H.; Levine, Josh; Robinson, M. Dirk
2014-10-01
This paper presents details of the SkySat-1 mission, which is the first microsatellite-class commercial earth- observation system to generate sub-meter resolution panchromatic imagery, in addition to sub-meter resolution 4-band pan-sharpened imagery. SkySat-1 was built and launched for an order of magnitude lower cost than similarly performing missions. The low-cost design enables the deployment of a large imaging constellation that can provide imagery with both high temporal resolution and high spatial resolution. One key enabler of the SkySat-1 mission was simplifying the spacecraft design and instead relying on ground- based image processing to achieve high-performance at the system level. The imaging instrument consists of a custom-designed high-quality optical telescope and commercially-available high frame rate CMOS image sen- sors. While each individually captured raw image frame shows moderate quality, ground-based image processing algorithms improve the raw data by combining data from multiple frames to boost image signal-to-noise ratio (SNR) and decrease the ground sample distance (GSD) in a process Skybox calls "digital TDI". Careful qual-ity assessment and tuning of the spacecraft, payload, and algorithms was necessary to generate high-quality panchromatic, multispectral, and pan-sharpened imagery. Furthermore, the framing sensor configuration en- abled the first commercial High-Definition full-frame rate panchromatic video to be captured from space, with approximately 1 meter ground sample distance. Details of the SkySat-1 imaging instrument and ground-based image processing system are presented, as well as an overview of the work involved with calibrating and validating the system. Examples of raw and processed imagery are shown, and the raw imagery is compared to pre-launch simulated imagery used to tune the image processing algorithms.
The quantitative control and matching of an optical false color composite imaging system
NASA Astrophysics Data System (ADS)
Zhou, Chengxian; Dai, Zixin; Pan, Xizhe; Li, Yinxi
1993-10-01
Design of an imaging system for optical false color composite (OFCC) capable of high-precision density-exposure time control and color balance is presented. The system provides high quality FCC image data that can be analyzed using a quantitative calculation method. The quality requirement to each part of the image generation system is defined, and the distribution of satellite remote sensing image information is analyzed. The proposed technology makes it possible to present the remote sensing image data more effectively and accurately.
Optically efficient InAsSb nanowires for silicon-based mid-wavelength infrared optoelectronics.
Zhuang, Q D; Alradhi, H; Jin, Z M; Chen, X R; Shao, J; Chen, X; Sanchez, Ana M; Cao, Y C; Liu, J Y; Yates, P; Durose, K; Jin, C J
2017-03-10
InAsSb nanowires (NWs) with a high Sb content have potential in the fabrication of advanced silicon-based optoelectronics such as infrared photondetectors/emitters and highly sensitive phototransistors, as well as in the generation of renewable electricity. However, producing optically efficient InAsSb NWs with a high Sb content remains a challenge, and optical emission is limited to 4.0 μm due to the quality of the nanowires. Here, we report, for the first time, the success of high-quality and optically efficient InAsSb NWs enabling silicon-based optoelectronics operating in entirely mid-wavelength infrared. Pure zinc-blende InAsSb NWs were realized with efficient photoluminescence emission. We obtained room-temperature photoluminescence emission in InAs NWs and successfully extended the emission wavelength in InAsSb NWs to 5.1 μm. The realization of this optically efficient InAsSb NW material paves the way to realizing next-generation devices, combining advances in III-V semiconductors and silicon.
Efficient high-performance ultrasound beamforming using oversampling
NASA Astrophysics Data System (ADS)
Freeman, Steven R.; Quick, Marshall K.; Morin, Marc A.; Anderson, R. C.; Desilets, Charles S.; Linnenbrink, Thomas E.; O'Donnell, Matthew
1998-05-01
High-performance and efficient beamforming circuitry is very important in large channel count clinical ultrasound systems. Current state-of-the-art digital systems using multi-bit analog to digital converters (A/Ds) have matured to provide exquisite image quality with moderate levels of integration. A simplified oversampling beamforming architecture has been proposed that may a low integration of delta-sigma A/Ds onto the same chip as digital delay and processing circuitry to form a monolithic ultrasound beamformer. Such a beamformer may enable low-power handheld scanners for high-end systems with very large channel count arrays. This paper presents an oversampling beamformer architecture that generates high-quality images using very simple; digitization, delay, and summing circuits. Additional performance may be obtained with this oversampled system for narrow bandwidth excitations by mixing the RF signal down in frequency to a range where the electronic signal to nose ratio of the delta-sigma A/D is optimized. An oversampled transmit beamformer uses the same delay circuits as receive and eliminates the need for separate transmit function generators.
Luo, Mingzhang; Li, Weijie; Wang, Junming; Chen, Xuemin; Song, Gangbing
2018-01-01
As a common approach to nondestructive testing and evaluation, guided wave-based methods have attracted much attention because of their wide detection range and high detection efficiency. It is highly desirable to develop a portable guided wave testing system with high actuating energy and variable frequency. In this paper, a novel giant magnetostrictive actuator with high actuation power is designed and implemented, based on the giant magnetostrictive (GMS) effect. The novel GMS actuator design involves a conical energy-focusing head that can focus the amplified mechanical energy generated by the GMS actuator. This design enables the generation of stress waves with high energy, and the focusing of the generated stress waves on the test object. The guided wave generation system enables two kinds of output modes: the coded pulse signal and the sweep signal. The functionality and the advantages of the developed system are validated through laboratory testing in the quality assessment of rock bolt-reinforced structures. In addition, the developed GMS actuator and the supporting system are successfully implemented and applied in field tests. The device can also be used in other nondestructive testing and evaluation applications that require high-power stress wave generation. PMID:29510540
Luo, Mingzhang; Li, Weijie; Wang, Junming; Wang, Ning; Chen, Xuemin; Song, Gangbing
2018-03-04
As a common approach to nondestructive testing and evaluation, guided wave-based methods have attracted much attention because of their wide detection range and high detection efficiency. It is highly desirable to develop a portable guided wave testing system with high actuating energy and variable frequency. In this paper, a novel giant magnetostrictive actuator with high actuation power is designed and implemented, based on the giant magnetostrictive (GMS) effect. The novel GMS actuator design involves a conical energy-focusing head that can focus the amplified mechanical energy generated by the GMS actuator. This design enables the generation of stress waves with high energy, and the focusing of the generated stress waves on the test object. The guided wave generation system enables two kinds of output modes: the coded pulse signal and the sweep signal. The functionality and the advantages of the developed system are validated through laboratory testing in the quality assessment of rock bolt-reinforced structures. In addition, the developed GMS actuator and the supporting system are successfully implemented and applied in field tests. The device can also be used in other nondestructive testing and evaluation applications that require high-power stress wave generation.
Gaze-Aware Streaming Solutions for the Next Generation of Mobile VR Experiences.
Lungaro, Pietro; Sjoberg, Rickard; Valero, Alfredo Jose Fanghella; Mittal, Ashutosh; Tollmar, Konrad
2018-04-01
This paper presents a novel approach to content delivery for video streaming services. It exploits information from connected eye-trackers embedded in the next generation of VR Head Mounted Displays (HMDs). The proposed solution aims to deliver high visual quality, in real time, around the users' fixations points while lowering the quality everywhere else. The goal of the proposed approach is to substantially reduce the overall bandwidth requirements for supporting VR video experiences while delivering high levels of user perceived quality. The prerequisites to achieve these results are: (1) mechanisms that can cope with different degrees of latency in the system and (2) solutions that support fast adaptation of video quality in different parts of a frame, without requiring a large increase in bitrate. A novel codec configuration, capable of supporting near-instantaneous video quality adaptation in specific portions of a video frame, is presented. The proposed method exploits in-built properties of HEVC encoders and while it introduces a moderate amount of error, these errors are indetectable by users. Fast adaptation is the key to enable gaze-aware streaming and its reduction in bandwidth. A testbed implementing gaze-aware streaming, together with a prototype HMD with in-built eye tracker, is presented and was used for testing with real users. The studies quantified the bandwidth savings achievable by the proposed approach and characterize the relationships between Quality of Experience (QoE) and network latency. The results showed that up to 83% less bandwidth is required to deliver high QoE levels to the users, as compared to conventional solutions.
Ultrasonically Assisted Cutting of Bio-tissues in Microtomy
NASA Astrophysics Data System (ADS)
Wang, Dong; Roy, Anish; Silberschmidt, Vadim V.
Modern-day histology of bio-tissues for supporting stratified medicine diagnoses requires high-precision cutting to ensure high quality extremely thin specimens used in analysis. Additionally, the cutting quality is significantly affected by a wide variety of soft and hard tissues in the samples. This paper deals with development of a next generation of microtome employing introduction of controlled ultrasonic vibration to realise a hybrid cutting process of bio-tissues. The study is based on a combination of advanced experimental and numerical (finite-element) studies of multi-body dynamics of a cutting system. The quality of cut samples produced with the prototype is compared with the state-of-the-art.
High power, high beam quality regenerative amplifier
Hackel, L.A.; Dane, C.B.
1993-08-24
A regenerative laser amplifier system generates high peak power and high energy per pulse output beams enabling generation of X-rays used in X-ray lithography for manufacturing integrated circuits. The laser amplifier includes a ring shaped optical path with a limited number of components including a polarizer, a passive 90 degree phase rotator, a plurality of mirrors, a relay telescope, and a gain medium, the components being placed close to the image plane of the relay telescope to reduce diffraction or phase perturbations in order to limit high peak intensity spiking. In the ring, the beam makes two passes through the gain medium for each transit of the optical path to increase the amplifier gain to loss ratio. A beam input into the ring makes two passes around the ring, is diverted into an SBS phase conjugator and proceeds out of the SBS phase conjugator back through the ring in an equal but opposite direction for two passes, further reducing phase perturbations. A master oscillator inputs the beam through an isolation cell (Faraday or Pockels) which transmits the beam into the ring without polarization rotation. The isolation cell rotates polarization only in beams proceeding out of the ring to direct the beams out of the amplifier. The diffraction limited quality of the input beam is preserved in the amplifier so that a high power output beam having nearly the same diffraction limited quality is produced.
High power, high beam quality regenerative amplifier
Hackel, Lloyd A.; Dane, Clifford B.
1993-01-01
A regenerative laser amplifier system generates high peak power and high energy per pulse output beams enabling generation of X-rays used in X-ray lithography for manufacturing integrated circuits. The laser amplifier includes a ring shaped optical path with a limited number of components including a polarizer, a passive 90 degree phase rotator, a plurality of mirrors, a relay telescope, and a gain medium, the components being placed close to the image plane of the relay telescope to reduce diffraction or phase perturbations in order to limit high peak intensity spiking. In the ring, the beam makes two passes through the gain medium for each transit of the optical path to increase the amplifier gain to loss ratio. A beam input into the ring makes two passes around the ring, is diverted into an SBS phase conjugator and proceeds out of the SBS phase conjugator back through the ring in an equal but opposite direction for two passes, further reducing phase perturbations. A master oscillator inputs the beam through an isolation cell (Faraday or Pockels) which transmits the beam into the ring without polarization rotation. The isolation cell rotates polarization only in beams proceeding out of the ring to direct the beams out of the amplifier. The diffraction limited quality of the input beam is preserved in the amplifier so that a high power output beam having nearly the same diffraction limited quality is produced.
ChronQC: a quality control monitoring system for clinical next generation sequencing.
Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C
2018-05-15
ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.
Apparatus and process for passivating an SRF cavity
Myneni, Ganapati Rao; Wallace, John P
2014-12-02
An apparatus and process for the production of a niobium cavity exhibiting high quality factors at high gradients is provided. The apparatus comprises a first chamber positioned within a second chamber, an RF generator and vacuum pumping systems. The process comprises placing the niobium cavity in a first chamber of the apparatus; thermally treating the cavity by high temperature in the first chamber while maintaining high vacuum in the first and second chambers; and applying a passivating thin film layer to a surface of the cavity in the presence of a gaseous mixture and an RF field. Further a niobium cavity exhibiting high quality factors at high gradients produced by the method of the invention is provided.
Low-grade geothermal energy conversion by organic Rankine cycle turbine generator
NASA Astrophysics Data System (ADS)
Zarling, J. P.; Aspnes, J. D.
Results of a demonstration project which helped determine the feasibility of converting low-grade thermal energy in 49 C water into electrical energy via an organic Rankine cycle 2500 watt (electrical) turbine-generator are presented. The geothermal source which supplied the water is located in a rural Alaskan village. The reasons an organic Rankine cycle turbine-generator was investigated as a possible source of electric power in rural Alaska are: (1) high cost of operating diesel-electric units and their poor long-term reliability when high-quality maintenance is unavailable and (2) the extremely high level of long-term reliability reportedly attained by commercially available organic Rankine cycle turbines. Data is provided on the thermal and electrical operating characteristics of an experimental organic Rankine cycle turbine-generator operating at a uniquely low vaporizer temperature.
NASA Astrophysics Data System (ADS)
Boisson, Guillaume; Kerbiriou, Paul; Drazic, Valter; Bureller, Olivier; Sabater, Neus; Schubert, Arno
2014-03-01
Generating depth maps along with video streams is valuable for Cinema and Television production. Thanks to the improvements of depth acquisition systems, the challenge of fusion between depth sensing and disparity estimation is widely investigated in computer vision. This paper presents a new framework for generating depth maps from a rig made of a professional camera with two satellite cameras and a Kinect device. A new disparity-based calibration method is proposed so that registered Kinect depth samples become perfectly consistent with disparities estimated between rectified views. Also, a new hierarchical fusion approach is proposed for combining on the flow depth sensing and disparity estimation in order to circumvent their respective weaknesses. Depth is determined by minimizing a global energy criterion that takes into account the matching reliability and the consistency with the Kinect input. Thus generated depth maps are relevant both in uniform and textured areas, without holes due to occlusions or structured light shadows. Our GPU implementation reaches 20fps for generating quarter-pel accurate HD720p depth maps along with main view, which is close to real-time performances for video applications. The estimated depth is high quality and suitable for 3D reconstruction or virtual view synthesis.
ERIC Educational Resources Information Center
Thoenig, Jean-Claude; Paradeise, Catherine
2014-01-01
Does organizational governance contribute to academic quality? Two top research universities are observed in-depth: Berkeley and the MIT. Three key factors are listed that help generate consistent and lasting high performance. Priority is allocated to self-evaluation and to the development of talent. Values and norms such as community membership,…
Narinc, D; Aygun, A; Karaman, E; Aksoy, T
2015-07-01
The objective of the present study was to estimate heritabilities as well as genetic and phenotypic correlations for egg weight, specific gravity, shape index, shell ratio, egg shell strength, egg length, egg width and shell weight in Japanese quail eggs. External egg quality traits were measured on 5864 eggs of 934 female quails from a dam line selected for two generations. Within the Bayesian framework, using Gibbs Sampling algorithm, a multivariate animal model was applied to estimate heritabilities and genetic correlations for external egg quality traits. The heritability estimates for external egg quality traits were moderate to high and ranged from 0.29 to 0.81. The heritability estimates for egg and shell weight of 0.81 and 0.76 were fairly high. The genetic and phenotypic correlations between egg shell strength with specific gravity, shell ratio and shell weight ranging from 0.55 to 0.79 were relatively high. It can be concluded that it is possible to determine egg shell quality using the egg specific gravity values utilizing its high heritability and fairly high positive correlation with most of the egg shell quality traits. As a result, egg specific gravity may be the choice of selection criterion rather than other external egg traits for genetic improvement of egg shell quality in Japanese quails.
NASA Astrophysics Data System (ADS)
Ji, Zhong-Ye; Zhang, Xiao-Fang
2018-01-01
The mathematical relation between the beam quality β factor of high-energy laser and the wavefront aberration of laser beam is important in beam quality control theory of the high-energy laser weapon system. In order to obtain this mathematical relation, numerical simulation is used in the research. Firstly, the Zernike representations of typically distorted atmospheric wavefront aberrations caused by the Kolmogoroff turbulence are generated. And then, the corresponding beam quality β factors of the different distorted wavefronts are calculated numerically through fast Fourier transform. Thus, the statistical distribution rule between the beam quality β factors of high-energy laser and the wavefront aberrations of the beam can be established by the calculated results. Finally, curve fitting method is chosen to establish the mathematical fitting relationship of these two parameters. And the result of the curve fitting shows that there is a quadratic curve relation between the beam quality β factor of high-energy laser and the wavefront aberration of laser beam. And in this paper, 3 fitting curves, in which the wavefront aberrations are consisted of Zernike Polynomials of 20, 36, 60 orders individually, are established to express the relationship between the beam quality β factor and atmospheric wavefront aberrations with different spatial frequency.
A filtering method to generate high quality short reads using illumina paired-end technology.
Eren, A Murat; Vineis, Joseph H; Morrison, Hilary G; Sogin, Mitchell L
2013-01-01
Consensus between independent reads improves the accuracy of genome and transcriptome analyses, however lack of consensus between very similar sequences in metagenomic studies can and often does represent natural variation of biological significance. The common use of machine-assigned quality scores on next generation platforms does not necessarily correlate with accuracy. Here, we describe using the overlap of paired-end, short sequence reads to identify error-prone reads in marker gene analyses and their contribution to spurious OTUs following clustering analysis using QIIME. Our approach can also reduce error in shotgun sequencing data generated from libraries with small, tightly constrained insert sizes. The open-source implementation of this algorithm in Python programming language with user instructions can be obtained from https://github.com/meren/illumina-utils.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ha, Gwanghui; Cho, Moo-Hyun; Conde, Manoel
Emittance exchange (EEX) based longitudinal current profile shaping is the one of the promising current profile shaping technique. This method can generate high quality arbitrary current profiles under the ideal conditions. The double dog-leg EEX beam line was recently installed at the Argonne Wakefield Accelerator (AWA) to explore the shaping capability and confirm the quality of this method. To demonstrate the arbitrary current profile generation, several different transverse masks are applied to generate different final current profiles. The phase space slopes and the charge of incoming beam are varied to observe and suppress the aberrations on the ideal profile. Wemore » present current profile shaping results, aberrations on the shaped profile, and its suppression.« less
On the virtues of automated quantitative structure-activity relationship: the new kid on the block.
de Oliveira, Marcelo T; Katekawa, Edson
2018-02-01
Quantitative structure-activity relationship (QSAR) has proved to be an invaluable tool in medicinal chemistry. Data availability at unprecedented levels through various databases have collaborated to a resurgence in the interest for QSAR. In this context, rapid generation of quality predictive models is highly desirable for hit identification and lead optimization. We showcase the application of an automated QSAR approach, which randomly selects multiple training/test sets and utilizes machine-learning algorithms to generate predictive models. Results demonstrate that AutoQSAR produces models of improved or similar quality to those generated by practitioners in the field but in just a fraction of the time. Despite the potential of the concept to the benefit of the community, the AutoQSAR opportunity has been largely undervalued.
Center for Inherited Disease Research (CIDR)
The Center for Inherited Disease Research (CIDR) Program at The Johns Hopkins University provides high-quality next generation sequencing and genotyping services to investigators working to discover genes that contribute to common diseases.
Square2 - A Web Application for Data Monitoring in Epidemiological and Clinical Studies
Schmidt, Carsten Oliver; Krabbe, Christine; Schössow, Janka; Albers, Martin; Radke, Dörte; Henke, Jörg
2017-01-01
Valid scientific inferences from epidemiological and clinical studies require high data quality. Data generating departments therefore aim to detect data irregularities as early as possible in order to guide quality management processes. In addition, after the completion of data collections the obtained data quality must be evaluated. This can be challenging in complex studies due to a wide scope of examinations, numerous study variables, multiple examiners, devices, and examination centers. This paper describes a Java EE web application used to monitor and evaluate data quality in institutions with complex and multiple studies, named Square 2 . It uses the Java libraries Apache MyFaces 2, extended by BootsFaces for layout and style. RServe and REngine manage calls to R server processes. All study data and metadata are stored in PostgreSQL. R is the statistics backend and LaTeX is used for the generation of print ready PDF reports. A GUI manages the entire workflow. Square 2 covers all steps in the data monitoring workflow, including the setup of studies and their structure, the handling of metadata for data monitoring purposes, selection of variables, upload of data, statistical analyses, and the generation as well as inspection of quality reports. To take into account data protection issues, Square 2 comprises an extensive user rights and roles concept.
Fuel cells - a new contributor to stationary power
NASA Astrophysics Data System (ADS)
Dufour, Angelo U.
Stationary power generation historically started as distributed generation near the user, with the configuration of a very open market, where a lot of small competing utilities were offering electricity to the customers. At a second time it became a `monopolistic' business because of technical reasons. Big steam turbines and electric generators, allowing better efficiencies, were more conveniently installed in very large power plants, necessarily located in sites far away from where the power was needed, and the transmission losses were bounded by AC high voltage technology. The Governments were, therefore, trying to balance the power of monopolies, that were limiting the economical development of the countries, by strengthening the concept of electrical energy price public control and, alternatively, by establishing rules to allow a free flow of electricity from one region to the other, or taking direct control through ownership of big and small utilities. The most effective way of making the electric energy system competitive has proved to be the opening of a partial competition in the generation field by forcing the utilities to compare the cost of their energy, produced with new centralised plants, to the price of the available energy, coming from combined heat and power dispersed generators. In fact, with reference to this cost, all the peculiar features of large central stations and dispersed generators were taken into account, like the widespread use of natural gas, the investment risk reduction with single smaller increments of capacity, the transmission and distribution siting difficulties and high costs, the improved system reliability, and, finally, the high quality electric power. Fuel Cells are a recently become available technology for distributed electrical energy production, because they share the main typical aspects, relevant for a distributed power system, like compatibility with other modular subsystem packages, fully automation possibility, very low noise and emissions release, high efficiency both directly as fuel cell (38-55%) and in integrated cycles (50-65% with fossil fuels), delivered `power quality' and reliability. Focus is principally kept on the impact fuel cells could have on electrical grid management and control, for their voltage support and active filtering capabilities, for their response speed and for quick load connection capabilities. The cost for the moment is high, but some technology, like phosphoric acid, is in the market entry phase. Cost analysis for the main subsystems, that is fuel cell stacks, fuel processors, and power electronics and controls, indicates that the prices will be driven down to the required levels both through technology refinements and increase of production volumes. Anyhow, a new phase is beginning, where centralised power plants are facing the competition of distributed generators, like fuel cells, small gas turbines and internal combustion engines, and of other renewable energy generators, like photovoltaics and wind generators. They all are modular, dispersed throughout the utility distribution system to provide power closer to end user, and are not in competition with existing transmission and distribution systems, but they improve the systems' utilisation. The plants will initially be directly owned and operated by gas or energy distributors, and the customers could easily supersede their mistrusts by only paying for the energy they are really utilising, leaving away the worries about the investment costs and the risks of a bad operation. An `intelligent grid', delivering high quality electrical energy to millions of electrical household consumers, which, a second later, become non-polluting energy producers, appears to be giving a very relevant contribution to `the town of the future', envisaged also by the European Commission, where the quality of our lives is mainly depending on the quality of the energy.
Active Semi-Supervised Community Detection Based on Must-Link and Cannot-Link Constraints
Cheng, Jianjun; Leng, Mingwei; Li, Longjie; Zhou, Hanhai; Chen, Xiaoyun
2014-01-01
Community structure detection is of great importance because it can help in discovering the relationship between the function and the topology structure of a network. Many community detection algorithms have been proposed, but how to incorporate the prior knowledge in the detection process remains a challenging problem. In this paper, we propose a semi-supervised community detection algorithm, which makes full utilization of the must-link and cannot-link constraints to guide the process of community detection and thereby extracts high-quality community structures from networks. To acquire the high-quality must-link and cannot-link constraints, we also propose a semi-supervised component generation algorithm based on active learning, which actively selects nodes with maximum utility for the proposed semi-supervised community detection algorithm step by step, and then generates the must-link and cannot-link constraints by accessing a noiseless oracle. Extensive experiments were carried out, and the experimental results show that the introduction of active learning into the problem of community detection makes a success. Our proposed method can extract high-quality community structures from networks, and significantly outperforms other comparison methods. PMID:25329660
ERIC Educational Resources Information Center
Immerwahr, John; Johnson, Jean
2007-01-01
Traditionally, the United States higher education system has been the envy of the world for its high quality, accessibility to millions of Americans, ability to train generations of skilled workers, and its contribution to creating the vast American middle class. Today, however, higher education is experiencing new pressures. A new generation of…
Cornelia Pinchot; Stacy Clark; Scott Schlarbaum; Arnold Saxton; Shi-Jean Sung; Frederick. Hebard
2015-01-01
Blight-resistant American chestnut (Castanea dentata) may soon be commercially available, but few studies have tested methods to produce high quality seedlings that will be competitive after planting. This study evaluated the performance of one American, one Chinese (C. mollissima), one second-generation backcross (BC3...
Biofuel as an Integrated Farm Drainage Management crop: A bioeconomic analysis
NASA Astrophysics Data System (ADS)
Levers, L. R.; Schwabe, K. A.
2017-04-01
Irrigated agricultural lands in arid regions often suffer from soil salinization and lack of drainage, which affect environmental quality and productivity. Integrated Farm Drainage Management (IFDM) systems, where drainage water generated from higher-valued crops grown on high quality soils are used to irrigate salt-tolerant crops grown on marginal soils, is one possible strategy for managing salinity and drainage problems. If the IFDM crop were a biofuel crop, both environmental and private benefits may be generated; however, little is known about this possibility. As such, we develop a bioeconomic programming model of irrigated agricultural production to examine the role salt-tolerant biofuel crops might play within an IFDM system. Our results, generated by optimizing profits over land, water, and crop choice decisions subject to resource constraints, suggest that based on the private profits alone, biofuel crops can be a competitive alternative to the common practices of land retirement and nonbiofuel crop production under both low to high drainage water salinity. Yet IFDM biofuel crop production generates 30-35% fewer GHG emissions than the other strategies. The private market competitiveness coupled with the public good benefits may justify policy changes encouraging the growth of IFDM biofuel crops in arid agricultural areas globally.
NASA Astrophysics Data System (ADS)
Ndiaye, Maty; Quinquis, Catherine; Larabi, Mohamed Chaker; Le Lay, Gwenael; Saadane, Hakim; Perrine, Clency
2014-01-01
During the last decade, the important advances and widespread availability of mobile technology (operating systems, GPUs, terminal resolution and so on) have encouraged a fast development of voice and video services like video-calling. While multimedia services have largely grown on mobile devices, the generated increase of data consumption is leading to the saturation of mobile networks. In order to provide data with high bit-rates and maintain performance as close as possible to traditional networks, the 3GPP (The 3rd Generation Partnership Project) worked on a high performance standard for mobile called Long Term Evolution (LTE). In this paper, we aim at expressing recommendations related to audio and video media profiles (selection of audio and video codecs, bit-rates, frame-rates, audio and video formats) for a typical video-calling services held over LTE/4G mobile networks. These profiles are defined according to targeted devices (smartphones, tablets), so as to ensure the best possible quality of experience (QoE). Obtained results indicate that for a CIF format (352 x 288 pixels) which is usually used for smartphones, the VP8 codec provides a better image quality than the H.264 codec for low bitrates (from 128 to 384 kbps). However sequences with high motion, H.264 in slow mode is preferred. Regarding audio, better results are globally achieved using wideband codecs offering good quality except for opus codec (at 12.2 kbps).
Alyusuf, Raja H; Prasad, Kameshwar; Abdel Satir, Ali M; Abalkhail, Ali A; Arora, Roopa K
2013-01-01
The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites.
Negative ion-driven associated particle neutron generator
Antolak, A. J.; Leung, K. N.; Morse, D. H.; ...
2015-10-09
We describe an associated particle neutron generator that employs a negative ion source to produce high neutron flux from a small source size. Furthermore, negative ions produced in an rf-driven plasma source are extracted through a small aperture to form a beam which bombards a positively biased, high voltage target electrode. Electrons co-extracted with the negative ions are removed by a permanent magnet electron filter. The use of negative ions enables high neutron output (100% atomic ion beam), high quality imaging (small neutron source size), and reliable operation (no high voltage breakdowns). Finally, the neutron generator can operate in eithermore » pulsed or continuous-wave (cw) mode and has been demonstrated to produce 10 6 D-D n/s (equivalent to similar to 10 8 D-T n/s) from a 1 mm-diameter neutron source size to facilitate high fidelity associated particle imaging.« less
Traditional ecological knowledge underlying herding decisions of pastoralists.
Tamou, C; de Boer, I J M; Ripoll-Bosch, R; Oosting, S J
2018-04-01
Pastoralists have traditional ecological knowledge (TEK), which is important for their livelihoods and for policies and interventions. Pastoralism is under pressure, however, which may result in a decline of pastoral lifestyle and its related TEK. We, therefore, addressed the following objectives (i) to inventorise and assess how pastoralists characterise and value soils and forages in their environment, (ii) to analyse how soil, forage and livestock (i.e. cattle) characteristics relate to herding decisions and (iii) to determine whether TEK underlying herding decisions differs across generations. Data were collected through focus groups and individual interviews with 72 pastoralists, belonging to three generations and to three agro-ecological zones. Using a three-point scale (high, medium, low), four grasses and three tree forages were assessed in terms of nutritional quality for milk, meat, health and strength. Using their own visual criteria, pastoralists identified five different soils, which they selected for herding at different times of the year. Pastoralists stated that Pokuri was the best soil because of its low moisture content, whereas Karaal was the worst because forage hardly grows on it. They stated that perennials, such as Andropogon gayanus and Loxoderra ledermannii, were of high nutritional quality, whereas annuals such as Andropogon pseudapricus and Hyparrhenia involucrata were of low nutritional quality. Afzelia africana was perceived of high quality for milk production, whereas Khaya senegalensis had the highest quality for meat, health and strength. Pastoralists first used soil, then forage and finally livestock characteristics in their herding decisions. Pastoralists' TEK was not associated with their generations, but with their agro-ecological zones. This study suggests that pastoralists had common and detailed TEK about soils, forages and livestock characteristics, underlying their herding decisions. To conclude, pastoralists use a holistic approach, combining soil, vegetation and livestock TEK in herding decisions. Such TEK can guide restoration or improvement of grazing lands, and land use planning.
2014-01-01
Background Maternal and newborn mortality remain unacceptably high in sub-Saharan Africa. Tanzania and Uganda are committed to reduce maternal and newborn mortality, but progress has been limited and many essential interventions are unavailable in primary and referral facilities. Quality management has the potential to overcome low implementation levels by assisting teams of health workers and others finding local solutions to problems in delivering quality care and the underutilization of health services by the community. Existing evidence of the effect of quality management on health worker performance in these contexts has important limitations, and the feasibility of expanding quality management to the community level is unknown. We aim to assess quality management at the district, facility, and community levels, supported by information from high-quality, continuous surveys, and report effects of the quality management intervention on the utilization and quality of services in Tanzania and Uganda. Methods In Uganda and Tanzania, the Expanded Quality Management Using Information Power (EQUIP) intervention is implemented in one intervention district and evaluated using a plausibility design with one non-randomly selected comparison district. The quality management approach is based on the collaborative model for improvement, in which groups of quality improvement teams test new implementation strategies (change ideas) and periodically meet to share results and identify the best strategies. The teams use locally-generated community and health facility data to monitor improvements. In addition, data from continuous health facility and household surveys are used to guide prioritization and decision making by quality improvement teams as well as for evaluation of the intervention. These data include input, process, output, coverage, implementation practice, and client satisfaction indicators in both intervention and comparison districts. Thus, intervention districts receive quality management and continuous surveys, and comparison districts-only continuous surveys. Discussion EQUIP is a district-scale, proof-of-concept study that evaluates a quality management approach for maternal and newborn health including communities, health facilities, and district health managers, supported by high-quality data from independent continuous household and health facility surveys. The study will generate robust evidence about the effectiveness of quality management and will inform future nationwide implementation approaches for health system strengthening in low-resource settings. Trial registration PACTR201311000681314 PMID:24690284
Hanson, Claudia; Waiswa, Peter; Marchant, Tanya; Marx, Michael; Manzi, Fatuma; Mbaruku, Godfrey; Rowe, Alex; Tomson, Göran; Schellenberg, Joanna; Peterson, Stefan
2014-04-02
Maternal and newborn mortality remain unacceptably high in sub-Saharan Africa. Tanzania and Uganda are committed to reduce maternal and newborn mortality, but progress has been limited and many essential interventions are unavailable in primary and referral facilities. Quality management has the potential to overcome low implementation levels by assisting teams of health workers and others finding local solutions to problems in delivering quality care and the underutilization of health services by the community. Existing evidence of the effect of quality management on health worker performance in these contexts has important limitations, and the feasibility of expanding quality management to the community level is unknown. We aim to assess quality management at the district, facility, and community levels, supported by information from high-quality, continuous surveys, and report effects of the quality management intervention on the utilization and quality of services in Tanzania and Uganda. In Uganda and Tanzania, the Expanded Quality Management Using Information Power (EQUIP) intervention is implemented in one intervention district and evaluated using a plausibility design with one non-randomly selected comparison district. The quality management approach is based on the collaborative model for improvement, in which groups of quality improvement teams test new implementation strategies (change ideas) and periodically meet to share results and identify the best strategies. The teams use locally-generated community and health facility data to monitor improvements. In addition, data from continuous health facility and household surveys are used to guide prioritization and decision making by quality improvement teams as well as for evaluation of the intervention. These data include input, process, output, coverage, implementation practice, and client satisfaction indicators in both intervention and comparison districts. Thus, intervention districts receive quality management and continuous surveys, and comparison districts-only continuous surveys. EQUIP is a district-scale, proof-of-concept study that evaluates a quality management approach for maternal and newborn health including communities, health facilities, and district health managers, supported by high-quality data from independent continuous household and health facility surveys. The study will generate robust evidence about the effectiveness of quality management and will inform future nationwide implementation approaches for health system strengthening in low-resource settings. PACTR201311000681314.
Wang, Yuker; Carlton, Victoria EH; Karlin-Neumann, George; Sapolsky, Ronald; Zhang, Li; Moorhead, Martin; Wang, Zhigang C; Richardson, Andrea L; Warren, Robert; Walther, Axel; Bondy, Melissa; Sahin, Aysegul; Krahe, Ralf; Tuna, Musaffe; Thompson, Patricia A; Spellman, Paul T; Gray, Joe W; Mills, Gordon B; Faham, Malek
2009-01-01
Background A major challenge facing DNA copy number (CN) studies of tumors is that most banked samples with extensive clinical follow-up information are Formalin-Fixed Paraffin Embedded (FFPE). DNA from FFPE samples generally underperforms or suffers high failure rates compared to fresh frozen samples because of DNA degradation and cross-linking during FFPE fixation and processing. As FFPE protocols may vary widely between labs and samples may be stored for decades at room temperature, an ideal FFPE CN technology should work on diverse sample sets. Molecular Inversion Probe (MIP) technology has been applied successfully to obtain high quality CN and genotype data from cell line and frozen tumor DNA. Since the MIP probes require only a small (~40 bp) target binding site, we reasoned they may be well suited to assess degraded FFPE DNA. We assessed CN with a MIP panel of 50,000 markers in 93 FFPE tumor samples from 7 diverse collections. For 38 FFPE samples from three collections we were also able to asses CN in matched fresh frozen tumor tissue. Results Using an input of 37 ng genomic DNA, we generated high quality CN data with MIP technology in 88% of FFPE samples from seven diverse collections. When matched fresh frozen tissue was available, the performance of FFPE DNA was comparable to that of DNA obtained from matched frozen tumor (genotype concordance averaged 99.9%), with only a modest loss in performance in FFPE. Conclusion MIP technology can be used to generate high quality CN and genotype data in FFPE as well as fresh frozen samples. PMID:19228381
Power Supplies for Space Systems Quality Assurance by Sandia Laboratories
DOE R&D Accomplishments Database
Hannigan, R. L.; Harnar, R. R.
1976-07-01
The Sandia Laboratories` participation in Quality Assurance programs for Radioisotopic Thermoelectric Generators which have been used in space systems over the past 10 years is summarized. Basic elements of this QA program are briefly described and recognition of assistance from other Sandia organizations is included. Descriptions of the various systems for which Sandia has had the QA responsibility are presented, including SNAP 19 (Nimbus, Pioneer, Viking), SNAP 27 (Apollo), Transit, Multi Hundred Watt (LES 8/9 and MJS), and a new program, High Performance Generator Mod 3. The outlook for Sandia participation in RTG programs for the next several years is noted.
Unassigned MS/MS Spectra: Who Am I?
Pathan, Mohashin; Samuel, Monisha; Keerthikumar, Shivakumar; Mathivanan, Suresh
2017-01-01
Recent advances in high resolution tandem mass spectrometry (MS) has resulted in the accumulation of high quality data. Paralleled with these advances in instrumentation, bioinformatics software have been developed to analyze such quality datasets. In spite of these advances, data analysis in mass spectrometry still remains critical for protein identification. In addition, the complexity of the generated MS/MS spectra, unpredictable nature of peptide fragmentation, sequence annotation errors, and posttranslational modifications has impeded the protein identification process. In a typical MS data analysis, about 60 % of the MS/MS spectra remains unassigned. While some of these could attribute to the low quality of the MS/MS spectra, a proportion can be classified as high quality. Further analysis may reveal how much of the unassigned MS spectra attribute to search space, sequence annotation errors, mutations, and/or posttranslational modifications. In this chapter, the tools used to identify proteins and ways to assign unassigned tandem MS spectra are discussed.
Testing Overload in America's Schools
ERIC Educational Resources Information Center
Lazarín, Melissa
2014-01-01
It appears that schools and families are at a crossroads when it comes to testing. High-quality assessments generate rich data and can provide valuable information about student progress to teachers and parents, support accountability, promote high expectations, and encourage equity for students of color and low-income students. But it is…
High-Quality Carbohydrates and Physical Performance
Kanter, Mitch
2018-01-01
While all experts agreed that protein needs for performance are likely greater than believed in past generations, particularly for strength training athletes, and that dietary fat could sustain an active person through lower-intensity training bouts, current research still points to carbohydrate as an indispensable energy source for high-intensity performance. PMID:29449746
Generation of monoenergetic ion beams via ionization dynamics (Conference Presentation)
NASA Astrophysics Data System (ADS)
Lin, Chen; Kim, I. Jong; Yu, Jinqing; Choi, Il Woo; Ma, Wenjun; Yan, Xueqing; Nam, Chang Hee
2017-05-01
The research on ion acceleration driven by high intensity laser pulse has attracted significant interests in recent decades due to the developments of laser technology. The intensive study of energetic ion bunches is particularly stimulated by wide applications in nuclear fusion, medical treatment, warm dense matter production and high energy density physics. However, to implement such compact accelerators, challenges are still existing in terms of beam quality and stability, especially in applications that require higher energy and narrow bandwidth spectra ion beams. We report on the acceleration of quasi-mono-energetic ion beams via ionization dynamics in the interaction of an intense laser pulse with a solid target. Using ionization dynamics model in 2D particle-in-cell (PIC) simulations, we found that high charge state contamination ions can only be ionized in the central spot area where the intensity of sheath field surpasses their ionization threshold. These ions automatically form a microstructure target with a width of few micron scale, which is conducive to generate mono-energetic beams. In the experiment of ultraintense (< 10^21 W/cm^2) laser pulses irradiating ultrathin targets each attracted with a contamination layer of nm-thickness, high quality < 100 MeV mono-energetic ion bunches are generated. The peak energy of the self-generated micro-structured target ions with respect to different contamination layer thickness is also examined This is relatively newfound respect, which is confirmed by the consistence between experiment data and the simulation results.
Construction of High-Quality Camel Immune Antibody Libraries.
Romão, Ema; Poignavent, Vianney; Vincke, Cécile; Ritzenthaler, Christophe; Muyldermans, Serge; Monsion, Baptiste
2018-01-01
Single-domain antibodies libraries of heavy-chain only immunoglobulins from camelids or shark are enriched for high-affinity antigen-specific binders by a short in vivo immunization. Thus, potent binders are readily retrieved from relatively small-sized libraries of 10 7 -10 8 individual transformants, mostly after phage display and panning on a purified target. However, the remaining drawback of this strategy arises from the need to generate a dedicated library, for nearly every envisaged target. Therefore, all the procedures that shorten and facilitate the construction of an immune library of best possible quality are definitely a step forward. In this chapter, we provide the protocol to generate a high-quality immune VHH library using the Golden Gate Cloning strategy employing an adapted phage display vector where a lethal ccdB gene has to be substituted by the VHH gene. With this procedure, the construction of the library can be shortened to less than a week starting from bleeding the animal. Our libraries exceed 10 8 individual transformants and close to 100% of the clones harbor a phage display vector having an insert with the length of a VHH gene. These libraries are also more economic to make than previous standard approaches using classical restriction enzymes and ligations. The quality of the Nanobodies that are retrieved from immune libraries obtained by Golden Gate Cloning is identical to those from immune libraries made according to the classical procedure.
Wimmer, Isabella; Tröscher, Anna R; Brunner, Florian; Rubino, Stephen J; Bien, Christian G; Weiner, Howard L; Lassmann, Hans; Bauer, Jan
2018-04-20
Formalin-fixed paraffin-embedded (FFPE) tissues are valuable resources commonly used in pathology. However, formalin fixation modifies nucleic acids challenging the isolation of high-quality RNA for genetic profiling. Here, we assessed feasibility and reliability of microarray studies analysing transcriptome data from fresh, fresh-frozen (FF) and FFPE tissues. We show that reproducible microarray data can be generated from only 2 ng FFPE-derived RNA. For RNA quality assessment, fragment size distribution (DV200) and qPCR proved most suitable. During RNA isolation, extending tissue lysis time to 10 hours reduced high-molecular-weight species, while additional incubation at 70 °C markedly increased RNA yields. Since FF- and FFPE-derived microarrays constitute different data entities, we used indirect measures to investigate gene signal variation and relative gene expression. Whole-genome analyses revealed high concordance rates, while reviewing on single-genes basis showed higher data variation in FFPE than FF arrays. Using an experimental model, gene set enrichment analysis (GSEA) of FFPE-derived microarrays and fresh tissue-derived RNA-Seq datasets yielded similarly affected pathways confirming the applicability of FFPE tissue in global gene expression analysis. Our study provides a workflow comprising RNA isolation, quality assessment and microarray profiling using minimal RNA input, thus enabling hypothesis-generating pathway analyses from limited amounts of precious, pathologically significant FFPE tissues.
NASA Astrophysics Data System (ADS)
Huang, Shengzhou; Li, Mujun; Shen, Lianguan; Qiu, Jinfeng; Zhou, Youquan
2018-03-01
A novel fabrication method for high quality aspheric microlens array (MLA) was developed by combining the dose-modulated DMD-based lithography and surface thermal reflow process. In this method, the complex shape of aspheric microlens is pre-modeled via dose modulation in a digital micromirror device (DMD) based maskless projection lithography. And the dose modulation mainly depends on the distribution of exposure dose of photoresist. Then the pre-shaped aspheric microlens is polished by a following non-contact thermal reflow (NCTR) process. Different from the normal process, the reflow process here is investigated to improve the surface quality while keeping the pre-modeled shape unchanged, and thus will avoid the difficulties in generating the aspheric surface during reflow. Fabrication of a designed aspheric MLA with this method was demonstrated in experiments. Results showed that the obtained aspheric MLA was good in both shape accuracy and surface quality. The presented method may be a promising approach in rapidly fabricating high quality aspheric microlens with complex surface.
Quality issues in blue noise halftoning
NASA Astrophysics Data System (ADS)
Yu, Qing; Parker, Kevin J.
1998-01-01
The blue noise mask (BNM) is a halftone screen that produces unstructured visually pleasing dot patterns. The BNM combines the blue-noise characteristics of error diffusion and the simplicity of ordered dither. A BNM is constructed by designing a set of interdependent binary patterns for individual gray levels. In this paper, we investigate the quality issues in blue-noise binary pattern design and mask generation as well as in application to color reproduction. Using a global filtering technique and a local 'force' process for rearranging black and white pixels, we are able to generate a series of binary patterns, all representing a certain gray level, ranging from white-noise pattern to highly structured pattern. The quality of these individual patterns are studied in terms of low-frequency structure and graininess. Typically, the low-frequency structure (LF) is identified with a measurement of the energy around dc in the spatial frequency domain, while the graininess is quantified by a measurement of the average minimum distance (AMD) between minority dots as well as the kurtosis of the local kurtosis distribution (KLK) for minority pixels of the binary pattern. A set of partial BNMs are generated by using the different patterns as unique starting 'seeds.' In this way, we are able to study the quality of binary patterns over a range of gray levels. We observe that the optimality of a binary pattern for mask generation is related to its own quality mertirc values as well as the transition smoothness of those quality metric values over neighboring levels. Several schemes have been developed to apply blue-noise halftoning to color reproduction. Different schemes generate halftone patterns with different textures. In a previous paper, a human visual system (HVS) model was used to study the color halftone quality in terms of luminance and chrominance error in CIELAB color space. In this paper, a new series of psycho-visual experiments address the 'preferred' color rendering among four different blue noise halftoning schemes. The experimental results will be interpreted with respect to the proposed halftone quality metrics.
Recovery of condensate water quality in power generator's surface condenser
NASA Astrophysics Data System (ADS)
Kurniawan, Lilik Adib
2017-03-01
In PT Badak NGL Plant, steam turbines are used to drive major power generators, compressors, and pumps. Steam exiting the turbines is condensed in surface condensers to be returned to boilers. Therefore, surface condenser performance and quality of condensate water are very important. One of the recent problem was caused by the leak of a surface condenser of Steam Turbine Power Generator. Thesteam turbine was overhauled, leaving the surface condenser idle and exposed to air for more than 1.5 years. Sea water ingress due to tube leaks worsens the corrosionof the condenser shell. The combination of mineral scale and corrosion product resulting high conductivity condensate at outlet condenser when we restarted up, beyond the acceptable limit. After assessing several options, chemical cleaning was the best way to overcome the problem according to condenser configuration. An 8 hour circulation of 5%wt citric acid had succeed reducing water conductivity from 50 μmhos/cm to below 5 μmhos/cm. The condensate water, then meets the required quality, i.e. pH 8.3 - 9.0; conductivity ≤ 5 μmhos/cm, therefore the power generator can be operated normally without any concern until now.
Highly Integrated Quality Assurance – An Empirical Case
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake Kirkham; Amy Powell; Lucas Rich
2011-02-01
Highly Integrated Quality Assurance – An Empirical Case Drake Kirkham1, Amy Powell2, Lucas Rich3 1Quality Manager, Radioisotope Power Systems (RPS) Program, Idaho National Laboratory, P.O. Box 1625 M/S 6122, Idaho Falls, ID 83415-6122 2Quality Engineer, RPS Program, Idaho National Laboratory 3Quality Engineer, RPS Program, Idaho National Laboratory Contact: Voice: (208) 533-7550 Email: Drake.Kirkham@inl.gov Abstract. The Radioisotope Power Systems Program of the Idaho National Laboratory makes an empirical case for a highly integrated Quality Assurance function pertaining to the preparation, assembly, testing, storage and transportation of 238Pu fueled radioisotope thermoelectric generators. Case data represents multiple campaigns including the Pluto/New Horizons mission,more » the Mars Science Laboratory mission in progress, and other related projects. Traditional Quality Assurance models would attempt to reduce cost by minimizing the role of dedicated Quality Assurance personnel in favor of either functional tasking or peer-based implementations. Highly integrated Quality Assurance adds value by placing trained quality inspectors on the production floor side-by-side with nuclear facility operators to enhance team dynamics, reduce inspection wait time, and provide for immediate, independent feedback. Value is also added by maintaining dedicated Quality Engineers to provide for rapid identification and resolution of corrective action, enhanced and expedited supply chain interfaces, improved bonded storage capabilities, and technical resources for requirements management including data package development and Certificates of Inspection. A broad examination of cost-benefit indicates highly integrated Quality Assurance can reduce cost through the mitigation of risk and reducing administrative burden thereby allowing engineers to be engineers, nuclear operators to be nuclear operators, and the cross-functional team to operate more efficiently. Applicability of this case extends to any high-value, long-term project where traceability and accountability are determining factors.« less
Mavromatis, Konstantinos; Land, Miriam L; Brettin, Thomas S; Quest, Daniel J; Copeland, Alex; Clum, Alicia; Goodwin, Lynne; Woyke, Tanja; Lapidus, Alla; Klenk, Hans Peter; Cottingham, Robert W; Kyrpides, Nikos C
2012-01-01
The emergence of next generation sequencing (NGS) has provided the means for rapid and high throughput sequencing and data generation at low cost, while concomitantly creating a new set of challenges. The number of available assembled microbial genomes continues to grow rapidly and their quality reflects the quality of the sequencing technology used, but also of the analysis software employed for assembly and annotation. In this work, we have explored the quality of the microbial draft genomes across various sequencing technologies. We have compared the draft and finished assemblies of 133 microbial genomes sequenced at the Department of Energy-Joint Genome Institute and finished at the Los Alamos National Laboratory using a variety of combinations of sequencing technologies, reflecting the transition of the institute from Sanger-based sequencing platforms to NGS platforms. The quality of the public assemblies and of the associated gene annotations was evaluated using various metrics. Results obtained with the different sequencing technologies, as well as their effects on downstream processes, were analyzed. Our results demonstrate that the Illumina HiSeq 2000 sequencing system, the primary sequencing technology currently used for de novo genome sequencing and assembly at JGI, has various advantages in terms of total sequence throughput and cost, but it also introduces challenges for the downstream analyses. In all cases assembly results although on average are of high quality, need to be viewed critically and consider sources of errors in them prior to analysis. These data follow the evolution of microbial sequencing and downstream processing at the JGI from draft genome sequences with large gaps corresponding to missing genes of significant biological role to assemblies with multiple small gaps (Illumina) and finally to assemblies that generate almost complete genomes (Illumina+PacBio).
High quality fuel gas from biomass pyrolysis with calcium oxide.
Zhao, Baofeng; Zhang, Xiaodong; Chen, Lei; Sun, Laizhi; Si, Hongyu; Chen, Guanyi
2014-03-01
The removal of CO2 and tar in fuel gas produced by biomass thermal conversion has aroused more attention due to their adverse effects on the subsequent fuel gas application. High quality fuel gas production from sawdust pyrolysis with CaO was studied in this paper. The results of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) experiments indicate that the mass ratio of CaO to sawdust (Ca/S) remarkably affects the behavior of sawdust pyrolysis. On the basis of Py-GC/MS results, one system of a moving bed pyrolyzer coupled with a fluid bed combustor has been developed to produce high quality fuel gas. The lower heating value (LHV) of the fuel gas was above 16MJ/Nm(3) and the content of tar was under 50mg/Nm(3), which is suitable for gas turbine application to generate electricity and heat. Therefore, this technology may be a promising route to achieve high quality fuel gas for biomass utilization. Copyright © 2014 Elsevier Ltd. All rights reserved.
INTEGRATED POWER GENERATION SYSTEMS FOR COAL MINE WASTE METHANE UTILIZATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peet M. Soot; Dale R. Jesse; Michael E. Smith
2005-08-01
An integrated system to utilize the waste coal mine methane (CMM) at the Federal No. 2 Coal Mine in West Virginia was designed and built. The system includes power generation, using internal combustion engines, along with gas processing equipment to upgrade sub-quality waste methane to pipeline quality standards. The power generation has a nominal capacity of 1,200 kw and the gas processing system can treat about 1 million cubic feet per day (1 MMCFD) of gas. The gas processing is based on the Northwest Fuel Development, Inc. (NW Fuel) proprietary continuous pressure swing adsorption (CPSA) process that can remove nitrogenmore » from CMM streams. The two major components of the integrated system are synergistic. The byproduct gas stream from the gas processing equipment can be used as fuel for the power generating equipment. In return, the power generating equipment provides the nominal power requirements of the gas processing equipment. This Phase III effort followed Phase I, which was comprised of a feasibility study for the project, and Phase II, where the final design for the commercial-scale demonstration was completed. The fact that NW Fuel is desirous of continuing to operate the equipment on a commercial basis provides the validation for having advanced the project through all of these phases. The limitation experienced by the project during Phase III was that the CMM available to operate the CPSA system on a commercial basis was not of sufficiently high quality. NW Fuel's CPSA process is limited in its applicability, requiring a relatively high quality of gas as the feed to the process. The CPSA process was demonstrated during Phase III for a limited time, during which the processing capabilities met the expected results, but the process was never capable of providing pipeline quality gas from the available low quality CMM. The NW Fuel CPSA process is a low-cost ''polishing unit'' capable of removing a few percent nitrogen. It was never intended to process CMM streams containing high levels of nitrogen, as is now the case at the Federal No.2 Mine. Even lacking the CPSA pipeline delivery demonstration, the project was successful in laying the groundwork for future commercial applications of the integrated system. This operation can still provide a guide for other coal mines which need options for utilization of their methane resources. The designed system can be used as a complete template, or individual components of the system can be segregated and utilized separately at other mines. The use of the CMM not only provides an energy fuel from an otherwise wasted resource, but it also yields an environmental benefit by reducing greenhouse gas emissions. The methane has twenty times the greenhouse effect as compared to carbon dioxide, which the combustion of the methane generates. The net greenhouse gas emission mitigation is substantial.« less
Le, Van So; Do, Zoe Phuc-Hien; Le, Minh Khoi; Le, Vicki; Le, Natalie Nha-Truc
2014-06-10
Methods of increasing the performance of radionuclide generators used in nuclear medicine radiotherapy and SPECT/PET imaging were developed and detailed for 99Mo/99mTc and 68Ge/68Ga radionuclide generators as the cases. Optimisation methods of the daughter nuclide build-up versus stand-by time and/or specific activity using mean progress functions were developed for increasing the performance of radionuclide generators. As a result of this optimisation, the separation of the daughter nuclide from its parent one should be performed at a defined optimal time to avoid the deterioration in specific activity of the daughter nuclide and wasting stand-by time of the generator, while the daughter nuclide yield is maintained to a reasonably high extent. A new characteristic parameter of the formation-decay kinetics of parent/daughter nuclide system was found and effectively used in the practice of the generator production and utilisation. A method of "early elution schedule" was also developed for increasing the daughter nuclide production yield and specific radioactivity, thus saving the cost of the generator and improving the quality of the daughter radionuclide solution. These newly developed optimisation methods in combination with an integrated elution-purification-concentration system of radionuclide generators recently developed is the most suitable way to operate the generator effectively on the basis of economic use and improvement of purposely suitable quality and specific activity of the produced daughter radionuclides. All these features benefit the economic use of the generator, the improved quality of labelling/scan, and the lowered cost of nuclear medicine procedure. Besides, a new method of quality control protocol set-up for post-delivery test of radionuclidic purity has been developed based on the relationship between gamma ray spectrometric detection limit, required limit of impure radionuclide activity and its measurement certainty with respect to optimising decay/measurement time and product sample activity used for QC quality control. The optimisation ensures a certainty of measurement of the specific impure radionuclide and avoids wasting the useful amount of valuable purified/concentrated daughter nuclide product. This process is important for the spectrometric measurement of very low activity of impure radionuclide contamination in the radioisotope products of much higher activity used in medical imaging and targeted radiotherapy.
Internet Interventions for Long-Term Conditions: Patient and Caregiver Quality Criteria
Murray, Elizabeth; Stevenson, Fiona; Gore, Charles; Nazareth, Irwin
2006-01-01
Background Interactive health communication applications (IHCAs) that combine high-quality health information with interactive components, such as self-assessment tools, behavior change support, peer support, or decision support, are likely to benefit people with long-term conditions. IHCAs are now largely Web-based and are becoming known as "Internet interventions." Although there are numerous professionally generated criteria to assess health-related websites, to date there has been scant exploration of patient-generated assessment criteria even though patients and professionals use different criteria for assessing the quality of traditional sources of health information. Objective We aimed to determine patients' and caregivers' requirements of IHCAs for long-term conditions as well as their criteria for assessing the quality of different programs. Methods This was a qualitative study with focus groups. Patients and caregivers managing long-term conditions used three (predominantly Web-based) IHCAs relevant to their condition and subsequently discussed the strengths and weaknesses of the different IHCAs in focus groups. Participants in any one focus group all shared the same long-term condition and viewed the same three IHCAs. Patient and caregiver criteria for IHCAs emerged from the data. Results There were 40 patients and caregivers who participated in 10 focus groups. Participants welcomed the potential of Internet interventions but felt that many were not achieving their full potential. Participants generated detailed and specific quality criteria relating to information content, presentation, interactivity, and trustworthiness, which can be used by developers and purchasers of Internet interventions. Conclusions The user-generated quality criteria reported in this paper should help developers and purchasers provide Internet interventions that better meet user needs. PMID:16954123
NASA Astrophysics Data System (ADS)
Hua, Yi-Lin; Zhou, Zong-Quan; Liu, Xiao; Yang, Tian-Shu; Li, Zong-Feng; Li, Pei-Yun; Chen, Geng; Xu, Xiao-Ye; Tang, Jian-Shun; Xu, Jin-Shi; Li, Chuan-Feng; Guo, Guang-Can
2018-01-01
A photon pair can be entangled in many degrees of freedom such as polarization, time bins, and orbital angular momentum (OAM). Among them, the OAM of photons can be entangled in an infinite-dimensional Hilbert space which enhances the channel capacity of sharing information in a network. Twisted photons generated by spontaneous parametric down-conversion offer an opportunity to create this high-dimensional entanglement, but a photon pair generated by this process is typically wideband, which makes it difficult to interface with the quantum memories in a network. Here we propose an annual-ring-type quasi-phase-matching (QPM) crystal for generation of the narrowband high-dimensional entanglement. The structure of the QPM crystal is designed by tracking the geometric divergences of the OAM modes that comprise the entangled state. The dimensionality and the quality of the entanglement can be greatly enhanced with the annual-ring-type QPM crystal.
NASA Astrophysics Data System (ADS)
Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin
2018-03-01
Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.
High Resolution X-ray-Induced Acoustic Tomography
Xiang, Liangzhong; Tang, Shanshan; Ahmad, Moiz; Xing, Lei
2016-01-01
Absorption based CT imaging has been an invaluable tool in medical diagnosis, biology, and materials science. However, CT requires a large set of projection data and high radiation dose to achieve superior image quality. In this letter, we report a new imaging modality, X-ray Induced Acoustic Tomography (XACT), which takes advantages of high sensitivity to X-ray absorption and high ultrasonic resolution in a single modality. A single projection X-ray exposure is sufficient to generate acoustic signals in 3D space because the X-ray generated acoustic waves are of a spherical nature and propagate in all directions from their point of generation. We demonstrate the successful reconstruction of gold fiducial markers with a spatial resolution of about 350 μm. XACT reveals a new imaging mechanism and provides uncharted opportunities for structural determination with X-ray. PMID:27189746
Sanchez, Travis; Baral, Stefan; Mee, Paul; Sabin, Keith; Garcia-Calleja, Jesus M; Hargreaves, James
2018-01-01
To guide HIV prevention and treatment activities up to 2020, we need to generate and make better use of high quality HIV surveillance data. To highlight our surveillance needs, a special collection of papers in JMIR Public Health and Surveillance has been released under the title “Improving Global and National Responses to the HIV Epidemic Through High Quality HIV Surveillance Data.” We provide a summary of these papers and highlight methods for developing a new HIV surveillance architecture. PMID:29444766
Generation Y in healthcare: leading millennials in an era of reform.
Piper, Llewellyn E
2012-01-01
The healthcare workforce has grown with the addition of a new group of physicians, nurses, allied health professionals, administrators, and support staff who belong to America's youngest generation now in the workforce-generation Y, or the millennials. This generation consists of more than 70 million people, the oldest of whom are now in their late 20s and early 30s. With traits and workplace expectations that differ from those observed in other generations, and with a size that threatens to overtake the total number of baby boomers, generation Yers are positioned to influence (if not drastically change) current leadership approaches. The common traits that define or are associated with generation Y workers are often regarded as barriers yet provide healthcare leaders with a clear guide to understanding these employees and drawing out their best qualities and performance. For the organization to fulfill its social contract to provide high-quality, cost-effective, and safe healthcare, it must satisfy the needs and manage the expectations of those who directly deliver these services. This is especially important in today's environment, which is marked by the still-fluid stipulations of the Affordable Care Act (ACA), changed consumer expectations, and public demands for transparency and accountability.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use.
Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min
2015-06-01
The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.
Gilmore, Elisabeth A; Adams, Peter J; Lave, Lester B
2010-05-01
Generators installed for backup power during blackouts could help satisfy peak electricity demand; however, many are diesel generators with nonnegligible air emissions that may damage air quality and human health. The full (private and social) cost of using diesel generators with and without emission control retrofits for fine particulate matter (PM2.5) and nitrogen oxides (NOx) were compared with a new natural gas turbine peaking plant. Lower private costs were found for the backup generators because the capital costs are mostly ascribed to reliability. To estimate the social costs from air quality, the changes in ambient concentrations of ozone (O3) and PM2.5 were modeled using the Particulate Matter Comprehensive Air Quality Model with extensions (PMCAMx) chemical transport model. These air quality changes were translated to their equivalent human health effects using concentration-response functions and then into dollars using estimates of "willingness-to-pay" to avoid ill health. As a case study, 1000 MW of backup generation operating for 12 hr/day for 6 days in each of four eastern U.S. cities (Atlanta, Chicago, Dallas, and New York) was modeled. In all cities, modeled PM2.5 concentrations increased (up to 5 microg/m3) due mainly to primary emissions. Smaller increases and decreases were observed for secondary PM2.5 with more variation between cities. Increases in NOx, emissions resulted in significant nitrate formation (up to 1 microg/m3) in Atlanta and Chicago. The NOx emissions also caused O3 decreases in the urban centers and increases in the surrounding areas. For PM2.5, a social cost of approximately $2/kWh was calculated for uncontrolled diesel generators in highly populated cities but was under 10 cent/kWh with PM2.5 and NOx controls. On a full cost basis, it was found that properly controlled diesel generators are cost-effective for meeting peak electricity demand. The authors recommend NOx and PM2.5 controls.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Chuan S.; Shao, Xi
2016-06-14
The main objective of our work is to provide theoretical basis and modeling support for the design and experimental setup of compact laser proton accelerator to produce high quality proton beams tunable with energy from 50 to 250 MeV using short pulse sub-petawatt laser. We performed theoretical and computational studies of energy scaling and Raleigh--Taylor instability development in laser radiation pressure acceleration (RPA) and developed novel RPA-based schemes to remedy/suppress instabilities for high-quality quasimonoenergetic proton beam generation as we proposed. During the project period, we published nine peer-reviewed journal papers and made twenty conference presentations including six invited talks onmore » our work. The project supported one graduate student who received his PhD degree in physics in 2013 and supported two post-doctoral associates. We also mentored three high school students and one undergraduate student of physics major by inspiring their interests and having them involved in the project.« less
ERIC Educational Resources Information Center
Vick, Matthew
2016-01-01
Science teaching continues to move away from teaching science as merely a body of facts and figures to be memorized to a process of exploring and drawing conclusions. The Next Generation Science Standards (NGSS) emphasize eight science and engineering practices that ask students to apply scientific and engineering reasoning and explanation. This…
Dynamical aspects of behavior generation under constraints
Harter, Derek; Achunala, Srinivas
2007-01-01
Dynamic adaptation is a key feature of brains helping to maintain the quality of their performance in the face of increasingly difficult constraints. How to achieve high-quality performance under demanding real-time conditions is an important question in the study of cognitive behaviors. Animals and humans are embedded in and constrained by their environments. Our goal is to improve the understanding of the dynamics of the interacting brain–environment system by studying human behaviors when completing constrained tasks and by modeling the observed behavior. In this article we present results of experiments with humans performing tasks on the computer under variable time and resource constraints. We compare various models of behavior generation in order to describe the observed human performance. Finally we speculate on mechanisms how chaotic neurodynamics can contribute to the generation of flexible human behaviors under constraints. PMID:19003514
Development and Validation of a High-Quality Composite Real-World Mortality Endpoint.
Curtis, Melissa D; Griffith, Sandra D; Tucker, Melisa; Taylor, Michael D; Capra, William B; Carrigan, Gillis; Holzman, Ben; Torres, Aracelis Z; You, Paul; Arnieri, Brandon; Abernethy, Amy P
2018-05-14
To create a high-quality electronic health record (EHR)-derived mortality dataset for retrospective and prospective real-world evidence generation. Oncology EHR data, supplemented with external commercial and US Social Security Death Index data, benchmarked to the National Death Index (NDI). We developed a recent, linkable, high-quality mortality variable amalgamated from multiple data sources to supplement EHR data, benchmarked against the highest completeness U.S. mortality data, the NDI. Data quality of the mortality variable version 2.0 is reported here. For advanced non-small-cell lung cancer, sensitivity of mortality information improved from 66 percent in EHR structured data to 91 percent in the composite dataset, with high date agreement compared to the NDI. For advanced melanoma, metastatic colorectal cancer, and metastatic breast cancer, sensitivity of the final variable was 85 to 88 percent. Kaplan-Meier survival analyses showed that improving mortality data completeness minimized overestimation of survival relative to NDI-based estimates. For EHR-derived data to yield reliable real-world evidence, it needs to be of known and sufficiently high quality. Considering the impact of mortality data completeness on survival endpoints, we highlight the importance of data quality assessment and advocate benchmarking to the NDI. © 2018 The Authors. Health Services Research published by Wiley Periodicals, Inc. on behalf of Health Research and Educational Trust.
Choices of capture chromatography technology in antibody manufacturing processes.
DiLeo, Michael; Ley, Arthur; Nixon, Andrew E; Chen, Jie
2017-11-15
The capture process employed in monoclonal antibody downstream purification is not only the most critically impacted process by increased antibody titer resulting from optimized mammalian cell culture expression systems, but also the most important purification step in determining overall process throughput, product quality, and economics. Advances in separation technology for capturing antibodies from complex feedstocks have been one focus of downstream purification process innovation for past 10 years. In this study, we evaluated new generation chromatography resins used in the antibody capture process including Protein A, cation exchange, and mixed mode chromatography to address the benefits and unique challenges posed by each chromatography approach. Our results demonstrate the benefit of improved binding capacity of new generation Protein A resins, address the concern of high concentration surge caused aggregation when using new generation cation exchange resins with over 100mg/mL binding capacity, and highlight the potential of multimodal cation exchange resins for capture process design. The new landscape of capture chromatography technologies provides options to achieve overall downstream purification outcome with high product quality and process efficiency. Copyright © 2017 Elsevier B.V. All rights reserved.
SeSaM-Tv-II generates a protein sequence space that is unobtainable by epPCR.
Mundhada, Hemanshu; Marienhagen, Jan; Scacioc, Andreea; Schenk, Alexander; Roccatano, Danilo; Schwaneberg, Ulrich
2011-07-04
Generating high-quality mutant libraries in which each amino acid is equally targeted and substituted in a chemically diverse manner is crucial to obtain improved variants in small mutant libraries. The sequence saturation mutagenesis method (SeSaM-Tv(+) ) offers the opportunity to generate such high-quality mutant libraries by introducing consecutive mutations and by enriching transversions. In this study, automated gel electrophoresis, real-time quantitative PCR, and a phosphorimager quantification system were developed and employed to optimize each step of previously reported SeSaM-Tv(+) method. Advancements of the SeSaM-Tv(+) protocol and the use of a novel DNA polymerase quadrupled the number of transversions, by doubling the fraction of consecutive mutations (from 16.7 to 37.1 %). About 33 % of all amino acid substitutions observed in a model library are rarely introduced by epPCR methods, and around 10 % of all clones carried amino acid substitutions that are unobtainable by epPCR. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Modular design and implementation of field-programmable-gate-array-based Gaussian noise generator
NASA Astrophysics Data System (ADS)
Li, Yuan-Ping; Lee, Ta-Sung; Hwang, Jeng-Kuang
2016-05-01
The modular design of a Gaussian noise generator (GNG) based on field-programmable gate array (FPGA) technology was studied. A new range reduction architecture was included in a series of elementary function evaluation modules and was integrated into the GNG system. The approximation and quantisation errors for the square root module with a first polynomial approximation were high; therefore, we used the central limit theorem (CLT) to improve the noise quality. This resulted in an output rate of one sample per clock cycle. We subsequently applied Newton's method for the square root module, thus eliminating the need for the use of the CLT because applying the CLT resulted in an output rate of two samples per clock cycle (>200 million samples per second). Two statistical tests confirmed that our GNG is of high quality. Furthermore, the range reduction, which is used to solve a limited interval of the function approximation algorithms of the System Generator platform using Xilinx FPGAs, appeared to have a higher numerical accuracy, was operated at >350 MHz, and can be suitably applied for any function evaluation.
The New Tropospheric Product of the International GNSS Service
NASA Technical Reports Server (NTRS)
Byun, Sung H.; Bar-Sever, Yoaz E.; Gendt, Gerd
2005-01-01
We compare this new approach for generating the IGS tropospheric products with the previous approach, which was based on explicit combination of total zenith delay contributions from the IGS ACs. The new approach enables the IGS to rapidly generate highly accurate and highly reliable total zenith delay time series for many hundreds of sites, thus increasing the utility of the products to weather modelers, climatologists, and GPS analysts. In this paper we describe this new method, and discuss issues of accuracy, quality control, utility of the new products and assess its benefits.
NASA Astrophysics Data System (ADS)
Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo
2018-02-01
A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.
NASA Astrophysics Data System (ADS)
Farkas, C. M.; Moeller, M.; Carlton, A. G.
2013-12-01
Photochemical transport models routinely under predict peak air quality events. This deficiency may be due, in part, to inadequate temporalization of emissions from the electric generating sector. The National Emissions Inventory (NEI) reports emissions from Electric Generating Units (EGUs) by either Continuous Emission Monitors (CEMs) that report hourly values or as an annual total. The Sparse Matrix Operator Kernel Emissions preprocessor (SMOKE), used to prepare emissions data for modeling with the CMAQ air quality model, allocates annual emission totals throughout the year using specific monthly, weekly, and hourly weights according to standard classification code (SCC) and location. This approach represents average diurnal and seasonal patterns of electricity generation but does not capture spikes in emissions due to episodic use as with peaking units or due to extreme weather events. In this project we use a combination of state air quality permits, CEM data, and EPA emission factors to more accurately temporalize emissions of NOx, SO2 and particulate matter (PM) during the extensive heat wave of July and August 2006. Two CMAQ simulations are conducted; the first with the base NEI emissions and the second with improved temporalization, more representative of actual emissions during the heat wave. Predictions from both simulations are evaluated with O3 and PM measurement data from EPA's National Air Monitoring Stations (NAMS) and State and Local Air Monitoring Stations (SLAMS) during the heat wave, for which ambient concentrations of criteria pollutants were often above NAAQS. During periods of increased photochemistry and high pollutant concentrations, it is critical that emissions are most accurately represented in air quality models.
Zheng, Xiasheng; Zhang, Peng; Liao, Baosheng; Li, Jing; Liu, Xingyun; Shi, Yuhua; Cheng, Jinle; Lai, Zhitian; Xu, Jiang; Chen, Shilin
2017-01-01
Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis), to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control. Highlight: We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis) with analytical chemistry approaches to achieve the authentication and quality connotation of the samples. PMID:28955365
Zheng, Xiasheng; Zhang, Peng; Liao, Baosheng; Li, Jing; Liu, Xingyun; Shi, Yuhua; Cheng, Jinle; Lai, Zhitian; Xu, Jiang; Chen, Shilin
2017-01-01
Herbal medicine is a major component of complementary and alternative medicine, contributing significantly to the health of many people and communities. Quality control of herbal medicine is crucial to ensure that it is safe and sound for use. Here, we investigated a comprehensive quality evaluation system for a classic herbal medicine, Danggui Buxue Formula, by applying genetic-based and analytical chemistry approaches to authenticate and evaluate the quality of its samples. For authenticity, we successfully applied two novel technologies, third-generation sequencing and PCR-DGGE (denaturing gradient gel electrophoresis), to analyze the ingredient composition of the tested samples. For quality evaluation, we used high performance liquid chromatography assays to determine the content of chemical markers to help estimate the dosage relationship between its two raw materials, plant roots of Huangqi and Danggui. A series of surveys were then conducted against several exogenous contaminations, aiming to further access the efficacy and safety of the samples. In conclusion, the quality evaluation system demonstrated here can potentially address the authenticity, quality, and safety of herbal medicines, thus providing novel insight for enhancing their overall quality control. Highlight : We established a comprehensive quality evaluation system for herbal medicine, by combining two genetic-based approaches third-generation sequencing and DGGE (denaturing gradient gel electrophoresis) with analytical chemistry approaches to achieve the authentication and quality connotation of the samples.
The viability of ADVANTG deterministic method for synthetic radiography generation
NASA Astrophysics Data System (ADS)
Bingham, Andrew; Lee, Hyoung K.
2018-07-01
Fast simulation techniques to generate synthetic radiographic images of high resolution are helpful when new radiation imaging systems are designed. However, the standard stochastic approach requires lengthy run time with poorer statistics at higher resolution. The investigation of the viability of a deterministic approach to synthetic radiography image generation was explored. The aim was to analyze a computational time decrease over the stochastic method. ADVANTG was compared to MCNP in multiple scenarios including a small radiography system prototype, to simulate high resolution radiography images. By using ADVANTG deterministic code to simulate radiography images the computational time was found to decrease 10 to 13 times compared to the MCNP stochastic approach while retaining image quality.
Colour computer-generated holography for point clouds utilizing the Phong illumination model.
Symeonidou, Athanasia; Blinder, David; Schelkens, Peter
2018-04-16
A technique integrating the bidirectional reflectance distribution function (BRDF) is proposed to generate realistic high-quality colour computer-generated holograms (CGHs). We build on prior work, namely a fast computer-generated holography method for point clouds that handles occlusions. We extend the method by integrating the Phong illumination model so that the properties of the objects' surfaces are taken into account to achieve natural light phenomena such as reflections and shadows. Our experiments show that rendering holograms with the proposed algorithm provides realistic looking objects without any noteworthy increase to the computational cost.
Low-Temperature Fumigation of Harvested Lettuce Using a Phosphine Generator.
Liu, Yong-Biao
2018-02-28
A research-scale phosphine generator, QuickPHlo-R, from United Phosphorus Ltd. (Mumbai, India) was tested to determine whether it was suitable for low-temperature fumigation and oxygenated phosphine fumigation of harvested lettuce. Vacuum cooled Iceberg and Romaine lettuce (Lactuca sativa) were fumigated in 442-liter chambers at 2°C for 24 and 72 h for control of western flower thrips [Frankliniella occidentalis (Pergande) (Thysanoptera: Thripidae)] and lettuce aphid [Nasonovia ribisnigri (Mosely) (Homoptera: Aphididae)]. Oxygenated phosphine fumigation for 48 h under 60% O2 was also conducted at 2°C with Iceberg and Romaine lettuce for control of lettuce aphid. The generator completed phosphine generation in 60-90 min. Complete control of western flower thrips was achieved in 24-h treatment, and the 48-h oxygenated fumigation, and 72-h regular fumigation treatments completely controlled lettuce aphid. Lettuce quality was evaluated 14 d after fumigation. There was increased incidence of brown stains on fumigated Iceberg lettuce, and the increases were more obvious in longer (≥48 h) treatments. Both Iceberg and Romaine lettuce from all treatments and controls had good visual quality even though there was significantly higher brown stain incidence on fumigated Iceberg lettuce in ≥48-h treatment and significant differences in quality score for both Iceberg and Romaine lettuce in the 72-h treatment. The brown stains were likely due to the high sensitivity of lettuce to carbon dioxide. The study indicated that QuiPHlo-R phosphine generator has potential in low-temperature phosphine fumigation due to the quick establishment of desired phosphine levels, efficacy in pest control, and reasonable safety to product quality.
USDA-ARS?s Scientific Manuscript database
The rapid advancement in high-throughput SNP genotyping technologies along with next generation sequencing (NGS) platforms has decreased the cost, improved the quality of large-scale genome surveys, and allowed specialty crops with limited genomic resources such as carrot (Daucus carota) to access t...
Method to generate high efficient devices which emit high quality light for illumination
Krummacher, Benjamin C.; Mathai, Mathew; Choong, Vi-En; Choulis, Stelios A.
2009-06-30
An electroluminescent apparatus includes an OLED device emitting light in the blue and green spectrums, and at least one down conversion layer. The down conversion layer absorbs at least part of the green spectrum light and emits light in at least one of the orange spectra and red spectra.
USDA-ARS?s Scientific Manuscript database
Solid-liquid separation of the raw manure increases the capacity of decision making and opportunities for treatment. The high-rate separation up-front using flocculants allows recovery of most of the organic compounds, which can be used for manufacture of high-quality compost materials. However, t...
NASA Astrophysics Data System (ADS)
Ba Dinh, Khuong; Le, Hoang Vu; Hannaford, Peter; Van Dao, Lap
2017-08-01
A table-top coherent diffractive imaging experiment on a sample with biological-like characteristics using a focused narrow-bandwidth high harmonic source around 30 nm is performed. An approach involving a beam stop and a new reconstruction algorithm to enhance the quality of reconstructed the image is described.
Many structural and functional aspects of the vertebrate hypothalamic-pituitary-gonadal (HPG) axis are known to be highly conserved, but the full significance of this from a toxicological perspective has received comparatively little attention. High-quality data generated throug...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akou, H., E-mail: h.akou@nit.ac.ir; Hamedi, M.
2015-10-15
In this paper, the generation of high-quality and high-energy micro electron beam in vacuum by a chirped Gaussian laser pulse in the presence of an axial magnetic field is numerically investigated. The features of energy and angular spectra, emittances, and position distribution of electron beam are compared in two cases, i.e., in the presence and absence of an external magnetic field. The electron beam is accelerated with higher energy and qualified in spatial distribution in the presence of the magnetic field. The presence of an axial magnetic field improves electron beam spatial quality as well as its gained energy throughmore » keeping the electron motion parallel to the direction of propagation for longer distances. It has been found that a 64 μm electron bunch with about MeV initial energy becomes a 20 μm electron beam with high energy of the order of GeV, after interacting with a laser pulse in the presence of an external magnetic field.« less
Adaptive Skin Meshes Coarsening for Biomolecular Simulation
Shi, Xinwei; Koehl, Patrice
2011-01-01
In this paper, we present efficient algorithms for generating hierarchical molecular skin meshes with decreasing size and guaranteed quality. Our algorithms generate a sequence of coarse meshes for both the surfaces and the bounded volumes. Each coarser surface mesh is adaptive to the surface curvature and maintains the topology of the skin surface with guaranteed mesh quality. The corresponding tetrahedral mesh is conforming to the interface surface mesh and contains high quality tetrahedral that decompose both the interior of the molecule and the surrounding region (enclosed in a sphere). Our hierarchical tetrahedral meshes have a number of advantages that will facilitate fast and accurate multigrid PDE solvers. Firstly, the quality of both the surface triangulations and tetrahedral meshes is guaranteed. Secondly, the interface in the tetrahedral mesh is an accurate approximation of the molecular boundary. In particular, all the boundary points lie on the skin surface. Thirdly, our meshes are Delaunay meshes. Finally, the meshes are adaptive to the geometry. PMID:21779137
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dawson, F.N. Jr.
The Dynacracking process developed by Hydrocarbon Research, Inc., is a non-catalytic process capable of upgrading heavy oil whose sulfur, metal, and carbon contents may be high. It converts residual stocks to distillates with high naphtha yields, and to synthetic fuel gas of high quality (700-800 Btu/ft/sup 3/). It has esentially no air polution emissions and requires a relatively small amount of water and utilities. The process generates sufficient heat internally such that, except for start-up, no boilers, furnaces, or external heaters are required to operate the plant. Several aspects of the process are discussed: chemistry, hardware, feedstock, flexibility in themore » product mix, product quality, and economics.« less
Weinstein, Dana; Bhave, Sunil A
2010-04-14
This paper introduces the resonant body transistor (RBT), a silicon-based dielectrically transduced nanoelectromechanical (NEM) resonator embedding a sense transistor directly into the resonator body. Combining the benefits of FET sensing with the frequency scaling capabilities and high quality factors (Q) of internal dielectrically transduced bar resonators, the resonant body transistor achieves >10 GHz frequencies and can be integrated into a standard CMOS process for on-chip clock generation, high-Q microwave circuits, fundamental quantum-state preparation and observation, and high-sensitivity measurements. An 11.7 GHz bulk-mode RBT is demonstrated with a quality factor Q of 1830, marking the highest frequency acoustic resonance measured to date on a silicon wafer.
3 CFR 8510 - Proclamation 8510 of April 29, 2010. National Charter Schools Week, 2010
Code of Federal Regulations, 2011 CFR
2011-01-01
... generation of leaders by reaching beyond standardized methods and promoting creative teaching strategies and... in teaching and learning at high quality charter schools and ensuring all our students have a chance...
WASTEWATER RENOVATION AND RETRIEVAL ON CAPE COD
A rapidly increasing population on maritime Cape Cod has generated considerable interest in alternative wastewater disposal techniques which promise to maintain high groundwater quality and promote its conservation. The authors undertake an assessment of agricultural spray-irriga...
Subjective evaluation of next-generation video compression algorithms: a case study
NASA Astrophysics Data System (ADS)
De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio
2010-08-01
This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.
Wang, Yajun; Laughner, Jacob I.; Efimov, Igor R.; Zhang, Song
2013-01-01
This paper presents a two-frequency binary phase-shifting technique to measure three-dimensional (3D) absolute shape of beating rabbit hearts. Due to the low contrast of the cardiac surface, the projector and the camera must remain focused, which poses challenges for any existing binary method where the measurement accuracy is low. To conquer this challenge, this paper proposes to utilize the optimal pulse width modulation (OPWM) technique to generate high-frequency fringe patterns, and the error-diffusion dithering technique to produce low-frequency fringe patterns. Furthermore, this paper will show that fringe patterns produced with blue light provide the best quality measurements compared to fringe patterns generated with red or green light; and the minimum data acquisition speed for high quality measurements is around 800 Hz for a rabbit heart beating at 180 beats per minute. PMID:23482151
Advanced Computational Methods for High-accuracy Refinement of Protein Low-quality Models
NASA Astrophysics Data System (ADS)
Zang, Tianwu
Predicting the 3-dimentional structure of protein has been a major interest in the modern computational biology. While lots of successful methods can generate models with 3˜5A root-mean-square deviation (RMSD) from the solution, the progress of refining these models is quite slow. It is therefore urgently needed to develop effective methods to bring low-quality models to higher-accuracy ranges (e.g., less than 2 A RMSD). In this thesis, I present several novel computational methods to address the high-accuracy refinement problem. First, an enhanced sampling method, named parallel continuous simulated tempering (PCST), is developed to accelerate the molecular dynamics (MD) simulation. Second, two energy biasing methods, Structure-Based Model (SBM) and Ensemble-Based Model (EBM), are introduced to perform targeted sampling around important conformations. Third, a three-step method is developed to blindly select high-quality models along the MD simulation. These methods work together to make significant refinement of low-quality models without any knowledge of the solution. The effectiveness of these methods is examined in different applications. Using the PCST-SBM method, models with higher global distance test scores (GDT_TS) are generated and selected in the MD simulation of 18 targets from the refinement category of the 10th Critical Assessment of Structure Prediction (CASP10). In addition, in the refinement test of two CASP10 targets using the PCST-EBM method, it is indicated that EBM may bring the initial model to even higher-quality levels. Furthermore, a multi-round refinement protocol of PCST-SBM improves the model quality of a protein to the level that is sufficient high for the molecular replacement in X-ray crystallography. Our results justify the crucial position of enhanced sampling in the protein structure prediction and demonstrate that a considerable improvement of low-accuracy structures is still achievable with current force fields.
NASA Astrophysics Data System (ADS)
Iliev, Marin
Good pulse quality, high peak power and tunable central wavelength are amongst the most desired qualities in modern lasers. The nonlinear effect cross-polarized wave generation (XPW), can be used in ultrafast laser systems to achieve various pulse quality enhancements. The XPW yield depends on the cube of the input intensity and acts as a spatio-temporal filter. It is orthogonally polarized to the input pulse and highly Gaussian. If the input pulse is well compressed, the output spectrum is smoother and broader. These features make XPW an ideal reference signal in pulse characterization techniques. This thesis presents a detailed analysis of the XPW conversion process, and describes novel applications to pulse characterization and high-quality pulse cleaning. An extensive computer model was developed to describe XPW generation via solution of the full coupled non-linear differential equations. The model accounts for dispersion inside the nonlinear crystal and uses split-step Fourier optics beam propagation to simulate the evolution of the electro-magnetic fields of the pump and XPW through free-space and imaging systems. A novel extension to the self-referenced spectral interferometry (SRSI) pulse characterization technique allows the retrieval of the energy and spectral content of the amplified spontaneous emission (ASE) present in ultrashort pulse amplifier systems. A novel double-pass XPW conversion scheme is presented. In it the beam passes through a single XPW crystal (BaF2) and is re-imaged with a curved mirror. The technique resulted in good (˜30%) efficiency without the spatial aberrations commonly seen in another arrangement that uses two crystals in succession. The modeling sheds light on the complicated nonlinear beam dynamics of the double-crystal conversion, including self- and cross-phase modulation, self-focusing, and the effects of, relative on-axis phase-difference, relative beam sizes, and wave-front curvature matching on seeded XPW conversion. Finally, a design is presented for exploiting the clean-up properties of XPW at the output of an optical parametric generation (OPA) setup in conjunction with an extremely compact prism compressor. The prisms material, separation and geometry are designed carefully to work at the correct wavelength of the OPA setup and are extrapolated to accommodate wavelengths, such as 2mum of parametric wave generation.
2011-01-01
Background Biodiesel or ethanol derived from lipids or starch produced by microalgae may overcome many of the sustainability challenges previously ascribed to petroleum-based fuels and first generation plant-based biofuels. The paucity of microalgae genome sequences, however, limits gene-based biofuel feedstock optimization studies. Here we describe the sequencing and de novo transcriptome assembly for the non-model microalgae species, Dunaliella tertiolecta, and identify pathways and genes of importance related to biofuel production. Results Next generation DNA pyrosequencing technology applied to D. tertiolecta transcripts produced 1,363,336 high quality reads with an average length of 400 bases. Following quality and size trimming, ~ 45% of the high quality reads were assembled into 33,307 isotigs with a 31-fold coverage and 376,482 singletons. Assembled sequences and singletons were subjected to BLAST similarity searches and annotated with Gene Ontology (GO) and Kyoto Encyclopedia of Genes and Genomes (KEGG) orthology (KO) identifiers. These analyses identified the majority of lipid and starch biosynthesis and catabolism pathways in D. tertiolecta. Conclusions The construction of metabolic pathways involved in the biosynthesis and catabolism of fatty acids, triacylglycrols, and starch in D. tertiolecta as well as the assembled transcriptome provide a foundation for the molecular genetics and functional genomics required to direct metabolic engineering efforts that seek to enhance the quantity and character of microalgae-based biofuel feedstock. PMID:21401935
Research in the Optical Sciences
1994-02-01
Gain Asymmetry and the Generation of New Frequencies2 "’ When a stable coherent beam is injected into a VCSEL that is lasing just above threshold, we... optical microscope was developed and tested. High quality single-crystal layers of beryllium were grown on germanium by molecular beam epitaxy (MBE... OPTICAL ELEWENTS FOR X-UV WAVELENGTHS FALCO AND SLAUGHTEM indicate an increase in crystalline quality as T is increased. However, samples deposited at
A Secure Test Technique for Pipelined Advanced Encryption Standard
NASA Astrophysics Data System (ADS)
Shi, Youhua; Togawa, Nozomu; Yanagisawa, Masao; Ohtsuki, Tatsuo
In this paper, we presented a Design-for-Secure-Test (DFST) technique for pipelined AES to guarantee both the security and the test quality during testing. Unlike previous works, the proposed method can keep all the secrets inside and provide high test quality and fault diagnosis ability as well. Furthermore, the proposed DFST technique can significantly reduce test application time, test data volume, and test generation effort as additional benefits.
Littlefair, Joanne E; Knell, Robert J
2016-01-01
It is increasingly clear that parental environment can play an important role in determining offspring phenotype. These "transgenerational effects" have been linked to many different components of the environment, including toxin exposure, infection with pathogens and parasites, temperature and food quality. In this study, we focus on the latter, asking how variation in the quantity and quality of nutrition affects future generations. Previous studies have shown that artificial diets are a useful tool to examine the within-generation effects of variation in macronutrient content on life history traits, and could therefore be applied to investigations of the transgenerational effects of parental diet. Synthetic diets varying in total macronutrient content and protein: carbohydrate ratios were used to examine both within- and trans-generational effects on life history traits in a generalist stored product pest, the Indian meal moth Plodia interpunctella. The macronutrient composition of the diet was important for shaping within-generation life history traits, including pupal weight, adult weight, and phenoloxidase activity, and had indirect effects via maternal weight on fecundity. Despite these clear within-generation effects on the biology of P. interpunctella, diet composition had no transgenerational effects on the life history traits of offspring. P. interpunctella mothers were able to maintain their offspring quality, possibly at the expense of their own somatic condition, despite high variation in dietary macronutrient composition. This has important implications for the plastic biology of this successful generalist pest.
Knell, Robert J.
2016-01-01
It is increasingly clear that parental environment can play an important role in determining offspring phenotype. These “transgenerational effects” have been linked to many different components of the environment, including toxin exposure, infection with pathogens and parasites, temperature and food quality. In this study, we focus on the latter, asking how variation in the quantity and quality of nutrition affects future generations. Previous studies have shown that artificial diets are a useful tool to examine the within-generation effects of variation in macronutrient content on life history traits, and could therefore be applied to investigations of the transgenerational effects of parental diet. Synthetic diets varying in total macronutrient content and protein: carbohydrate ratios were used to examine both within- and trans-generational effects on life history traits in a generalist stored product pest, the Indian meal moth Plodia interpunctella. The macronutrient composition of the diet was important for shaping within-generation life history traits, including pupal weight, adult weight, and phenoloxidase activity, and had indirect effects via maternal weight on fecundity. Despite these clear within-generation effects on the biology of P. interpunctella, diet composition had no transgenerational effects on the life history traits of offspring. P. interpunctella mothers were able to maintain their offspring quality, possibly at the expense of their own somatic condition, despite high variation in dietary macronutrient composition. This has important implications for the plastic biology of this successful generalist pest. PMID:28033396
Transgenerational acclimatization in an herbivore–host plant relationship
Cahenzli, Fabian; Erhardt, Andreas
2013-01-01
Twenty years ago, scientists began to recognize that parental effects are one of the most important influences on progeny phenotype. Consequently, it was postulated that herbivorous insects could produce progeny that are acclimatized to the host plant experienced by the parents to improve progeny fitness, because host plants vary greatly in quality and quantity, and can thus provide important cues about the resources encountered by the next generation. However, despite the possible profound implications for our understanding of host-use evolution of herbivores, host-race formation and sympatric speciation, intense research has been unable to verify transgenerational acclimatization in herbivore–host plant relationships. We reared Coenonympha pamphilus larvae in the parental generation (P) on high- and low-quality host plants, and reared the offspring (F1) of both treatments again on high- and low-quality plants. We tested not only for maternal effects, as most previous studies, but also for paternal effects. Our results show that parents experiencing predictive cues on their host plant can indeed adjust progeny's phenotype to anticipated host plant quality. Maternal effects affected female and male offspring, whereas paternal effects affected only male progeny. We here verify, for the first time to our knowledge, the long postulated transgenerational acclimatization in an herbivore–host plant interaction. PMID:23407834
Transgenerational acclimatization in an herbivore-host plant relationship.
Cahenzli, Fabian; Erhardt, Andreas
2013-04-07
Twenty years ago, scientists began to recognize that parental effects are one of the most important influences on progeny phenotype. Consequently, it was postulated that herbivorous insects could produce progeny that are acclimatized to the host plant experienced by the parents to improve progeny fitness, because host plants vary greatly in quality and quantity, and can thus provide important cues about the resources encountered by the next generation. However, despite the possible profound implications for our understanding of host-use evolution of herbivores, host-race formation and sympatric speciation, intense research has been unable to verify transgenerational acclimatization in herbivore-host plant relationships. We reared Coenonympha pamphilus larvae in the parental generation (P) on high- and low-quality host plants, and reared the offspring (F(1)) of both treatments again on high- and low-quality plants. We tested not only for maternal effects, as most previous studies, but also for paternal effects. Our results show that parents experiencing predictive cues on their host plant can indeed adjust progeny's phenotype to anticipated host plant quality. Maternal effects affected female and male offspring, whereas paternal effects affected only male progeny. We here verify, for the first time to our knowledge, the long postulated transgenerational acclimatization in an herbivore-host plant interaction.
Alyusuf, Raja H.; Prasad, Kameshwar; Abdel Satir, Ali M.; Abalkhail, Ali A.; Arora, Roopa K.
2013-01-01
Background: The exponential use of the internet as a learning resource coupled with varied quality of many websites, lead to a need to identify suitable websites for teaching purposes. Aim: The aim of this study is to develop and to validate a tool, which evaluates the quality of undergraduate medical educational websites; and apply it to the field of pathology. Methods: A tool was devised through several steps of item generation, reduction, weightage, pilot testing, post-pilot modification of the tool and validating the tool. Tool validation included measurement of inter-observer reliability; and generation of criterion related, construct related and content related validity. The validated tool was subsequently tested by applying it to a population of pathology websites. Results and Discussion: Reliability testing showed a high internal consistency reliability (Cronbach's alpha = 0.92), high inter-observer reliability (Pearson's correlation r = 0.88), intraclass correlation coefficient = 0.85 and κ =0.75. It showed high criterion related, construct related and content related validity. The tool showed moderately high concordance with the gold standard (κ =0.61); 92.2% sensitivity, 67.8% specificity, 75.6% positive predictive value and 88.9% negative predictive value. The validated tool was applied to 278 websites; 29.9% were rated as recommended, 41.0% as recommended with caution and 29.1% as not recommended. Conclusion: A systematic tool was devised to evaluate the quality of websites for medical educational purposes. The tool was shown to yield reliable and valid inferences through its application to pathology websites. PMID:24392243
Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent
2018-01-01
Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.
Methods for slow axis beam quality improvement of high power broad area diode lasers
NASA Astrophysics Data System (ADS)
An, Haiyan; Xiong, Yihan; Jiang, Ching-Long J.; Schmidt, Berthold; Treusch, Georg
2014-03-01
For high brightness direct diode laser systems, it is of fundamental importance to improve the slow axis beam quality of the incorporated laser diodes regardless what beam combining technology is applied. To further advance our products in terms of increased brightness at a high power level, we must optimize the slow axis beam quality despite the far field blooming at high current levels. The later is caused predominantly by the built-in index step in combination with the thermal lens effect. Most of the methods for beam quality improvements reported in publications sacrifice the device efficiency and reliable output power. In order to improve the beam quality as well as maintain the efficiency and reliable output power, we investigated methods of influencing local heat generation to reduce the thermal gradient across the slow axis direction, optimizing the built-in index step and discriminating high order modes. Based on our findings, we have combined different methods in our new device design. Subsequently, the beam parameter product (BPP) of a 10% fill factor bar has improved by approximately 30% at 7 W/emitter without efficiency penalty. This technology has enabled fiber coupled high brightness multi-kilowatt direct diode laser systems. In this paper, we will elaborate on the methods used as well as the results achieved.
An Attention-Information-Based Spatial Adaptation Framework for Browsing Videos via Mobile Devices
NASA Astrophysics Data System (ADS)
Li, Houqiang; Wang, Yi; Chen, Chang Wen
2007-12-01
With the growing popularity of personal digital assistant devices and smart phones, more and more consumers are becoming quite enthusiastic to appreciate videos via mobile devices. However, limited display size of the mobile devices has been imposing significant barriers for users to enjoy browsing high-resolution videos. In this paper, we present an attention-information-based spatial adaptation framework to address this problem. The whole framework includes two major parts: video content generation and video adaptation system. During video compression, the attention information in video sequences will be detected using an attention model and embedded into bitstreams with proposed supplement-enhanced information (SEI) structure. Furthermore, we also develop an innovative scheme to adaptively adjust quantization parameters in order to simultaneously improve the quality of overall encoding and the quality of transcoding the attention areas. When the high-resolution bitstream is transmitted to mobile users, a fast transcoding algorithm we developed earlier will be applied to generate a new bitstream for attention areas in frames. The new low-resolution bitstream containing mostly attention information, instead of the high-resolution one, will be sent to users for display on the mobile devices. Experimental results show that the proposed spatial adaptation scheme is able to improve both subjective and objective video qualities.
Flexible NO(x) abatement from power plants in the eastern United States.
Sun, Lin; Webster, Mort; McGaughey, Gary; McDonald-Buller, Elena C; Thompson, Tammy; Prinn, Ronald; Ellerman, A Denny; Allen, David T
2012-05-15
Emission controls that provide incentives for maximizing reductions in emissions of ozone precursors on days when ozone concentrations are highest have the potential to be cost-effective ozone management strategies. Conventional prescriptive emissions controls or cap-and-trade programs consider all emissions similarly regardless of when they occur, despite the fact that contributions to ozone formation may vary. In contrast, a time-differentiated approach targets emissions reductions on forecasted high ozone days without imposition of additional costs on lower ozone days. This work examines simulations of such dynamic air quality management strategies for NO(x) emissions from electric generating units. Results from a model of day-specific NO(x) pricing applied to the Pennsylvania-New Jersey-Maryland (PJM) portion of the northeastern U.S. electrical grid demonstrate (i) that sufficient flexibility in electricity generation is available to allow power production to be switched from high to low NO(x) emitting facilities, (ii) that the emission price required to induce EGUs to change their strategies for power generation are competitive with other control costs, (iii) that dispatching strategies, which can change the spatial and temporal distribution of emissions, lead to ozone concentration reductions comparable to other control technologies, and (iv) that air quality forecasting is sufficiently accurate to allow EGUs to adapt their power generation strategies.
Analysis of Expressed Sequence Tags (EST) in Date Palm.
Al-Faifi, Sulieman A; Migdadi, Hussein M; Algamdi, Salem S; Khan, Mohammad Altaf; Al-Obeed, Rashid S; Ammar, Megahed H; Jakse, Jerenj
2017-01-01
Expressed sequence tags (EST) were generated from a normalized cDNA library of the date palm Sukkari cv. to understand the high-quality and better field performance of this well-known commercial cultivar. A total of 6943 high-quality ESTs were generated, out of them 6671 are submitted to the GenBank dbEST (LIBEST_028537). The generated ESTs were assembled into 6362 unigenes, consisting of 494 (14.4%) contigs and 5868 (84.53%) singletons. The functional annotation shows that the majority of the ESTs are associated with binding (44%), catalytic (40%), transporter (5%), and structural molecular (5%) activities. The blastx results show that 73% of unigenes are significantly similar to known plant genes and 27% are novel. The latter could be of particular interest in date palm genetic studies. Further analysis shows that some ESTs are categorized as stress/defense- and fruit development-related genes. These newly generated ESTs could significantly enhance date palm EST databases in the public domain and are available to scientists and researchers across the globe. This knowledge will facilitate the discovery of candidate genes that govern important developmental and agronomical traits in date palm. It will provide important resources for developing genetic tools, comparative genomics, and genome evolution among date palm cultivars.
Wu, Grace C; Torn, Margaret S; Williams, James H
2015-02-17
The land-use implications of deep decarbonization of the electricity sector (e.g., 80% below 1990 emissions) have not been well-characterized quantitatively or spatially. We assessed the operational-phase land-use requirements of different low-carbon scenarios for California in 2050 and found that most scenarios have comparable direct land footprints. While the per MWh footprint of renewable energy (RE) generation is initially higher, that of fossil and nuclear generation increases over time with continued fuel use. We built a spatially explicit model to understand the interactions between resource quality and environmental constraints in a high RE scenario (>70% of total generation). We found that there is sufficient land within California to meet the solar and geothermal targets, but areas with the highest quality wind and solar resources also tend to be those with high conservation value. Development of some land with lower conservation value results in lower average capacity factors, but also provides opportunity for colocation of different generation technologies, which could significantly improve land-use efficiency and reduce permitting, leasing, and transmission infrastructure costs. Basing siting decisions on environmentally-constrained long-term RE build-out requirements produces significantly different results, including better conservation outcomes, than implied by the current piecemeal approach to planning.
Best Practices in Overset Grid Generation
NASA Technical Reports Server (NTRS)
Gomez, Reynaldo J., III
2002-01-01
Accurate geometry + high quality grids are necessary for an accurate solution. Other requirements include a) Verified/validated solver with appropriate physics b) Convergence criteria consistent with application: 1) Aerodynamics - forces and moments; 2) Heat transfer - maximum and minimum heat transfer coefficients.
Kresse, Stine H; Namløs, Heidi M; Lorenz, Susanne; Berner, Jeanne-Marie; Myklebost, Ola; Bjerkehagen, Bodil; Meza-Zepeda, Leonardo A
2018-01-01
Nucleic acid material of adequate quality is crucial for successful high-throughput sequencing (HTS) analysis. DNA and RNA isolated from archival FFPE material are frequently degraded and not readily amplifiable due to chemical damage introduced during fixation. To identify optimal nucleic acid extraction kits, DNA and RNA quantity, quality and performance in HTS applications were evaluated. DNA and RNA were isolated from five sarcoma archival FFPE blocks, using eight extraction protocols from seven kits from three different commercial vendors. For DNA extraction, the truXTRAC FFPE DNA kit from Covaris gave higher yields and better amplifiable DNA, but all protocols gave comparable HTS library yields using Agilent SureSelect XT and performed well in downstream variant calling. For RNA extraction, all protocols gave comparable yields and amplifiable RNA. However, for fusion gene detection using the Archer FusionPlex Sarcoma Assay, the truXTRAC FFPE RNA kit from Covaris and Agencourt FormaPure kit from Beckman Coulter showed the highest percentage of unique read-pairs, providing higher complexity of HTS data and more frequent detection of recurrent fusion genes. truXTRAC simultaneous DNA and RNA extraction gave similar outputs as individual protocols. These findings show that although successful HTS libraries could be generated in most cases, the different protocols gave variable quantity and quality for FFPE nucleic acid extraction. Selecting the optimal procedure is highly valuable and may generate results in borderline quality specimens.
The effect of manager exclusion on nurse turnover intention and care quality.
Cottingham, Marci D; Erickson, Rebecca J; Diefendorff, James M; Bromley, Gail
2013-09-01
Little is known about how exclusionary practices (i.e., ignored, ostracized) by managers differ across demographics and influence nursing outcomes. This study examines whether managerial exclusion varies by generation, race, and gender, and the extent to which these variables, in turn, relate to turnover intention and perceived patient care among a sample of 747 nurses working in hospitals in a midwestern health system. Exclusion did not differ across most demographic groups, though men reported less exclusion than women. Younger nurses of the Millennial generation, those feeling excluded, and those with fewer years of experience reported lower quality patient care. Managerial exclusion, being a nurse of color, and less experience were associated with stronger intentions to leave. Nursing leaders should attend to factors that may contribute to racial minorities seeking other jobs, diminish younger nurses' ability to provide high-quality care, and minimize practices that might lead nurses to feel excluded.
Installation of new Generation General Purpose Computer (GPC) compact unit
NASA Technical Reports Server (NTRS)
1991-01-01
In the Kennedy Space Center's (KSC's) Orbiter Processing Facility (OPF) high bay 2, Spacecraft Electronics technician Ed Carter (right), wearing clean suit, prepares for (26864) and installs (26865) the new Generation General Purpose Computer (GPC) compact IBM unit in Atlantis', Orbiter Vehicle (OV) 104's, middeck avionics bay as Orbiter Systems Quality Control technician Doug Snider looks on. Both men work for NASA contractor Lockheed Space Operations Company. All three orbiters are being outfitted with the compact IBM unit, which replaces a two-unit earlier generation computer.
Analysis of Illumina Microbial Assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clum, Alicia; Foster, Brian; Froula, Jeff
2010-05-28
Since the emerging of second generation sequencing technologies, the evaluation of different sequencing approaches and their assembly strategies for different types of genomes has become an important undertaken. Next generation sequencing technologies dramatically increase sequence throughput while decreasing cost, making them an attractive tool for whole genome shotgun sequencing. To compare different approaches for de-novo whole genome assembly, appropriate tools and a solid understanding of both quantity and quality of the underlying sequence data are crucial. Here, we performed an in-depth analysis of short-read Illumina sequence assembly strategies for bacterial and archaeal genomes. Different types of Illumina libraries as wellmore » as different trim parameters and assemblers were evaluated. Results of the comparative analysis and sequencing platforms will be presented. The goal of this analysis is to develop a cost-effective approach for the increased throughput of the generation of high quality microbial genomes.« less
A high-fidelity weather time series generator using the Markov Chain process on a piecewise level
NASA Astrophysics Data System (ADS)
Hersvik, K.; Endrerud, O.-E. V.
2017-12-01
A method is developed for generating a set of unique weather time-series based on an existing weather series. The method allows statistically valid weather variations to take place within repeated simulations of offshore operations. The numerous generated time series need to share the same statistical qualities as the original time series. Statistical qualities here refer mainly to the distribution of weather windows available for work, including durations and frequencies of such weather windows, and seasonal characteristics. The method is based on the Markov chain process. The core new development lies in how the Markov Process is used, specifically by joining small pieces of random length time series together rather than joining individual weather states, each from a single time step, which is a common solution found in the literature. This new Markov model shows favorable characteristics with respect to the requirements set forth and all aspects of the validation performed.
Baby boomers' use and perception of recommended assistive technology: a systematic review.
Steel, Dianne M; Gray, Marion A
2009-05-01
The objective of this article is to review published studies to describe issues and quality of evidence surrounding assistive technology (AT) use by the baby boomer generation. As the baby boomer generation are ageing, they represent a new era for aged health care. In terms of helping this generation maintain independence, it is expected that there will be an increased demand for AT. A systematic literature search of Medline, CINAHL and Cochrane was undertaken. Selected studies were critically appraised using a previously validated tool. Inclusion criteria were: research related to AT use by a population which includes baby boomers; published in peer-reviewed journals and full-text English language articles. Studies were based in acute rehabilitation units in the USA and Australia. Frequency of use and patient satisfaction surveys were the main outcome measures. A total of 11 eligible studies were reviewed. All were cross-sectional. Many studies indicated a significant rate of AT non-use; use rates ranged from 35% to 86.5%. Numerous factors influencing use were proposed. Study quality was upper-mid range. Baby boomers will place more demand on AT in the future. There is a need for high-quality research to verify current findings and highlight AT issues specific to this generation.
A Stirling engine for use with lower quality fuels
NASA Astrophysics Data System (ADS)
Paul, Christopher J.
There is increasing interest in using renewable fuels from biomass or alternative fuels such as municipal waste to reduce the need for fossil based fuels. Due to the lower heating values and higher levels of impurities, small scale electricity generation is more problematic. Currently, there are not many technologically mature options for small scale electricity generation using lower quality fuels. Even though there are few manufacturers of Stirling engines, the history of their development for two centuries offers significant guidance in developing a viable small scale generator set using lower quality fuels. The history, development, and modeling of Stirling engines were reviewed to identify possible model and engine configurations. A Stirling engine model based on the finite volume, ideal adiabatic model was developed. Flow dissipation losses are shown to need correcting as they increase significantly at low mean engine pressure and high engine speed. The complete engine including external components was developed. A simple yet effective method of evaluating the external heat transfer to the Stirling engine was created that can be used with any second order Stirling engine model. A derivative of the General Motors Ground Power Unit 3 was designed. By significantly increasing heater, cooler and regenerator size at the expense of increased dead volume, and adding a combustion gas recirculation, a generator set with good efficiency was designed.
An improved multi-exposure approach for high quality holographic femtosecond laser patterning
NASA Astrophysics Data System (ADS)
Zhang, Chenchu; Hu, Yanlei; Li, Jiawen; Lao, Zhaoxin; Ni, Jincheng; Chu, Jiaru; Huang, Wenhao; Wu, Dong
2014-12-01
High efficiency two photon polymerization through single exposure via spatial light modulator (SLM) has been used to decrease the fabrication time and rapidly realize various micro/nanostructures, but the surface quality remains a big problem due to the speckle noise of optical intensity distribution at the defocused plane. Here, a multi-exposure approach which used tens of computer generate holograms successively loaded on SLM is presented to significantly improve the optical uniformity without losing efficiency. By applying multi-exposure, we found that the uniformity at the defocused plane was increased from ˜0.02 to ˜0.6 according to our simulation. The fabricated two series of letters "HELLO" and "USTC" under single-and multi-exposure in our experiment also verified that the surface quality was greatly improved. Moreover, by this method, several kinds of beam splitters with high quality, e.g., 2 × 2, 5 × 5 Daman, and complex nonseperate 5 × 5, gratings were fabricated with both of high quality and short time (<1 min, 95% time-saving). This multi-exposure SLM-two-photon polymerization method showed the promising prospect in rapidly fabricating and integrating various binary optical devices and their systems.
Work of PZT ceramics sounder for sound source artificial larynx
NASA Astrophysics Data System (ADS)
Sugio, Yuuichi; Kanetake, Ryota; Tanaka, Akimitsu; Ooe, Katsutoshi
2007-04-01
We aim to develop the easy-to-use artificial larynx with high tone quality. We focus on using a PZT ceramics sounder as its sound source, because it is small size, low power consumption, and harmless to humans. But conventional PZT ceramics sounder have the problem that it cannot generate an enough sound in the low frequency range, thus they cannot be used for artificial larynx. Then, we aim to develop the PZT ceramics sounder which can generate enough volume in the low frequency range. If we can lower the resonance frequency of the sounder, it can generate low pitch sound easily. Therefore I created the new diaphragm with low resonance frequency. In addition, we could obtain the high amplitude by changing method of driving. This time, we report on the characteristic comparison of this new PZT ceramics sounder and conventional one. Furthermore, for this new one, we analyzed the best alignment of PZT ceramics and the shape of the diaphragm to obtain low resonance frequency and big amplitude. In fact we analyzed the optimization of the structure. The analysis is done by computer simulation of ANSYS and Laser Doppler Vibrometer. In the future, we will add intonation to the generated sound by input wave form which is developed concurrently, and implant the sounder inside of the body by the method of fixing metal to biomolecule which is done too. And so high tone quality and convenient artificial larynx will be completed.
How Are Questions That Students Ask in High Level Mathematics Classes Linked to General Giftedness?
ERIC Educational Resources Information Center
Leikin, Roza; Koichu, Boris; Berman, Avi; Dinur, Sariga
2017-01-01
This paper presents a part of a larger study, in which we asked "How are learning and teaching of mathematics at high level linked to students' general giftedness?" We consider asking questions, especially student-generated questions, as indicators of quality of instructional interactions. In the part of the study presented in this…
ERIC Educational Resources Information Center
Lawrence, Nancy; Sanders, Felicia; Christman, Jolley Bruce; Duffy, Mark
2011-01-01
The Bill and Melinda Gates Foundation has invested in the development and dissemination of high-quality formative assessment tools to support teachers' incorporation of the Core Common State Standards (CCSS) into their classroom instruction. Lessons from the first generation of standards-based reforms suggest that intense attention to high quality…
Pulse - Accelerator Science in Medicine
imaging the human body. Many of medicine's most powerful diagnostic tools incorporate technology that is a technique used to produce high quality images of the inside of the human body. MRI is based on new generation of high-field superconducting MRI magnets will help unlock the secrets of the human
NASA Astrophysics Data System (ADS)
Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong
2016-12-01
We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.
Assembly of cucumber (Cucumis sativus L.) somaclones
NASA Astrophysics Data System (ADS)
Skarzyńska, Agnieszka; Kuśmirek, Wiktor; Pawełkowicz, Magdalena; PlÄ der, Wojciech; Nowak, Robert M.
2017-08-01
The development of next generation sequencing opens the possibility of using sequencing in various plant studies, such as finding structural changes and small polymorphisms between species and within them. Most analyzes rely on genomic sequences and it is crucial to use well-assembled genomes of high quality and completeness. Herein we compare commonly available programs for genomic assembling and newly developed software - dnaasm. Assemblies were tested on cucumber (Cucumis sativus L.) lines obtained by in vitro regeneration (somaclones), showing different phenotypes. Obtained results shows that dnaasm assembler is a good tool for short read assembly, which allows obtaining genomes of high quality and completeness.
Naval Research Lab Review 1999
1999-01-01
Center offers high-quality out- put from computer-generated files in EPS, Postscript, PICT, TIFF, Photoshop , and PowerPoint. Photo- graphic-quality color...767-3200 (228) 688-3390 (831) 656-4731 (410) 257-4000 DSN 297- or 754- 485 878 — Direct- in -Dialing 767- or 404- 688 656 257 Public Affairs (202) 767...research described in this NRL Review can be obtained from the Public Affairs Office, Code 1230, (202) 767-2541. Information concerning Technology
ATACseqQC: a Bioconductor package for post-alignment quality assessment of ATAC-seq data.
Ou, Jianhong; Liu, Haibo; Yu, Jun; Kelliher, Michelle A; Castilla, Lucio H; Lawson, Nathan D; Zhu, Lihua Julie
2018-03-01
ATAC-seq (Assays for Transposase-Accessible Chromatin using sequencing) is a recently developed technique for genome-wide analysis of chromatin accessibility. Compared to earlier methods for assaying chromatin accessibility, ATAC-seq is faster and easier to perform, does not require cross-linking, has higher signal to noise ratio, and can be performed on small cell numbers. However, to ensure a successful ATAC-seq experiment, step-by-step quality assurance processes, including both wet lab quality control and in silico quality assessment, are essential. While several tools have been developed or adopted for assessing read quality, identifying nucleosome occupancy and accessible regions from ATAC-seq data, none of the tools provide a comprehensive set of functionalities for preprocessing and quality assessment of aligned ATAC-seq datasets. We have developed a Bioconductor package, ATACseqQC, for easily generating various diagnostic plots to help researchers quickly assess the quality of their ATAC-seq data. In addition, this package contains functions to preprocess aligned ATAC-seq data for subsequent peak calling. Here we demonstrate the utilities of our package using 25 publicly available ATAC-seq datasets from four studies. We also provide guidelines on what the diagnostic plots should look like for an ideal ATAC-seq dataset. This software package has been used successfully for preprocessing and assessing several in-house and public ATAC-seq datasets. Diagnostic plots generated by this package will facilitate the quality assessment of ATAC-seq data, and help researchers to evaluate their own ATAC-seq experiments as well as select high-quality ATAC-seq datasets from public repositories such as GEO to avoid generating hypotheses or drawing conclusions from low-quality ATAC-seq experiments. The software, source code, and documentation are freely available as a Bioconductor package at https://bioconductor.org/packages/release/bioc/html/ATACseqQC.html .
GENERATING HIGH QUALITY IMPERVIOUS COVER DATA
Nonpoint source pollution (NPS) from urban/ suburban areas is rapidly increasing as the population increases in the United States. Research in recent years has consistently shown a strong relationship between the percentage of impervious cover in a drainage basin and the health...
NASA Astrophysics Data System (ADS)
Chakrabarti, Debalay; Chakrabarti, Ajit Kumar; Roy, Sanat Kumar
2018-05-01
The causes of defect generation in Ag-7.5 wt% Cu coinage alloy billets and in rolled and polished blanks were evaluated in this paper. Microstructural and compositional study of the as-cast billets indicated that excessive formation of gas-porosity and shrinkage cavity was responsible for crack formation during rolling. Carbon pick-up from charcoal flux cover used during melting, formation of CuS inclusions due to high-S content and rapid work-hardening also contributed to cracking during rolling. In order to prevent the defect generation, several measures were adopted. Those measures significantly reduced the defect generation and improved the surface luster of the trial rolled strips.
Generation of cylindrically polarized vector vortex beams with digital micromirror device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Lei; Liu, Weiwei; Wang, Meng
We propose a novel technique to directly transform a linearly polarized Gaussian beam into vector-vortex beams with various spatial patterns. Full high-quality control of amplitude and phase is implemented via a Digital Micro-mirror Device (DMD) binary holography for generating Laguerre-Gaussian, Bessel-Gaussian, and helical Mathieu–Gaussian modes, while a radial polarization converter (S-waveplate) is employed to effectively convert the optical vortices into cylindrically polarized vortex beams. Additionally, the generated vector-vortex beams maintain their polarization symmetry after arbitrary polarization manipulation. Due to the high frame rates of DMD, rapid switching among a series of vector modes carrying different orbital angular momenta paves themore » way for optical microscopy, trapping, and communication.« less
Riesgo, Ana; Pérez-Porro, Alicia R; Carmona, Susana; Leys, Sally P; Giribet, Gonzalo
2012-03-01
Transcriptome sequencing with next-generation sequencing technologies has the potential for addressing many long-standing questions about the biology of sponges. Transcriptome sequence quality depends on good cDNA libraries, which requires high-quality mRNA. Standard protocols for preserving and isolating mRNA often require optimization for unusual tissue types. Our aim was assessing the efficiency of two preservation modes, (i) flash freezing with liquid nitrogen (LN₂) and (ii) immersion in RNAlater, for the recovery of high-quality mRNA from sponge tissues. We also tested whether the long-term storage of samples at -80 °C affects the quantity and quality of mRNA. We extracted mRNA from nine sponge species and analysed the quantity and quality (A260/230 and A260/280 ratios) of mRNA according to preservation method, storage time, and taxonomy. The quantity and quality of mRNA depended significantly on the preservation method used (LN₂) outperforming RNAlater), the sponge species, and the interaction between them. When the preservation was analysed in combination with either storage time or species, the quantity and A260/230 ratio were both significantly higher for LN₂-preserved samples. Interestingly, individual comparisons for each preservation method over time indicated that both methods performed equally efficiently during the first month, but RNAlater lost efficiency in storage times longer than 2 months compared with flash-frozen samples. In summary, we find that for long-term preservation of samples, flash freezing is the preferred method. If LN₂ is not available, RNAlater can be used, but mRNA extraction during the first month of storage is advised. © 2011 Blackwell Publishing Ltd.
Zeng, G; Murphy, J; Annis, S-L; Wu, X; Wang, Y; McGowan, T; Macpherson, M
2012-07-01
To report a quality control program in prostate radiation therapy at our center that includes semi-automated planning process to generate high quality plans and in-house software to track plan quality in the subsequent clinical application. Arc planning in Eclipse v10.0 was preformed for both intact prostate and post-prostatectomy treatments. The planning focuses on DVH requirements and dose distributions being able to tolerate daily setup variations. A modified structure set is used to standardize the optimization, including short rectum and bladder in the fields to effectively tighten dose to target and a rectum expansion with 1cm cropped from PTV to block dose and shape posterior isodose lines. Structure, plan and optimization templates are used to streamline plan generation. DVH files are exported from Eclipse to a quality tracking software with GUI written in Matlab that can report the dose-volume data either for an individual patient or over a patient population. For 100 intact prostate patients treated with 78Gy, rectal D50, D25, D15 and D5 are 30.1±6.2Gy, 50.6±7.9Gy, 65.9±6.0Gy and 76.6±1.4Gy respectively, well below the limits 50Gy, 65Gy, 75Gy and 78Gy respectively. For prostate bed with prescription of 66Gy, rectal D50 is 35.9±6.9Gy. In both sites, PTV is covered by 95% prescription and the hotspots are less than 5%. The semi-automated planning method can efficiently create high quality plans while the tracking software can monitor the feedback from clinical application. It is a comprehensive and robust quality control program in radiation therapy. © 2012 American Association of Physicists in Medicine.
Information retrieval based on single-pixel optical imaging with quick-response code
NASA Astrophysics Data System (ADS)
Xiao, Yin; Chen, Wen
2018-04-01
Quick-response (QR) code technique is combined with ghost imaging (GI) to recover original information with high quality. An image is first transformed into a QR code. Then the QR code is treated as an input image in the input plane of a ghost imaging setup. After measurements, traditional correlation algorithm of ghost imaging is utilized to reconstruct an image (QR code form) with low quality. With this low-quality image as an initial guess, a Gerchberg-Saxton-like algorithm is used to improve its contrast, which is actually a post processing. Taking advantage of high error correction capability of QR code, original information can be recovered with high quality. Compared to the previous method, our method can obtain a high-quality image with comparatively fewer measurements, which means that the time-consuming postprocessing procedure can be avoided to some extent. In addition, for conventional ghost imaging, the larger the image size is, the more measurements are needed. However, for our method, images with different sizes can be converted into QR code with the same small size by using a QR generator. Hence, for the larger-size images, the time required to recover original information with high quality will be dramatically reduced. Our method makes it easy to recover a color image in a ghost imaging setup, because it is not necessary to divide the color image into three channels and respectively recover them.
Computer numeric control generation of toric surfaces
NASA Astrophysics Data System (ADS)
Bradley, Norman D.; Ball, Gary A.; Keller, John R.
1994-05-01
Until recently, the manufacture of toric ophthalmic lenses relied largely upon expensive, manual techniques for generation and polishing. Recent gains in computer numeric control (CNC) technology and tooling enable lens designers to employ single- point diamond, fly-cutting methods in the production of torics. Fly-cutting methods continue to improve, significantly expanding lens design possibilities while lowering production costs. Advantages of CNC fly cutting include precise control of surface geometry, rapid production with high throughput, and high-quality lens surface finishes requiring minimal polishing. As accessibility and affordability increase within the ophthalmic market, torics promise to dramatically expand lens design choices available to consumers.
High Current Density Cathodes for Future Vacuum Electronics Applications
2008-05-30
Tube - device for generating high levels of RF power DARPA Defense Advanced Research Agency PBG Photonic band gap W- Band 75-111 GHz dB Decibels GHz...Extended interaction klystron 1. Introduction All RF vacuum electron sources require a high quality electron beam for efficient operation. Research on...with long life. Pres- ently, only thermionic dispenser cathodes are practical for high power RF sources. Typical thermi- onic cathodes consists of a
A Generator-Produced Gallium-68 Radiopharmaceutical for PET Imaging of Myocardial Perfusion
Sharma, Vijay; Sivapackiam, Jothilingam; Harpstrite, Scott E.; Prior, Julie L.; Gu, Hannah; Rath, Nigam P.; Piwnica-Worms, David
2014-01-01
Lipophilic cationic technetium-99m-complexes are widely used for myocardial perfusion imaging (MPI). However, inherent uncertainties in the supply chain of molybdenum-99, the parent isotope required for manufacturing 99Mo/99mTc generators, intensifies the need for discovery of novel MPI agents incorporating alternative radionuclides. Recently, germanium/gallium (Ge/Ga) generators capable of producing high quality 68Ga, an isotope with excellent emission characteristics for clinical PET imaging, have emerged. Herein, we report a novel 68Ga-complex identified through mechanism-based cell screening that holds promise as a generator-produced radiopharmaceutical for PET MPI. PMID:25353349
Deshmukh, Nishikant P; Kang, Hyun Jae; Billings, Seth D; Taylor, Russell H; Hager, Gregory D; Boctor, Emad M
2014-01-01
A system for real-time ultrasound (US) elastography will advance interventions for the diagnosis and treatment of cancer by advancing methods such as thermal monitoring of tissue ablation. A multi-stream graphics processing unit (GPU) based accelerated normalized cross-correlation (NCC) elastography, with a maximum frame rate of 78 frames per second, is presented in this paper. A study of NCC window size is undertaken to determine the effect on frame rate and the quality of output elastography images. This paper also presents a novel system for Online Tracked Ultrasound Elastography (O-TRuE), which extends prior work on an offline method. By tracking the US probe with an electromagnetic (EM) tracker, the system selects in-plane radio frequency (RF) data frames for generating high quality elastograms. A novel method for evaluating the quality of an elastography output stream is presented, suggesting that O-TRuE generates more stable elastograms than generated by untracked, free-hand palpation. Since EM tracking cannot be used in all systems, an integration of real-time elastography and the da Vinci Surgical System is presented and evaluated for elastography stream quality based on our metric. The da Vinci surgical robot is outfitted with a laparoscopic US probe, and palpation motions are autonomously generated by customized software. It is found that a stable output stream can be achieved, which is affected by both the frequency and amplitude of palpation. The GPU framework is validated using data from in-vivo pig liver ablation; the generated elastography images identify the ablated region, outlined more clearly than in the corresponding B-mode US images.
Deshmukh, Nishikant P.; Kang, Hyun Jae; Billings, Seth D.; Taylor, Russell H.; Hager, Gregory D.; Boctor, Emad M.
2014-01-01
A system for real-time ultrasound (US) elastography will advance interventions for the diagnosis and treatment of cancer by advancing methods such as thermal monitoring of tissue ablation. A multi-stream graphics processing unit (GPU) based accelerated normalized cross-correlation (NCC) elastography, with a maximum frame rate of 78 frames per second, is presented in this paper. A study of NCC window size is undertaken to determine the effect on frame rate and the quality of output elastography images. This paper also presents a novel system for Online Tracked Ultrasound Elastography (O-TRuE), which extends prior work on an offline method. By tracking the US probe with an electromagnetic (EM) tracker, the system selects in-plane radio frequency (RF) data frames for generating high quality elastograms. A novel method for evaluating the quality of an elastography output stream is presented, suggesting that O-TRuE generates more stable elastograms than generated by untracked, free-hand palpation. Since EM tracking cannot be used in all systems, an integration of real-time elastography and the da Vinci Surgical System is presented and evaluated for elastography stream quality based on our metric. The da Vinci surgical robot is outfitted with a laparoscopic US probe, and palpation motions are autonomously generated by customized software. It is found that a stable output stream can be achieved, which is affected by both the frequency and amplitude of palpation. The GPU framework is validated using data from in-vivo pig liver ablation; the generated elastography images identify the ablated region, outlined more clearly than in the corresponding B-mode US images. PMID:25541954
NASA Astrophysics Data System (ADS)
Nakatsuji, Noriaki; Matsushima, Kyoji
2017-03-01
Full-parallax high-definition CGHs composed of more than billion pixels were so far created only by the polygon-based method because of its high performance. However, GPUs recently allow us to generate CGHs much faster by the point cloud. In this paper, we measure computation time of object fields for full-parallax high-definition CGHs, which are composed of 4 billion pixels and reconstruct the same scene, by using the point cloud with GPU and the polygon-based method with CPU. In addition, we compare the optical and simulated reconstructions between CGHs created by these techniques to verify the image quality.
Musumeci, P; Moody, J T; Scoby, C M; Gutierrez, M S; Bender, H A; Wilcox, N S
2010-01-01
Single shot diffraction patterns using a 250-fs-long electron beam have been obtained at the UCLA Pegasus laboratory. High quality images with spatial resolution sufficient to distinguish closely spaced peaks in the Debye-Scherrer ring pattern have been recorded by scattering the 1.6 pC 3.5 MeV electron beam generated in the rf photoinjector off a 100-nm-thick Au foil. Dark current and high emittance particles are removed from the beam before sending it onto the diffraction target using a 1 mm diameter collimating hole. These results open the door to the study of irreversible phase transformations by single shot MeV electron diffraction.
NASA Astrophysics Data System (ADS)
Abel, David; Holloway, Tracey; Harkey, Monica; Rrushaj, Arber; Brinkman, Greg; Duran, Phillip; Janssen, Mark; Denholm, Paul
2018-02-01
We evaluate how fine particulate matter (PM2.5) and precursor emissions could be reduced if 17% of electricity generation was replaced with solar photovoltaics (PV) in the Eastern United States. Electricity generation is simulated using GridView, then used to scale electricity-sector emissions of sulfur dioxide (SO2) and nitrogen oxides (NOX) from an existing gridded inventory of air emissions. This approach offers a novel method to leverage advanced electricity simulations with state-of-the-art emissions inventories, without necessitating recalculation of emissions for each facility. The baseline and perturbed emissions are input to the Community Multiscale Air Quality Model (CMAQ version 4.7.1) for a full accounting of time- and space-varying air quality changes associated with the 17% PV scenario. These results offer a high-value opportunity to evaluate the reduced-form AVoided Emissions and geneRation Tool (AVERT), while using AVERT to test the sensitivity of results to changing base-years and levels of solar integration. We find that average NOX and SO2 emissions across the region decrease 20% and 15%, respectively. PM2.5 concentrations decreased on average 4.7% across the Eastern U.S., with nitrate (NO3-) PM2.5 decreasing 3.7% and sulfate (SO42-) PM2.5 decreasing 9.1%. In the five largest cities in the region, we find that the most polluted days show the most significant PM2.5 decrease under the 17% PV generation scenario, and that the greatest benefits are accrued to cities in or near the Ohio River Valley. We find summer health benefits from reduced PM2.5 exposure estimated as 1424 avoided premature deaths (95% Confidence Interval (CI): 284 deaths, 2 732 deaths) or a health savings of 13.1 billion (95% CI: 0.6 billion, 43.9 billion) These results highlight the potential for renewable energy as a tool for air quality managers to support current and future health-based air quality regulations.
NASA Astrophysics Data System (ADS)
Wang, Andong; Li, Xiaowei; Qu, Lianti; Lu, Yongfeng; Jiang, Lan
2017-03-01
Metal nanowire fabrication has drawn tremendous attention in recent years due to its wide application in electronics, optoelectronics, and plasmonics. However, conventional laser fabrication technologies are limited by diffraction limit thus the fabrication resolution cannot meet the increasingly high demand of modern devices. Herein we report on a novel method for high-resolution high-quality metal nanowire fabrication by using Hermite-Gaussian beam to ablate metal thin film. The nanowire is formed due to the intensity valley in the center of the laser beam while the surrounding film is ablated. Arbitrary nanowire can be generated on the substrate by dynamically adjusting the orientation of the intensity valley. This method shows obvious advantages compared to conventional methods. First, the minimum nanowire has a width of 60 nm (≍1/13 of the laser wavelength), which is much smaller than the diffraction limit. The high resolution is achieved by combining the ultrashort nature of the femtosecond laser and the low thermal conductivity of the thin film. In addition, the fabricated nanowires have good inside qualities. No inner nanopores and particle intervals are generated inside the nanowire, thus endowing the nanowire with good electronic characteristics: the conductivity of the nanowires is as high as 1.2×107 S/m (≍1/4 of buck material), and the maximum current density is up to 1.66×108 A/m2. Last, the nanowire has a good adhesion to the substrates, which can withstand ultrasonic bath for a long time. These advantages make our method a good approach for high-resolution high-quality nanowire fabrication as a complementary method to conventional lithography methods.
NASA Astrophysics Data System (ADS)
Noh, M. J.; Howat, I. M.; Porter, C. C.; Willis, M. J.; Morin, P. J.
2016-12-01
The Arctic is undergoing rapid change associated with climate warming. Digital Elevation Models (DEMs) provide critical information for change measurement and infrastructure planning in this vulnerable region, yet the existing quality and coverage of DEMs in the Arctic is poor. Low contrast and repeatedly-textured surfaces, such as snow and glacial ice and mountain shadows, all common in the Arctic, challenge existing stereo-photogrammetric techniques. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible to the scientific community. To utilize these imagery for extracting DEMs at a large scale over glaciated and high latitude regions we developed the Surface Extraction from TIN-based Searchspace Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the satellite rational polynomial coefficients (RPCs). Using SETSM, we have generated a large number of DEMs (> 100,000 scene pair) from WorldView, GeoEye and QuickBird stereo images collected by DigitalGlobe Inc. and archived by the Polar Geospatial Center (PGC) at the University of Minnesota through an academic licensing program maintained by the US National Geospatial-Intelligence Agency (NGA). SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM program, with the objective of generating high resolution (2-8m) topography for the entire Arctic landmass, including seamless DEM mosaics and repeat DEM strips for change detection. ArcticDEM is collaboration between multiple US universities, governmental agencies and private companies, as well as international partners assisting with quality control and registration. ArcticDEM is being produced using the petascale Blue Waters supercomputer at the National Center for Supercomputer Applications at the University of Illinois. In this paper, we introduce the SETSM algorithm and the processing system used for the ArcticDEM project, as well as provide notable examples of ArcticDEM products.
ERIC Educational Resources Information Center
Yu, Fu-Yun; Wu, Chun-Ping
2016-01-01
The research objectives of this study were to examine the individual and combined predictive effects of the quality of online peer-feedback provided and received on primary school students' quality of question-generation. A correlational study was adopted, and performance data from 213 fifth-grade students engaged in online question-generation and…
Mirzaei, Ardalan; Carter, Stephen R; Chen, Jenny Yimin; Rittsteuer, Claudia; Schneider, Carl R
2018-06-11
Recent changes within community pharmacy have seen a shift towards some pharmacies providing "value-added" services. However, providing high levels of service is resource intensive yet revenues from dispensing are declining. Of significance therefore, is how consumers perceive service quality (SQ). However, at present there are no validated and reliable instruments to measure consumers' perceptions of SQ in Australian community pharmacies. The aim of this study was to build a theory-grounded model of service quality (SQ) in community pharmacies and to create a valid survey instrument to measure consumers' perceptions of service quality. Stage 1 dealt with item generation using theory, prior research and qualitative interviews with pharmacy consumers. Selected items were then subjected to content validity and face validity. Stages 2 and 3 included psychometric testing among English-speaking adult consumers of Australian pharmacies. Exploratory factor analysis was used for item reduction and to explain the domains of SQ. In stage 1, item generation for SQ initially generated 113 items which were then refined, through content and face validity, down to 61 items. In stage 2, after subjecting the questionnaire to psychometric testing on the data from the first pharmacy (n = 374), the use of the primary dimensions of SQ was abandoned leaving 32 items representing 5 domains of SQ. In stage 3, the questionnaire was subject to further testing and item reduction in 3 other pharmacies (n = 320). SQ was best described using 23 items representing 6 domains: 'health and medicines advice', 'relationship quality', 'technical quality', 'environmental quality', 'non-prescription service', and 'health outcomes'. This research presents a theoretically-grounded and robust measurement scale developed for consumer perceptions of SQ in a community pharmacy. Copyright © 2018. Published by Elsevier Inc.
Jia, Shiyu; Zhang, Rui; Lin, Guigao; Peng, Rongxue; Gao, Peng; Han, Yanxi; Fu, Yu; Ding, Jiansheng; Wu, Qisheng; Zhang, Kuo; Xie, Jiehong; Li, Jinming
2018-06-01
KRAS mutations are the key indicator for EGFR monoclonal antibody-targeted therapy and acquired drug resistance, and their accurate detection is critical to the clinical decision-making of colorectal cancer. However, no proper quality control material is available for the current detection methods, particularly next-generation sequencing (NGS). The ideal quality control material for NGS needs to provide both the tumor mutation gene and the matched background genomic DNA, which is uncataloged in public databases, to accurately distinguish germline polymorphisms and somatic mutations. We developed a novel KRAS G12V mutant cell line using the clustered regularly interspaced short palindromic repeat (CRISPR)/CRISPR-associated protein 9 (Cas9) technique to make up for the deficiencies in existing quality control material and further validated the feasibility of the cell line as quality control material by amplification refractory mutation system (ARMS), Sanger sequencing, digital PCR (dPCR), and NGS. We verified that the edited cell line specifically had the G12V mutation, and the validation results presented a high consistency among the four methods of detection. The three cell lines screened contained the G12V mutation and the mutation allele fractions of G12V-1, G12V-2, and G12V-3 were 52.01%, 82.06%, and 17.29%, respectively. The novel KRAS G12V cell line generated using the CRISPR/Cas9 gene editing system is suitable as a quality control material for all current detection methods and provides a new direction in the development of quality control material. © 2018 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Wong, Derek N.
The US Navy is actively developing all electric fleets, raising serious questions about what is required of onboard power supplies in order to properly power the ship's electrical systems. This is especially relevant when choosing a viable power source to drive high power propulsion and electric weapon systems in addition to the conventional loads deployed aboard these types of vessels. Especially when high pulsed power loads are supplied, the issue of maintaining power quality becomes important and increasingly complex. Conventionally, a vessel's electrical power is generated using gas turbine or diesel driven motor-generator sets that are very inefficient when they are used outside of their most efficient load condition. What this means is that if the generator is not being utilized continuously at its most efficient load capacity, the quality of the output power may also be effected and fall outside of the acceptable power quality limits imposed through military standards. As a solution to this potential problem, the Navy has proposed using electrochemical storage devices since they are able to buffer conventional generators when the load is operating below the generator's most efficient power level or able to efficiently augment a generator when the load is operating in excess of the generator's most efficient power rating. Specifically, the US Navy is interested in using commercial off-the-shelf (COTS) lithium-ion batteries within an intelligently controlled energy storage module that could act as either a prime power supply for on-board pulsed power systems or as a backup generator to other shipboard power systems. Due to the unique load profile of high-rate pulsed power systems, the implementation of lithium-ion batteries within these complex systems requires them to be operated at very high rates and the effects these things have on cell degradation has been an area of focus. There is very little published research into the effects that high power transient or pulsed loading has on the degradation mechanisms of secondary lithium-ion cells. Prior to performing this work, it was unclear if the implementation of lithium-ion batteries in highly transient load conditions at high rate would accelerate cell degradation mechanisms that have been previously considered as minor issues. This work has focused on answering these previously unanswered questions. In early experiments performed here, COTS lithium-iron-phosphate (LFP) cells were studied under high-rate, transient load conditions and it was found that their capacity fade deviated from the traditional linear behavior and exponentially declined until no charge could be accepted when recharge was attempted at high rate. These findings indicated that subjecting LFP chemistries to transient, high rate charge/discharge profiles induced rapid changes in the electrode/electrolyte interface that rendered the cells useless when high rate recharge was required. These findings suggested there was more phenomena to learn about how these cells degraded under high rate pulsed conditions before they are fielded in Naval applications. Therefore, the research presented here has been focused on understanding the degradation mechanisms that are unique to LFP cells when they are cycled under pulsed load profiles at high charge and discharge rates. In particular, the work has been focused on identifying major degradation reactions that occur by studying the surface chemistry of cycled electrode materials. Efforts have been performed to map the impedance evolution of both cathode and anode half cells, respectively, using a novel three electrode technique that was developed for this research. Using this technique, the progression of degradation has been mapped using analysis of differential capacitance spectrums. In both the three electrode EIS mapping and differential capacitance analysis that has been performed, electrical component models have been developed. The results presented will show that there are unique degradation mechanisms induced through high rate pulsed loading conditions that are not normally seen in low rate continuous cycling of LFP cells.
Laser source with high pulse energy at 3-5 μm and 8-12 μm based on nonlinear conversion in ZnGeP2
NASA Astrophysics Data System (ADS)
Lippert, Espen; Fonnum, Helge; Haakestad, Magnus W.
2014-10-01
We present a high energy infrared laser source where a Tm:fiber laser is used to pump a high-energy 2-μm cryogenically cooled Ho:YLF laser. We have achieved 550 mJ of output energy at 2.05 μm, and through non-linear conversion in ZnGeP2 generated 200 mJ in the 3-5-μm range. Using a numerical simulation tool we have also investigated a setup which should generate more than 70 mJ in the 8-12-μm range. The conversion stage uses a master-oscillator-power-amplifier architecture to enable high conversion efficiency and good beam quality.
Kim, Ki Hwan; Do, Won-Joon; Park, Sung-Hong
2018-05-04
The routine MRI scan protocol consists of multiple pulse sequences that acquire images of varying contrast. Since high frequency contents such as edges are not significantly affected by image contrast, down-sampled images in one contrast may be improved by high resolution (HR) images acquired in another contrast, reducing the total scan time. In this study, we propose a new deep learning framework that uses HR MR images in one contrast to generate HR MR images from highly down-sampled MR images in another contrast. The proposed convolutional neural network (CNN) framework consists of two CNNs: (a) a reconstruction CNN for generating HR images from the down-sampled images using HR images acquired with a different MRI sequence and (b) a discriminator CNN for improving the perceptual quality of the generated HR images. The proposed method was evaluated using a public brain tumor database and in vivo datasets. The performance of the proposed method was assessed in tumor and no-tumor cases separately, with perceptual image quality being judged by a radiologist. To overcome the challenge of training the network with a small number of available in vivo datasets, the network was pretrained using the public database and then fine-tuned using the small number of in vivo datasets. The performance of the proposed method was also compared to that of several compressed sensing (CS) algorithms. Incorporating HR images of another contrast improved the quantitative assessments of the generated HR image in reference to ground truth. Also, incorporating a discriminator CNN yielded perceptually higher image quality. These results were verified in regions of normal tissue as well as tumors for various MRI sequences from pseudo k-space data generated from the public database. The combination of pretraining with the public database and fine-tuning with the small number of real k-space datasets enhanced the performance of CNNs in in vivo application compared to training CNNs from scratch. The proposed method outperformed the compressed sensing methods. The proposed method can be a good strategy for accelerating routine MRI scanning. © 2018 American Association of Physicists in Medicine.
Using management information systems to enhance health care quality assurance.
Rosser, L H; Kleiner, B H
1995-01-01
Examines how computers and quality assurance are being used to improve the quality of health care delivery. Traditional quality assurance methods have been limited in their ability to effectively manage the high volume of data generated by the health care process. Computers on the other hand are able to handle large volumes of data as well as monitor patient care activities in both the acute care and ambulatory care settings. Discusses the use of computers to collect and analyse patient data so that changes and problems can be identified. In addition, computer models for reminding physicians to order appropriate preventive health measures for their patients are presented. Concludes that the use of computers to augment quality improvement is essential if the quality of patient care and health promotion are to be improved.
Study on development system of increasing gearbox for high-performance wind-power generator
NASA Astrophysics Data System (ADS)
Xu, Hongbin; Yan, Kejun; Zhao, Junyu
2005-12-01
Based on the analysis of the development potentiality of wind-power generator and domestic manufacture of its key parts in China, an independent development system of the Increasing Gearbox for High-performance Wind-power Generator (IGHPWG) was introduced. The main elements of the system were studied, including the procedure design, design analysis system, manufacturing technology and detecting system, and the relative important technologies were analyzed such as mixed optimal joint transmission structure of the first planetary drive with two grade parallel axle drive based on equal strength, tooth root round cutting technology before milling hard tooth surface, high-precise tooth grinding technology, heat treatment optimal technology and complex surface technique, and rig test and detection technique of IGHPWG. The development conception was advanced the data share and quality assurance system through all the elements of the development system. The increasing Gearboxes for 600KW and 1MW Wind-power Generator have been successfully developed through the application of the development system.
NASA Astrophysics Data System (ADS)
Asfaw, Alemayehu; Shucksmith, James; Smith, Andrea; Cherry, Katherine
2015-04-01
Metaldehyde is an active ingredient in agricultural pesticides such as slug pellets, which are heavily applied to UK farmland during the autumn application season. There is current concern that existing drinking water treatment processes may be inadequate in reducing potentially high levels of metaldehyde in surface waters to below the UK drinking water quality regulation limit of 0.1 µg/l. In addition, current water quality monitoring methods can miss short term fluctuations in metaldehyde concentration caused by rainfall driven runoff, hampering prediction of the potential risk of exposure. Datasets describing levels, fate and transport of metaldehyde in river catchments are currently very scarce. This work presents results from an ongoing study to quantify the presence of metaldehyde in surface waters within a UK catchment used for drinking water abstraction. High resolution water quality data from auto-samplers installed in rivers are coupled with radar rainfall, catchment characteristics and land use data to i) understand which hydro-meteorological characteristics of the catchment trigger the peak migration of metaldehyde to surface waters; ii) assess the relationship between measured metaldehyde levels and catchment characteristics such as land use, topographic index, proximity to water bodies and runoff generation area; iii) describe the current risks to drinking water supply and discuss mitigation options based on modelling and real-time control of water abstraction. Identifying the correlation between catchment attributes and metaldehyde generation will help in the development of effective catchment management strategies, which can help to significantly reduce the amount of metaldehyde finding its way into river water. Furthermore, the effectiveness of current water quality monitoring strategy in accurately quantifying the generation of metaldehyde from the catchment and its ability to benefit the development of effective catchment management practices has also been investigated.
Redfern, S; Norman, I
1999-07-01
The aims of the study were to identify indicators of quality of nursing care from the perceptions of patients and nurses, and to determine the congruence between patients' and nurses' perceptions. The paper is presented in two parts. Part 1 includes the background and methods to the study and the findings from the comparison of patients' and nurses' perceptions. Part 2 describes the perceptions of patients and nurses, and the conclusions drawn from the study as a whole. Patients and nurses in hospital wards were interviewed using the critical incident technique. We grouped 4546 indicators of high and low quality nursing care generated from the interview transcripts into 316 subcategories, 68 categories and 31 themes. Congruence between patients' and nurses' perceptions of quality was high and significant, although there was some difference of emphasis.
NASA Astrophysics Data System (ADS)
Guo, Tongqing; Chen, Hao; Lu, Zhiliang
2018-05-01
Aiming at extremely large deformation, a novel predictor-corrector-based dynamic mesh method for multi-block structured grid is proposed. In this work, the dynamic mesh generation is completed in three steps. At first, some typical dynamic positions are selected and high-quality multi-block grids with the same topology are generated at those positions. Then, Lagrange interpolation method is adopted to predict the dynamic mesh at any dynamic position. Finally, a rapid elastic deforming technique is used to correct the small deviation between the interpolated geometric configuration and the actual instantaneous one. Compared with the traditional methods, the results demonstrate that the present method shows stronger deformation ability and higher dynamic mesh quality.
Song, Ting; Li, Nan; Zarepisheh, Masoud; Li, Yongbao; Gautier, Quentin; Zhou, Linghong; Mell, Loren; Jiang, Steve; Cerviño, Laura
2016-01-01
Intensity-modulated radiation therapy (IMRT) currently plays an important role in radiotherapy, but its treatment plan quality can vary significantly among institutions and planners. Treatment plan quality control (QC) is a necessary component for individual clinics to ensure that patients receive treatments with high therapeutic gain ratios. The voxel-weighting factor-based plan re-optimization mechanism has been proved able to explore a larger Pareto surface (solution domain) and therefore increase the possibility of finding an optimal treatment plan. In this study, we incorporated additional modules into an in-house developed voxel weighting factor-based re-optimization algorithm, which was enhanced as a highly automated and accurate IMRT plan QC tool (TPS-QC tool). After importing an under-assessment plan, the TPS-QC tool was able to generate a QC report within 2 minutes. This QC report contains the plan quality determination as well as information supporting the determination. Finally, the IMRT plan quality can be controlled by approving quality-passed plans and replacing quality-failed plans using the TPS-QC tool. The feasibility and accuracy of the proposed TPS-QC tool were evaluated using 25 clinically approved cervical cancer patient IMRT plans and 5 manually created poor-quality IMRT plans. The results showed high consistency between the QC report quality determinations and the actual plan quality. In the 25 clinically approved cases that the TPS-QC tool identified as passed, a greater difference could be observed for dosimetric endpoints for organs at risk (OAR) than for planning target volume (PTV), implying that better dose sparing could be achieved in OAR than in PTV. In addition, the dose-volume histogram (DVH) curves of the TPS-QC tool re-optimized plans satisfied the dosimetric criteria more frequently than did the under-assessment plans. In addition, the criteria for unsatisfied dosimetric endpoints in the 5 poor-quality plans could typically be satisfied when the TPS-QC tool generated re-optimized plans without sacrificing other dosimetric endpoints. In addition to its feasibility and accuracy, the proposed TPS-QC tool is also user-friendly and easy to operate, both of which are necessary characteristics for clinical use. PMID:26930204
Moses, J; Huang, S-W; Hong, K-H; Mücke, O D; Falcão-Filho, E L; Benedick, A; Ilday, F O; Dergachev, A; Bolger, J A; Eggleton, B J; Kärtner, F X
2009-06-01
We present a 9 GW peak power, three-cycle, 2.2 microm optical parametric chirped-pulse amplification source with 1.5% rms energy and 150 mrad carrier envelope phase fluctuations. These characteristics, in addition to excellent beam, wavefront, and pulse quality, make the source suitable for long-wavelength-driven high-harmonic generation. High stability is achieved by careful optimization of superfluorescence suppression, enabling energy scaling.
Gyroharmonic conversion experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirshfield, J. L.; LaPointe, M. A.; Yale University, New Haven, Connecticut 06511
1999-05-07
Generation of high power microwaves has been observed in experiments where a 250-350 kV, 20-30 A electron beam accelerated in a cyclotron autoresonance accelerator (CARA) passes through a cavity tuned gyroharmonic) and at 8.6 GHz (3rd harmonic) will be described. Theory indicates that high conversion efficiency can be obtained for a high quality beam injected into CARA, and when mode competition can be controlled. Comparisons will be made between the experiments and theory. Planned 7th harmonic experiments will also be described, in which phase matching between the TE-72 mode at 20 GHz, and the TE-11 mode at 2.86 GHz, allowsmore » efficient 20 GHz co-generation within the CARA waveguide itself.« less
David, Matthias; Borde, Theda; Brenne, Silke; Henrich, Wolfgang; Breckenkamp, Jürgen; Razum, Oliver
2015-01-01
Objective The frequency of caesarean section delivery varies between countries and social groups. Among other factors, it is determined by the quality of obstetrics care. Rates of elective (planned) and emergency (in-labor) caesareans may also vary between immigrants (first generation), their offspring (second- and third-generation women), and non-immigrants because of access and language barriers. Other important points to be considered are whether caesarean section indications and the neonatal outcomes differ in babies delivered by caesarean between immigrants, their offspring, and non-immigrants. Methods A standardized interview on admission to delivery wards at three Berlin obstetric hospitals was performed in a 12-month period in 2011/2012. Questions on socio-demographic and care aspects and on migration (immigrated herself vs. second- and third-generation women vs. non-immigrant) and acculturation status were included. Data was linked with information from the expectant mothers’ antenatal records and with perinatal data routinely documented in the hospital. Regression modeling was used to adjust for age, parity and socio-economic status. Results The caesarean section rates for immigrants, second- and third-generation women, and non-immigrant women were similar. Neither indications for caesarean section delivery nor neonatal outcomes showed statistically significant differences. The only difference found was a somewhat higher rate of crash caesarean sections per 100 births among first generation immigrants compared to non-immigrants. Conclusion Unlike earlier German studies and current studies from other European countries, this study did not find an increased rate of caesarean sections among immigrants, as well as second- and third-generation women, with the possible exception of a small high-risk group. This indicates an equally high quality of perinatal care for women with and without a migration history. PMID:25985437
Modeling and Grid Generation of Iced Airfoils
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Hackenberg, Anthony W.; Pennline, James A.; Schilling, Herbert W.
2007-01-01
SmaggIce Version 2.0 is a software toolkit for geometric modeling and grid generation for two-dimensional, singleand multi-element, clean and iced airfoils. A previous version of SmaggIce was described in Preparing and Analyzing Iced Airfoils, NASA Tech Briefs, Vol. 28, No. 8 (August 2004), page 32. To recapitulate: Ice shapes make it difficult to generate quality grids around airfoils, yet these grids are essential for predicting ice-induced complex flow. This software efficiently creates high-quality structured grids with tools that are uniquely tailored for various ice shapes. SmaggIce Version 2.0 significantly enhances the previous version primarily by adding the capability to generate grids for multi-element airfoils. This version of the software is an important step in streamlining the aeronautical analysis of ice airfoils using computational fluid dynamics (CFD) tools. The user may prepare the ice shape, define the flow domain, decompose it into blocks, generate grids, modify/divide/merge blocks, and control grid density and smoothness. All these steps may be performed efficiently even for the difficult glaze and rime ice shapes. Providing the means to generate highly controlled grids near rough ice, the software includes the creation of a wrap-around block (called the "viscous sublayer block"), which is a thin, C-type block around the wake line and iced airfoil. For multi-element airfoils, the software makes use of grids that wrap around and fill in the areas between the viscous sub-layer blocks for all elements that make up the airfoil. A scripting feature records the history of interactive steps, which can be edited and replayed later to produce other grids. Using this version of SmaggIce, ice shape handling and grid generation can become a practical engineering process, rather than a laborious research effort.
David, Matthias; Borde, Theda; Brenne, Silke; Henrich, Wolfgang; Breckenkamp, Jürgen; Razum, Oliver
2015-01-01
The frequency of caesarean section delivery varies between countries and social groups. Among other factors, it is determined by the quality of obstetrics care. Rates of elective (planned) and emergency (in-labor) caesareans may also vary between immigrants (first generation), their offspring (second- and third-generation women), and non-immigrants because of access and language barriers. Other important points to be considered are whether caesarean section indications and the neonatal outcomes differ in babies delivered by caesarean between immigrants, their offspring, and non-immigrants. A standardized interview on admission to delivery wards at three Berlin obstetric hospitals was performed in a 12-month period in 2011/2012. Questions on socio-demographic and care aspects and on migration (immigrated herself vs. second- and third-generation women vs. non-immigrant) and acculturation status were included. Data was linked with information from the expectant mothers' antenatal records and with perinatal data routinely documented in the hospital. Regression modeling was used to adjust for age, parity and socio-economic status. The caesarean section rates for immigrants, second- and third-generation women, and non-immigrant women were similar. Neither indications for caesarean section delivery nor neonatal outcomes showed statistically significant differences. The only difference found was a somewhat higher rate of crash caesarean sections per 100 births among first generation immigrants compared to non-immigrants. Unlike earlier German studies and current studies from other European countries, this study did not find an increased rate of caesarean sections among immigrants, as well as second- and third-generation women, with the possible exception of a small high-risk group. This indicates an equally high quality of perinatal care for women with and without a migration history.
NASA Astrophysics Data System (ADS)
Zhou, Yuhong; Klages, Peter; Tan, Jun; Chi, Yujie; Stojadinovic, Strahinja; Yang, Ming; Hrycushko, Brian; Medin, Paul; Pompos, Arnold; Jiang, Steve; Albuquerque, Kevin; Jia, Xun
2017-06-01
High dose rate (HDR) brachytherapy treatment planning is conventionally performed manually and/or with aids of preplanned templates. In general, the standard of care would be elevated by conducting an automated process to improve treatment planning efficiency, eliminate human error, and reduce plan quality variations. Thus, our group is developing AutoBrachy, an automated HDR brachytherapy planning suite of modules used to augment a clinical treatment planning system. This paper describes our proof-of-concept module for vaginal cylinder HDR planning that has been fully developed. After a patient CT scan is acquired, the cylinder applicator is automatically segmented using image-processing techniques. The target CTV is generated based on physician-specified treatment depth and length. Locations of the dose calculation point, apex point and vaginal surface point, as well as the central applicator channel coordinates, and the corresponding dwell positions are determined according to their geometric relationship with the applicator and written to a structure file. Dwell times are computed through iterative quadratic optimization techniques. The planning information is then transferred to the treatment planning system through a DICOM-RT interface. The entire process was tested for nine patients. The AutoBrachy cylindrical applicator module was able to generate treatment plans for these cases with clinical grade quality. Computation times varied between 1 and 3 min on an Intel Xeon CPU E3-1226 v3 processor. All geometric components in the automated treatment plans were generated accurately. The applicator channel tip positions agreed with the manually identified positions with submillimeter deviations and the channel orientations between the plans agreed within less than 1 degree. The automatically generated plans obtained clinically acceptable quality.
NASA Astrophysics Data System (ADS)
Yamanaka, Masahito; Kawagoe, Hiroyuki; Nishizawa, Norihiko
2016-02-01
We describe the generation of a high-power, spectrally smooth supercontinuum (SC) in the 1600 nm spectral band for ultrahigh-resolution optical coherence tomography (UHR-OCT). A clean SC was achieved by using a highly nonlinear fiber with normal dispersion properties and a high-quality pedestal-free pulse obtained from a passively mode-locked erbium-doped fiber laser operating at 182 MHz. The center wavelength and spectral width were 1578 and 172 nm, respectively. The output power of the SC was 51 mW. Using the developed SC source, we demonstrated UHR-OCT imaging of biological samples with a sensitivity of 109 dB and an axial resolution of 4.9 µm in tissue.
Soft bilateral filtering volumetric shadows using cube shadow maps
Ali, Hatam H.; Sunar, Mohd Shahrizal; Kolivand, Hoshang
2017-01-01
Volumetric shadows often increase the realism of rendered scenes in computer graphics. Typical volumetric shadows techniques do not provide a smooth transition effect in real-time with conservation on crispness of boundaries. This research presents a new technique for generating high quality volumetric shadows by sampling and interpolation. Contrary to conventional ray marching method, which requires extensive time, this proposed technique adopts downsampling in calculating ray marching. Furthermore, light scattering is computed in High Dynamic Range buffer to generate tone mapping. The bilateral interpolation is used along a view rays to smooth transition of volumetric shadows with respect to preserving-edges. In addition, this technique applied a cube shadow map to create multiple shadows. The contribution of this technique isreducing the number of sample points in evaluating light scattering and then introducing bilateral interpolation to improve volumetric shadows. This contribution is done by removing the inherent deficiencies significantly in shadow maps. This technique allows obtaining soft marvelous volumetric shadows, having a good performance and high quality, which show its potential for interactive applications. PMID:28632740
Atropos: specific, sensitive, and speedy trimming of sequencing reads.
Didion, John P; Martin, Marcel; Collins, Francis S
2017-01-01
A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos.
Atropos: specific, sensitive, and speedy trimming of sequencing reads
Collins, Francis S.
2017-01-01
A key step in the transformation of raw sequencing reads into biological insights is the trimming of adapter sequences and low-quality bases. Read trimming has been shown to increase the quality and reliability while decreasing the computational requirements of downstream analyses. Many read trimming software tools are available; however, no tool simultaneously provides the accuracy, computational efficiency, and feature set required to handle the types and volumes of data generated in modern sequencing-based experiments. Here we introduce Atropos and show that it trims reads with high sensitivity and specificity while maintaining leading-edge speed. Compared to other state-of-the-art read trimming tools, Atropos achieves significant increases in trimming accuracy while remaining competitive in execution times. Furthermore, Atropos maintains high accuracy even when trimming data with elevated rates of sequencing errors. The accuracy, high performance, and broad feature set offered by Atropos makes it an appropriate choice for the pre-processing of Illumina, ABI SOLiD, and other current-generation short-read sequencing datasets. Atropos is open source and free software written in Python (3.3+) and available at https://github.com/jdidion/atropos. PMID:28875074
Large-Scale Point-Cloud Visualization through Localized Textured Surface Reconstruction.
Arikan, Murat; Preiner, Reinhold; Scheiblauer, Claus; Jeschke, Stefan; Wimmer, Michael
2014-09-01
In this paper, we introduce a novel scene representation for the visualization of large-scale point clouds accompanied by a set of high-resolution photographs. Many real-world applications deal with very densely sampled point-cloud data, which are augmented with photographs that often reveal lighting variations and inaccuracies in registration. Consequently, the high-quality representation of the captured data, i.e., both point clouds and photographs together, is a challenging and time-consuming task. We propose a two-phase approach, in which the first (preprocessing) phase generates multiple overlapping surface patches and handles the problem of seamless texture generation locally for each patch. The second phase stitches these patches at render-time to produce a high-quality visualization of the data. As a result of the proposed localization of the global texturing problem, our algorithm is more than an order of magnitude faster than equivalent mesh-based texturing techniques. Furthermore, since our preprocessing phase requires only a minor fraction of the whole data set at once, we provide maximum flexibility when dealing with growing data sets.
Studies on Hot-Melt Prepregging on PRM-II-50 Polyimide Resin with Graphite Fibers
NASA Technical Reports Server (NTRS)
Shin, E. Eugene; Sutter, James K.; Juhas, John; Veverka, Adrienne; Klans, Ojars; Inghram, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Zoha, John; Bubnick, Jim
2004-01-01
A second generation PMR (in situ Polymerization of Monomer Reactants) polyimide resin PMR-II-50, has been considered for high temperature and high stiffness space propulsion composites applications for its improved high temperature performance. As part of composite processing optimization, two commercial prepregging methods: solution vs. hot-melt processes were investigated with M40J fabrics from Toray. In a previous study a systematic chemical, physical, thermal and mechanical characterization of these composites indicated the poor resin-fiber interfacial wetting, especially for the hot-melt process, resulted in poor composite quality. In order to improve the interfacial wetting, optimization of the resin viscosity and process variables were attempted in a commercial hot-melt prepregging line. In addition to presenting the results from the prepreg quality optimization trials, the combined effects of the prepregging method and two different composite cure methods, i.e. hot press vs. autoclave on composite quality and properties are discussed.
Studies on Hot-Melt Prepregging of PMR-II-50 Polyimide Resin with Graphite Fibers
NASA Technical Reports Server (NTRS)
Shin, E. Eugene; Sutter, James K.; Juhas, John; Veverka, Adrienne; Klans, Ojars; Inghram, Linda; Scheiman, Dan; Papadopoulos, Demetrios; Zoha, John; Bubnick, Jim
2003-01-01
A Second generation PMR (in situ Polymerization of Monomer Reactants) polyimide resin, PMR-II-50, has been considered for high temperature and high stiffness space propulsion composites applications for its improved high temperature performance. As part of composite processing optimization, two commercial prepregging methods: solution vs. hot-melt processes were investigated with M40J fabrics from Toray. In a previous study a systematic chemical, physical, thermal and mechanical characterization of these composites indicated that poor resin-fiber interfacial wetting, especially for the hot-melt process, resulted in poor composite quality. In order to improve the interfacial wetting, optimization of the resin viscosity and process variables were attempted in a commercial hot-melt prepregging line. In addition to presenting the results from the prepreg quality optimization trials, the combined effects of the prepregging method and two different composite cure methods, i.e., hot press vs. autoclave on composite quality and properties are discussed.
Dimensions of vehicle sounds perception.
Wagner, Verena; Kallus, K Wolfgang; Foehl, Ulrich
2017-10-01
Vehicle sounds play an important role concerning customer satisfaction and can show another differentiating factor of brands. With an online survey of 1762 German and American customers, the requirement characteristics of high-quality vehicle sounds were determined. On the basis of these characteristics, a requirement profile was generated for every analyzed sound. These profiles were investigated in a second study with 78 customers using real vehicles. The assessment results of the vehicle sounds can be represented using the dimensions "timbre", "loudness", and "roughness/sharpness". The comparison of the requirement profiles and the assessment results show that the sounds which are perceived as pleasant and high-quality, more often correspond to the requirement profile. High-quality sounds are characterized by the fact that they are rather gentle, soft and reserved, rich, a bit dark and not too rough. For those sounds which are assessed worse by the customers, recommendations for improvements can be derived. Copyright © 2017 Elsevier Ltd. All rights reserved.
MODELS-3/CMAQ APPLICATIONS WHICH ILLUSTRATE CAPABILITY AND FUNCTIONALITY
The Models-3/CMAQ developed by the U.S. Environmental Protections Agency (USEPA) is a third generation multiscale, multi-pollutant air quality modeling system within a high-level, object-oriented computer framework (Models-3). It has been available to the scientific community ...
National Leadership for Children's Television.
ERIC Educational Resources Information Center
Heinz, John
1983-01-01
In view of the significant impact of television on children, the national leadership must work for increased production of high quality children's programs. Public and private actions are needed to generate both financial and nonfinancial incentives to encourage creativity in the television industry. (Author/MJL)
Feld, Gregory K
2004-11-01
Recent studies have demonstrated a high degree of efficacy of 8 mm electrode-tipped or saline-irrigated-tip catheters for ablation of atrial flutter (AFL). These catheters have a theoretical advantage as they produce a large ablation lesion. However, large-tip ablation catheters have a larger surface area and require a higher power radiofrequency (RF) generator with up to 100 W capacity to produce adequate ablation temperatures (50-60 degrees C). The potential advantages of a large-tip ablation catheter and high-power RF generator include the need for fewer energy applications, shorter procedure and fluoroscopy times, and greater efficacy. Therefore, the safety and efficacy of AFL ablation using 8 or 10 mm electrode catheters and a 100-W RF generator was studied using the Boston Scientific, Inc., EPT-1000 XP cardiac ablation system. There were 169 patients, aged 61 +/- 12 years involved. Acute end points were bidirectional isthmus block and no inducible AFL. Following ablation, patients were seen at 1, 3 and 6 months, with event monitoring performed weekly and for any symptoms. Three quality of life surveys were completed during follow-up. Acute success was achieved in 158 patients (93%), with 12 +/- 11 RF energy applications. The efficacy of 8 and 10 mm electrodes did not differ significantly. The number of RF energy applications (10 +/- 8 vs. 14 +/- 8) and ablation time (0.5 +/- 0.4 vs. 0.8 +/- 0.6 h) were less with 10 mm compared with 8 mm electrodes (p < 0.01). Of 158 patients with acute success, 42 were not evaluated at 6 months due to study exclusions. Of the 116 patients evaluated at 6 months, 112 (97%) had no AFL recurrence. Of those without AFL recurrence at 6 months, 95 and 93% were free of symptoms at 12 and 24 months, respectively. Ablation of AFL improved quality of life scores (p < 0.05) and reduced anti-arrhythmic and rate control drug use (p < 0.05). Complications occurred in six out of 169 patients (3.6%) but there were no deaths. It was concluded that ablation of AFL with 8 or 10 mm electrode catheters and a high-power RF generator was safe, effective and improved quality of life. The number and duration of RF applications was lower with 10 mm compared with 8 mm electrode catheters.
Pan, Yuchen; Sackmann, Eric K; Wypisniak, Karolina; Hornsby, Michael; Datwani, Sammy S; Herr, Amy E
2016-12-23
High-quality immunoreagents enhance the performance and reproducibility of immunoassays and, in turn, the quality of both biological and clinical measurements. High quality recombinant immunoreagents are generated using antibody-phage display. One metric of antibody quality - the binding affinity - is quantified through the dissociation constant (K D ) of each recombinant antibody and the target antigen. To characterize the K D of recombinant antibodies and target antigen, we introduce affinity electrophoretic mobility shift assays (EMSAs) in a high-throughput format suitable for small volume samples. A microfluidic card comprised of free-standing polyacrylamide gel (fsPAG) separation lanes supports 384 concurrent EMSAs in 30 s using a single power source. Sample is dispensed onto the microfluidic EMSA card by acoustic droplet ejection (ADE), which reduces EMSA variability compared to sample dispensing using manual or pin tools. The K D for each of a six-member fragment antigen-binding fragment library is reported using ~25-fold less sample mass and ~5-fold less time than conventional heterogeneous assays. Given the form factor and performance of this micro- and mesofluidic workflow, we have developed a sample-sparing, high-throughput, solution-phase alternative for biomolecular affinity characterization.
Pan, Yuchen; Sackmann, Eric K.; Wypisniak, Karolina; Hornsby, Michael; Datwani, Sammy S.; Herr, Amy E.
2016-01-01
High-quality immunoreagents enhance the performance and reproducibility of immunoassays and, in turn, the quality of both biological and clinical measurements. High quality recombinant immunoreagents are generated using antibody-phage display. One metric of antibody quality – the binding affinity – is quantified through the dissociation constant (KD) of each recombinant antibody and the target antigen. To characterize the KD of recombinant antibodies and target antigen, we introduce affinity electrophoretic mobility shift assays (EMSAs) in a high-throughput format suitable for small volume samples. A microfluidic card comprised of free-standing polyacrylamide gel (fsPAG) separation lanes supports 384 concurrent EMSAs in 30 s using a single power source. Sample is dispensed onto the microfluidic EMSA card by acoustic droplet ejection (ADE), which reduces EMSA variability compared to sample dispensing using manual or pin tools. The KD for each of a six-member fragment antigen-binding fragment library is reported using ~25-fold less sample mass and ~5-fold less time than conventional heterogeneous assays. Given the form factor and performance of this micro- and mesofluidic workflow, we have developed a sample-sparing, high-throughput, solution-phase alternative for biomolecular affinity characterization. PMID:28008969
High speed micromachining with high power UV laser
NASA Astrophysics Data System (ADS)
Patel, Rajesh S.; Bovatsek, James M.
2013-03-01
Increasing demand for creating fine features with high accuracy in manufacturing of electronic mobile devices has fueled growth for lasers in manufacturing. High power, high repetition rate ultraviolet (UV) lasers provide an opportunity to implement a cost effective high quality, high throughput micromachining process in a 24/7 manufacturing environment. The energy available per pulse and the pulse repetition frequency (PRF) of diode pumped solid state (DPSS) nanosecond UV lasers have increased steadily over the years. Efficient use of the available energy from a laser is important to generate accurate fine features at a high speed with high quality. To achieve maximum material removal and minimal thermal damage for any laser micromachining application, use of the optimal process parameters including energy density or fluence (J/cm2), pulse width, and repetition rate is important. In this study we present a new high power, high PRF QuasarR 355-40 laser from Spectra-Physics with TimeShiftTM technology for unique software adjustable pulse width, pulse splitting, and pulse shaping capabilities. The benefits of these features for micromachining include improved throughput and quality. Specific example and results of silicon scribing are described to demonstrate the processing benefits of the Quasar's available power, PRF, and TimeShift technology.
Impact of image quality on OCT angiography based quantitative measurements.
Al-Sheikh, Mayss; Ghasemi Falavarjani, Khalil; Akil, Handan; Sadda, SriniVas R
2017-01-01
To study the impact of image quality on quantitative measurements and the frequency of segmentation error with optical coherence tomography angiography (OCTA). Seventeen eyes of 10 healthy individuals were included in this study. OCTA was performed using a swept-source device (Triton, Topcon). Each subject underwent three scanning sessions 1-2 min apart; the first two scans were obtained under standard conditions and for the third session, the image quality index was reduced using application of a topical ointment. En face OCTA images of the retinal vasculature were generated using the default segmentation for the superficial and deep retinal layer (SRL, DRL). Intraclass correlation coefficient (ICC) was used as a measure for repeatability. The frequency of segmentation error, motion artifact, banding artifact and projection artifact was also compared among the three sessions. The frequency of segmentation error, and motion artifact was statistically similar between high and low image quality sessions (P = 0.707, and P = 1 respectively). However, the frequency of projection and banding artifact was higher with a lower image quality. The vessel density in the SRL was highly repeatable in the high image quality sessions (ICC = 0.8), however, the repeatability was low, comparing the high and low image quality measurements (ICC = 0.3). In the DRL, the repeatability of the vessel density measurements was fair in the high quality sessions (ICC = 0.6 and ICC = 0.5, with and without automatic artifact removal, respectively) and poor comparing high and low image quality sessions (ICC = 0.3 and ICC = 0.06, with and without automatic artifact removal, respectively). The frequency of artifacts is higher and the repeatability of the measurements is lower with lower image quality. The impact of image quality index should be always considered in OCTA based quantitative measurements.
MOSAIK: a hash-based algorithm for accurate next-generation sequencing short-read mapping.
Lee, Wan-Ping; Stromberg, Michael P; Ward, Alistair; Stewart, Chip; Garrison, Erik P; Marth, Gabor T
2014-01-01
MOSAIK is a stable, sensitive and open-source program for mapping second and third-generation sequencing reads to a reference genome. Uniquely among current mapping tools, MOSAIK can align reads generated by all the major sequencing technologies, including Illumina, Applied Biosystems SOLiD, Roche 454, Ion Torrent and Pacific BioSciences SMRT. Indeed, MOSAIK was the only aligner to provide consistent mappings for all the generated data (sequencing technologies, low-coverage and exome) in the 1000 Genomes Project. To provide highly accurate alignments, MOSAIK employs a hash clustering strategy coupled with the Smith-Waterman algorithm. This method is well-suited to capture mismatches as well as short insertions and deletions. To support the growing interest in larger structural variant (SV) discovery, MOSAIK provides explicit support for handling known-sequence SVs, e.g. mobile element insertions (MEIs) as well as generating outputs tailored to aid in SV discovery. All variant discovery benefits from an accurate description of the read placement confidence. To this end, MOSAIK uses a neural-network based training scheme to provide well-calibrated mapping quality scores, demonstrated by a correlation coefficient between MOSAIK assigned and actual mapping qualities greater than 0.98. In order to ensure that studies of any genome are supported, a training pipeline is provided to ensure optimal mapping quality scores for the genome under investigation. MOSAIK is multi-threaded, open source, and incorporated into our command and pipeline launcher system GKNO (http://gkno.me).
MOSAIK: A Hash-Based Algorithm for Accurate Next-Generation Sequencing Short-Read Mapping
Lee, Wan-Ping; Stromberg, Michael P.; Ward, Alistair; Stewart, Chip; Garrison, Erik P.; Marth, Gabor T.
2014-01-01
MOSAIK is a stable, sensitive and open-source program for mapping second and third-generation sequencing reads to a reference genome. Uniquely among current mapping tools, MOSAIK can align reads generated by all the major sequencing technologies, including Illumina, Applied Biosystems SOLiD, Roche 454, Ion Torrent and Pacific BioSciences SMRT. Indeed, MOSAIK was the only aligner to provide consistent mappings for all the generated data (sequencing technologies, low-coverage and exome) in the 1000 Genomes Project. To provide highly accurate alignments, MOSAIK employs a hash clustering strategy coupled with the Smith-Waterman algorithm. This method is well-suited to capture mismatches as well as short insertions and deletions. To support the growing interest in larger structural variant (SV) discovery, MOSAIK provides explicit support for handling known-sequence SVs, e.g. mobile element insertions (MEIs) as well as generating outputs tailored to aid in SV discovery. All variant discovery benefits from an accurate description of the read placement confidence. To this end, MOSAIK uses a neural-network based training scheme to provide well-calibrated mapping quality scores, demonstrated by a correlation coefficient between MOSAIK assigned and actual mapping qualities greater than 0.98. In order to ensure that studies of any genome are supported, a training pipeline is provided to ensure optimal mapping quality scores for the genome under investigation. MOSAIK is multi-threaded, open source, and incorporated into our command and pipeline launcher system GKNO (http://gkno.me). PMID:24599324
Joseph, Paul N; Sasson, Daniel A; Allen, Pablo E; Somjee, Ummat; Miller, Christine W
2016-07-01
Adverse conditions may be the norm rather than the exception in natural populations. Many populations experience poor nutrition on a seasonal basis. Further, brief interludes of inbreeding can be common as population density fluctuates and because of habitat fragmentation. Here, we investigated the effects of poor nutrition and inbreeding on traits that can be very important to reproductive success and fitness in males: testes mass, sperm concentration, and sperm viability. Our study species was Narnia femorata, a species introduced to north-central Florida in the 1950s. This species encounters regular, seasonal changes in diet that can have profound phenotypic effects on morphology and behavior. We generated inbred and outbred individuals through a single generation of full-sibling mating or outcrossing, respectively. All juveniles were provided a natural, high-quality diet of Opuntia humifusa cactus cladode with fruit until they reached adulthood. New adult males were put on a high- or low-quality diet for at least 21 days before measurements were taken. As expected, the low-quality diet led to significantly decreased testes mass in both inbred and outbred males, although there were surprisingly no detectable effects on sperm traits. We did not find evidence that inbreeding affected testes mass, sperm concentration, and sperm viability. Our results highlight the immediate and overwhelming effects of nutrition on testes mass, while suggesting that a single generation of inbreeding might not be detrimental for primary sexual traits in this particular population.
Modulated Modularity Clustering as an Exploratory Tool for Functional Genomic Inference
Stone, Eric A.; Ayroles, Julien F.
2009-01-01
In recent years, the advent of high-throughput assays, coupled with their diminishing cost, has facilitated a systems approach to biology. As a consequence, massive amounts of data are currently being generated, requiring efficient methodology aimed at the reduction of scale. Whole-genome transcriptional profiling is a standard component of systems-level analyses, and to reduce scale and improve inference clustering genes is common. Since clustering is often the first step toward generating hypotheses, cluster quality is critical. Conversely, because the validation of cluster-driven hypotheses is indirect, it is critical that quality clusters not be obtained by subjective means. In this paper, we present a new objective-based clustering method and demonstrate that it yields high-quality results. Our method, modulated modularity clustering (MMC), seeks community structure in graphical data. MMC modulates the connection strengths of edges in a weighted graph to maximize an objective function (called modularity) that quantifies community structure. The result of this maximization is a clustering through which tightly-connected groups of vertices emerge. Our application is to systems genetics, and we quantitatively compare MMC both to the hierarchical clustering method most commonly employed and to three popular spectral clustering approaches. We further validate MMC through analyses of human and Drosophila melanogaster expression data, demonstrating that the clusters we obtain are biologically meaningful. We show MMC to be effective and suitable to applications of large scale. In light of these features, we advocate MMC as a standard tool for exploration and hypothesis generation. PMID:19424432
Defect detection on hardwood logs using high resolution three-dimensional laser scan data
Liya Thomas; Lamine Mili; Clifford A. Shaffer; Ed Thomas; Ed Thomas
2004-01-01
The location, type, and severity of external defects on hardwood logs and skills are the primary indicators of overall log quality and value. External defects provide hints about the internal log characteristics. Defect data would improve the sawyer's ability to process logs such that a higher valued product (lumber) is generated. Using a high-resolution laser log...
Pros and Cons of Internet2 Videoconferencing as a New Generation Distance Education Tool
ERIC Educational Resources Information Center
Ozkan, Betul C.
2005-01-01
Internet2 is one of the newer ways of videoconferencing in American universities. Over 200 universities in the United States collaborate with each other through these high-quality Internet lines. K-12 schools and libraries nationwide have also started taking advantage of this fiber optic, high-capacity speedy network. However, the term Internet2,…
A statistical model for forecasting hourly ozone levels during fire season
Haiganoush K. Preisler; Shiyuan (Sharon) Zhong; Annie Esperanza; Leland Tarnay; Julide Kahyaoglu-Koracin
2009-01-01
Concerns about smoke from large high-intensity and managed low intensity fires have been increasing during the past decade. Because smoke from large high-intensity fires are known to contain and generate secondary fine particles (PM2.5) and ozone precursors, the effect of fires on air quality in the southern Sierra Nevada is a serious management...
ERIC Educational Resources Information Center
Ward, Tony J.; Delaloye, Naomi; Adams, Earle Raymond; Ware, Desirae; Vanek, Diana; Knuth, Randy; Hester, Carolyn Laurie; Marra, Nancy Noel; Holian, Andrij
2016-01-01
"Air Toxics Under the Big Sky" is an environmental science outreach/education program that incorporates the Next Generation Science Standards (NGSS) 8 Practices with the goal of promoting knowledge and understanding of authentic scientific research in high school classrooms through air quality research. This research explored: (1)…
Physicochemical assessment criteria for high-voltage pulse capacitors
NASA Astrophysics Data System (ADS)
Darian, L. A.; Lam, L. Kh.
2016-12-01
In the paper, the applicability of decomposition products of internal insulation of high-voltage pulse capacitors is considered (aging is the reason for decomposition products of internal insulation). Decomposition products of internal insulation of high-voltage pulse capacitors can be used to evaluate their quality when in operation and in service. There have been three generations of markers of aging of insulation as in the case with power transformers. The area of applicability of markers of aging of insulation for power transformers has been studied and the area can be extended to high-voltage pulse capacitors. The research reveals that there is a correlation between the components and quantities of markers of aging of the first generation (gaseous decomposition products of insulation) dissolved in insulating liquid and the remaining life of high-voltage pulse capacitors. The application of markers of aging to evaluate the remaining service life of high-voltage pulse capacitor is a promising direction of research, because the design of high-voltage pulse capacitors keeps stability of markers of aging of insulation in high-voltage pulse capacitors. It is necessary to continue gathering statistical data concerning development of markers of aging of the first generation. One should also carry out research aimed at estimation of the remaining life of capacitors using markers of the second and the third generation.
Real-time monitoring of laser welding of galvanized high strength steel in lap joint configuration
NASA Astrophysics Data System (ADS)
Kong, Fanrong; Ma, Junjie; Carlson, Blair; Kovacevic, Radovan
2012-10-01
Two different cases regarding the zinc coating at the lap joint faying surface are selected for studying the influence of zinc vapor on the keyhole dynamics of the weld pool and the final welding quality. One case has the zinc coating fully removed at the faying surface; while the other case retains the zinc coating on the faying surface. It is found that removal of the zinc coating at the faying surface produces a significantly better weld quality as exemplified by a lack of spatters whereas intense spatters are present when the zinc coating is present at the faying surface. Spectroscopy is used to detect the optical spectra emitted from a laser generated plasma plume during the laser welding of galvanized high strength DP980 steel in a lap-joint configuration. A correlation between the electron temperature and defects within the weld bead is identified by using the Boltzmann plot method. The laser weld pool keyhole dynamic behavior affected by a high-pressure zinc vapor generated at the faying surface of galvanized steel lap-joint is monitored in real-time by a high speed charge-coupled device (CCD) camera assisted with a green laser as an illumination source.
Novel Serial Positive Enrichment Technology Enables Clinical Multiparameter Cell Sorting
Tschulik, Claudia; Piossek, Christine; Bet, Jeannette; Yamamoto, Tori N.; Schiemann, Matthias; Neuenhahn, Michael; Martin, Klaus; Schlapschy, Martin; Skerra, Arne; Schmidt, Thomas; Edinger, Matthias; Riddell, Stanley R.; Germeroth, Lothar; Busch, Dirk H.
2012-01-01
A general obstacle for clinical cell preparations is limited purity, which causes variability in the quality and potency of cell products and might be responsible for negative side effects due to unwanted contaminants. Highly pure populations can be obtained best using positive selection techniques. However, in many cases target cell populations need to be segregated from other cells by combinations of multiple markers, which is still difficult to achieve – especially for clinical cell products. Therefore, we have generated low-affinity antibody-derived Fab-fragments, which stain like parental antibodies when multimerized via Strep-tag and Strep-Tactin, but can subsequently be removed entirely from the target cell population. Such reagents can be generated for virtually any antigen and can be used for sequential positive enrichment steps via paramagnetic beads. First protocols for multiparameter enrichment of two clinically relevant cell populations, CD4high/CD25high/CD45RAhigh ‘regulatory T cells’ and CD8high/CD62Lhigh/CD45RAneg ‘central memory T cells’, have been established to determine quality and efficacy parameters of this novel technology, which should have broad applicability for clinical cell sorting as well as basic research. PMID:22545138
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Y; Tan, J; Jiang, S
Purpose: High dose rate (HDR) brachytherapy treatment planning is conventionally performed in a manual fashion. Yet it is highly desirable to perform computerized automated planning to improve treatment planning efficiency, eliminate human errors, and reduce plan quality variation. The goal of this research is to develop an automatic treatment planning tool for HDR brachytherapy with a cylinder applicator for vaginal cancer. Methods: After inserting the cylinder applicator into the patient, a CT scan was acquired and was loaded to an in-house developed treatment planning software. The cylinder applicator was automatically segmented using image-processing techniques. CTV was generated based on user-specifiedmore » treatment depth and length. Locations of relevant points (apex point, prescription point, and vaginal surface point), central applicator channel coordinates, and dwell positions were determined according to their geometric relations with the applicator. Dwell time was computed through an inverse optimization process. The planning information was written into DICOM-RT plan and structure files to transfer the automatically generated plan to a commercial treatment planning system for plan verification and delivery. Results: We have tested the system retrospectively in nine patients treated with vaginal cylinder applicator. These cases were selected with different treatment prescriptions, lengths, depths, and cylinder diameters to represent a large patient population. Our system was able to generate treatment plans for these cases with clinically acceptable quality. Computation time varied from 3–6 min. Conclusion: We have developed a system to perform automated treatment planning for HDR brachytherapy with a cylinder applicator. Such a novel system has greatly improved treatment planning efficiency and reduced plan quality variation. It also served as a testbed to demonstrate the feasibility of automatic HDR treatment planning for more complicated cases.« less
Blick, Rachel N; Litz, Katherine S; Thornhill, Monica G; Goreczny, Anthony J
2016-01-01
More individuals with an intellectual disability now possess prerequisite skills and supports necessary for successful work force integration than did previous generations. The current study compared quality of life of community-integrated workers with those participating in sheltered vocational workshops and adult day care programs. We considered numerous indices of quality of life, including inclusion and community participation; satisfaction within professional services, home life, and day activities; dignity, rights, and respect received from others; fear; choice and control; and family satisfaction. Our data revealed several important differences in quality of life across daytime activities; participants involved in community-integrated employment tended to be younger, indicated a greater sense of community integration, and reported more financial autonomy than did those who participated in adult day care programs and sheltered workshops. However, individuals reported no differences in overall satisfaction across daytime activities. We discuss generational differences across employment status as well as possible explanations to account for high levels of satisfaction across daytime activities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Development of forensic-quality full mtGenome haplotypes: success rates with low template specimens.
Just, Rebecca S; Scheible, Melissa K; Fast, Spence A; Sturk-Andreaggi, Kimberly; Higginbotham, Jennifer L; Lyons, Elizabeth A; Bush, Jocelyn M; Peck, Michelle A; Ring, Joseph D; Diegoli, Toni M; Röck, Alexander W; Huber, Gabriela E; Nagl, Simone; Strobl, Christina; Zimmermann, Bettina; Parson, Walther; Irwin, Jodi A
2014-05-01
Forensic mitochondrial DNA (mtDNA) testing requires appropriate, high quality reference population data for estimating the rarity of questioned haplotypes and, in turn, the strength of the mtDNA evidence. Available reference databases (SWGDAM, EMPOP) currently include information from the mtDNA control region; however, novel methods that quickly and easily recover mtDNA coding region data are becoming increasingly available. Though these assays promise to both facilitate the acquisition of mitochondrial genome (mtGenome) data and maximize the general utility of mtDNA testing in forensics, the appropriate reference data and database tools required for their routine application in forensic casework are lacking. To address this deficiency, we have undertaken an effort to: (1) increase the large-scale availability of high-quality entire mtGenome reference population data, and (2) improve the information technology infrastructure required to access/search mtGenome data and employ them in forensic casework. Here, we describe the application of a data generation and analysis workflow to the development of more than 400 complete, forensic-quality mtGenomes from low DNA quantity blood serum specimens as part of a U.S. National Institute of Justice funded reference population databasing initiative. We discuss the minor modifications made to a published mtGenome Sanger sequencing protocol to maintain a high rate of throughput while minimizing manual reprocessing with these low template samples. The successful use of this semi-automated strategy on forensic-like samples provides practical insight into the feasibility of producing complete mtGenome data in a routine casework environment, and demonstrates that large (>2kb) mtDNA fragments can regularly be recovered from high quality but very low DNA quantity specimens. Further, the detailed empirical data we provide on the amplification success rates across a range of DNA input quantities will be useful moving forward as PCR-based strategies for mtDNA enrichment are considered for targeted next-generation sequencing workflows. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
ARIA: Delivering state-of-the-art InSAR products to end users
NASA Astrophysics Data System (ADS)
Agram, P. S.; Owen, S. E.; Hua, H.; Manipon, G.; Sacco, G. F.; Bue, B. D.; Fielding, E. J.; Yun, S. H.; Simons, M.; Webb, F.; Rosen, P. A.; Lundgren, P.; Liu, Z.
2016-12-01
Advanced Rapid Imaging and Analysis (ARIA) Center for Natural Hazards aims to bring state-of-the-art geodetic imaging capabilities to an operational level in support of local, national, and international hazard response communities. ARIA project's first foray into operational generation of InSAR products was with Calimap Project, in collaboration with ASI-CIDOT, using X-band data from the Cosmo-SkyMed constellation. Over the last year, ARIA's processing infrastructure has been significantly upgraded to exploit the free stream of high quality C-band SAR data from ESA's Sentinel-1 mission and related algorithmic improvements to the ISCE software. ARIA's data system can now operationally generate geocoded unwrapped phase and coherence products in GIS-friendly formats from Sentinel-1 TOPS mode data in an automated fashion, and this capability is currently being exercised various study sites across the United States including Hawaii, Central California, Iceland and South America. The ARIA team, building on the experience gained from handling X-band data and C-band data, has also built an automated machine learning-based classifier to label the auto-generated interferograms based on phase unwrapping quality. These high quality "time-series ready" InSAR products generated using state-of-the-art processing algorithms can be accessed by end users using two different mechanisms - 1) a Faceted-search interface that includes browse imagery for quick visualization and 2) an ElasticSearch-based API to enable bulk automated download, post-processing and time-series analysis. In this talk, we will present InSAR results from various global events that ARIA system has responded to. We will also discuss the set of geospatial big data tools including GIS libraries and API tools, that end users will need to familiarize themselves with in order to maximize the utilization of continuous stream of InSAR products from the Sentinel-1 and NISAR missions that the ARIA project will generate.
Teaching adolescents with learning disabilities to generate and use task-specific strategies.
Ellis, E S; Deshler, D D; Schumaker, J B
1989-02-01
The effects of an intervention designed to enhance students' roles as control agents for strategic functioning were investigated. The goal was to increase the ability of students labeled learning disabled to generate new strategies or adapt existing task-specific strategies for meeting varying demands of the regular classroom. Measures were taken in three areas: (a) metacognitive knowledge related to generating or adapting strategies, (b) ability to generate problem-solving strategies for novel problems, and (c) the effects of the intervention on students' regular classroom grades and teachers' perceptions of the students' self-reliance and work quality. A multiple baseline across subjects design was used. The intervention resulted in dramatic increases in the subjects' verbal expression of metacognitive knowledge and ability to generate task-specific strategies. Students' regular class grades increased; for those students who did not spontaneously generalize use of the strategy to problems encountered in these classes, providing instruction to target specific classes resulted in improved grades. Teacher perceptions of students' self-reliance and work quality did not change, probably because baseline measures were already high in both areas. Implications for instruction and future research are discussed.
NASA Astrophysics Data System (ADS)
Friedrich, Axel; Raabe, Helmut; Schiefele, Jens; Doerr, Kai Uwe
1999-07-01
In future aircraft cockpit designs SVS (Synthetic Vision System) databases will be used to display 3D physical and virtual information to pilots. In contrast to pure warning systems (TAWS, MSAW, EGPWS) SVS serve to enhance pilot spatial awareness by 3-dimensional perspective views of the objects in the environment. Therefore all kind of aeronautical relevant data has to be integrated into the SVS-database: Navigation- data, terrain-data, obstacles and airport-Data. For the integration of all these data the concept of a GIS (Geographical Information System) based HQDB (High-Quality- Database) has been created at the TUD (Technical University Darmstadt). To enable database certification, quality- assessment procedures according to ICAO Annex 4, 11, 14 and 15 and RTCA DO-200A/EUROCAE ED76 were established in the concept. They can be differentiated in object-related quality- assessment-methods following the keywords accuracy, resolution, timeliness, traceability, assurance-level, completeness, format and GIS-related quality assessment methods with the keywords system-tolerances, logical consistence and visual quality assessment. An airport database is integrated in the concept as part of the High-Quality- Database. The contents of the HQDB are chosen so that they support both Flight-Guidance-SVS and other aeronautical applications like SMGCS (Surface Movement and Guidance Systems) and flight simulation as well. Most airport data are not available. Even though data for runways, threshold, taxilines and parking positions were to be generated by the end of 1997 (ICAO Annex 11 and 15) only a few countries fulfilled these requirements. For that reason methods of creating and certifying airport data have to be found. Remote sensing and digital photogrammetry serve as means to acquire large amounts of airport objects with high spatial resolution and accuracy in much shorter time than with classical surveying methods. Remotely sensed images can be acquired from satellite-platforms or aircraft-platforms. To achieve the highest horizontal accuracy requirements stated in ICAO Annex 14 for runway centerlines (0.50 meters), at the present moment only images acquired from aircraft based sensors can be used as source data. Still, ground reference by GCP (Ground Control-points) is obligatory. A DEM (Digital Elevation Model) can be created automatically in the photogrammetric process. It can be used as highly accurate elevation model for the airport area. The final verification of airport data is accomplished by independent surveyed runway- and taxiway- control-points. The concept of generation airport-data by means of remote sensing and photogrammetry was tested with the Stuttgart/Germany airport. The results proved that the final accuracy was within the accuracy specification defined by ICAO Annex 14.
High Spectral Resolution LIDAR as a Tool for Air Quality Research
NASA Astrophysics Data System (ADS)
Eloranta, E. W.; Spuler, S.; Hayman, M. M.
2017-12-01
Many aspects of air quality research require information on the vertical distribution of pollution. Traditional measurements, obtained from surface based samplers, or passive satellite remote sensing, do not provide vertical profiles. Lidar can provide profiles of aerosol properties. However traditional backscatter lidar suffers from uncertain calibrations with poorly constrained algorithms. These problems are avoided using High Spectral Resolution Lidar (HSRL) which provides absolutely calibrated vertical profiles of aerosol properties. The University of Wisconsin HSRL systems measure 532 nm wavelength aerosol backscatter cross-sections, extinction cross-sections, depolarization, and attenuated 1064 nm backscatter. These instruments are designed for long-term deployment at remote sites with minimal local support. Processed data is provided for public viewing and download in real-time on our web site "http://hsrl.ssec.wisc.edu". Air pollution applications of HSRL data will be illustrated with examples acquired during air quality field programs including; KORUS-AQ, DISCOVER-AQ, LAMOS and FRAPPE. Observations include 1) long range transport of dust, air pollution and smoke. 2) Fumigation episodes where elevated pollution is mixed down to the surface. 3) visibility restrictions by aerosols and 4) diurnal variations in atmospheric optical depth. While HSRL is powerful air quality research tool, its application in routine measurement networks is hindered by the high cost of current systems. Recent technical advances promise a next generation HSRL using telcom components to greatly reduce system cost. This paper will present data generated by a prototype low cost system constructed at NCAR. In addition to lower cost, operation at a non-visible near 780 nm infrared wavelength removes all FAA restrictions on the operation.
Playing Chemical Plant Environmental Protection Games with Historical Monitoring Data.
Zhu, Zhengqiu; Chen, Bin; Reniers, Genserik; Zhang, Laobing; Qiu, Sihang; Qiu, Xiaogang
2017-09-29
The chemical industry is very important for the world economy and this industrial sector represents a substantial income source for developing countries. However, existing regulations on controlling atmospheric pollutants, and the enforcement of these regulations, often are insufficient in such countries. As a result, the deterioration of surrounding ecosystems and a quality decrease of the atmospheric environment can be observed. Previous works in this domain fail to generate executable and pragmatic solutions for inspection agencies due to practical challenges. In addressing these challenges, we introduce a so-called Chemical Plant Environment Protection Game (CPEP) to generate reasonable schedules of high-accuracy air quality monitoring stations (i.e., daily management plans) for inspection agencies. First, so-called Stackelberg Security Games (SSGs) in conjunction with source estimation methods are applied into this research. Second, high-accuracy air quality monitoring stations as well as gas sensor modules are modeled in the CPEP game. Third, simplified data analysis on the regularly discharging of chemical plants is utilized to construct the CPEP game. Finally, an illustrative case study is used to investigate the effectiveness of the CPEP game, and a realistic case study is conducted to illustrate how the models and algorithms being proposed in this paper, work in daily practice. Results show that playing a CPEP game can reduce operational costs of high-accuracy air quality monitoring stations. Moreover, evidence suggests that playing the game leads to more compliance from the chemical plants towards the inspection agencies. Therefore, the CPEP game is able to assist the environmental protection authorities in daily management work and reduce the potential risks of gaseous pollutants dispersion incidents.
Zhang, Chenchu; Hu, Yanlei; Du, Wenqiang; Wu, Peichao; Rao, Shenglong; Cai, Ze; Lao, Zhaoxin; Xu, Bing; Ni, Jincheng; Li, Jiawen; Zhao, Gang; Wu, Dong; Chu, Jiaru; Sugioka, Koji
2016-09-13
Rapid integration of high-quality functional devices in microchannels is in highly demand for miniature lab-on-a-chip applications. This paper demonstrates the embellishment of existing microfluidic devices with integrated micropatterns via femtosecond laser MRAF-based holographic patterning (MHP) microfabrication, which proves two-photon polymerization (TPP) based on spatial light modulator (SLM) to be a rapid and powerful technology for chip functionalization. Optimized mixed region amplitude freedom (MRAF) algorithm has been used to generate high-quality shaped focus field. Base on the optimized parameters, a single-exposure approach is developed to fabricate 200 × 200 μm microstructure arrays in less than 240 ms. Moreover, microtraps, QR code and letters are integrated into a microdevice by the advanced method for particles capture and device identification. These results indicate that such a holographic laser embellishment of microfluidic devices is simple, flexible and easy to access, which has great potential in lab-on-a-chip applications of biological culture, chemical analyses and optofluidic devices.
NASA Astrophysics Data System (ADS)
Zhang, Chenchu; Hu, Yanlei; Du, Wenqiang; Wu, Peichao; Rao, Shenglong; Cai, Ze; Lao, Zhaoxin; Xu, Bing; Ni, Jincheng; Li, Jiawen; Zhao, Gang; Wu, Dong; Chu, Jiaru; Sugioka, Koji
2016-09-01
Rapid integration of high-quality functional devices in microchannels is in highly demand for miniature lab-on-a-chip applications. This paper demonstrates the embellishment of existing microfluidic devices with integrated micropatterns via femtosecond laser MRAF-based holographic patterning (MHP) microfabrication, which proves two-photon polymerization (TPP) based on spatial light modulator (SLM) to be a rapid and powerful technology for chip functionalization. Optimized mixed region amplitude freedom (MRAF) algorithm has been used to generate high-quality shaped focus field. Base on the optimized parameters, a single-exposure approach is developed to fabricate 200 × 200 μm microstructure arrays in less than 240 ms. Moreover, microtraps, QR code and letters are integrated into a microdevice by the advanced method for particles capture and device identification. These results indicate that such a holographic laser embellishment of microfluidic devices is simple, flexible and easy to access, which has great potential in lab-on-a-chip applications of biological culture, chemical analyses and optofluidic devices.
Enhancement of digital radiography image quality using a convolutional neural network.
Sun, Yuewen; Li, Litao; Cong, Peng; Wang, Zhentao; Guo, Xiaojing
2017-01-01
Digital radiography system is widely used for noninvasive security check and medical imaging examination. However, the system has a limitation of lower image quality in spatial resolution and signal to noise ratio. In this study, we explored whether the image quality acquired by the digital radiography system can be improved with a modified convolutional neural network to generate high-resolution images with reduced noise from the original low-quality images. The experiment evaluated on a test dataset, which contains 5 X-ray images, showed that the proposed method outperformed the traditional methods (i.e., bicubic interpolation and 3D block-matching approach) as measured by peak signal to noise ratio (PSNR) about 1.3 dB while kept highly efficient processing time within one second. Experimental results demonstrated that a residual to residual (RTR) convolutional neural network remarkably improved the image quality of object structural details by increasing the image resolution and reducing image noise. Thus, this study indicated that applying this RTR convolutional neural network system was useful to improve image quality acquired by the digital radiography system.
Shao, Wenguang; Pedrioli, Patrick G A; Wolski, Witold; Scurtescu, Cristian; Schmid, Emanuel; Courcelles, Mathieu; Schuster, Heiko; Kowalewski, Daniel; Marino, Fabio; Arlehamn, Cecilia S L; Vaughan, Kerrie; Peters, Bjoern; Sette, Alessandro; Ottenhoff, Tom H M; Meijgaarden, Krista E; Nieuwenhuizen, Natalie; Kaufmann, Stefan H E; Schlapbach, Ralph; Castle, John C; Nesvizhskii, Alexey I; Nielsen, Morten; Deutsch, Eric W; Campbell, David S; Moritz, Robert L; Zubarev, Roman A; Ytterberg, Anders Jimmy; Purcell, Anthony W; Marcilla, Miguel; Paradela, Alberto; Wang, Qi; Costello, Catherine E; Ternette, Nicola; van Veelen, Peter A; van Els, Cécile A C M; de Souza, Gustavo A; Sollid, Ludvig M; Admon, Arie; Stevanovic, Stefan; Rammensee, Hans-Georg; Thibault, Pierre; Perreault, Claude; Bassani-Sternberg, Michal
2018-01-01
Abstract Mass spectrometry (MS)-based immunopeptidomics investigates the repertoire of peptides presented at the cell surface by major histocompatibility complex (MHC) molecules. The broad clinical relevance of MHC-associated peptides, e.g. in precision medicine, provides a strong rationale for the large-scale generation of immunopeptidomic datasets and recent developments in MS-based peptide analysis technologies now support the generation of the required data. Importantly, the availability of diverse immunopeptidomic datasets has resulted in an increasing need to standardize, store and exchange this type of data to enable better collaborations among researchers, to advance the field more efficiently and to establish quality measures required for the meaningful comparison of datasets. Here we present the SysteMHC Atlas (https://systemhcatlas.org), a public database that aims at collecting, organizing, sharing, visualizing and exploring immunopeptidomic data generated by MS. The Atlas includes raw mass spectrometer output files collected from several laboratories around the globe, a catalog of context-specific datasets of MHC class I and class II peptides, standardized MHC allele-specific peptide spectral libraries consisting of consensus spectra calculated from repeat measurements of the same peptide sequence, and links to other proteomics and immunology databases. The SysteMHC Atlas project was created and will be further expanded using a uniform and open computational pipeline that controls the quality of peptide identifications and peptide annotations. Thus, the SysteMHC Atlas disseminates quality controlled immunopeptidomic information to the public domain and serves as a community resource toward the generation of a high-quality comprehensive map of the human immunopeptidome and the support of consistent measurement of immunopeptidomic sample cohorts. PMID:28985418
Microstructured fibres: a positive impact on defence technology?
NASA Astrophysics Data System (ADS)
O'Driscoll, E. J.; Watson, M. A.; Delmonte, T.; Petrovich, M. N.; Feng, X.; Flanagan, J. C.; Hayes, J. R.; Richardson, D. J.
2006-09-01
In this paper we seek to assess the potential impact of microstructured fibres for security and defence applications. Recent literature has presented results on using microstructured fibre for delivery of high power, high quality radiation and also on the use of microstructured fibre for broadband source generation. Whilst these two applications may appear contradictory to one another the inherent design flexibility of microstructured fibres allows fibres to be fabricated for the specific application requirements, either minimising (for delivery) or maximising (for broadband source generation) the nonlinear effects. In platform based laser applications such as infrared counter measures, remote sensing and laser directed-energy weapons, a suitable delivery fibre providing high power, high quality light delivery would allow a laser to be sited remotely from the sensor/device head. This opens up the possibility of several sensor/device types sharing the same multi-functional laser, thus reducing the complexity and hence the cost of such systems. For applications requiring broadband source characteristics, microstructured fibres can also offer advantages over conventional sources. By exploiting the nonlinear effects it is possible to realise a multifunctional source for applications such as active hyperspectral imaging, countermeasures, and biochemical sensing. These recent results suggest enormous potential for these novel fibre types to influence the next generation of photonic systems for security and defence applications. However, it is important to establish where the fibres can offer the greatest advantages and what research still needs to be done to drive the technology towards real platform solutions.
Integrity of high-velocity water slug generated by an impacting technique
NASA Astrophysics Data System (ADS)
Dehkhoda, Sevda; Bourne, Neil
2013-06-01
A pulsed water jet is a series of discrete water slugs travelling at high velocity. Immediately after striking a target, these slugs apply high-intensity, short-duration transient stress known as the water hammer pressure, followed by low-intensity, long-duration stationary stress at the stagnation pressure. The magnitude and duration of the water hammer and stagnation pressures are controlled by the size and quality of the water slugs. The use of water jets for rock cutting in mining operations is a centuries-old technology; however, practical methods for producing high-energy water slugs repeatedly have proven difficult. This can be partly due to the fact that the geometrical properties of a jet and so its effectiveness in creating damage is controlled and influenced by the method that is employed to generate the water slugs. This paper investigates the integrity of a single water slug produced using an impacting technique where a hammer strikes a piston, resting on top of a water-filled chamber. The coherence of the generated water pulse was of concern in this study. If repeated shock reflections within the chamber were transmitted or were carried into the internal geometry of nozzle, the emerging jet could pulsate. The impact impulse of the formed water jet was measured in a Kel-F target material using an embedded PVDF (Polyvinylidene fluoride) shock gauge. The recorded stress waveform was then used to study the quality and endurance of the water pulse stream as it travelled through air.
Constrained CVT meshes and a comparison of triangular mesh generators
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Hoa; Burkardt, John; Gunzburger, Max
2009-01-01
Mesh generation in regions in Euclidean space is a central task in computational science, and especially for commonly used numerical methods for the solution of partial differential equations, e.g., finite element and finite volume methods. We focus on the uniform Delaunay triangulation of planar regions and, in particular, on how one selects the positions of the vertices of the triangulation. We discuss a recently developed method, based on the centroidal Voronoi tessellation (CVT) concept, for effecting such triangulations and present two algorithms, including one new one, for CVT-based grid generation. We also compare several methods, including CVT-based methods, for triangulatingmore » planar domains. To this end, we define several quantitative measures of the quality of uniform grids. We then generate triangulations of several planar regions, including some having complexities that are representative of what one may encounter in practice. We subject the resulting grids to visual and quantitative comparisons and conclude that all the methods considered produce high-quality uniform grids and that the CVT-based grids are at least as good as any of the others.« less
Stable generation of GeV-class electron beams from self-guided laser-plasma channels
NASA Astrophysics Data System (ADS)
Hafz, Nasr A. M.; Jeong, Tae Moon; Choi, Il Woo; Lee, Seong Ku; Pae, Ki Hong; Kulagin, Victor V.; Sung, Jae Hee; Yu, Tae Jun; Hong, Kyung-Han; Hosokai, Tomonao; Cary, John R.; Ko, Do-Kyeong; Lee, Jongmin
2008-09-01
Table-top laser-driven plasma accelerators are gaining attention for their potential use in miniaturizing future high-energy accelerators. By irradiating gas jet targets with ultrashort intense laser pulses, the generation of quasimonoenergetic electron beams was recently observed. Currently, the stability of beam generation and the ability to scale to higher electron beam energies are critical issues for practical laser acceleration. Here, we demonstrate the first generation of stable GeV-class electron beams from stable few-millimetre-long plasma channels in a self-guided wakefield acceleration process. As primary evidence of the laser wakefield acceleration in a bubble regime, we observed a boost of both the electron beam energy and quality by reducing the plasma density and increasing the plasma length in a 1-cm-long gas jet. Subsequent three-dimensional simulations show the possibility of achieving even higher electron beam energies by minimizing plasma bubble elongation, and we anticipate dramatic increases in beam energy and quality in the near future. This will pave the way towards ultracompact, all-optical electron beam accelerators and their applications in science, technology and medicine.
NASA Technical Reports Server (NTRS)
2000-01-01
Video Pics is a software program that generates high-quality photos from video. The software was developed under an SBIR contract with Marshall Space Flight Center by Redhawk Vision, Inc.--a subsidiary of Irvine Sensors Corporation. Video Pics takes information content from multiple frames of video and enhances the resolution of a selected frame. The resulting image has enhanced sharpness and clarity like that of a 35 mm photo. The images are generated as digital files and are compatible with image editing software.
NASA Astrophysics Data System (ADS)
Leidinger, Martin; Schultealbert, Caroline; Neu, Julian; Schütze, Andreas; Sauerwald, Tilman
2018-01-01
This article presents a test gas generation system designed to generate ppb level gas concentrations from gas cylinders. The focus is on permanent gases and volatile organic compounds (VOCs) for applications like indoor and outdoor air quality monitoring or breath analysis. In the design and the setup of the system, several issues regarding handling of trace gas concentrations have been considered, addressed and tested. This concerns not only the active fluidic components (flow controllers, valves), which have been chosen specifically for the task, but also the design of the fluidic tubing regarding dead volumes and delay times, which have been simulated for the chosen setup. Different tubing materials have been tested for their adsorption/desorption characteristics regarding naphthalene, a highly relevant gas for indoor air quality monitoring, which has generated high gas exchange times in a previous gas mixing system due to long time adsorption/desorption effects. Residual gas contaminations of the system and the selected carrier air supply have been detected and quantified using both an analytical method (GC-MS analysis according to ISO 16000-6) and a metal oxide semiconductor gas sensor, which detected a maximum contamination equivalent to 28 ppb of carbon monoxide. A measurement strategy for suppressing even this contamination has been devised, which allows the system to be used for gas sensor and gas sensor system characterization and calibration in the low ppb concentration range.
NASA Astrophysics Data System (ADS)
Zhang, Xiaojie; Zeng, Qiming; Jiao, Jian; Zhang, Jingfa
2016-01-01
Repeat-pass Interferometric Synthetic Aperture Radar (InSAR) is a technique that can be used to generate DEMs. But the accuracy of InSAR is greatly limited by geometrical distortions, atmospheric effect, and decorrelations, particularly in mountainous areas, such as western China where no high quality DEM has so far been accomplished. Since each of InSAR DEMs generated using data of different frequencies and baselines has their own advantages and disadvantages, it is therefore very potential to overcome some of the limitations of InSAR by fusing Multi-baseline and Multi-frequency Interferometric Results (MMIRs). This paper proposed a fusion method based on Extended Kalman Filter (EKF), which takes the InSAR-derived DEMs as states in prediction step and the flattened interferograms as observations in control step to generate the final fused DEM. Before the fusion, detection of layover and shadow regions, low-coherence regions and regions with large height error is carried out because MMIRs in these regions are believed to be unreliable and thereafter are excluded. The whole processing flow is tested with TerraSAR-X and Envisat ASAR datasets. Finally, the fused DEM is validated with ASTER GDEM and national standard DEM of China. The results demonstrate that the proposed method is effective even in low coherence areas.
Accelerating Pseudo-Random Number Generator for MCNP on GPU
NASA Astrophysics Data System (ADS)
Gong, Chunye; Liu, Jie; Chi, Lihua; Hu, Qingfeng; Deng, Li; Gong, Zhenghu
2010-09-01
Pseudo-random number generators (PRNG) are intensively used in many stochastic algorithms in particle simulations, artificial neural networks and other scientific computation. The PRNG in Monte Carlo N-Particle Transport Code (MCNP) requires long period, high quality, flexible jump and fast enough. In this paper, we implement such a PRNG for MCNP on NVIDIA's GTX200 Graphics Processor Units (GPU) using CUDA programming model. Results shows that 3.80 to 8.10 times speedup are achieved compared with 4 to 6 cores CPUs and more than 679.18 million double precision random numbers can be generated per second on GPU.
NASA Astrophysics Data System (ADS)
Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.
2017-11-01
Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.
Evaluating Computer-Generated Domain-Oriented Vocabularies.
ERIC Educational Resources Information Center
Damerau, Fred J.
1990-01-01
Discusses methods for automatically compiling domain-oriented vocabularies in natural language systems and describes techniques for evaluating the quality of the resulting word lists. A study is described that used subject headings from Grolier's Encyclopedia and the United Press International newswire, and filters for removing high frequency…
Urban and regional land use analysis: CARETS and census cities experiment package
NASA Technical Reports Server (NTRS)
Alexander, R. (Principal Investigator); Lins, H. F., Jr.
1974-01-01
The author has identified the following significant results. The most significant finding has been the ability of the S-190B data to produce land use maps not far removed from the quality of high altitude aircraft photography generated maps.
Phanerochaete chrysosporium genomics
Luis F. Larrondo; Rafael Vicuna; Dan Cullen
2005-01-01
A high quality draft genome sequence has been generated for the lignocellulose-degrading basidiomycete Phanerochaete chrysosporium (Martinez et al. 2004). Analysis of the genome in the context of previously established genetics and physiology is presented. Transposable elements and their potential relationship to genes involved in lignin degradation are systematically...
ERIC Educational Resources Information Center
DiRanna, Kathy; Gomez-Zwiep, Susan
2013-01-01
The confluence of "Common Core State Standards" ("CCSS"), "Next Generation Science Standards" ("NGSS"), and 21st century skills provides an unparalleled opportunity to improve science, technology, engineering, and mathematics (STEM) education for all students, particularly English learners. "CCSS,"…
ERIC Educational Resources Information Center
Baldwin, Kathryn; Wilson, Allison
2017-01-01
Having high-quality early childhood education programs that prepare children for success in school and later years continues to be an ever increasing national priority. While the "Next Generation Science Standards" ("NGSS") do not provide standards for preschool, there are ample opportunities to use the Standards as a guide to…
Next-generation pushbroom filter radiometers for remote sensing
NASA Astrophysics Data System (ADS)
Tarde, Richard W.; Dittman, Michael G.; Kvaran, Geir E.
2012-09-01
Individual focal plane size, yield, and quality continue to improve, as does the technology required to combine these into large tiled formats. As a result, next-generation pushbroom imagers are replacing traditional scanning technologies in remote sensing applications. Pushbroom architecture has inherently better radiometric sensitivity and significantly reduced payload mass, power, and volume than previous generation scanning technologies. However, the architecture creates challenges achieving the required radiometric accuracy performance. Achieving good radiometric accuracy, including image spectral and spatial uniformity, requires creative optical design, high quality focal planes and filters, careful consideration of on-board calibration sources, and state-of-the-art ground test facilities. Ball Aerospace built the Landsat Data Continuity Mission (LDCM) next-generation Operational Landsat Imager (OLI) payload. Scheduled to launch in 2013, OLI provides imagery consistent with the historical Landsat spectral, spatial, radiometric, and geometric data record and completes the generational technology upgrade from the Enhanced Thematic Mapper (ETM+) whiskbroom technology to modern pushbroom technology afforded by advanced focal planes. We explain how Ball's capabilities allowed producing the innovative next-generational OLI pushbroom filter radiometer that meets challenging radiometric accuracy or calibration requirements. OLI will improve the multi-decadal land surface observation dataset dating back to the 1972 launch of ERTS-1 or Landsat 1.
Waste heat generation: A comprehensive review.
Yeşiller, Nazli; Hanson, James L; Yee, Emma H
2015-08-01
A comprehensive review of heat generation in various types of wastes and of the thermal regime of waste containment facilities is provided in this paper. Municipal solid waste (MSW), MSW incineration ash, and mining wastes were included in the analysis. Spatial and temporal variations of waste temperatures, thermal gradients, thermal properties of wastes, average temperature differentials, and heat generation values are provided. Heat generation was influenced by climatic conditions, mean annual earth temperatures, waste temperatures at the time of placement, cover conditions, and inherent heat generation potential of the specific wastes. Time to onset of heat generation varied between months and years, whereas timelines for overall duration of heat generation varied between years and decades. For MSW, measured waste temperatures were as high as 60-90°C and as low as -6°C. MSW incinerator ash temperatures varied between 5 and 87°C. Mining waste temperatures were in the range of -25 to 65°C. In the wastes analyzed, upward heat flow toward the surface was more prominent than downward heat flow toward the subsurface. Thermal gradients generally were higher for MSW and incinerator ash and lower for mining waste. Based on thermal properties, MSW had insulative qualities (low thermal conductivity), while mining wastes typically were relatively conductive (high thermal conductivity) with ash having intermediate qualities. Heat generation values ranged from -8.6 to 83.1MJ/m(3) and from 0.6 to 72.6MJ/m(3) for MSW and mining waste, respectively and was 72.6MJ/m(3) for ash waste. Conductive thermal losses were determined to range from 13 to 1111MJ/m(3)yr. The data and analysis provided in this review paper can be used in the investigation of heat generation and thermal regime of a wide range of wastes and waste containment facilities located in different climatic regions. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, Ruihua; Wang, Rong; Liu, Qunying; Yang, Li; Xi, Chuan; Wang, Wei; Li, Lingzhou; Zhao, Zhoufang; Zhou, Ying
2018-02-01
With China’s new energy generation grid connected capacity being in the forefront of the world and the uncertainty of new energy sources, such as wind energy and solar energy, it is be of great significance to study scientific and comprehensive assessment of power quality. On the foundation of analysizing the current power quality index systematically and objectively, the new energy grid power quality analysis method and comprehensive evaluation method, this paper tentatively explored the trend of the new generation of energy system power quality comprehensive evaluation.
NASA Astrophysics Data System (ADS)
Alidoost, F.; Arefi, H.
2017-11-01
Nowadays, Unmanned Aerial System (UAS)-based photogrammetry offers an affordable, fast and effective approach to real-time acquisition of high resolution geospatial information and automatic 3D modelling of objects for numerous applications such as topography mapping, 3D city modelling, orthophoto generation, and cultural heritages preservation. In this paper, the capability of four different state-of-the-art software packages as 3DSurvey, Agisoft Photoscan, Pix4Dmapper Pro and SURE is examined to generate high density point cloud as well as a Digital Surface Model (DSM) over a historical site. The main steps of this study are including: image acquisition, point cloud generation, and accuracy assessment. The overlapping images are first captured using a quadcopter and next are processed by different software to generate point clouds and DSMs. In order to evaluate the accuracy and quality of point clouds and DSMs, both visual and geometric assessments are carry out and the comparison results are reported.
Sequence Data for Clostridium autoethanogenum using Three Generations of Sequencing Technologies
Utturkar, Sagar M.; Klingeman, Dawn Marie; Bruno-Barcena, José M.; ...
2015-04-14
During the past decade, DNA sequencing output has been mostly dominated by the second generation sequencing platforms which are characterized by low cost, high throughput and shorter read lengths for example, Illumina. The emergence and development of so called third generation sequencing platforms such as PacBio has permitted exceptionally long reads (over 20 kb) to be generated. Due to read length increases, algorithm improvements and hybrid assembly approaches, the concept of one chromosome, one contig and automated finishing of microbial genomes is now a realistic and achievable task for many microbial laboratories. In this paper, we describe high quality sequencemore » datasets which span three generations of sequencing technologies, containing six types of data from four NGS platforms and originating from a single microorganism, Clostridium autoethanogenum. The dataset reported here will be useful for the scientific community to evaluate upcoming NGS platforms, enabling comparison of existing and novel bioinformatics approaches and will encourage interest in the development of innovative experimental and computational methods for NGS data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
Chen, Yue; Fang, Zhao-Xiang; Ren, Yu-Xuan; Gong, Lei; Lu, Rong-De
2015-09-20
Optical vortices are associated with a spatial phase singularity. Such a beam with a vortex is valuable in optical microscopy, hyper-entanglement, and optical levitation. In these applications, vortex beams with a perfect circle shape and a large topological charge are highly desirable. But the generation of perfect vortices with high topological charges is challenging. We present a novel method to create perfect vortex beams with large topological charges using a digital micromirror device (DMD) through binary amplitude modulation and a narrow Gaussian approximation. The DMD with binary holograms encoding both the spatial amplitude and the phase could generate fast switchable, reconfigurable optical vortex beams with significantly high quality and fidelity. With either the binary Lee hologram or the superpixel binary encoding technique, we were able to generate the corresponding hologram with high fidelity and create a perfect vortex with topological charge as large as 90. The physical properties of the perfect vortex beam produced were characterized through measurements of propagation dynamics and the focusing fields. The measurements show good consistency with the theoretical simulation. The perfect vortex beam produced satisfies high-demand utilization in optical manipulation and control, momentum transfer, quantum computing, and biophotonics.
Quantitative evaluation of 3D images produced from computer-generated holograms
NASA Astrophysics Data System (ADS)
Sheerin, David T.; Mason, Ian R.; Cameron, Colin D.; Payne, Douglas A.; Slinger, Christopher W.
1999-08-01
Advances in computing and optical modulation techniques now make it possible to anticipate the generation of near real- time, reconfigurable, high quality, three-dimensional images using holographic methods. Computer generated holography (CGH) is the only technique which holds promise of producing synthetic images having the full range of visual depth cues. These realistic images will be viewable by several users simultaneously, without the need for headtracking or special glasses. Such a data visualization tool will be key to speeding up the manufacture of new commercial and military equipment by negating the need for the production of physical 3D models in the design phase. DERA Malvern has been involved in designing and testing fixed CGH in order to understand the connection between the complexity of the CGH, the algorithms used to design them, the processes employed in their implementation and the quality of the images produced. This poster describes results from CGH containing up to 108 pixels. The methods used to evaluate the reconstructed images are discussed and quantitative measures of image fidelity made. An understanding of the effect of the various system parameters upon final image quality enables a study of the possible system trade-offs to be carried out. Such an understanding of CGH production and resulting image quality is key to effective implementation of a reconfigurable CGH system currently under development at DERA.
Single image super-resolution reconstruction algorithm based on eage selection
NASA Astrophysics Data System (ADS)
Zhang, Yaolan; Liu, Yijun
2017-05-01
Super-resolution (SR) has become more important, because it can generate high-quality high-resolution (HR) images from low-resolution (LR) input images. At present, there are a lot of work is concentrated on developing sophisticated image priors to improve the image quality, while taking much less attention to estimating and incorporating the blur model that can also impact the reconstruction results. We present a new reconstruction method based on eager selection. This method takes full account of the factors that affect the blur kernel estimation and accurately estimating the blur process. When comparing with the state-of-the-art methods, our method has comparable performance.
Li, Yaohang; Liu, Hui; Rata, Ionel; Jakobsson, Eric
2013-02-25
The rapidly increasing number of protein crystal structures available in the Protein Data Bank (PDB) has naturally made statistical analyses feasible in studying complex high-order inter-residue correlations. In this paper, we report a context-based secondary structure potential (CSSP) for assessing the quality of predicted protein secondary structures generated by various prediction servers. CSSP is a sequence-position-specific knowledge-based potential generated based on the potentials of mean force approach, where high-order inter-residue interactions are taken into consideration. The CSSP potential is effective in identifying secondary structure predictions with good quality. In 56% of the targets in the CB513 benchmark, the optimal CSSP potential is able to recognize the native secondary structure or a prediction with Q3 accuracy higher than 90% as best scored in the predicted secondary structures generated by 10 popularly used secondary structure prediction servers. In more than 80% of the CB513 targets, the predicted secondary structures with the lowest CSSP potential values yield higher than 80% Q3 accuracy. Similar performance of CSSP is found on the CASP9 targets as well. Moreover, our computational results also show that the CSSP potential using triplets outperforms the CSSP potential using doublets and is currently better than the CSSP potential using quartets.
Description of the F-16XL Geometry and Computational Grids Used in CAWAPI
NASA Technical Reports Server (NTRS)
Boelens, O. J.; Badcock, K. J.; Gortz, S.; Morton, S.; Fritz, W.; Karman, S. L., Jr.; Michal, T.; Lamar, J. E.
2009-01-01
The objective of the Cranked-Arrow Wing Aerodynamics Project International (CAWAPI) was to allow a comprehensive validation of Computational Fluid Dynamics methods against the CAWAP flight database. A major part of this work involved the generation of high-quality computational grids. Prior to the grid generation an IGES file containing the air-tight geometry of the F-16XL aircraft was generated by a cooperation of the CAWAPI partners. Based on this geometry description both structured and unstructured grids have been generated. The baseline structured (multi-block) grid (and a family of derived grids) has been generated by the National Aerospace Laboratory NLR. Although the algorithms used by NLR had become available just before CAWAPI and thus only a limited experience with their application to such a complex configuration had been gained, a grid of good quality was generated well within four weeks. This time compared favourably with that required to produce the unstructured grids in CAWAPI. The baseline all-tetrahedral and hybrid unstructured grids has been generated at NASA Langley Research Center and the USAFA, respectively. To provide more geometrical resolution, trimmed unstructured grids have been generated at EADS-MAS, the UTSimCenter, Boeing Phantom Works and KTH/FOI. All grids generated within the framework of CAWAPI will be discussed in the article. Both results obtained on the structured grids and the unstructured grids showed a significant improvement in agreement with flight test data in comparison with those obtained on the structured multi-block grid used during CAWAP.
NASA Astrophysics Data System (ADS)
Miranda, Rommel Joseph
By employing qualitative methods, this study sought to determine the perceptions that urban stakeholders hold about what characteristics should distinguish a high school science teacher whom they would consider to demonstrate high quality in science teaching. A maximum variation sample of six science teachers, three school administrators, six parents and six students from a large urban public school district were interviewed using semi-structured, in-depth interview techniques. From these data, a list of observable characteristics which urban stakeholders hold as evidence of high quality in science teaching was generated. Observational techniques were utilized to determine the extent to which six urban high school science teachers, who meet the NCLB Act criteria for being "highly qualified", actually possessed the characteristics which these stakeholders hold as evidence of high quality in science teaching. Constant comparative analysis was used to analyze the data set. The findings suggest that urban stakeholders perceive that a high school science teacher who demonstrates high quality in science teaching should be knowledgeable about their subject matter, their student population, and should be resourceful; should possess an academic background in science and professional experience in science teaching; should exhibit professionalism, a passion for science and teaching, and a dedication to teaching and student learning; should be skillful in planning and preparing science lessons and in organizing the classroom, in presenting the subject matter to students, in conducting a variety of hands-on activities, and in managing a classroom; and should assess whether students complete class goals and objectives, and provide feedback about grades for students promptly. The findings further reveal that some of the urban high school science teachers who were deemed to be "highly qualified", as defined by the NCLB Act, engaged in practices that threatened quality in science teaching and often failed to display the characteristics which urban stakeholders hold as evidence of high quality in science teaching. Thus, the criteria for "highly qualified" prescribed by policy makers and politicians do not necessarily translate into effective science teaching in urban settings. These findings emphasize the importance of stakeholder involvement in the design of educational reform initiatives.
Cohen, Deborah J; Dorr, David A; Knierim, Kyle; DuBard, C Annette; Hemler, Jennifer R; Hall, Jennifer D; Marino, Miguel; Solberg, Leif I; McConnell, K John; Nichols, Len M; Nease, Donald E; Edwards, Samuel T; Wu, Winfred Y; Pham-Singer, Hang; Kho, Abel N; Phillips, Robert L; Rasmussen, Luke V; Duffy, F Daniel; Balasubramanian, Bijal A
2018-04-01
Federal value-based payment programs require primary care practices to conduct quality improvement activities, informed by the electronic reports on clinical quality measures that their electronic health records (EHRs) generate. To determine whether EHRs produce reports adequate to the task, we examined survey responses from 1,492 practices across twelve states, supplemented with qualitative data. Meaningful-use participation, which requires the use of a federally certified EHR, was associated with the ability to generate reports-but the reports did not necessarily support quality improvement initiatives. Practices reported numerous challenges in generating adequate reports, such as difficulty manipulating and aligning measurement time frames with quality improvement needs, lack of functionality for generating reports on electronic clinical quality measures at different levels, discordance between clinical guidelines and measures available in reports, questionable data quality, and vendors that were unreceptive to changing EHR configuration beyond federal requirements. The current state of EHR measurement functionality may be insufficient to support federal initiatives that tie payment to clinical quality measures.
Cohen, Deborah J.; Dorr, David A.; Knierim, Kyle; DuBard, C. Annette; Hemler, Jennifer R.; Hall, Jennifer D.; Marino, Miguel; Solberg, Leif I.; McConnell, K. John; Nichols, Len M.; Nease, Donald E.; Edwards, Samuel T.; Wu, Winfred Y.; Pham-Singer, Hang; Kho, Abel N.; Phillips, Robert L.; Rasmussen, Luke V.; Duffy, F. Daniel; Balasubramanian, Bijal A.
2018-01-01
Federal value-based payment programs require primary care practices to conduct quality improvement activities, informed by the electronic reports on clinical quality measures that their electronic health records (EHRs) generate. To determine whether EHRs produce reports adequate to the task, we examined survey responses from 1,492 practices across twelve states, supplemented with qualitative data. Meaningful-use participation, which requires the use of a federally certified EHR, was associated with the ability to generate reports—but the reports did not necessarily support quality improvement initiatives. Practices reported numerous challenges in generating adequate reports, such as difficulty manipulating and aligning measurement time frames with quality improvement needs, lack of functionality for generating reports on electronic clinical quality measures at different levels, discordance between clinical guidelines and measures available in reports, questionable data quality, and vendors that were unreceptive to changing EHR configuration beyond federal requirements. The current state of EHR measurement functionality may be insufficient to support federal initiatives that tie payment to clinical quality measures. PMID:29608365
Decision Environment and Heuristics in Individual and Collective Hypothesis Generation
2017-11-01
number of images viewed: When the high -value cue appeared late in the trial, time pressure yielded hypothesis changes sooner than did no time pressure...However, when the high -value cue appeared early in the trial, time pressure had no effect on images viewed. Overall, participants’ timing was...scenarios. Cue order influenced the quality of timing. Participants were more likely to change their hypotheses at an optimal time when the high
Quality Assessment of Collection 6 MODIS Atmospheric Science Products
NASA Astrophysics Data System (ADS)
Manoharan, V. S.; Ridgway, B.; Platnick, S. E.; Devadiga, S.; Mauoka, E.
2015-12-01
Since the launch of the NASA Terra and Aqua satellites in December 1999 and May 2002, respectively, atmosphere and land data acquired by the MODIS (Moderate Resolution Imaging Spectroradiometer) sensor on-board these satellites have been reprocessed five times at the MODAPS (MODIS Adaptive Processing System) located at NASA GSFC. The global land and atmosphere products use science algorithms developed by the NASA MODIS science team investigators. MODAPS completed Collection 6 reprocessing of MODIS Atmosphere science data products in April 2015 and is currently generating the Collection 6 products using the latest version of the science algorithms. This reprocessing has generated one of the longest time series of consistent data records for understanding cloud, aerosol, and other constituents in the earth's atmosphere. It is important to carefully evaluate and assess the quality of this data and remove any artifacts to maintain a useful climate data record. Quality Assessment (QA) is an integral part of the processing chain at MODAPS. This presentation will describe the QA approaches and tools adopted by the MODIS Land/Atmosphere Operational Product Evaluation (LDOPE) team to assess the quality of MODIS operational Atmospheric products produced at MODAPS. Some of the tools include global high resolution images, time series analysis and statistical QA metrics. The new high resolution global browse images with pan and zoom have provided the ability to perform QA of products in real time through synoptic QA on the web. This global browse generation has been useful in identifying production error, data loss, and data quality issues from calibration error, geolocation error and algorithm performance. A time series analysis for various science datasets in the Level-3 monthly product was recently developed for assessing any long term drifts in the data arising from instrument errors or other artifacts. This presentation will describe and discuss some test cases from the recently processed C6 products. We will also describe the various tools and approaches developed to verify and assess the algorithm changes implemented by the science team to address known issues in the products and improve the quality of the products.
An improved multi-exposure approach for high quality holographic femtosecond laser patterning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chenchu; Hu, Yanlei, E-mail: huyl@ustc.edu.cn, E-mail: jwl@ustc.edu.cn; Li, Jiawen, E-mail: huyl@ustc.edu.cn, E-mail: jwl@ustc.edu.cn
High efficiency two photon polymerization through single exposure via spatial light modulator (SLM) has been used to decrease the fabrication time and rapidly realize various micro/nanostructures, but the surface quality remains a big problem due to the speckle noise of optical intensity distribution at the defocused plane. Here, a multi-exposure approach which used tens of computer generate holograms successively loaded on SLM is presented to significantly improve the optical uniformity without losing efficiency. By applying multi-exposure, we found that the uniformity at the defocused plane was increased from ∼0.02 to ∼0.6 according to our simulation. The fabricated two series ofmore » letters “HELLO” and “USTC” under single-and multi-exposure in our experiment also verified that the surface quality was greatly improved. Moreover, by this method, several kinds of beam splitters with high quality, e.g., 2 × 2, 5 × 5 Daman, and complex nonseperate 5 × 5, gratings were fabricated with both of high quality and short time (<1 min, 95% time-saving). This multi-exposure SLM-two-photon polymerization method showed the promising prospect in rapidly fabricating and integrating various binary optical devices and their systems.« less
Lell, M M; May, M S; Brand, M; Eller, A; Buder, T; Hofmann, E; Uder, M; Wuest, W
2015-07-01
CT is the imaging technique of choice in the evaluation of midface trauma or inflammatory disease. We performed a systematic evaluation of scan protocols to optimize image quality and radiation exposure on third-generation dual-source CT. CT protocols with different tube voltage (70-150 kV), current (25-300 reference mAs), prefiltration, pitch value, and rotation time were systematically evaluated. All images were reconstructed with iterative reconstruction (Advanced Modeled Iterative Reconstruction, level 2). To individually compare results with otherwise identical factors, we obtained all scans on a frozen human head. Conebeam CT was performed for image quality and dose comparison with multidetector row CT. Delineation of important anatomic structures and incidental pathologic conditions in the cadaver head was evaluated. One hundred kilovolts with tin prefiltration demonstrated the best compromise between dose and image quality. The most dose-effective combination for trauma imaging was Sn100 kV/250 mAs (volume CT dose index, 2.02 mGy), and for preoperative sinus surgery planning, Sn100 kV/150 mAs (volume CT dose index, 1.22 mGy). "Sn" indicates an additional prefiltration of the x-ray beam with a tin filter to constrict the energy spectrum. Exclusion of sinonasal disease was possible with even a lower dose by using Sn100 kV/25 mAs (volume CT dose index, 0.2 mGy). High image quality at very low dose levels can be achieved by using a Sn100-kV protocol with iterative reconstruction. The effective dose is comparable with that of conventional radiography, and the high image quality at even lower radiation exposure favors multidetector row CT over conebeam CT. © 2015 by American Journal of Neuroradiology.
Redfern, S; Norman, I
1999-07-01
The aims of the study were to identify indicators of quality of nursing care from the perceptions of patients and nurses, and to determine the congruence between patients' and nurses' perceptions. The paper is presented in two parts. Part 1 included the background and methods to the study and the findings from the comparison of patients' and nurses' perceptions. Part 2 describes the perceptions of patients and nurses, and draws conclusions drawn from the study as a whole. Patients and nurses in hospital wards were interviewed using the critical incident technique. We grouped 4546 indicators of high and low quality nursing care generated from the interview transcripts into 316 subcategories, 68 categories and 31 themes. The themes were grouped into eight clusters: therapeutic context for care, attitudes and sensitivity, teaching and leadership, motivation to nurse, monitoring and informing, high-dependency care, efficiency and thoroughness, reflection and anticipation. As shown in Part 1 of the paper, congruence between patients' and nurses' perceptions of quality was high and significant, although there was some difference of emphasis. The findings support an emerging theory of interpersonal competence and quality in nursing care.
Dynamic power scheduling system for JPEG2000 delivery over wireless networks
NASA Astrophysics Data System (ADS)
Martina, Maurizio; Vacca, Fabrizio
2003-06-01
Third generation mobile terminals diffusion is encouraging the development of new multimedia based applications. The reliable transmission of audiovisual content will gain major interest being one of the most valuable services. Nevertheless, mobile scenario is severely power constrained: high compression ratios and refined energy management strategies are highly advisable. JPEG2000 as the source encoding stage assures excellent performance with extremely good visual quality. However the limited power budged imposes to limit the computational effort in order to save as much power as possible. Starting from an error prone environment, as the wireless one, high error-resilience features need to be employed. This paper tries to investigate the trade-off between quality and power in such a challenging environment.
Visualization for genomics: the Microbial Genome Viewer.
Kerkhoven, Robert; van Enckevort, Frank H J; Boekhorst, Jos; Molenaar, Douwe; Siezen, Roland J
2004-07-22
A Web-based visualization tool, the Microbial Genome Viewer, is presented that allows the user to combine complex genomic data in a highly interactive way. This Web tool enables the interactive generation of chromosome wheels and linear genome maps from genome annotation data stored in a MySQL database. The generated images are in scalable vector graphics (SVG) format, which is suitable for creating high-quality scalable images and dynamic Web representations. Gene-related data such as transcriptome and time-course microarray experiments can be superimposed on the maps for visual inspection. The Microbial Genome Viewer 1.0 is freely available at http://www.cmbi.kun.nl/MGV
Intergenerational Relationship Quality Across Three Generations
Tighe, Lauren A.; Fingerman, Karen L.; Zarit, Steven H.
2012-01-01
Objectives. Studies of intergenerational relationship quality often include one or two generations. This study examined within-family differences and similarities or transmission of positive and negative relationship quality across three generations. Method. Participants included 633 middle-aged individuals (G2; 52% women, ages 40–60 years), 592 of their offspring (G3; 53% daughters; ages 18–41 years), and 337 of their parents (i.e., grandparents; G1; 69% women; ages 59–96 years). Results. Multilevel models revealed differences and similarities in relationship quality across generations. The oldest generation (G1) reported greater positive and less negative quality relationships than the middle (G2) and the younger (G3) generations. There was limited evidence of transmission. Middle-aged respondents who reported more positive and less negative ties with their parents (G1) reported more positive and less negative ties with their own children (G3). Grandmother (G1) reports of more positive relationship quality were associated with G3 reports of more positive relationship quality with G2. Discussion. Findings are consistent with the intergenerational stake hypothesis and only partially consistent with the theory of intergenerational transmission. Overall, this study suggests that there is greater within-family variability than similarities in how family members feel about one another. PMID:22628478
Ogunbekun, I; Adeyi, O; Wouters, A; Morrow, R H
1996-12-01
This paper reports on a study to assess the quality of maternal health care in public health facilities in Nigeria and to identify the resource implications of making the necessary quality improvements. Drawing upon unifying themes from quality assurance, basic microeconomics and the Bamako Initiative, locally defined norms were used to estimate resource requirements for improving the quality of maternal health care. Wide gaps existed between what is required (the norm) and what was available in terms of fixed and variable resources required for the delivery of maternal health services in public facilities implementing the Bamako Initiative in the Local Government Areas studied. Given such constraints, it was highly unlikely that technically acceptable standards of care could be met without additional resource inputs to meet the norm. This is part of the cost of doing business and merits serious policy dialogue. Revenue generation from health services was poor and appeared to be more related to inadequate supply of essential drugs and consumables than to the use of uneconomic fee scales. It is likely that user fees will be necessary to supplement scarce government budgets, especially to fund the most critical variable inputs associated with quality improvements. However, any user fee system, especially one that raises fees to patients, will have to be accompanied by immediate and visible quality improvements. Without such quality improvements, cost recovery will result in even lower utilization and attempts to generate new revenues are unlikely to succeed.
NASA Technical Reports Server (NTRS)
Robinson, E. A.
1973-01-01
Quality, reliability, and design standards for microwave hybrid microcircuits were established. The MSFC Standard 85M03926 for hybrid microcircuits was reviewed and modifications were generated for use with microwave hybrid microcircuits. The results for reliability tests of microwave thin film capacitors, transistors, and microwave circuits are presented. Twenty-two microwave receivers were tested for 13,500 unit hours. The result of 111,121 module burn-in and operating hours for an integrated solid state transceiver module is reported.
Nirea, K G; Meuwissen, T H E
2017-04-01
We simulated a genomic selection pig breeding schemes containing nucleus and production herds to improve feed efficiency of production pigs that were cross-breed. Elite nucleus herds had access to high-quality feed, and production herds were fed low-quality feed. Feed efficiency in the nucleus herds had a heritability of 0.3 and 0.25 in the production herds. It was assumed the genetic relationships between feed efficiency in the nucleus and production were low (r g = 0.2), medium (r g = 0.5) and high (r g = 0.8). In our alternative breeding schemes, different proportion of production animals were recorded for feed efficiency and genotyped with high-density panel of genetic markers. Genomic breeding value of the selection candidates for feed efficiency was estimated based on three different approaches. In one approach, genomic breeding value was estimated including nucleus animals in the reference population. In the second approach, the reference population was containing a mixture of nucleus and production animals. In the third approach, the reference population was only consisting of production herds. Using a mixture reference population, we generated 40-115% more genetic gain in the production environment as compared to only using nucleus reference population that were fed high-quality feed sources when the production animals were offspring of the nucleus animals. When the production animals were grand offspring of the nucleus animals, 43-104% more genetic gain was generated. Similarly, a higher genetic gain generated in the production environment when mixed reference population was used as compared to only using production animals. This was up to 19 and 14% when the production animals were offspring and grand offspring of nucleus animals, respectively. Therefore, in genomic selection pig breeding programmes, feed efficiency traits could be improved by properly designing the reference population. © 2016 Blackwell Verlag GmbH.
USDA-ARS?s Scientific Manuscript database
In this study, we generated a linkage map containing 1,151,856 high quality SNPs between Mo17 and B73, which were verified in the maize intermated B73'×'Mo17 (IBM) Syn10 population. This resource is an excellent complement to existing maize genetic maps available in an online database (iPlant, http:...
ERIC Educational Resources Information Center
O'Connor, Karen V.; Stichter, Janine P.
2011-01-01
Students with high-functioning autism and/or Asperger Syndrome (HFA/AS) are characterized by difficulties with communication as well as impairments in social interaction skills. Students with HFA/AS have been shown to generate solutions that are lower quality (e.g., less socially appropriate) than those of their typical peers and also to have…
Code of Federal Regulations, 2011 CFR
2011-07-01
..., entertainment, and residential opportunities, as well as high quality office uses. (b) That portion of the... shall be directly related to creating a lively atmosphere and to promoting an active street life... as important to the commercial life of any inner city, uses that do not generate lively activities...
Code of Federal Regulations, 2010 CFR
2010-07-01
..., entertainment, and residential opportunities, as well as high quality office uses. (b) That portion of the... shall be directly related to creating a lively atmosphere and to promoting an active street life... as important to the commercial life of any inner city, uses that do not generate lively activities...
Code of Federal Regulations, 2012 CFR
2012-07-01
..., entertainment, and residential opportunities, as well as high quality office uses. (b) That portion of the... shall be directly related to creating a lively atmosphere and to promoting an active street life... as important to the commercial life of any inner city, uses that do not generate lively activities...
Code of Federal Regulations, 2014 CFR
2014-07-01
..., entertainment, and residential opportunities, as well as high quality office uses. (b) That portion of the... shall be directly related to creating a lively atmosphere and to promoting an active street life... as important to the commercial life of any inner city, uses that do not generate lively activities...
21 CFR 884.6170 - Assisted reproduction water and water purification systems.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Assisted reproduction water and water purification... Devices § 884.6170 Assisted reproduction water and water purification systems. (a) Identification. Assisted reproduction water purification systems are devices specifically intended to generate high quality...
21 CFR 884.6170 - Assisted reproduction water and water purification systems.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Assisted reproduction water and water purification... Devices § 884.6170 Assisted reproduction water and water purification systems. (a) Identification. Assisted reproduction water purification systems are devices specifically intended to generate high quality...
21 CFR 884.6170 - Assisted reproduction water and water purification systems.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Assisted reproduction water and water purification... Devices § 884.6170 Assisted reproduction water and water purification systems. (a) Identification. Assisted reproduction water purification systems are devices specifically intended to generate high quality...
21 CFR 884.6170 - Assisted reproduction water and water purification systems.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Assisted reproduction water and water purification... Devices § 884.6170 Assisted reproduction water and water purification systems. (a) Identification. Assisted reproduction water purification systems are devices specifically intended to generate high quality...
21 CFR 884.6170 - Assisted reproduction water and water purification systems.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Assisted reproduction water and water purification... Devices § 884.6170 Assisted reproduction water and water purification systems. (a) Identification. Assisted reproduction water purification systems are devices specifically intended to generate high quality...
A Distributed Middleware Architecture for Attack-Resilient Communications in Smart Grids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hodge, Brian S; Wu, Yifu; Wei, Jin
Distributed Energy Resources (DERs) are being increasingly accepted as an excellent complement to traditional energy sources in smart grids. As most of these generators are geographically dispersed, dedicated communications investments for every generator are capital cost prohibitive. Real-time distributed communications middleware, which supervises, organizes and schedules tremendous amounts of data traffic in smart grids with high penetrations of DERs, allows for the use of existing network infrastructure. In this paper, we propose a distributed attack-resilient middleware architecture that detects and mitigates the congestion attacks by exploiting the Quality of Experience (QoE) measures to complement the conventional Quality of Service (QoS)more » information to detect and mitigate the congestion attacks effectively. The simulation results illustrate the efficiency of our proposed communications middleware architecture.« less
A Distributed Middleware Architecture for Attack-Resilient Communications in Smart Grids: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Yifu; Wei, Jin; Hodge, Bri-Mathias
Distributed energy resources (DERs) are being increasingly accepted as an excellent complement to traditional energy sources in smart grids. Because most of these generators are geographically dispersed, dedicated communications investments for every generator are capital-cost prohibitive. Real-time distributed communications middleware - which supervises, organizes, and schedules tremendous amounts of data traffic in smart grids with high penetrations of DERs - allows for the use of existing network infrastructure. In this paper, we propose a distributed attack-resilient middleware architecture that detects and mitigates the congestion attacks by exploiting the quality of experience measures to complement the conventional quality of service informationmore » to effectively detect and mitigate congestion attacks. The simulation results illustrate the efficiency of our proposed communications middleware architecture.« less
Haytowitz, David B; Pehrsson, Pamela R
2018-01-01
For nearly 20years, the National Food and Nutrient Analysis Program (NFNAP) has expanded and improved the quantity and quality of data in US Department of Agriculture's (USDA) food composition databases (FCDB) through the collection and analysis of nationally representative food samples. NFNAP employs statistically valid sampling plans, the Key Foods approach to identify and prioritize foods and nutrients, comprehensive quality control protocols, and analytical oversight to generate new and updated analytical data for food components. NFNAP has allowed the Nutrient Data Laboratory to keep up with the dynamic US food supply and emerging scientific research. Recently generated results for nationally representative food samples show marked changes compared to previous database values for selected nutrients. Monitoring changes in the composition of foods is critical in keeping FCDB up-to-date, so that they remain a vital tool in assessing the nutrient intake of national populations, as well as for providing dietary advice. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Ishigaki, Mika; Hashimoto, Kosuke; Sato, Hidetoshi; Ozaki, Yukihiro
2017-03-01
Current research focuses on embryonic development and quality not only by considering fundamental biology, but also by aiming to improve assisted reproduction technologies, such as in vitro fertilization. In this study, we explored the development of mouse embryo and its quality based on molecular information, obtained nondestructively using Raman spectroscopy. The detailed analysis of Raman spectra measured in situ during embryonic development revealed a temporary increase in protein content after fertilization. Proteins with a β-sheet structure—present in the early stages of embryonic development—are derived from maternal oocytes, while α-helical proteins are additionally generated by switching on a gene after fertilization. The transition from maternal to embryonic control during development can be non-destructively profiled, thus facilitating the in situ assessment of structural changes and component variation in proteins generated by metabolic activity. Furthermore, it was indicated that embryos with low-grade morphology had high concentrations of lipids and hydroxyapatite. This technique could be used for embryo quality testing in the future.
Thermal Neutron Radiography using a High-flux Compact Neutron Generator
NASA Astrophysics Data System (ADS)
Taylor, Michael; Sengbusch, Evan; Seyfert, Chris; Moll, Eli; Radel, Ross
A novel neutron imaging system has been designed and constructed by Phoenix Nuclear Labs to investigate specimens when conventional X-ray imaging will not suffice. A first-generation electronic neutron generator is actively being used by the United States Army and is coupled with activation films for neutron radiography to inspect munitions and other critical defence and aerospace components. A second-generation system has been designed to increase the total neutron output from an upgraded gaseous deuterium target to 5×1011 DD n/s, generating higher neutron flux at the imaging plane and dramatically reducing interrogation time, while maintaining high spatial resolution and low geometric unsharpness. A description of the neutron generator and imaging system, including the beamline, target and detector platform, is given in this paper. State of the art neutron moderators, collimators and imaging detector components are also discussed in the context of increasing specimen throughput and optimizing image quality. Neutron radiographs captured with the neutron radiography system will be further compared against simulated images using the MCNP nuclear simulation code.
Tjin A Tsoi, Sharon L N M; de Boer, Anthonius; Croiset, Gerda; Koster, Andries S; Kusurkar, Rashmi A
2016-01-01
Continuing education (CE) can support health care professionals in maintaining and developing their knowledge and competencies. Although lack of motivation is one of the most important barriers of pharmacists' participation in CE, we know little about the quality or the quantity of motivation. We used the self-determination theory, which describes autonomous motivation (AM) as originating from within an individual and controlled motivation (CM) as originating from external factors, as a framework for this study. Our aim was to obtain insight into the quality and quantity of pharmacists' motivation for CE. The scores of 425 pharmacists on Academic Motivation Scale were subjected to K-means cluster analysis to generate motivational profiles. We unraveled four motivational profiles: (1) good quality with high AM/low CM, (2) high quantity with high AM/high CM, (3) poor quality with low AM/high CM, and (4) low quantity with low AM/low CM. Female pharmacists, pharmacists working in a hospital pharmacy, pharmacists working for more than 10 years, and pharmacists not in training were highly represented in the good-quality profile. Pharmacists working in a community pharmacy, pharmacists working for less than 10 years, and pharmacists in training were highly represented in the high-quantity profile. Male pharmacists were more or less equally distributed over the four profiles. The highest percentage of pharmacy owners was shown in the low-quantity profile, and the highest percentage of the nonowners was shown in the good-quality profile. Pharmacists exhibit different motivational profiles, which are associated with their background characteristics, such as gender, ownership of business, practice setting, and current training. Motivational profiles could be used to tailor CE courses for pharmacists.
Radiation Hardened, Modulator ASIC for High Data Rate Communications
NASA Technical Reports Server (NTRS)
McCallister, Ron; Putnam, Robert; Andro, Monty; Fujikawa, Gene
2000-01-01
Satellite-based telecommunication services are challenged by the need to generate down-link power levels adequate to support high quality (BER approx. equals 10(exp 12)) links required for modem broadband data services. Bandwidth-efficient Nyquist signaling, using low values of excess bandwidth (alpha), can exhibit large peak-to-average-power ratio (PAPR) values. High PAPR values necessitate high-power amplifier (HPA) backoff greater than the PAPR, resulting in unacceptably low HPA efficiency. Given the high cost of on-board prime power, this inefficiency represents both an economical burden, and a constraint on the rates and quality of data services supportable from satellite platforms. Constant-envelope signals offer improved power-efficiency, but only by imposing a severe bandwidth-efficiency penalty. This paper describes a radiation- hardened modulator which can improve satellite-based broadband data services by combining the bandwidth-efficiency of low-alpha Nyquist signals with high power-efficiency (negligible HPA backoff).
Computing Aerodynamic Performance of a 2D Iced Airfoil: Blocking Topology and Grid Generation
NASA Technical Reports Server (NTRS)
Chi, X.; Zhu, B.; Shih, T. I.-P.; Slater, J. W.; Addy, H. E.; Choo, Yung K.; Lee, Chi-Ming (Technical Monitor)
2002-01-01
The ice accrued on airfoils can have enormously complicated shapes with multiple protruded horns and feathers. In this paper, several blocking topologies are proposed and evaluated on their ability to produce high-quality structured multi-block grid systems. A transition layer grid is introduced to ensure that jaggedness on the ice-surface geometry do not to propagate into the domain. This is important for grid-generation methods based on hyperbolic PDEs (Partial Differential Equations) and algebraic transfinite interpolation. A 'thick' wrap-around grid is introduced to ensure that grid lines clustered next to solid walls do not propagate as streaks of tightly packed grid lines into the interior of the domain along block boundaries. For ice shapes that are not too complicated, a method is presented for generating high-quality single-block grids. To demonstrate the usefulness of the methods developed, grids and CFD solutions were generated for two iced airfoils: the NLF0414 airfoil with and without the 623-ice shape and the B575/767 airfoil with and without the 145m-ice shape. To validate the computations, the computed lift coefficients as a function of angle of attack were compared with available experimental data. The ice shapes and the blocking topologies were prepared by NASA Glenn's SmaggIce software. The grid systems were generated by using a four-boundary method based on Hermite interpolation with controls on clustering, orthogonality next to walls, and C continuity across block boundaries. The flow was modeled by the ensemble-averaged compressible Navier-Stokes equations, closed by the shear-stress transport turbulence model in which the integration is to the wall. All solutions were generated by using the NPARC WIND code.
Precision Crystal Calorimeters in High Energy Physics
Ren-Yuan Zhu
2017-12-09
Precision crystal calorimeters traditionally play an important role in high energy physics experiments. In the last two decades, it faces a challenge to maintain its precision in a hostile radiation environment. This paper reviews the performance of crystal calorimeters constructed for high energy physics experiments and the progress achieved in understanding crystalâs radiation damage as well as in developing high quality scintillating crystals for particle physics. Potential applications of new generation scintillating crystals of high density and high light yield, such as LSO and LYSO, in particle physics experiments is also discussed.
DSM Generation from ALSO/PRISM Images Using SAT-PP
NASA Astrophysics Data System (ADS)
Wolff, Kirsten; Gruen, Armin
2008-11-01
One of the most important products of ALOS/PRISM image data are accurate DSMs. To exploit the full potential of the full resolution of PRISM for DSM generation, a highly developed image matcher is needed. As a member of the validation and calibration team for PRISM we published earlier results of DSM generation using PRISM image triplets in combination with our software package SAT-PP. The overall accuracy across all object and image features for all tests lies between 1-5 pixels in matching, depending primarily on surface roughness, vegetation, image texture and image quality. Here we will discuss some new results. We focus on four different topics: the use of two different evaluation methods, the difference between a 5m and a 10m GSD for the final PRISM DSM, the influence of the level of initial information and the comparison of the quality of different combinations of the three different views forward, nadir and backward. All tests have been conducted with our testfield Bern/Thun, Switzerland.
NASA Astrophysics Data System (ADS)
Lanusse, Francois; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Schneider, Jeff; Poczos, Barnabas
2017-01-01
Weak gravitational lensing has long been identified as one of the most powerful probes to investigate the nature of dark energy. As such, weak lensing is at the heart of the next generation of cosmological surveys such as LSST, Euclid or WFIRST.One particularly crititcal source of systematic errors in these surveys comes from the shape measurement algorithms tasked with estimating galaxy shapes. GREAT3, the last community challenge to assess the quality of state-of-the-art shape measurement algorithms has in particular demonstrated that all current methods are biased to various degrees and, more importantly, that these biases depend on the details of the galaxy morphologies. These biases can be measured and calibrated by generating mock observations where a known lensing signal has been introduced and comparing the resulting measurements to the ground-truth. Producing these mock observations however requires input galaxy images of higher resolution and S/N than the simulated survey, which typically implies acquiring extremely expensive space-based observations.The goal of this work is to train a deep generative model on already available Hubble Space Telescope data which can then be used to sample new galaxy images conditioned on parameters such as magnitude, size or redshift and exhibiting complex morphologies. Such model can allow us to inexpensively produce large set of realistic realistic images for calibration purposes.We implement a conditional generative model based on state-of-the-art deep learning methods and fit it to deep galaxy images from the COSMOS survey. The quality of the model is assessed by computing an extensive set of galaxy morphology statistics on the generated images. Beyond simple second moment statistics such as size and ellipticity, we apply more complex statistics specifically designed to be sensitive to disturbed galaxy morphologies. We find excellent agreement between the morphologies of real and model generated galaxies.Our results suggest that such deep generative models represent a reliable alternative to the acquisition of expensive high quality observations for generating the calibration data needed by the next generation of weak lensing surveys.
Optical components of adaptive systems for improving laser beam quality
NASA Astrophysics Data System (ADS)
Malakhov, Yuri I.; Atuchin, Victor V.; Kudryashov, Aleksis V.; Starikov, Fedor A.
2008-10-01
The short overview is given of optical equipment developed within the ISTC activity for adaptive systems of new generation allowing for correction of high-power laser beams carrying optical vortices onto the phase surface. They are the kinoform many-level optical elements of new generation, namely, special spiral phase plates and ordered rasters of microlenses, i.e. lenslet arrays, as well as the wide-aperture Hartmann-Shack sensors and bimorph deformable piezoceramics- based mirrors with various grids of control elements.
Design and Scheduling of Microgrids using Benders Decomposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagarajan, Adarsh; Ayyanar, Raja
2016-11-21
The distribution feeder laterals in a distribution feeder with relatively high PV generation as compared to the load can be operated as microgrids to achieve reliability, power quality and economic benefits. However, renewable resources are intermittent and stochastic in nature. A novel approach for sizing and scheduling an energy storage system and microturbine for reliable operation of microgrids is proposed. The size and schedule of an energy storage system and microturbine are determined using Benders' decomposition, considering PV generation as a stochastic resource.
Hydrolysis of aluminum dross material to achieve zero hazardous waste.
David, E; Kopac, J
2012-03-30
A simple method with high efficiency for generating high pure hydrogen by hydrolysis in tap water of highly activated aluminum dross is established. Aluminum dross is activated by mechanically milling to particles of about 45 μm. This leads to removal of surface layer of the aluminum particles and creation of a fresh chemically active metal surface. In contact with water the hydrolysis reaction takes place and hydrogen is released. In this process a Zero Waste concept is achieved because the other product of reaction is aluminum oxide hydroxide (AlOOH), which is nature-friendly and can be used to make high quality refractory or calcium aluminate cement. For comparison we also used pure aluminum powder and alkaline tap water solution (NaOH, KOH) at a ratio similar to that of aluminum dross content. The rates of hydrogen generated in hydrolysis reaction of pure aluminum and aluminum dross have been found to be similar. As a result of the experimental setup, a hydrogen generator was designed and assembled. Hydrogen volume generated by hydrolysis reaction was measured. The experimental results obtained reveal that aluminum dross could be economically recycled by hydrolysis process with achieving zero hazardous aluminum dross waste and hydrogen generation. Copyright © 2012 Elsevier B.V. All rights reserved.
Trevisan, Maíra; De Bortoli, Sergio A; Vacari, Alessandra M; Laurentis, Valéria L; Ramalho, Dagmara G
2016-01-01
Although the parasitoid Cotesia flavipes (Cameron) has proven effective in controlling sugarcane borer Diatraea saccharalis (Fabricius) for many years, concern has arisen over the quality of individuals produced at large scales. The parasitoid has been reared in laboratories in Brazil for more than 40 years, with no new introductions of new populations during that period. Since the quality of the parasitoids was not verified at the time of the species' introduction in Brazil, we do not know if there has been any reduction in quality so far. However, it is possible to determine whether the parasitoid could reduce in quality in future generations. Thus, the objective of this research was to assess the quality of these insects over 10 generations and look for evidence of any loss in quality. We used two populations: one from a biofactory that has been maintained in the laboratory for over 40 years, and an inbred laboratory population. Both were bred, and compared for 10 generations. We wanted to determine what happened to the quality of the parasitoid after 10 generations in an extreme inbreeding situation. To assure inbreeding, newly emerged females were forced to mate with a sibling. Individual females were then allowed to parasitize larvae of D. saccharalis. We performed evaluations for each generation until the tenth generation, and recorded the sex ratio, percentage emergence, number of offspring/females, and longevity of both males and females. Results of the measurements of biological characteristics demonstrated random significant differences between populations; best results were obtained intermittently for both the biofactory population and the inbred population. No significant differences across generations for the same population were observed. Thus, rearing of a C. flavipes population subjected to inbreeding for 10 generations was not sufficient to reveal any deleterious effects of inbreeding.
NASA Astrophysics Data System (ADS)
Liu, Jiansheng; Wang, Wentao; Li, Wentao; Qi, Rong; Zhang, Zhijun; Yu, Changhai; Wang, Cheng; Liu, Jiaqi; Qing, Zhiyong; Ming, Fang; Xu, Yi; Leng, Yuxin; Li, Ruxin; Xu, Zhizhan
2017-05-01
One of the major goals of developing laser wakefiled accelerators (LWFAs) is to produce compact high-energy electron beam (e-beam) sources, which are expected to be applied in developing compact x-ray free-electron lasers and monoenergetic gamma-ray sources. Although LWFAs have been demonstrated to generate multi-GeV e-beams, to date they are still failed to produce high quality e beams with several essential properties (narrow energy spread, small transverse emittance and high beam charge) achieved simultaneously. Here we report on the demonstration of a high-quality cascaded LWFA experimentally via manipulating electron injection, seeding in different periods of the wakefield, as well as controlling energy chirp for the compression of energy spread. The cascaded LWFA was powered by a 1-Hz 200-TW femtosecond laser facility at SIOM. High-brightness e beams with peak energies in the range of 200-600 MeV, 0.4-1.2% rms energy spread, 10-80 pC charge, and 0.2 mrad rms divergence are experimentally obtained. Unprecedentedly high 6-dimensional (6-D) brightness B6D,n in units of A/m2/0.1% was estimated at the level of 1015-16, which is very close to the typical brightness of e beams from state-of-the-art linac drivers and several-fold higher than those of previously reported LWFAs. Furthermore, we propose a scheme to minimize the energy spread of an e beam in a cascaded LWFA to the one-thousandth-level by inserting a stage to compress its longitudinal spatial distribution via velocity bunching. In this scheme, three-segment plasma stages are designed for electron injection, e-beam length compression, and e-beam acceleration, respectively. A one-dimensional theory and two-dimensional particle-in-cell simulations have demonstrated this scheme and an e beam with 0.2% rms energy spread and low transverse emittance could be generated without loss of charge. Based on the high-quality e beams generated in the LWFA, we have experimentally realized a new scheme to enhance the betatron radiation via manipulating the e-beam transverse oscillation in the wakefield. Very brilliant quasi-monochromatic betatron x-rays in tens of keV with significant enhancement both in photon yield and peak energy have been generated. Besides, by employing a self-synchronized all-optical Compton scattering scheme, in which the electron beam collided with the intense driving laser pulse via the reflection of a plasma mirror, we produced tunable quasi-monochromatic MeV γ-rays ( 33% full-width at half-maximum) with a peak brilliance of 3.1×1022 photons s-1 mm-2 mrad-2 0.1% BW at 1 MeV, which is one order of magnitude higher than ever reported value in MeV regime to the best of our knowledge. 1. J. S. Liu, et al., Phys. Rev. Lett. 107, 035001 (2011). 2. X. Wang, et al., Nat. Commun. 4, 1988 (2013). 3. W. P. Leemans, et al., Phys. Rev. Lett. 113, 245002 (2014) 4. W. T. Wang et al., Phys. Rev. Lett. 117, 124801 (2016). 5. Z. J. Zhang et al., Phys. Plasmas 23, 053106 (2016). 6. C. H. Yu et al., Sci. Rep. 6, 29518 (2016).
DOE Office of Scientific and Technical Information (OSTI.GOV)
KNUPP,PATRICK; MITCHELL,SCOTT A.
1999-11-01
In an attempt to automatically produce high-quality all-hex meshes, we investigated a mesh improvement strategy: given an initial poor-quality all-hex mesh, we iteratively changed the element connectivity, adding and deleting elements and nodes, and optimized the node positions. We found a set of hex reconnection primitives. We improved the optimization algorithms so they can untangle a negative-Jacobian mesh, even considering Jacobians on the boundary, and subsequently optimize the condition number of elements in an untangled mesh. However, even after applying both the primitives and optimization we were unable to produce high-quality meshes in certain regions. Our experiences suggest that manymore » boundary configurations of quadrilaterals admit no hexahedral mesh with positive Jacobians, although we have no proof of this.« less
Demand response, behind-the-meter generation and air quality.
Zhang, Xiyue; Zhang, K Max
2015-02-03
We investigated the implications of behind-the-meter (BTM) generation participating in demand response (DR) programs. Specifically, we evaluated the impacts of NOx emissions from BTM generators enrolled in the New York Independent System Operator (NYISO)'s reliability-based DR programs. Through analyzing the DR program enrollment data, DR event records, ozone air quality monitoring data, and emission characteristics of the generators, we found that the emissions from BTM generators very likely contribute to exceedingly high ozone concentrations in the Northeast Corridor region, and very likely account for a substantial fraction of total NOx emissions from electricity generation. In addition, a companion study showed that the emissions from BTM generators could also form near-source particulate matter (PM) hotspots. The important policy implications are that the absence of up-to-date regulations on BTM generators may offset the current efforts to reduce the emissions from peaking power plants, and that there is a need to quantify the environmental impacts of DR programs in designing sound policies related to demand-side resources. Furthermore, we proposed the concept of "Green" DR resources, referring to those that not only provide power systems reliability services, but also have verifiable environmental benefits or minimal negative environmental impacts. We argue that Green DR resources that are able to maintain resource adequacy and reduce emissions at the same time are key to achieving the cobenefits of power system reliability and protecting public health during periods with peak electricity demand.
Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E
2012-04-06
Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Ashford, Gregory A.; Powell, Kenneth G.
1995-01-01
A method for generating high quality unstructured triangular grids for high Reynolds number Navier-Stokes calculations about complex geometries is described. Careful attention is paid in the mesh generation process to resolving efficiently the disparate length scales which arise in these flows. First the surface mesh is constructed in a way which ensures that the geometry is faithfully represented. The volume mesh generation then proceeds in two phases thus allowing the viscous and inviscid regions of the flow to be meshed optimally. A solution-adaptive remeshing procedure which allows the mesh to adapt itself to flow features is also described. The procedure for tracking wakes and refinement criteria appropriate for shock detection are described. Although at present it has only been implemented in two dimensions, the grid generation process has been designed with the extension to three dimensions in mind. An implicit, higher-order, upwind method is also presented for computing compressible turbulent flows on these meshes. Two recently developed one-equation turbulence models have been implemented to simulate the effects of the fluid turbulence. Results for flow about a RAE 2822 airfoil and a Douglas three-element airfoil are presented which clearly show the improved resolution obtainable.
NASA Astrophysics Data System (ADS)
Servilla, M. S.; Brunt, J.; Costa, D.; Gries, C.; Grossman-Clarke, S.; Hanson, P. C.; O'Brien, M.; Smith, C.; Vanderbilt, K.; Waide, R.
2017-12-01
In the world of data repositories, there seems to be a never ending struggle between the generation of high-quality data documentation and the ease of archiving a data product in a repository - the higher the documentation standards, the greater effort required by the scientist, and the less likely the data will be archived. The Environmental Data Initiative (EDI) attempts to balance the rigor of data documentation to the amount of effort required by a scientist to upload and archive data. As an outgrowth of the LTER Network Information System, the EDI is funded by the US NSF Division of Environmental Biology, to support the LTER, LTREB, OBFS, and MSB programs, in addition to providing an open data archive for environmental scientists without a viable archive. EDI uses the PASTA repository software, developed originally by the LTER. PASTA is metadata driven and documents data with the Ecological Metadata Language (EML), a high-fidelity standard that can describe all types of data in great detail. PASTA incorporates a series of data quality tests to ensure that data are correctly documented with EML in a process that is termed "metadata and data congruence", and incongruent data packages are forbidden in the repository. EDI reduces the burden of data documentation on scientists in two ways: first, EDI provides hands-on assistance in data documentation best practices using R and being developed in Python, for generating EML. These tools obscure the details of EML generation and syntax by providing a more natural and contextual setting for describing data. Second, EDI works closely with community information managers in defining rules used in PASTA quality tests. Rules deemed too strict can be turned off completely or just issue a warning, while the community learns to best handle the situation and improve their documentation practices. Rules can also be added or refined over time to improve overall quality of archived data. The outcome of quality tests are stored as part of the data archive in PASTA and are accessible to all users of the EDI data repository. In summary, EDI's metadata support to scientists and the comprehensive set of data quality tests for metadata and data congruency provide an ideal archive for environmental and ecological data.
NASA Astrophysics Data System (ADS)
Zhang, Zhen; Chen, Siqing; Zheng, Huadong; Sun, Tao; Yu, Yingjie; Gao, Hongyue; Asundi, Anand K.
2017-06-01
Computer holography has made a notably progress in recent years. The point-based method and slice-based method are chief calculation algorithms for generating holograms in holographic display. Although both two methods are validated numerically and optically, the differences of the imaging quality of these methods have not been specifically analyzed. In this paper, we analyze the imaging quality of computer-generated phase holograms generated by point-based Fresnel zone plates (PB-FZP), point-based Fresnel diffraction algorithm (PB-FDA) and slice-based Fresnel diffraction algorithm (SB-FDA). The calculation formula and hologram generation with three methods are demonstrated. In order to suppress the speckle noise, sequential phase-only holograms are generated in our work. The results of reconstructed images numerically and experimentally are also exhibited. By comparing the imaging quality, the merits and drawbacks with three methods are analyzed. Conclusions are given by us finally.
Quality of Care Measures for the Management of Unhealthy Alcohol Use
Hepner, Kimberly A.; Watkins, Katherine E.; Farmer, Carrie M.; Rubenstein, Lisa; Pedersen, Eric R.; Pincus, Harold Alan
2017-01-01
There is a paucity of quality measures to assess the care for the range of unhealthy alcohol use, ranging from risky drinking to alcohol use disorders. Using a two-phase expert panel review process, we sought to develop an expanded set of quality of care measures for unhealthy alcohol use, focusing on outpatient care delivered in both primary care and specialty care settings. This process generated 25 candidate measures. Eight measures address screening and assessment, 11 address aspects of treatment, and six address follow-up. These quality measures represent high priority targets for future development, including creating detailed technical specifications and pilot testing them to evaluate their utility in terms of feasibility, reliability, and validity. PMID:28340902
Interplay of the Quality of Ciprofloxacin and Antibiotic Resistance in Developing Countries
Sharma, Deepali; Patel, Rahul P.; Zaidi, Syed Tabish R.; Sarker, Md. Moklesur Rahman; Lean, Qi Ying; Ming, Long C.
2017-01-01
Ciprofloxacin, a second generation broad spectrum fluoroquinolone, is active against both Gram-positive and Gram-negative bacteria. Ciprofloxacin has a high oral bioavailability and a large volume of distribution. It is used for the treatment of a wide range of infections including urinary tract infections caused by susceptible bacteria. However, the availability and use of substandard and spurious quality of oral ciprofloxacin formulations in the developing countries has been thought to have contributed toward increased risk of treatment failure and bacterial resistance. Therefore, quality control and bioequivalence studies of the commercially available oral ciprofloxacin formulations should be monitored. Appropriate actions should be taken against offending manufacturers in order to prevent the sale of substandard and spurious quality of ciprofloxacin formulations. PMID:28871228
NASA Astrophysics Data System (ADS)
Lukowski, Michal L.
Optically pumped semiconductor vertical external cavity surface emitting lasers (VECSEL) were first demonstrated in the mid 1990's. Due to the unique design properties of extended cavity lasers VECSELs have been able to provide tunable, high-output powers while maintaining excellent beam quality. These features offer a wide range of possible applications in areas such as medicine, spectroscopy, defense, imaging, communications and entertainment. Nowadays, newly developed VECSELs, cover the spectral regions from red (600 nm) to around 5 microm. By taking the advantage of the open cavity design, the emission can be further expanded to UV or THz regions by the means of intracavity nonlinear frequency generation. The objective of this dissertation is to investigate and extend the capabilities of high-power VECSELs by utilizing novel nonlinear conversion techniques. Optically pumped VECSELs based on GaAs semiconductor heterostructures have been demonstrated to provide exceptionally high output powers covering the 900 to 1200 nm spectral region with diffraction limited beam quality. The free space cavity design allows for access to the high intracavity circulating powers where high efficiency nonlinear frequency conversions and wavelength tuning can be obtained. As an introduction, this dissertation consists of a brief history of the development of VECSELs as well as wafer design, chip fabrication and resonator cavity design for optimal frequency conversion. Specifically, the different types of laser cavities such as: linear cavity, V-shaped cavity and patented T-shaped cavity are described, since their optimization is crucial for transverse mode quality, stability, tunability and efficient frequency conversion. All types of nonlinear conversions such as second harmonic, sum frequency and difference frequency generation are discussed in extensive detail. The theoretical simulation and the development of the high-power, tunable blue and green VECSEL by the means of type I second harmonic generation in a V- cavity is presented. Tens of watts of output power for both blue and green wavelengths prove the viability for VECSELs to replace the other types of lasers currently used for applications in laser light shows, for Ti:Sapphire pumping, and for medical applications such as laser skin resurfacing. The novel, recently patented, two-chip T-cavity configuration allowing for spatial overlap of two, separate VECSEL cavities is described in detail. This type of setup is further used to demonstrate type II sum frequency generation to green with multi-watt output, and the full potential of the T-cavity is utilized by achieving type II difference frequency generation to the mid-IR spectral region. The tunable output around 5.4 microm with over 10 mW power is showcased. In the same manner the first attempts to generate THz radiation are discussed. Finally, a slightly modified T-cavity VECSEL is used to reach the UV spectral regions thanks to type I fourth harmonic generation. Over 100 mW at around 265 nm is obtained in a setup which utilizes no stabilization techniques. The dissertation demonstrates the flexibility of the VECSEL in achieving broad spectral coverage and thus its potential for a wide range of applications.
Couple Resilience to Economic Pressure Over Time and Across Generations
Masarik, April S.; Martin, Monica J.; Ferrer, Emilio; Lorenz, Frederick O.; Conger, Katherine J.; Conger, Rand D.
2016-01-01
Research suggests that economic stress disrupts perceived romantic relationship quality; yet less is known regarding the direct influence of economic stress on negative behavioral exchanges between partners over time. Another intriguing question concerns the degree to which effective problem-solving might protect against this hypothesized association. To address these issues, the authors studied two generations of couples who were assessed approximately 13 years apart (Generation 1: N = 367, Generation 2: N = 311). On average and for both generations, economic pressure predicted relative increases in couples’ hostile, contemptuous, and angry behaviors; however, couples who were highly effective problem solvers experienced no increases in these behaviors in response to economic pressure. Less effective problem solvers experienced the steepest increases in hostile behaviors in response to economic pressure. Because these predictive pathways were replicated in both generations of couples it appears that these stress and resilience processes unfold over time and across generations. PMID:27019520
Quantum random number generation for loophole-free Bell tests
NASA Astrophysics Data System (ADS)
Mitchell, Morgan; Abellan, Carlos; Amaya, Waldimar
2015-05-01
We describe the generation of quantum random numbers at multi-Gbps rates, combined with real-time randomness extraction, to give very high purity random numbers based on quantum events at most tens of ns in the past. The system satisfies the stringent requirements of quantum non-locality tests that aim to close the timing loophole. We describe the generation mechanism using spontaneous-emission-driven phase diffusion in a semiconductor laser, digitization, and extraction by parity calculation using multi-GHz logic chips. We pay special attention to experimental proof of the quality of the random numbers and analysis of the randomness extraction. In contrast to widely-used models of randomness generators in the computer science literature, we argue that randomness generation by spontaneous emission can be extracted from a single source.
Low-latency fiber-millimeter-wave system for future mobile fronthauling
NASA Astrophysics Data System (ADS)
Tien Dat, Pham; Kanno, Atsushi; Yamamoto, Naokatsu; Kawanishi, Tetsuya
2016-02-01
A seamless combination of fiber and millimeter-wave (MMW) systems can be very attractive for future heterogeneous mobile networks such as 5G because of its flexibility and high bandwidth. Analog mobile signal transmission over seamless fiber-MMW systems is very promising to reduce the latency and the required band-width, and to simplify the systems. However, stable and high-performance seamless systems are indispensable to conserve the quality of the analog signal transmission. In this paper, we present several technologies to develop such seamless fiber-MMW systems. In the downlink direction, a high-performance system can be realized using a high-quality optical MMW signal generator and a self-homodyne MMW signal detector. In the uplink direction, a cascade of radio-on-radio and radio-over-fiber systems using a burst-mode optical amplifier can support bursty radio signal transmission. A full-duplex transmission with negligible interference effects can be realized using frequency multiplexing in the radio link and wavelength-division multiplexing in the optical link. A high-spectral efficiency MMW-over-fiber system using an intermediate frequency-over-fiber system and a high-quality remote delivery of a local oscillator signal is highly desirable to reduce the costs.
Monitoring urban air quality using a high-density network of low-cost sensor nodes in Oslo, Norway.
NASA Astrophysics Data System (ADS)
Castell, Nuria; Schneider, Philipp; Vogt, Matthias; Dauge, Franck R.; Lahoz, William; Bartonova, Alena
2017-04-01
Urban air quality represents a major public health burden and is a long-standing concern to citizens. Air pollution is associated with a range of diseases, symptoms and conditions that impair health and quality of life. In Oslo, traffic, especially exhaust from heavy-duty and private diesel vehicles and dust resuspension from studded tyres, together with wood burning in winter, are the main sources of pollution. Norway, as part of the European Economic Area, is obliged to comply with the European air quality regulations and ensure clean air. Despite this, Oslo has exceeded both the NO2 and PM10 thresholds for health protection defined in the Directive 2008/50/EC. The air quality in the Oslo area is continuously monitored in 12 compliance monitoring stations. These stations provide reliable and accurate data but their density is too low to provide a detailed spatial distribution of air quality. The emergence of low-cost nodes enables observations at high spatial resolution, providing the opportunity to enhance existing monitoring systems. However, the data generated by these nodes is significantly less accurate and precise than the data provided by reference equipment. We have conducted an evaluation of low-cost nodes to monitor NO2 and PM10, comparing the data collected with low-cost nodes against CEN (European Standardization Organization) reference analysers. During January and March 2016, a network of 24 nodes was deployed in Oslo. During January, high NO2 levels were observed for several days in a row coinciding with the formation of a thermal inversion. During March, we observed an episode with high PM10 levels due to road dust resuspension. Our results show that there is a major technical challenge associated with current commercial low-cost sensors, regarding the sensor robustness and measurement repeatability. Despite this, low-cost sensor nodes are able to reproduce the NO2 and PM10 variability. The data from the sensors was employed to generate detailed NO2 and PM10 air quality maps using a data fusion technique. This way we were able to offer localized air quality information for the city of Oslo. The outlook for commercial low-cost sensors is promising, and our results show that currently some sensors are already capable of providing coarse information about air quality, indicating if the air quality is good, moderate or if the air is heavily polluted. This type of information could be suitable for applications that aim to raise awareness, or engage the community by monitoring local air quality, as such applications do not require the same accuracy as scientific or regulatory monitoring.
Code of Federal Regulations, 2013 CFR
2013-07-01
..., entertainment, and residential opportunities, as well as high quality office uses. (b) That portion of the... shall be directly related to creating a lively atmosphere and to promoting an active street life... as important to the commercial life of any inner city, uses that do not generate lively activities...
Fertilizer intensification and its impacts in China's HHH Plains
USDA-ARS?s Scientific Manuscript database
The accomplishment of China’s food security by application of high rates of fertilizers has generated several controversies regarding the quality of soil and water resources. Thus, the objective of this article is to assess the effects and causes of the fertilizer intensification in the Huang Huai ...
Toward a Quality-of-Life Paradigm for Sustainable Communities.
ERIC Educational Resources Information Center
Hyman, Drew
This paper suggests that current paradigms and world views guiding research for social action are inadequate for directing rural community change in a high-tech, global community. For several generations, the agrarian and industrial paradigms have been accepted as appropriate for guiding social change and development. However, there are problems…
USDA-ARS?s Scientific Manuscript database
Land data assimilations are typically based on highly uncertain assumptions regarding the statistical structure of observation and modeling errors. Left uncorrected, poor assumptions can degrade the quality of analysis products generated by land data assimilation systems. Recently, Crow and van de...
Many water utilities in the US using chloramine as disinfectant treatment in their distribution systems have experienced nitrification episodes, which detrimentally impact the water quality. Here, we used 16S rRNA sequencing data to generate high-resolution taxonomic profiles of...
ERIC Educational Resources Information Center
Balcazar, Fabricio E.
2012-01-01
Now spanning 31 years, the "Journal of Organizational Behavior Management" ("JOBM") continues to generate new conceptual ideas and high-quality research in the field of Organizational Behavior Management (OBM). However, it is a bit disheartening to realize that the growth (in terms of number of practitioners and researchers) and expected impact of…
USDA-ARS?s Scientific Manuscript database
Beta vulgaris crop types represent highly diverged populations with distinct phenotypes resulting from long-term selection. Differential end use in the crop types includes: leaf quality (chard/leaf beet), root enlargement and biomass, (table beet, fodder beet, sugar beet), and secondary metabolite a...
Radioactive Decay: Audio Data Collection
ERIC Educational Resources Information Center
Struthers, Allan
2009-01-01
Many phenomena generate interesting audible time series. This data can be collected and processed using audio software. The free software package "Audacity" is used to demonstrate the process by recording, processing, and extracting click times from an inexpensive radiation detector. The high quality of the data is demonstrated with a simple…
Brain-Wide Maps of "Fos" Expression during Fear Learning and Recall
ERIC Educational Resources Information Center
Cho, Jin-Hyung; Rendall, Sam D.; Gray, Jesse M.
2017-01-01
"Fos" induction during learning labels neuronal ensembles in the hippocampus that encode a specific physical environment, revealing a memory trace. In the cortex and other regions, the extent to which "Fos" induction during learning reveals specific sensory representations is unknown. Here we generate high-quality brain-wide…
Creative Thinking: Processes, Strategies, and Knowledge
ERIC Educational Resources Information Center
Mumford, Michael D.; Medeiros, Kelsey E.; Partlow, Paul J.
2012-01-01
Creative achievements are the basis for progress in our world. Although creative achievement is influenced by many variables, the basis for creativity is held to lie in the generation of high-quality, original, and elegant solutions to complex, novel, ill-defined problems. In the present effort, we examine the cognitive capacities that make…
Assessing nutritional status in cancer: role of the Patient-Generated Subjective Global Assessment.
Jager-Wittenaar, Harriët; Ottery, Faith D
2017-09-01
The Scored Patient-Generated Subjective Global Assessment (PG-SGA) is used internationally as the reference method for proactive risk assessment (screening), assessment, monitoring and triaging for interventions in patients with cancer. This review aims to explain the rationale behind and data supporting the PG-SGA, and to provide an overview of recent developments in the utilization of the PG-SGA and the PG-SGA Short Form. The PG-SGA was designed in the context of a paradigm known as 'anabolic competence'. Uniquely, the PG-SGA evaluates the patient's status as a dynamic rather than static process. The PG-SGA has received new attention, particularly as a screening instrument for nutritional risk or deficit, identifying treatable impediments and guiding patients and professionals in triaging for interdisciplinary interventions. The international use of the PG-SGA indicates a critical need for high-quality and linguistically validated translations of the PG-SGA. As a 4-in-1 instrument, the PG-SGA can streamline clinic work flow and improve the quality of interaction between the clinician and the patient. The availability of multiple high-quality language versions of the PG-SGA enables the inclusion of the PG-SGA in international multicenter studies, facilitating meta-analysis and benchmarking across countries.
Influence of modified atmosphere packaging on 'Star Ruby' grapefruit phytochemicals.
Chaudhary, Priyanka R; Jayaprakasha, G K; Porat, Ron; Patil, Bhimanagouda S
2015-01-28
Modified atmosphere packaging (MAP) can extend the shelf life of salads, vegetables, and fruits by generating a storage environment with low O2, high CO2, and high humidity. The current study investigates the effect of modified atmosphere and humidity generated by two plastic films, microperforated bags (MIPBs) and macroperforated bags (MAPBs), on the levels of phytochemicals present in 'Star Ruby' grapefruits (Citrus paradisi, Macf.) stored for 16 weeks at 10 °C. Control fruits were stored without any packaging film. Juice samples were analyzed every 4 weeks for ascorbic acid, carotenoids, limonoids, flavonoids, and furocoumarins and assessed for quality parameters. MAP significantly reduced weight loss compared to control grapefruits. Control fruits had more β-carotene, lycopene, and furocoumarin compared with the fruits in MAP. Flavonoid content was highest in fruits stored in MAPB (P < 0.05), while fruits stored in MIPB showed no significant difference in flavonoid content compared to control (P > 0.05). The MAP treatments did not significantly affect ascorbic acid, limonoids, or fruit quality parameters, including total soluble solids, acidity, ripening ratio, decay and disorders, fruit taste, and off-flavors after 16 weeks of storage. These results suggest that MAP can be used to maintain the quality of 'Star Ruby' grapefruit with no detrimental effect on health-promoting phytochemicals.
Diet in pregnancy-more than food.
Danielewicz, H; Myszczyszyn, G; Dębińska, A; Myszkal, A; Boznański, A; Hirnle, L
2017-12-01
High food quality, together with adequate macro- and micronutrient intake in pregnancy, is crucial for the health status of the mother and child. Recent findings suggest that it could also be beneficial or harmful in the context of the well-being of the whole future population. According to the developmental origins of health and disease hypothesis, most conditions that occur in adulthood originate in foetal life. Moreover, some epigenetic events, modified inter alia by diet, impact more than one generation. Still, the recommendations in most countries are neither popularised nor very detailed. While it seems to be important to direct diet trends towards a healthier lifestyle, the methods of preventing specific disorders like diabetes or asthma are not yet established and require further investigation. In this review, we will summarise the recommendations for diet composition in pregnancy, focusing on both diet quality and quantity. What is Known • High food quality, together with adequate macro- and micronutrient intake in pregnancy, is crucial for the health status of the mother and child. What is New • Recent findings suggest that the diet could be beneficial or harmful in the context of the well-being of the whole future population. Most conditions that occur in adulthood originate in foetal life. • Moreover, some epigenetic events, modified by diet impact more than one generation.
Why do I always have the best ideas? The role of idea quality in unconscious plagiarism.
Perfect, Timothy J; Stark, Louisa-Jayne
2008-05-01
Groups of individuals often work together to generate solutions to a problem. Subsequently, one member of the group can plagiarise another either by recalling that person's idea as their own (recall-own plagiarism), or by generating a novel solution that duplicates a previous idea (generate-new plagiarism). The current study examines the extent to which these forms of plagiarism are influenced by the quality of the ideas. Groups of participants initially generated ideas, prior to an elaboration phase in which idea quality was manipulated in two ways: participants received feedback on the quality of the ideas as rated by independent judges, and they generated improvements to a subset of the ideas. Unconscious plagiarism was measured in recall-own and generate-new tasks. For recall, idea improvement led to increased plagiarism, while for the generate-new task, the independent ratings influenced plagiarism. These data indicate that different source-judgement processes underlie the two forms of plagiarism, neither of which can be reduced simply to memory strength.
Physicochemical assessment criteria for high-voltage pulse capacitors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darian, L. A., E-mail: LDarian@rambler.ru; Lam, L. Kh.
In the paper, the applicability of decomposition products of internal insulation of high-voltage pulse capacitors is considered (aging is the reason for decomposition products of internal insulation). Decomposition products of internal insulation of high-voltage pulse capacitors can be used to evaluate their quality when in operation and in service. There have been three generations of markers of aging of insulation as in the case with power transformers. The area of applicability of markers of aging of insulation for power transformers has been studied and the area can be extended to high-voltage pulse capacitors. The research reveals that there is amore » correlation between the components and quantities of markers of aging of the first generation (gaseous decomposition products of insulation) dissolved in insulating liquid and the remaining life of high-voltage pulse capacitors. The application of markers of aging to evaluate the remaining service life of high-voltage pulse capacitor is a promising direction of research, because the design of high-voltage pulse capacitors keeps stability of markers of aging of insulation in high-voltage pulse capacitors. It is necessary to continue gathering statistical data concerning development of markers of aging of the first generation. One should also carry out research aimed at estimation of the remaining life of capacitors using markers of the second and the third generation.« less
Object Detection from MMS Imagery Using Deep Learning for Generation of Road Orthophotos
NASA Astrophysics Data System (ADS)
Li, Y.; Sakamoto, M.; Shinohara, T.; Satoh, T.
2018-05-01
In recent years, extensive research has been conducted to automatically generate high-accuracy and high-precision road orthophotos using images and laser point cloud data acquired from a mobile mapping system (MMS). However, it is necessary to mask out non-road objects such as vehicles, bicycles, pedestrians and their shadows in MMS images in order to eliminate erroneous textures from the road orthophoto. Hence, we proposed a novel vehicle and its shadow detection model based on Faster R-CNN for automatically and accurately detecting the regions of vehicles and their shadows from MMS images. The experimental results show that the maximum recall of the proposed model was high - 0.963 (intersection-over-union > 0.7) - and the model could identify the regions of vehicles and their shadows accurately and robustly from MMS images, even when they contain varied vehicles, different shadow directions, and partial occlusions. Furthermore, it was confirmed that the quality of road orthophoto generated using vehicle and its shadow masks was significantly improved as compared to those generated using no masks or using vehicle masks only.
Farrell, Timothy W; Supiano, Katherine P; Wong, Bob; Luptak, Marilyn K; Luther, Brenda; Andersen, Troy C; Wilson, Rebecca; Wilby, Frances; Yang, Rumei; Pepper, Ginette A; Brunker, Cherie P
2018-05-01
Health professions trainees' performance in teams is rarely evaluated, but increasingly important as the healthcare delivery systems in which they will practice move towards team-based care. Effective management of care transitions is an important aspect of interprofessional teamwork. This mixed-methods study used a crossover design to randomise health professions trainees to work as individuals and as teams to formulate written care transition plans. Experienced external raters assessed the quality of the written care transition plans as well as both the quality of team process and overall team performance. Written care transition plan quality did not vary between individuals and teams (21.8 vs. 24.4, respectively, p = 0.42). The quality of team process did not correlate with the quality of the team-generated written care transition plans (r = -0.172, p = 0.659). However, there was a significant correlation between the quality of team process and overall team performance (r = 0.692, p = 0.039). Teams with highly engaged recorders, performing an internal team debrief, had higher-quality care transition plans. These results suggest that high-quality interprofessional care transition plans may require advance instruction as well as teamwork in finalising the plan.
Dryout occurrence in a helically coiled steam generator for nuclear power application
NASA Astrophysics Data System (ADS)
Santini, L.; Cioncolini, A.; Lombardi, C.; Ricotti, M.
2014-03-01
Dryout phenomena have been experimentally investigated in a helically coiled steam generator tube. The experiences carried out in the present work are part of a wide experimental program devoted to the study of a GEN III+ innovative nuclear power plant [1].The experimental facility consists in an electrically heated AISI 316L stainless steel coiled tube. The tube is 32 meters long, 12.53 mm of inner diameter, with a coil diameter of 1m and a pitch of 0.79 m, resulting in a total height of the steam generator of 8 meters. The thermo-hydraulics conditions for dryout investigations covered a spectrum of mass fluxes between 199 and 810 kg/m2s, the pressures ranges from 10.7 to 60.7 bar, heat fluxes between 43.6 to 209.3 kW/m2.Very high first qualities dryout, between 0.72 and 0.92, were found in the range of explored conditions, comparison of our results with literature available correlations shows the difficulty in predicting high qualities dryout in helical coils., immediately following the heading. The text should be set to 1.15 line spacing. The abstract should be centred across the page, indented 15 mm from the left and right page margins and justified. It should not normally exceed 200 words.
Universal route to optimal few- to single-cycle pulse generation in hollow-core fiber compressors.
Conejero Jarque, E; San Roman, J; Silva, F; Romero, R; Holgado, W; Gonzalez-Galicia, M A; Alonso, B; Sola, I J; Crespo, H
2018-02-02
Gas-filled hollow-core fiber (HCF) pulse post-compressors generating few- to single-cycle pulses are a key enabling tool for attosecond science and ultrafast spectroscopy. Achieving optimum performance in this regime can be extremely challenging due to the ultra-broad bandwidth of the pulses and the need of an adequate temporal diagnostic. These difficulties have hindered the full exploitation of HCF post-compressors, namely the generation of stable and high-quality near-Fourier-transform-limited pulses. Here we show that, independently of conditions such as the type of gas or the laser system used, there is a universal route to obtain the shortest stable output pulse down to the single-cycle regime. Numerical simulations and experimental measurements performed with the dispersion-scan technique reveal that, in quite general conditions, post-compressed pulses exhibit a residual third-order dispersion intrinsic to optimum nonlinear propagation within the fiber, in agreement with measurements independently performed in several laboratories around the world. The understanding of this effect and its adequate correction, e.g. using simple transparent optical media, enables achieving high-quality post-compressed pulses with only minor changes in existing setups. These optimized sources have impact in many fields of science and technology and should enable new and exciting applications in the few- to single-cycle pulse regime.
Exploitation of Digital Surface Models Generated from WORLDVIEW-2 Data for SAR Simulation Techniques
NASA Astrophysics Data System (ADS)
Ilehag, R.; Auer, S.; d'Angelo, P.
2017-05-01
GeoRaySAR, an automated SAR simulator developed at DLR, identifies buildings in high resolution SAR data by utilizing geometric knowledge extracted from digital surface models (DSMs). Hitherto, the simulator has utilized DSMs generated from LiDAR data from airborne sensors with pre-filtered vegetation. Discarding the need for pre-optimized model input, DSMs generated from high resolution optical data (acquired with WorldView-2) are used for the extraction of building-related SAR image parts in this work. An automatic preprocessing of the DSMs has been developed for separating buildings from elevated vegetation (trees, bushes) and reducing the noise level. Based on that, automated simulations are triggered considering the properties of real SAR images. Locations in three cities, Munich, London and Istanbul, were chosen as study areas to determine advantages and limitations related to WorldView-2 DSMs as input for GeoRaySAR. Beyond, the impact of the quality of the DSM in terms of building extraction is evaluated as well as evaluation of building DSM, a DSM only containing buildings. The results indicate that building extents can be detected with DSMs from optical satellite data with various success, dependent on the quality of the DSM as well as on the SAR imaging perspective.
A succinct rating scale for radiology report quality
Yang, Chengwu; Ouyang, Tao; Peterson, Christine M; Sarwani, Nabeel I; Tappouni, Rafel; Bruno, Michael
2014-01-01
Context: Poorly written radiology reports are common among residents and are a significant challenge for radiology education. While training may improve report quality, a professionally developed reliable and valid scale to measure report quality does not exist. Objectives: To develop a measurement tool for report quality, the quality of report scale, with rigorous validation through empirical data. Methods: A research team of an experienced psychometrician and six senior radiologists conducted qualitative and quantitative studies. Five items were identified for the quality of report scale, each measuring a distinct aspect of report quality. Two dedicated training sessions were designed and implemented to help residents generate high-quality reports. In a blinded fashion, the quality of report scale was applied to 804 randomly selected reports issued before (n = 403) and after (n = 401) training. Full-scale psychometrical assessments were implemented onto the quality of report scale’s item- and scale-scores from the reports. The quality of report scale scores were correlated with report professionalism and attendings’ preference and were compared pre-/post-training. Results: The quality of report scale showed sound psychometrical properties, with high validity and reliability. Reports with higher quality of report scale score were more professional and preferable by attendings. Training improved the quality of report scale score, empirically validating the quality of report scale further. Conclusion: While succinct and practitioner friendly, the quality of report scale is a reliable and valid measure of radiology report quality and has the potential to be easily adapted to other fields such as pathology, where similar training would be beneficial. PMID:26770756
Patel, Prinesh N; Karakam, Vijaya Saradhi; Samanthula, Gananadhamu; Ragampeta, Srinivas
2015-10-01
Quality-by-design-based methods hold greater level of confidence for variations and greater success in method transfer. A quality-by-design-based ultra high performance liquid chromatography method was developed for the simultaneous assay of sumatriptan and naproxen along with their related substances. The first screening was performed by fractional factorial design comprising 44 experiments for reversed-phase stationary phases, pH, and organic modifiers. The results of screening design experiments suggested phenyl hexyl column and acetonitrile were the best combination. The method was further optimized for flow rate, temperature, and gradient time by experimental design of 20 experiments and the knowledge space was generated for effect of variable on response (number of peaks ≥ 1.50 - resolution). Proficient design space was generated from knowledge space by applying Monte Carlo simulation to successfully integrate quantitative robustness metrics during optimization stage itself. The final method provided the robust performance which was verified and validated. Final conditions comprised Waters® Acquity phenyl hexyl column with gradient elution using ammonium acetate (pH 4.12, 0.02 M) buffer and acetonitrile at 0.355 mL/min flow rate and 30°C. The developed method separates all 13 analytes within a 15 min run time with fewer experiments compared to the traditional quality-by-testing approach. ©2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Shao, Wenguang; Pedrioli, Patrick G A; Wolski, Witold; Scurtescu, Cristian; Schmid, Emanuel; Vizcaíno, Juan A; Courcelles, Mathieu; Schuster, Heiko; Kowalewski, Daniel; Marino, Fabio; Arlehamn, Cecilia S L; Vaughan, Kerrie; Peters, Bjoern; Sette, Alessandro; Ottenhoff, Tom H M; Meijgaarden, Krista E; Nieuwenhuizen, Natalie; Kaufmann, Stefan H E; Schlapbach, Ralph; Castle, John C; Nesvizhskii, Alexey I; Nielsen, Morten; Deutsch, Eric W; Campbell, David S; Moritz, Robert L; Zubarev, Roman A; Ytterberg, Anders Jimmy; Purcell, Anthony W; Marcilla, Miguel; Paradela, Alberto; Wang, Qi; Costello, Catherine E; Ternette, Nicola; van Veelen, Peter A; van Els, Cécile A C M; Heck, Albert J R; de Souza, Gustavo A; Sollid, Ludvig M; Admon, Arie; Stevanovic, Stefan; Rammensee, Hans-Georg; Thibault, Pierre; Perreault, Claude; Bassani-Sternberg, Michal; Aebersold, Ruedi; Caron, Etienne
2018-01-04
Mass spectrometry (MS)-based immunopeptidomics investigates the repertoire of peptides presented at the cell surface by major histocompatibility complex (MHC) molecules. The broad clinical relevance of MHC-associated peptides, e.g. in precision medicine, provides a strong rationale for the large-scale generation of immunopeptidomic datasets and recent developments in MS-based peptide analysis technologies now support the generation of the required data. Importantly, the availability of diverse immunopeptidomic datasets has resulted in an increasing need to standardize, store and exchange this type of data to enable better collaborations among researchers, to advance the field more efficiently and to establish quality measures required for the meaningful comparison of datasets. Here we present the SysteMHC Atlas (https://systemhcatlas.org), a public database that aims at collecting, organizing, sharing, visualizing and exploring immunopeptidomic data generated by MS. The Atlas includes raw mass spectrometer output files collected from several laboratories around the globe, a catalog of context-specific datasets of MHC class I and class II peptides, standardized MHC allele-specific peptide spectral libraries consisting of consensus spectra calculated from repeat measurements of the same peptide sequence, and links to other proteomics and immunology databases. The SysteMHC Atlas project was created and will be further expanded using a uniform and open computational pipeline that controls the quality of peptide identifications and peptide annotations. Thus, the SysteMHC Atlas disseminates quality controlled immunopeptidomic information to the public domain and serves as a community resource toward the generation of a high-quality comprehensive map of the human immunopeptidome and the support of consistent measurement of immunopeptidomic sample cohorts. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Hiding Techniques for Dynamic Encryption Text based on Corner Point
NASA Astrophysics Data System (ADS)
Abdullatif, Firas A.; Abdullatif, Alaa A.; al-Saffar, Amna
2018-05-01
Hiding technique for dynamic encryption text using encoding table and symmetric encryption method (AES algorithm) is presented in this paper. The encoding table is generated dynamically from MSB of the cover image points that used as the first phase of encryption. The Harris corner point algorithm is applied on cover image to generate the corner points which are used to generate dynamic AES key to second phase of text encryption. The embedded process in the LSB for the image pixels except the Harris corner points for more robust. Experimental results have demonstrated that the proposed scheme have embedding quality, error-free text recovery, and high value in PSNR.
Li, Jianping; Yu, Changyuan; Li, Zhaohui
2014-03-15
A novel polarization-modulator-based complementary frequency shifter (PCFS) has been proposed and then used to implement the generation of a frequency-locked multicarrier with single- and dual-recirculating frequency shifting loops, respectively. The transfer functions and output properties of PCFS and PCFS-based multicarrier generator have been studied theoretically. Based on our simulation results through VPItransmissionMaker software, 100 stable carriers have been obtained with acceptable flatness while no DC bias control is required. The results show that the proposed PCFS has the potential to become a commercial product and then used in various scenarios.
Vortex Airy beams directly generated via liquid crystal q-Airy-plates
NASA Astrophysics Data System (ADS)
Wei, Bing-Yan; Liu, Sheng; Chen, Peng; Qi, Shu-Xia; Zhang, Yi; Hu, Wei; Lu, Yan-Qing; Zhao, Jian-Lin
2018-03-01
Liquid crystal q-Airy-plates with director distributions integrated by q-plates and polarization Airy masks are proposed and demonstrated via the photoalignment technique. Single/dual vortex Airy beams of opposite topological charges and orthogonal circular polarizations are directly generated with polarization-controllable characteristic. The singular phase of the vortex part is verified by both astigmatic transformation and digital holography. The trajectory of vortex Airy beams is investigated, manifesting separate propagation dynamics of optical vortices and Airy beams. Meanwhile, Airy beams still keep their intrinsic transverse acceleration, self-healing, and nondiffraction features. This work provides a versatile candidate for generating high-quality vortex Airy beams.
NASA Astrophysics Data System (ADS)
Zhao, Xiaoyun; Tuo, Xianguo; Ge, Qing; Peng, Ying
2017-12-01
We employ a high-quality linear axis-encircling electron beam generated by a Cuccia coupler to drive a Ka-band third-harmonic peniotron and develop a self-consistent nonlinear calculation code to numerically analyze the characteristics of the designed peniotron. It is demonstrated that through a Cuccia coupler, a 6 kV, 0.5 A pencil beam and an input microwave power of 16 kW at 10 GHz can generate a 37 kV, 0.5 A linear axis-encircling beam, and it is characterized by a very low velocity spread. Moreover, the electron beam guiding center deviation can be adjusted easily. Driven by such a beam, a 30 GHz, Ka-band third-harmonic peniotron is predicted to achieve a conversion efficiency of 51.0% and a microwave output power of 9.44 kW; the results are in good agreement with the Magic3D simulation. Using this code, we studied the factors influencing the peniotron performance, and it can provide some guidelines for the design of a Ka-band third-harmonic peniotron driven by a linear electron beam and can promote the application of high-harmonic peniotrons in practice.
Solution-Grown Rubrene Crystals as Radiation Detecting Devices
Carman, Leslie; Martinez, H. Paul; Voss, Lars; ...
2017-01-11
There has been increased interest in organic semiconductors over the last decade because of their unique properties. Of these, 5, 6, 11, 12-tetraphenylnaphthacene (rubrene) has generated the most interest because of its high charge carrier mobility. In this paper, large single crystals with a volume of ~1 cm 3 were grown from solution by a temperature reduction technique. The faceted crystals had flat surfaces and cm-scale, visually defect-free areas suitable for physical characterization. X-ray diffraction analysis indicates that solvent does not incorporate into the crystals and photoluminescence spectra are consistent with pristine, high-crystallinity rubrene. Furthermore, the response curve to pulsedmore » optical illumination indicates that the solution grown crystals are of similar quality to those grown by physical vapor transport, albeit larger. The good quality of these crystals in combination with the improvement of electrical contacts by application of conductive polymer on the graphite electrodes have led to the clear observation of alpha particles with these rubrene detectors. Finally, preliminary results with a 252Cf source generate a small signal with the rubrene detector and may demonstrate that rubrene can also be used for detecting high-energy neutrons.« less
Muthukumarasamy, S; Osmani, Z; Sharpe, A; England, R J A
2012-02-01
This study aimed to assess the quality of information available on the World Wide Web for patients undergoing thyroidectomy. The first 50 web-links generated by internet searches using the five most popular search engines and the key word 'thyroidectomy' were evaluated using the Lida website validation instrument (assessing accessibility, usability and reliability) and the Flesch Reading Ease Score. We evaluated 103 of a possible 250 websites. Mean scores (ranges) were: Lida accessibility, 48/63 (27-59); Lida usability, 36/54 (21-50); Lida reliability, 21/51 (4-38); and Flesch Reading Ease, 43.9 (2.6-77.6). The quality of internet health information regarding thyroidectomy is variable. High ranking and popularity are not good indicators of website quality. Overall, none of the websites assessed achieved high Lida scores. In order to prevent the dissemination of inaccurate or commercially motivated information, we recommend independent labelling of medical information available on the World Wide Web.
Evans, Steven T; Stewart, Kevin D; Afdahl, Chris; Patel, Rohan; Newell, Kelcy J
2017-07-14
In this paper, we discuss the optimization and implementation of a high throughput process development (HTPD) tool that utilizes commercially available micro-liter sized column technology for the purification of multiple clinically significant monoclonal antibodies. Chromatographic profiles generated using this optimized tool are shown to overlay with comparable profiles from the conventional bench-scale and clinical manufacturing scale. Further, all product quality attributes measured are comparable across scales for the mAb purifications. In addition to supporting chromatography process development efforts (e.g., optimization screening), comparable product quality results at all scales makes this tool is an appropriate scale model to enable purification and product quality comparisons of HTPD bioreactors conditions. The ability to perform up to 8 chromatography purifications in parallel with reduced material requirements per run creates opportunities for gathering more process knowledge in less time. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Playing Chemical Plant Environmental Protection Games with Historical Monitoring Data
Reniers, Genserik; Zhang, Laobing; Qiu, Xiaogang
2017-01-01
The chemical industry is very important for the world economy and this industrial sector represents a substantial income source for developing countries. However, existing regulations on controlling atmospheric pollutants, and the enforcement of these regulations, often are insufficient in such countries. As a result, the deterioration of surrounding ecosystems and a quality decrease of the atmospheric environment can be observed. Previous works in this domain fail to generate executable and pragmatic solutions for inspection agencies due to practical challenges. In addressing these challenges, we introduce a so-called Chemical Plant Environment Protection Game (CPEP) to generate reasonable schedules of high-accuracy air quality monitoring stations (i.e., daily management plans) for inspection agencies. First, so-called Stackelberg Security Games (SSGs) in conjunction with source estimation methods are applied into this research. Second, high-accuracy air quality monitoring stations as well as gas sensor modules are modeled in the CPEP game. Third, simplified data analysis on the regularly discharging of chemical plants is utilized to construct the CPEP game. Finally, an illustrative case study is used to investigate the effectiveness of the CPEP game, and a realistic case study is conducted to illustrate how the models and algorithms being proposed in this paper, work in daily practice. Results show that playing a CPEP game can reduce operational costs of high-accuracy air quality monitoring stations. Moreover, evidence suggests that playing the game leads to more compliance from the chemical plants towards the inspection agencies. Therefore, the CPEP game is able to assist the environmental protection authorities in daily management work and reduce the potential risks of gaseous pollutants dispersion incidents. PMID:28961188
Image resolution enhancement via image restoration using neural network
NASA Astrophysics Data System (ADS)
Zhang, Shuangteng; Lu, Yihong
2011-04-01
Image super-resolution aims to obtain a high-quality image at a resolution that is higher than that of the original coarse one. This paper presents a new neural network-based method for image super-resolution. In this technique, the super-resolution is considered as an inverse problem. An observation model that closely follows the physical image acquisition process is established to solve the problem. Based on this model, a cost function is created and minimized by a Hopfield neural network to produce high-resolution images from the corresponding low-resolution ones. Not like some other single frame super-resolution techniques, this technique takes into consideration point spread function blurring as well as additive noise and therefore generates high-resolution images with more preserved or restored image details. Experimental results demonstrate that the high-resolution images obtained by this technique have a very high quality in terms of PSNR and visually look more pleasant.
Liquid Metal Engineering by Application of Intensive Melt Shearing
NASA Astrophysics Data System (ADS)
Patel, Jayesh; Zuo, Yubo; Fan, Zhongyun
In all casting processes, liquid metal treatment is an essential step in order to produce high quality cast products. A new liquid metal treatment technology has been developed which comprises of a rotor/stator set-up that delivers high shear rate to the liquid melt. It generates macro-flow in a volume of melt for distributive mixing and intensive shearing for dispersive mixing. The high shear device exhibits significantly enhanced kinetics for phase transformations, uniform dispersion, distribution and size reduction of solid particles and gas bubbles, improved homogenisation of chemical composition and temperature fields and also forced wetting of usually difficult-to-wet solid particles in the liquid metal. Hence, it can benefit various casting processes to produce high quality cast products with refined microstructure and enhanced mechanical properties. Here, we report an overview on the application of the new high shear technology to the processing of light metal alloys.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abel, David; Holloway, Tracey; Harkey, Monica
We evaluate how fine particulate matter (PM2.5) and precursor emissions could be reduced if 17% of electricity generation was replaced with solar photovoltaics (PV) in the Eastern United States. Electricity generation is simulated using GridView, then used to scale electricity-sector emissions of sulfur dioxide (SO2) and nitrogen oxides (NOX) from an existing gridded inventory of air emissions. This approach offers a novel method to leverage advanced electricity simulations with state-of-the-art emissions inventories, without necessitating recalculation of emissions for each facility. The baseline and perturbed emissions are input to the Community Multiscale Air Quality Model (CMAQ version 4.7.1) for a fullmore » accounting of time- and space-varying air quality changes associated with the 17% PV scenario. These results offer a high-value opportunity to evaluate the reduced-form AVoided Emissions and geneRation Tool (AVERT), while using AVERT to test the sensitivity of results to changing base-years and levels of solar integration. We find that average NOX and SO2 emissions across the region decrease 20% and 15%, respectively. PM2.5 concentrations decreased on average 4.7% across the Eastern U.S., with nitrate (NO3-) PM2.5 decreasing 3.7% and sulfate (SO42-) PM2.5 decreasing 9.1%. In the five largest cities in the region, we find that the most polluted days show the most significant PM2.5 decrease under the 17% PV generation scenario, and that the greatest benefits are accrued to cities in or near the Ohio River Valley. We find summer health benefits from reduced PM2.5 exposure estimated as 1424 avoided premature deaths (95% Confidence Interval (CI): 284 deaths, 2 732 deaths) or a health savings of $13.1 billion (95% CI: $0.6 billion, $43.9 billion) These results highlight the potential for renewable energy as a tool for air quality managers to support current and future health-based air quality regulations.« less
Cornelissen, Frans; Cik, Miroslav; Gustin, Emmanuel
2012-04-01
High-content screening has brought new dimensions to cellular assays by generating rich data sets that characterize cell populations in great detail and detect subtle phenotypes. To derive relevant, reliable conclusions from these complex data, it is crucial to have informatics tools supporting quality control, data reduction, and data mining. These tools must reconcile the complexity of advanced analysis methods with the user-friendliness demanded by the user community. After review of existing applications, we realized the possibility of adding innovative new analysis options. Phaedra was developed to support workflows for drug screening and target discovery, interact with several laboratory information management systems, and process data generated by a range of techniques including high-content imaging, multicolor flow cytometry, and traditional high-throughput screening assays. The application is modular and flexible, with an interface that can be tuned to specific user roles. It offers user-friendly data visualization and reduction tools for HCS but also integrates Matlab for custom image analysis and the Konstanz Information Miner (KNIME) framework for data mining. Phaedra features efficient JPEG2000 compression and full drill-down functionality from dose-response curves down to individual cells, with exclusion and annotation options, cell classification, statistical quality controls, and reporting.
Critical maternal health knowledge gaps in low- and middle-income countries for the post-2015 era.
Kendall, Tamil; Langer, Ana
2015-06-05
Effective interventions to promote maternal health and address obstetric complications exist, however 800 women die every day during pregnancy and childbirth from largely preventable causes and more than 90% of these deaths occur in low and middle income countries (LMIC). In 2014, the Maternal Health Task Force consulted 26 global maternal health researchers to identify persistent and critical knowledge gaps to be filled to reduce maternal morbidity and mortality and improve maternal health. The vision of maternal health articulated was comprehensive and priorities for knowledge generation encompassed improving the availability, accessibility, acceptability, and quality of institutional labor and delivery services and other effective interventions, such as contraception and safe abortion services. Respondents emphasized the need for health systems research to identify models that can deliver what is known to be effective to prevent and treat the main causes of maternal death at scale in different contexts and to sustain coverage and quality over time. Researchers also emphasized the development of tools to measure quality of care and promote ongoing quality improvement at the facility, district, and national level. Knowledge generation to improve distribution and retention of healthcare workers, facilitate task shifting, develop and evaluate training models to improve "hands-on" skills and promote evidence-based practice, and increase managerial capacity at different levels of the health system were also prioritized. Interviewees noted that attitudes, behavior, and power relationships between health professionals and within institutions must be transformed to achieve coverage of high-quality maternal health services in LMIC. The increasing burden of non-communicable diseases, urbanization, and the persistence of social and economic inequality were identified as emerging challenges that require knowledge generation to improve health system responses and evaluate progress. Respondents emphasized evaluating effectiveness, feasibility, and equity impacts of health system interventions. A prominent role for implementation science, evidence for policy advocacy, and interdisciplinary collaboration were identified as critical areas for knowledge generation to improve maternal health in the post-2015 era.
Yu, Yang; Wei, Jiankai; Zhang, Xiaojun; Liu, Jingwen; Liu, Chengzhang; Li, Fuhua; Xiang, Jianhai
2014-01-01
The application of next generation sequencing technology has greatly facilitated high throughput single nucleotide polymorphism (SNP) discovery and genotyping in genetic research. In the present study, SNPs were discovered based on two transcriptomes of Litopenaeus vannamei (L. vannamei) generated from Illumina sequencing platform HiSeq 2000. One transcriptome of L. vannamei was obtained through sequencing on the RNA from larvae at mysis stage and its reference sequence was de novo assembled. The data from another transcriptome were downloaded from NCBI and the reads of the two transcriptomes were mapped separately to the assembled reference by BWA. SNP calling was performed using SAMtools. A total of 58,717 and 36,277 SNPs with high quality were predicted from the two transcriptomes, respectively. SNP calling was also performed using the reads of two transcriptomes together, and a total of 96,040 SNPs with high quality were predicted. Among these 96,040 SNPs, 5,242 and 29,129 were predicted as non-synonymous and synonymous SNPs respectively. Characterization analysis of the predicted SNPs in L. vannamei showed that the estimated SNP frequency was 0.21% (one SNP per 476 bp) and the estimated ratio for transition to transversion was 2.0. Fifty SNPs were randomly selected for validation by Sanger sequencing after PCR amplification and 76% of SNPs were confirmed, which indicated that the SNPs predicted in this study were reliable. These SNPs will be very useful for genetic study in L. vannamei, especially for the high density linkage map construction and genome-wide association studies. PMID:24498047
Spatiotemporal database of US congressional elections, 1896–2014
Wolf, Levi John
2017-01-01
High-quality historical data about US Congressional elections has long provided common ground for electoral studies. However, advances in geographic information science have recently made it efficient to compile, distribute, and analyze large spatio-temporal data sets on the structure of US Congressional districts. A single spatio-temporal data set that relates US Congressional election results to the spatial extent of the constituencies has not yet been developed. To address this, existing high-quality data sets of elections returns were combined with a spatiotemporal data set on Congressional district boundaries to generate a new spatio-temporal database of US Congressional election results that are explicitly linked to the geospatial data about the districts themselves. PMID:28809849
Assessment of Reference Height Models on Quality of Tandem-X dem
NASA Astrophysics Data System (ADS)
Mirzaee, S.; Motagh, M.; Arefi, H.
2015-12-01
The aim of this study is to investigate the effect of various Global Digital Elevation Models (GDEMs) in producing high-resolution topography model using TanDEM-X (TDX) Coregistered Single Look Slant Range Complex (CoSSC) images. We selected an image acquired on Jun 12th, 2012 over Doroud region in Lorestan, west of Iran and used 4 external digital elevation models in our processing including DLR/ASI X-SAR DEM (SRTM-X, 30m resolution), ASTER GDEM Version 2 (ASTER-GDEMV2, 30m resolution), NASA SRTM Version 4 (SRTM-V4, 90m resolution), and a local photogrammetry-based DEM prepared by National Cartographic Center (NCC DEM, 10m resolution) of Iran. InSAR procedure for DEM generation was repeated four times with each of the four external height references. The quality of each external DEM was initially assessed using ICESat filtered points. Then, the quality of, each TDX-based DEM was assessed using the more precise external DEM selected in the previous step. Results showed that both local (NCC) DEM and SRTM X-band performed the best (RMSE< 9m) for TDX-DEM generation. In contrast, ASTER GDEM v2 and SRTM C-band v4 showed poorer quality.
Neural network Hilbert transform based filtered backprojection for fast inline x-ray inspection
NASA Astrophysics Data System (ADS)
Janssens, Eline; De Beenhouwer, Jan; Van Dael, Mattias; De Schryver, Thomas; Van Hoorebeke, Luc; Verboven, Pieter; Nicolai, Bart; Sijbers, Jan
2018-03-01
X-ray imaging is an important tool for quality control since it allows to inspect the interior of products in a non-destructive way. Conventional x-ray imaging, however, is slow and expensive. Inline x-ray inspection, on the other hand, can pave the way towards fast and individual quality control, provided that a sufficiently high throughput can be achieved at a minimal cost. To meet these criteria, an inline inspection acquisition geometry is proposed where the object moves and rotates on a conveyor belt while it passes a fixed source and detector. Moreover, for this acquisition geometry, a new neural-network-based reconstruction algorithm is introduced: the neural network Hilbert transform based filtered backprojection. The proposed algorithm is evaluated both on simulated and real inline x-ray data and has shown to generate high quality reconstructions of 400 × 400 reconstruction pixels within 200 ms, thereby meeting the high throughput criteria.
High-quality electron beam generation in a proton-driven hollow plasma wakefield accelerator
NASA Astrophysics Data System (ADS)
Li, Y.; Xia, G.; Lotov, K. V.; Sosedkin, A. P.; Hanahoe, K.; Mete-Apsimon, O.
2017-10-01
Simulations of proton-driven plasma wakefield accelerators have demonstrated substantially higher accelerating gradients compared to conventional accelerators and the viability of accelerating electrons to the energy frontier in a single plasma stage. However, due to the strong intrinsic transverse fields varying both radially and in time, the witness beam quality is still far from suitable for practical application in future colliders. Here we demonstrate the efficient acceleration of electrons in proton-driven wakefields in a hollow plasma channel. In this regime, the witness bunch is positioned in the region with a strong accelerating field, free from plasma electrons and ions. We show that the electron beam carrying the charge of about 10% of 1 TeV proton driver charge can be accelerated to 0.6 TeV with a preserved normalized emittance in a single channel of 700 m. This high-quality and high-charge beam may pave the way for the development of future plasma-based energy frontier colliders.
SU-E-T-43: A Methodology for Quality Control of IMPT Treatment Plan Based On VMAT Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, S; Tianjin Medical University Cancer Institute and Hospital; Yang, Y
Purpose: IMPT plan design is highly dependent on planner’s experiences. VMAT plan design is relatively mature and can even be automated. The quality of IMPT plan designed by in-experienced planner could be inferior to that of VMAT plan designed by experienced planner or automatic planning software. Here we introduce a method for designing IMPT plan based on VMAT plan to ensure the IMPT plan be superior to IMRT/VMAT plan for majority clinical scenario. Methods: To design a new IMPT plan, a VMAT plan is first generated either by experienced planner or by in-house developed automatic planning system. An in-house developedmore » tool is used to generate the dose volume constrains for the IMPT plan as plan template to Eclipse TPS. The beam angles for IMPT plan are selected based on the preferred angles in the VMAT plan. IMPT plan is designed by importing the plan objectives generated from VMAT plan. Majority thoracic IMPT plans are designed using this plan approach in our center. In this work, a thoracic IMPT plan under RTOG 1308 protocol is selected to demonstrate the effectiveness and efficiency of this approach. The dosimetric indices of IMPT are compared with VMAT plan. Results: The PTV D95, lung V20, MLD, mean heart dose, esophagus D1, cord D1 are 70Gy, 31%, 17.8Gy, 25.5Gy, 73Gy, 45Gy for IMPT plan and 65.3Gy, 34%, 21.6Gy, 35Gy, 74Gy, 48Gy for VMAT plan. For majority cases, the high dose region of the normal tissue which is in proximity of PTV is comparable between IMPT and VMAT plan. The low dose region of the IMPT plan is significantly better than VMAT plan. Conclusion: Using the knowledge gained in VMAT plan design can help efficiently and effectively design high quality IMPT plan. The quality of IMPT plan can be controlled to ensure the superiority of IMPT plan compared to VMAT/IMRT plan.« less
Improving the baking quality of bread wheat by genomic selection in early generations.
Michel, Sebastian; Kummer, Christian; Gallee, Martin; Hellinger, Jakob; Ametz, Christian; Akgöl, Batuhan; Epure, Doru; Güngör, Huseyin; Löschenberger, Franziska; Buerstmayr, Hermann
2018-02-01
Genomic selection shows great promise for pre-selecting lines with superior bread baking quality in early generations, 3 years ahead of labour-intensive, time-consuming, and costly quality analysis. The genetic improvement of baking quality is one of the grand challenges in wheat breeding as the assessment of the associated traits often involves time-consuming, labour-intensive, and costly testing forcing breeders to postpone sophisticated quality tests to the very last phases of variety development. The prospect of genomic selection for complex traits like grain yield has been shown in numerous studies, and might thus be also an interesting method to select for baking quality traits. Hence, we focused in this study on the accuracy of genomic selection for laborious and expensive to phenotype quality traits as well as its selection response in comparison with phenotypic selection. More than 400 genotyped wheat lines were, therefore, phenotyped for protein content, dough viscoelastic and mixing properties related to baking quality in multi-environment trials 2009-2016. The average prediction accuracy across three independent validation populations was r = 0.39 and could be increased to r = 0.47 by modelling major QTL as fixed effects as well as employing multi-trait prediction models, which resulted in an acceptable prediction accuracy for all dough rheological traits (r = 0.38-0.63). Genomic selection can furthermore be applied 2-3 years earlier than direct phenotypic selection, and the estimated selection response was nearly twice as high in comparison with indirect selection by protein content for baking quality related traits. This considerable advantage of genomic selection could accordingly support breeders in their selection decisions and aid in efficiently combining superior baking quality with grain yield in newly developed wheat varieties.
Textbook Outcome: A Composite Measure for Quality of Elective Aneurysm Surgery.
Karthaus, Eleonora G; Lijftogt, Niki; Busweiler, Linde A D; Elsman, Bernard H P; Wouters, Michel W J M; Vahl, Anco C; Hamming, Jaap F
2017-11-01
To investigate a new composite quality measurement, which comprises a desirable outcome for elective aneurysm surgery, called "Textbook Outcome" (TO). Single-quality indicators in vascular surgery are often not distinctive and insufficiently reflect the quality of care. All patients undergoing elective abdominal aortic aneurysm repair, registered in the Dutch Surgical Aneurysm Audit between 2014 and 2015 were included. TO was defined as the percentage of patients who had abdominal aortic aneurysm-repair without intraoperative complications, postoperative surgical complications, reinterventions, prolonged hospital stay [endovascular aneurysm repair (EVAR) ≤4 d, open surgical repair (OSR) ≤10 d], readmissions, and postoperative mortality (≤30 d after surgery/at discharge). Case-mix adjusted TO rates were used to compare hospitals and to compare individual hospital results for different procedures. Five thousand one hundred seventy patients were included, of whom 4039 were treated with EVAR and 1131 with OSR. TO was achieved in 71% of EVAR and 53% of OSR. Important obstacles for achieving TO were a prolonged hospital stay, postoperative complications, and readmissions. Adjusted TO rates varied from 38% to 89% (EVAR) and from 0% to 97% (OSR) between individual hospitals. Hospitals with a high TO for OSR also had a high TO for EVAR; however, a high TO for EVAR did not implicate a high TO for OSR. TO generates additional information to evaluate the overall quality of the care of elective aneurysm surgery, which subsequently can be used by hospitals to improve the quality of their care.
A comprehensive evaluation of assembly scaffolding tools
2014-01-01
Background Genome assembly is typically a two-stage process: contig assembly followed by the use of paired sequencing reads to join contigs into scaffolds. Scaffolds are usually the focus of reported assembly statistics; longer scaffolds greatly facilitate the use of genome sequences in downstream analyses, and it is appealing to present larger numbers as metrics of assembly performance. However, scaffolds are highly prone to errors, especially when generated using short reads, which can directly result in inflated assembly statistics. Results Here we provide the first independent evaluation of scaffolding tools for second-generation sequencing data. We find large variations in the quality of results depending on the tool and dataset used. Even extremely simple test cases of perfect input, constructed to elucidate the behaviour of each algorithm, produced some surprising results. We further dissect the performance of the scaffolders using real and simulated sequencing data derived from the genomes of Staphylococcus aureus, Rhodobacter sphaeroides, Plasmodium falciparum and Homo sapiens. The results from simulated data are of high quality, with several of the tools producing perfect output. However, at least 10% of joins remains unidentified when using real data. Conclusions The scaffolders vary in their usability, speed and number of correct and missed joins made between contigs. Results from real data highlight opportunities for further improvements of the tools. Overall, SGA, SOPRA and SSPACE generally outperform the other tools on our datasets. However, the quality of the results is highly dependent on the read mapper and genome complexity. PMID:24581555
Wang, Qian; Zhang, Qionghua; Dzakpasu, Mawuli; Lian, Bin; Wu, Yaketon; Wang, Xiaochang C
2018-03-01
Stormwater particles washed from road-deposited sediments (RDS) are traditionally characterized as either turbidity or total suspended solids (TSS). Although these parameters are influenced by particle sizes, neither of them characterizes the particle size distribution (PSD), which is of great importance in pollutant entrainment and treatment performance. Therefore, the ratio of turbidity to TSS (Tur/TSS) is proposed and validated as a potential surrogate for the bulk PSD and quality of stormwater runoff. The results show an increasing trend of Tur/TSS with finer sizes of both RDS and stormwater runoff. Taking heavy metals (HMs, including Cu, Pb, Zn, Cr, and Ni) as typical pollutants in stormwater runoff, the concentrations (mg/kg) were found to vary significantly during rainfall events and tended to increase significantly with Tur/TSS. Therefore, Tur/TSS is a valid parameter to characterize the PSD and quality of stormwater. The high negative correlations between Tur/TSS and rainfall intensity demonstrate that stormwater with higher Tur/TSS generates under low intensity and, thus, characterizes small volume, finer sizes, weak settleability, greater mobility, and bioavailability. Conversely, stormwater with lower Tur/TSS generates under high intensity and, thus, characterizes large volume, coarser sizes, good settleability, low mobility, and bioavailability. These results highlight the need to control stormwater with high Tur/TSS. Moreover, Tur/TSS can aid the selection of stormwater control measures with appropriate detention storage, pollution loading, and removal effectiveness of particles.
High-throughput genotyping of hop (Humulus lupulus L.) utilising diversity arrays technology (DArT).
Howard, E L; Whittock, S P; Jakše, J; Carling, J; Matthews, P D; Probasco, G; Henning, J A; Darby, P; Cerenak, A; Javornik, B; Kilian, A; Koutoulis, A
2011-05-01
Implementation of molecular methods in hop (Humulus lupulus L.) breeding is dependent on the availability of sizeable numbers of polymorphic markers and a comprehensive understanding of genetic variation. However, use of molecular marker technology is limited due to expense, time inefficiency, laborious methodology and dependence on DNA sequence information. Diversity arrays technology (DArT) is a high-throughput cost-effective method for the discovery of large numbers of quality polymorphic markers without reliance on DNA sequence information. This study is the first to utilise DArT for hop genotyping, identifying 730 polymorphic markers from 92 hop accessions. The marker quality was high and similar to the quality of DArT markers previously generated for other species; although percentage polymorphism and polymorphism information content (PIC) were lower than in previous studies deploying other marker systems in hop. Genetic relationships in hop illustrated by DArT in this study coincide with knowledge generated using alternate methods. Several statistical analyses separated the hop accessions into genetically differentiated North American and European groupings, with hybrids between the two groups clearly distinguishable. Levels of genetic diversity were similar in the North American and European groups, but higher in the hybrid group. The markers produced from this time and cost-efficient genotyping tool will be a valuable resource for numerous applications in hop breeding and genetics studies, such as mapping, marker-assisted selection, genetic identity testing, guidance in the maintenance of genetic diversity and the directed breeding of superior cultivars.
Role of H2O in Generating Subduction Zone Earthquakes
NASA Astrophysics Data System (ADS)
Hasegawa, A.
2017-03-01
A dense nationwide seismic network and high seismic activity in Japan have provided a large volume of high-quality data, enabling high-resolution imaging of the seismic structures defining the Japanese subduction zones. Here, the role of H2O in generating earthquakes in subduction zones is discussed based mainly on recent seismic studies in Japan using these high-quality data. Locations of intermediate-depth intraslab earthquakes and seismic velocity and attenuation structures within the subducted slab provide evidence that strongly supports intermediate-depth intraslab earthquakes, although the details leading to the earthquake rupture are still poorly understood. Coseismic rotations of the principal stress axes observed after great megathrust earthquakes demonstrate that the plate interface is very weak, which is probably caused by overpressured fluids. Detailed tomographic imaging of the seismic velocity structure in and around plate boundary zones suggests that interplate coupling is affected by local fluid overpressure. Seismic tomography studies also show the presence of inclined sheet-like seismic low-velocity, high-attenuation zones in the mantle wedge. These may correspond to the upwelling flow portion of subduction-induced secondary convection in the mantle wedge. The upwelling flows reach the arc Moho directly beneath the volcanic areas, suggesting a direct relationship. H2O originally liberated from the subducted slab is transported by this upwelling flow to the arc crust. The H2O that reaches the crust is overpressured above hydrostatic values, weakening the surrounding crustal rocks and decreasing the shear strength of faults, thereby inducing shallow inland earthquakes. These observations suggest that H2O expelled from the subducting slab plays an important role in generating subduction zone earthquakes both within the subduction zone itself and within the magmatic arc occupying its hanging wall.
NASA Astrophysics Data System (ADS)
Papapostolou, Vasileios; Zhang, Hang; Feenstra, Brandon J.; Polidori, Andrea
2017-12-01
A state-of-the-art integrated chamber system has been developed for evaluating the performance of low-cost air quality sensors. The system contains two professional grade chamber enclosures. A 1.3 m3 stainless-steel outer chamber and a 0.11 m3 Teflon-coated stainless-steel inner chamber are used to create controlled aerosol and gaseous atmospheres, respectively. Both chambers are temperature and relative humidity controlled with capability to generate a wide range of environmental conditions. The system is equipped with an integrated zero-air system, an ozone and two aerosol generation systems, a dynamic dilution calibrator, certified gas cylinders, an array of Federal Reference Method (FRM), Federal Equivalent Method (FEM), and Best Available Technology (BAT) reference instruments and an automated control and sequencing software. Our experiments have demonstrated that the chamber system is capable of generating stable and reproducible aerosol and gas concentrations at low, medium, and high levels. This paper discusses the development of the chamber system along with the methods used to quantitatively evaluate sensor performance. Considering that a significant number of academic and research institutions, government agencies, public and private institutions, and individuals are becoming interested in developing and using low-cost air quality sensors, it is important to standardize the procedures used to evaluate their performance. The information discussed herein provides a roadmap for entities who are interested in characterizing air quality sensors in a rigorous, systematic and reproducible manner.
A next generation air quality modeling system is being developed at the U.S. EPA to enable seamless modeling of air quality from global to regional to (eventually) local scales. State of the science chemistry and aerosol modules from the Community Multiscale Air Quality (CMAQ) mo...
Assessment of shrimp farming impact on groundwater quality using analytical hierarchy process
NASA Astrophysics Data System (ADS)
Anggie, Bernadietta; Subiyanto, Arief, Ulfah Mediaty; Djuniadi
2018-03-01
Improved shrimp farming affects the groundwater quality conditions. Assessment of shrimp farming impact on groundwater quality conventionally has less accuracy. This paper presents the implementation of Analytical Hierarchy Process (AHP) method for assessing shrimp farming impact on groundwater quality. The data used is the impact data of shrimp farming in one of the regions in Indonesia from 2006-2016. Criteria used in this study were 8 criteria and divided into 49 sub-criteria. The weighting by AHP performed to determine the importance level of criteria and sub-criteria. Final priority class of shrimp farming impact were obtained from the calculation of criteria's and sub-criteria's weights. The validation was done by comparing priority class of shrimp farming impact and water quality conditions. The result show that 50% of the total area was moderate priority class, 37% was low priority class and 13% was high priority class. From the validation result impact assessment for shrimp farming has been high accuracy to the groundwater quality conditions. This study shows that assessment based on AHP has a higher accuracy to shrimp farming impact and can be used as the basic fisheries planning to deal with impacts that have been generated.
Albion: the UK 3rd generation high-performance thermal imaging programme
NASA Astrophysics Data System (ADS)
McEwen, R. K.; Lupton, M.; Lawrence, M.; Knowles, P.; Wilson, M.; Dennis, P. N. J.; Gordon, N. T.; Lees, D. J.; Parsons, J. F.
2007-04-01
The first generation of high performance thermal imaging sensors in the UK was based on two axis opto-mechanical scanning systems and small (4-16 element) arrays of the SPRITE detector, developed during the 1970s. Almost two decades later, a 2nd Generation system, STAIRS C was introduced, based on single axis scanning and a long linear array of approximately 3000 elements. The UK has now begun the industrialisation of 3 rd Generation High Performance Thermal Imaging under a programme known as "Albion". Three new high performance cadmium mercury telluride arrays are being manufactured. The CMT material is grown by MOVPE on low cost substrates and bump bonded to the silicon read out circuit (ROIC). To maintain low production costs, all three detectors are designed to fit with existing standard Integrated Detector Cooling Assemblies (IDCAs). The two largest focal planes are conventional devices operating in the MWIR and LWIR spectral bands. A smaller format LWIR device is also described which has a smart ROIC, enabling much longer stare times than are feasible with conventional pixel circuits, thus achieving very high sensitivity. A new reference surface technology for thermal imaging sensors is described, based on Negative Luminescence (NL), which offers several advantages over conventional peltier references, improving the quality of the Non-Uniformity Correction (NUC) algorithms.
Fungal-to-bacterial dominance of soil detrital food-webs: Consequences for biogeochemistry
NASA Astrophysics Data System (ADS)
Rousk, Johannes; Frey, Serita
2015-04-01
Resolving fungal and bacterial groups within the microbial decomposer community is thought to capture disparate microbial life strategies, associating bacteria with an r-selected strategy for carbon (C) and nutrient use, and fungi with a K-selected strategy. Additionally, food-web models have established a widely held belief that the bacterial decomposer pathway in soil supports high turnover rates of easily available substrates, while the slower fungal pathway supports the decomposition of more complex organic material, thus characterising the biogeochemistry of the ecosystem. Three field-experiments to generate gradients of SOC-quality were assessed. (1) the Detritus Input, Removal, and Trenching - DIRT - experiment in a temperate forest in mixed hardwood stands at Harvard Forest LTER, US. There, experimentally adjusted litter input and root input had affected the SOC quality during 23 years. (2) field-application of 14-C labelled glucose to grassland soils, sampled over the course of 13 months to generate an age-gradient of SOM (1 day - 13 months). (3) The Park Grass Experiment at Rothamsted, UK, where 150-years continuous N-fertilisation (0, 50, 100, 150 kg N ha-1 y-1) has affected the quality of SOM in grassland soils. A combination of carbon stable and radio isotope studies, fungal and bacterial growth and biomass measurements, and C and N mineralisation (15N pool dilution) assays were used to investigate how SOC-quality influenced fungal and bacterial food-web pathways and the implications this had for C and nutrient turnover. There was no support that decomposer food-webs dominated by bacteria support high turnover rates of easily available substrates, while slower fungal-dominated decomposition pathways support the decomposition of more complex organic material. Rather, an association between high quality SOC and fungi emerges from the results. This suggests that we need to revise our basic understanding for soil microbial communities and the processes they regulate in soil.
Local environmental quality positively predicts breastfeeding in the UK’s Millennium Cohort Study
Sear, Rebecca
2017-01-01
ABSTRACT Background and Objectives: Breastfeeding is an important form of parental investment with clear health benefits. Despite this, rates remain low in the UK; understanding variation can therefore help improve interventions. Life history theory suggests that environmental quality may pattern maternal investment, including breastfeeding. We analyse a nationally representative dataset to test two predictions: (i) higher local environmental quality predicts higher likelihood of breastfeeding initiation and longer duration; (ii) higher socioeconomic status (SES) provides a buffer against the adverse influences of low local environmental quality. Methodology: We ran factor analysis on a wide range of local-level environmental variables. Two summary measures of local environmental quality were generated by this analysis—one ‘objective’ (based on an independent assessor’s neighbourhood scores) and one ‘subjective’ (based on respondent’s scores). We used mixed-effects regression techniques to test our hypotheses. Results: Higher objective, but not subjective, local environmental quality predicts higher likelihood of starting and maintaining breastfeeding over and above individual SES and area-level measures of environmental quality. Higher individual SES is protective, with women from high-income households having relatively high breastfeeding initiation rates and those with high status jobs being more likely to maintain breastfeeding, even in poor environmental conditions. Conclusions and Implications: Environmental quality is often vaguely measured; here we present a thorough investigation of environmental quality at the local level, controlling for individual- and area-level measures. Our findings support a shift in focus away from individual factors and towards altering the landscape of women’s decision making contexts when considering behaviours relevant to public health. PMID:29354262
Local environmental quality positively predicts breastfeeding in the UK's Millennium Cohort Study.
Brown, Laura J; Sear, Rebecca
2017-01-01
Background and Objectives: Breastfeeding is an important form of parental investment with clear health benefits. Despite this, rates remain low in the UK; understanding variation can therefore help improve interventions. Life history theory suggests that environmental quality may pattern maternal investment, including breastfeeding. We analyse a nationally representative dataset to test two predictions: (i) higher local environmental quality predicts higher likelihood of breastfeeding initiation and longer duration; (ii) higher socioeconomic status (SES) provides a buffer against the adverse influences of low local environmental quality. Methodology: We ran factor analysis on a wide range of local-level environmental variables. Two summary measures of local environmental quality were generated by this analysis-one 'objective' (based on an independent assessor's neighbourhood scores) and one 'subjective' (based on respondent's scores). We used mixed-effects regression techniques to test our hypotheses. Results: Higher objective, but not subjective, local environmental quality predicts higher likelihood of starting and maintaining breastfeeding over and above individual SES and area-level measures of environmental quality. Higher individual SES is protective, with women from high-income households having relatively high breastfeeding initiation rates and those with high status jobs being more likely to maintain breastfeeding, even in poor environmental conditions. Conclusions and Implications: Environmental quality is often vaguely measured; here we present a thorough investigation of environmental quality at the local level, controlling for individual- and area-level measures. Our findings support a shift in focus away from individual factors and towards altering the landscape of women's decision making contexts when considering behaviours relevant to public health.
Assembly and diploid architecture of an individual human genome via single-molecule technologies
Pendleton, Matthew; Sebra, Robert; Pang, Andy Wing Chun; Ummat, Ajay; Franzen, Oscar; Rausch, Tobias; Stütz, Adrian M; Stedman, William; Anantharaman, Thomas; Hastie, Alex; Dai, Heng; Fritz, Markus Hsi-Yang; Cao, Han; Cohain, Ariella; Deikus, Gintaras; Durrett, Russell E; Blanchard, Scott C; Altman, Roger; Chin, Chen-Shan; Guo, Yan; Paxinos, Ellen E; Korbel, Jan O; Darnell, Robert B; McCombie, W Richard; Kwok, Pui-Yan; Mason, Christopher E; Schadt, Eric E; Bashir, Ali
2015-01-01
We present the first comprehensive analysis of a diploid human genome that combines single-molecule sequencing with single-molecule genome maps. Our hybrid assembly markedly improves upon the contiguity observed from traditional shotgun sequencing approaches, with scaffold N50 values approaching 30 Mb, and we identified complex structural variants (SVs) missed by other high-throughput approaches. Furthermore, by combining Illumina short-read data with long reads, we phased both single-nucleotide variants and SVs, generating haplotypes with over 99% consistency with previous trio-based studies. Our work shows that it is now possible to integrate single-molecule and high-throughput sequence data to generate de novo assembled genomes that approach reference quality. PMID:26121404
Tablet—next generation sequence assembly visualization
Milne, Iain; Bayer, Micha; Cardle, Linda; Shaw, Paul; Stephen, Gordon; Wright, Frank; Marshall, David
2010-01-01
Summary: Tablet is a lightweight, high-performance graphical viewer for next-generation sequence assemblies and alignments. Supporting a range of input assembly formats, Tablet provides high-quality visualizations showing data in packed or stacked views, allowing instant access and navigation to any region of interest, and whole contig overviews and data summaries. Tablet is both multi-core aware and memory efficient, allowing it to handle assemblies containing millions of reads, even on a 32-bit desktop machine. Availability: Tablet is freely available for Microsoft Windows, Apple Mac OS X, Linux and Solaris. Fully bundled installers can be downloaded from http://bioinf.scri.ac.uk/tablet in 32- and 64-bit versions. Contact: tablet@scri.ac.uk PMID:19965881
Assembly and diploid architecture of an individual human genome via single-molecule technologies.
Pendleton, Matthew; Sebra, Robert; Pang, Andy Wing Chun; Ummat, Ajay; Franzen, Oscar; Rausch, Tobias; Stütz, Adrian M; Stedman, William; Anantharaman, Thomas; Hastie, Alex; Dai, Heng; Fritz, Markus Hsi-Yang; Cao, Han; Cohain, Ariella; Deikus, Gintaras; Durrett, Russell E; Blanchard, Scott C; Altman, Roger; Chin, Chen-Shan; Guo, Yan; Paxinos, Ellen E; Korbel, Jan O; Darnell, Robert B; McCombie, W Richard; Kwok, Pui-Yan; Mason, Christopher E; Schadt, Eric E; Bashir, Ali
2015-08-01
We present the first comprehensive analysis of a diploid human genome that combines single-molecule sequencing with single-molecule genome maps. Our hybrid assembly markedly improves upon the contiguity observed from traditional shotgun sequencing approaches, with scaffold N50 values approaching 30 Mb, and we identified complex structural variants (SVs) missed by other high-throughput approaches. Furthermore, by combining Illumina short-read data with long reads, we phased both single-nucleotide variants and SVs, generating haplotypes with over 99% consistency with previous trio-based studies. Our work shows that it is now possible to integrate single-molecule and high-throughput sequence data to generate de novo assembled genomes that approach reference quality.
Construction of CRISPR Libraries for Functional Screening.
Carstens, Carsten P; Felts, Katherine A; Johns, Sarah E
2018-01-01
Identification of gene function has been aided by the ability to generate targeted gene knockouts or transcriptional repression using the CRISPR/CAS9 system. Using pooled libraries of guide RNA expression vectors that direct CAS9 to a specific genomic site allows identification of genes that are either enriched or depleted in response to a selection scheme, thus linking the affected gene to the chosen phenotype. The quality of the data generated by the screening is dependent on the quality of the guide RNA delivery library with regards to error rates and especially evenness of distribution of the guides. Here, we describe a method for constructing complex plasmid libraries based on pooled designed oligomers with high representation and tight distributions. The procedure allows construction of plasmid libraries of >60,000 members with a 95th/5th percentile ratio of less than 3.5.
Vandelanotte, Corneel; Kirwan, Morwenna; Rebar, Amanda; Alley, Stephanie; Short, Camille; Fallon, Luke; Buzza, Gavin; Schoeppe, Stephanie; Maher, Carol; Duncan, Mitch J
2014-08-17
It has been shown that physical activity is more likely to increase if web-based interventions apply evidence-based components (e.g. self-monitoring) and incorporate interactive social media applications (e.g. social networking), but it is unclear to what extent these are being utilized in the publicly available web-based physical activity interventions. The purpose of this study was to evaluate whether freely accessible websites delivering physical activity interventions use evidence-based behavior change techniques and provide social media applications. In 2013, a systematic search strategy examined 750 websites. Data was extracted on a wide range of variables (e.g. self-monitoring, goal setting, and social media applications). To evaluate website quality a new tool, comprising three sub-scores (Behavioral Components, Interactivity and User Generated Content), was developed to assess implementation of behavior change techniques and social media applications. An overall website quality scored was obtained by summing the three sub-scores. Forty-six publicly available websites were included in the study. The use of self-monitoring (54.3%), goal setting (41.3%) and provision of feedback (46%) was relatively low given the amount of evidence supporting these features. Whereas the presence of features allowing users to generate content (73.9%), and social media components (Facebook (65.2%), Twitter (47.8%), YouTube (48.7%), smartphone applications (34.8%)) was relatively high considering their innovative and untested nature. Nearly all websites applied some behavioral and social media applications. The average Behavioral Components score was 3.45 (±2.53) out of 10. The average Interactivity score was 3.57 (±2.16) out of 10. The average User Generated Content Score was 4.02 (±2.77) out of 10. The average overall website quality score was 11.04 (±6.92) out of 30. Four websites (8.7%) were classified as high quality, 12 websites (26.1%) were classified as moderate quality, and 30 websites (65.2%) were classified as low quality. Despite large developments in Internet technology and growth in the knowledge of how to develop more effective web-based interventions, overall website quality was low and the majority of freely available physical activity websites lack the components associated with behavior change. However, the results show that website quality can be improved by taking a number of simple steps, and the presence of social media applications in most websites is encouraging.
Comparative Genomics as a Foundation for Evo-Devo Studies in Birds.
Grayson, Phil; Sin, Simon Y W; Sackton, Timothy B; Edwards, Scott V
2017-01-01
Developmental genomics is a rapidly growing field, and high-quality genomes are a useful foundation for comparative developmental studies. A high-quality genome forms an essential reference onto which the data from numerous assays and experiments, including ChIP-seq, ATAC-seq, and RNA-seq, can be mapped. A genome also streamlines and simplifies the development of primers used to amplify putative regulatory regions for enhancer screens, cDNA probes for in situ hybridization, microRNAs (miRNAs) or short hairpin RNAs (shRNA) for RNA interference (RNAi) knockdowns, mRNAs for misexpression studies, and even guide RNAs (gRNAs) for CRISPR knockouts. Finally, much can be gleaned from comparative genomics alone, including the identification of highly conserved putative regulatory regions. This chapter provides an overview of laboratory and bioinformatics protocols for DNA extraction, library preparation, library quantification, and genome assembly, from fresh or frozen tissue to a draft avian genome. Generating a high-quality draft genome can provide a developmental research group with excellent resources for their study organism, opening the doors to many additional assays and experiments.
Zhang, Chenchu; Hu, Yanlei; Du, Wenqiang; Wu, Peichao; Rao, Shenglong; Cai, Ze; Lao, Zhaoxin; Xu, Bing; Ni, Jincheng; Li, Jiawen; Zhao, Gang; Wu, Dong; Chu, Jiaru; Sugioka, Koji
2016-01-01
Rapid integration of high-quality functional devices in microchannels is in highly demand for miniature lab-on-a-chip applications. This paper demonstrates the embellishment of existing microfluidic devices with integrated micropatterns via femtosecond laser MRAF-based holographic patterning (MHP) microfabrication, which proves two-photon polymerization (TPP) based on spatial light modulator (SLM) to be a rapid and powerful technology for chip functionalization. Optimized mixed region amplitude freedom (MRAF) algorithm has been used to generate high-quality shaped focus field. Base on the optimized parameters, a single-exposure approach is developed to fabricate 200 × 200 μm microstructure arrays in less than 240 ms. Moreover, microtraps, QR code and letters are integrated into a microdevice by the advanced method for particles capture and device identification. These results indicate that such a holographic laser embellishment of microfluidic devices is simple, flexible and easy to access, which has great potential in lab-on-a-chip applications of biological culture, chemical analyses and optofluidic devices. PMID:27619690
Roy, Alexis T; Carver, Courtney; Jiradejvong, Patpong; Limb, Charles J
2015-01-01
Med-El cochlear implant (CI) patients are typically programmed with either the fine structure processing (FSP) or high-definition continuous interleaved sampling (HDCIS) strategy. FSP is the newer-generation strategy and aims to provide more direct encoding of fine structure information compared with HDCIS. Since fine structure information is extremely important in music listening, FSP may offer improvements in musical sound quality for CI users. Despite widespread clinical use of both strategies, few studies have assessed the possible benefits in music perception for the FSP strategy. The objective of this study is to measure the differences in musical sound quality discrimination between the FSP and HDCIS strategies. Musical sound quality discrimination was measured using a previously designed evaluation, called Cochlear Implant-MUltiple Stimulus with Hidden Reference and Anchor (CI-MUSHRA). In this evaluation, participants were required to detect sound quality differences between an unaltered real-world musical stimulus and versions of the stimulus in which various amount of bass (low) frequency information was removed via a high-pass filer. Eight CI users, currently using the FSP strategy, were enrolled in this study. In the first session, participants completed the CI-MUSHRA evaluation with their FSP strategy. Patients were then programmed with the clinical-default HDCIS strategy, which they used for 2 months to allow for acclimatization. After acclimatization, each participant returned for the second session, during which they were retested with HDCIS, and then switched back to their original FSP strategy and tested acutely. Sixteen normal-hearing (NH) controls completed a CI-MUSHRA evaluation for comparison, in which NH controls listened to music samples under normal acoustic conditions, without CI stimulation. Sensitivity to high-pass filtering more closely resembled that of NH controls when CI users were programmed with the clinical-default FSP strategy compared with performance when programmed with HDCIS (mixed-design analysis of variance, p < 0.05). The clinical-default FSP strategy offers improvements in musical sound quality discrimination for CI users with respect to bass frequency perception. This improved bass frequency discrimination may in turn support enhanced musical sound quality. This is the first study that has demonstrated objective improvements in musical sound quality discrimination with the newer-generation FSP strategy. These positive results may help guide the selection of processing strategies for Med-El CI patients. In addition, CI-MUSHRA may also provide a novel method for assessing the benefits of newer processing strategies in the future.
Gyroharmonic conversion experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirshfield, J.L.; LaPointe, M.A.; Ganguly, A.K.
1999-05-01
Generation of high power microwaves has been observed in experiments where a 250{endash}350 kV, 20{endash}30 A electron beam accelerated in a cyclotron autoresonance accelerator (CARA) passes through a cavity tuned gyroharmonic) and at 8.6 GHz (3rd harmonic) will be described. Theory indicates that high conversion efficiency can be obtained for a high quality beam injected into CARA, and when mode competition can be controlled. Comparisons will be made between the experiments and theory. Planned 7th harmonic experiments will also be described, in which phase matching between the TE-72 mode at 20 GHz, and the TE-11 mode at 2.86 GHz, allowsmore » efficient 20 GHz co-generation within the CARA waveguide itself. {copyright} {ital 1999 American Institute of Physics.}« less
Highly efficient 400 W near-fundamental-mode green thin-disk laser.
Piehler, Stefan; Dietrich, Tom; Rumpel, Martin; Graf, Thomas; Ahmed, Marwan Abdou
2016-01-01
We report on the efficient generation of continuous-wave, high-brightness green laser radiation. Green lasers are particularly interesting for reliable and reproducible deep-penetration welding of copper or for pumping Ti:Sa oscillators. By intracavity second-harmonic generation in a thin-disk laser resonator designed for fundamental-mode operation, an output power of up to 403 W is demonstrated at a wavelength of 515 nm with almost diffraction-limited beam quality. The unprecedented optical efficiency of 40.7% of green output power with respect to the pump power of the thin-disk laser is enabled by the intracavity use of a highly efficient grating waveguide mirror, which combines the functions of wavelength stabilization and spectral narrowing, as well as polarization selection in a single element.
Effect of fat content on aroma generation during processing of dry fermented sausages.
Olivares, Alicia; Navarro, José Luis; Flores, Mónica
2011-03-01
Dry fermented sausages with different fat contents were produced (10%, 20% and 30%). The effect of fat content and ripening time on sensory characteristics, lipolysis, lipid oxidation and volatile compounds generation was studied. Also, the key aroma components were identified using gas chromatography (GC) and olfactometry. High fat sausages showed the highest lipolysis and lipid oxidation, determined by free fatty acid content and thiobarbituric acid reactive substances (TBARS), respectively. A total of 95 volatile compounds were identified using SPME, GC and mass spectrometry (MS). Fat reduction decreased the generation of lipid derived volatile compounds during processing while those generated from bacterial metabolism increased, although only at the first stages of processing. The consumers preference in aroma and overall quality of high and medium fat sausages was related to the aroma compounds hexanal, 2-nonenal, 2,4-nonadienal, ethyl butanoate and 1-octen-3-ol which contributed green, medicinal, tallowy, fruity and mushroom notes. Copyright © 2010 The American Meat Science Association. Published by Elsevier Ltd. All rights reserved.
Deleger, Louise; Li, Qi; Kaiser, Megan; Stoutenborough, Laura
2013-01-01
Background A high-quality gold standard is vital for supervised, machine learning-based, clinical natural language processing (NLP) systems. In clinical NLP projects, expert annotators traditionally create the gold standard. However, traditional annotation is expensive and time-consuming. To reduce the cost of annotation, general NLP projects have turned to crowdsourcing based on Web 2.0 technology, which involves submitting smaller subtasks to a coordinated marketplace of workers on the Internet. Many studies have been conducted in the area of crowdsourcing, but only a few have focused on tasks in the general NLP field and only a handful in the biomedical domain, usually based upon very small pilot sample sizes. In addition, the quality of the crowdsourced biomedical NLP corpora were never exceptional when compared to traditionally-developed gold standards. The previously reported results on medical named entity annotation task showed a 0.68 F-measure based agreement between crowdsourced and traditionally-developed corpora. Objective Building upon previous work from the general crowdsourcing research, this study investigated the usability of crowdsourcing in the clinical NLP domain with special emphasis on achieving high agreement between crowdsourced and traditionally-developed corpora. Methods To build the gold standard for evaluating the crowdsourcing workers’ performance, 1042 clinical trial announcements (CTAs) from the ClinicalTrials.gov website were randomly selected and double annotated for medication names, medication types, and linked attributes. For the experiments, we used CrowdFlower, an Amazon Mechanical Turk-based crowdsourcing platform. We calculated sensitivity, precision, and F-measure to evaluate the quality of the crowd’s work and tested the statistical significance (P<.001, chi-square test) to detect differences between the crowdsourced and traditionally-developed annotations. Results The agreement between the crowd’s annotations and the traditionally-generated corpora was high for: (1) annotations (0.87, F-measure for medication names; 0.73, medication types), (2) correction of previous annotations (0.90, medication names; 0.76, medication types), and excellent for (3) linking medications with their attributes (0.96). Simple voting provided the best judgment aggregation approach. There was no statistically significant difference between the crowd and traditionally-generated corpora. Our results showed a 27.9% improvement over previously reported results on medication named entity annotation task. Conclusions This study offers three contributions. First, we proved that crowdsourcing is a feasible, inexpensive, fast, and practical approach to collect high-quality annotations for clinical text (when protected health information was excluded). We believe that well-designed user interfaces and rigorous quality control strategy for entity annotation and linking were critical to the success of this work. Second, as a further contribution to the Internet-based crowdsourcing field, we will publicly release the JavaScript and CrowdFlower Markup Language infrastructure code that is necessary to utilize CrowdFlower’s quality control and crowdsourcing interfaces for named entity annotations. Finally, to spur future research, we will release the CTA annotations that were generated by traditional and crowdsourced approaches. PMID:23548263
Mateescu, R G; Oltenacu, P A; Garmyn, A J; Mafi, G G; VanOverbeke, D L
2016-05-01
Product quality is a high priority for the beef industry because of its importance as a major driver of consumer demand for beef and the ability of the industry to improve it. A 2-prong approach based on implementation of a genetic program to improve eating quality and a system to communicate eating quality and increase the probability that consumers' eating quality expectations are met is outlined. The objectives of this study were 1) to identify the best carcass and meat composition traits to be used in a selection program to improve eating quality and 2) to develop a relatively small number of classes that reflect real and perceptible differences in eating quality that can be communicated to consumers and identify a subset of carcass and meat composition traits with the highest predictive accuracy across all eating quality classes. Carcass traits, meat composition, including Warner-Bratzler shear force (WBSF), intramuscular fat content (IMFC), trained sensory panel scores, and mineral composition traits of 1,666 Angus cattle were used in this study. Three eating quality indexes, EATQ1, EATQ2, and EATQ3, were generated by using different weights for the sensory traits (emphasis on tenderness, flavor, and juiciness, respectively). The best model for predicting eating quality explained 37%, 9%, and 19% of the variability of EATQ1, EATQ2, and EATQ3, and 2 traits, WBSF and IMFC, accounted for most of the variability explained by the best models. EATQ1 combines tenderness, juiciness, and flavor assessed by trained panels with 0.60, 0.15, and 0.25 weights, best describes North American consumers, and has a moderate heritability (0.18 ± 0.06). A selection index (I= -0.5[WBSF] + 0.3[IMFC]) based on phenotypic and genetic variances and covariances can be used to improve eating quality as a correlated trait. The 3 indexes (EATQ1, EATQ2, and EATQ3) were used to generate 3 equal (33.3%) low, medium, and high eating quality classes, and linear combinations of traits that best predict class membership were estimated using a predictive discriminant analysis. The best predictive model to classify new observations into low, medium, and high eating quality classes defined by the EATQ1 index included WBSF, IMFC, HCW, and marbling score and resulted in a total error rate of 47.06%, much lower than the 60.74% error rate when the prediction of class membership was based on the USDA grading system. The 2 best predictors were WBSF and IMFC, and they accounted for 97.2% of the variability explained by the best model.
Reviews of a Diode-Pumped Alkali Laser (DPAL): a potential high powered light source
NASA Astrophysics Data System (ADS)
Cai, He; Wang, You; Han, Juhong; An, Guofei; Zhang, Wei; Xue, Liangping; Wang, Hongyuan; Zhou, Jie; Gao, Ming; Jiang, Zhigang
2015-03-01
Diode pumped alkali vapor lasers (DPALs) were first developed by in W. F. Krupke at the beginning of the 21th century. In the recent years, DPALs have been rapidly developed because of their high Stokes efficiency, good beam quality, compact size and near-infrared emission wavelengths. The Stokes efficiency of a DPAL can achieve a miraculous level as high as 95.3% for cesium (Cs), 98.1% for rubidium (Rb), and 99.6% for potassium (K), respectively. The thermal effect of a DPAL is theoretically smaller than that of a normal diode-pumped solid-state laser (DPSSL). Additionally, generated heat of a DPAL can be removed by circulating the gases inside a sealed system. Therefore, the thermal management would be relatively simple for realization of a high-powered DPAL. In the meantime, DPALs combine the advantages of both DPSSLs and normal gas lasers but evade the disadvantages of them. Generally, the collisionally broadened cross sections of both the D1 and the D2 lines for a DPAL are much larger than those for the most conventional solid-state, fiber and gas lasers. Thus, DPALs provide an outstanding potentiality for realization of high-powered laser systems. It has been shown that a DPAL is now becoming one of the most promising candidates for simultaneously achieving good beam quality and high output power. With a lot of marvelous merits, a DPAL becomes one of the most hopeful high-powered laser sources of next generation.
NASA Astrophysics Data System (ADS)
Salleh, M. R. M.; Ismail, Z.; Rahman, M. Z. A.
2015-10-01
Airborne Light Detection and Ranging (LiDAR) technology has been widely used recent years especially in generating high accuracy of Digital Terrain Model (DTM). High density and good quality of airborne LiDAR data promises a high quality of DTM. This study focussing on the analysing the error associated with the density of vegetation cover (canopy cover) and terrain slope in a LiDAR derived-DTM value in a tropical forest environment in Bentong, State of Pahang, Malaysia. Airborne LiDAR data were collected can be consider as low density captured by Reigl system mounted on an aircraft. The ground filtering procedure use adaptive triangulation irregular network (ATIN) algorithm technique in producing ground points. Next, the ground control points (GCPs) used in generating the reference DTM and these DTM was used for slope classification and the point clouds belong to non-ground are then used in determining the relative percentage of canopy cover. The results show that terrain slope has high correlation for both study area (0.993 and 0.870) with the RMSE of the LiDAR-derived DTM. This is similar to canopy cover where high value of correlation (0.989 and 0.924) obtained. This indicates that the accuracy of airborne LiDAR-derived DTM is significantly affected by terrain slope and canopy caver of study area.
Disorganizing experiences in second- and third-generation holocaust survivors.
Scharf, Miri; Mayseless, Ofra
2011-11-01
Second-generation Holocaust survivors might not show direct symptoms of posttraumatic stress disorder or attachment disorganization, but are at risk for developing high levels of psychological distress. We present themes of difficult experiences of second-generation Holocaust survivors, arguing that some of these aversive experiences might have disorganizing qualities even though they do not qualify as traumatic. Based on in-depth interviews with 196 second-generation parents and their adolescent children, three themes of disorganizing experiences carried across generations were identified: focus on survival issues, lack of emotional resources, and coercion to please the parents and satisfy their needs. These themes reflect the frustration of three basic needs: competence, relatedness, and autonomy, and this frustration becomes disorganizing when it involves stability, potency, incomprehensibility, and helplessness. The findings shed light on the effect of trauma over the generations and, as such, equip therapists with a greater understanding of the mechanisms involved.
Wide band continuous all-fiber comb generator at 1.5 micron
NASA Astrophysics Data System (ADS)
Lemaître, François; Mondin, Linda; Orlik, X.
2017-11-01
We present an all-fiber continuous optical frequency comb-generator (OFCG) able to generate over 6 nm (750 GHz) at 1560 nm using a combination of electro-optic and acousto-optic modulations. As opposed to numerous experimental setups that use the longitudinal modes of an optical cavity to generate continuous optical frequency combs, our setup doesn't need any active stabilization of the cavity length since we use the intrinsically high stability of radiofrequency sources to generate the multiple lines of the comb laser. Moreover, compared to the work of ref [1], the hybrid optical modulation we use allows to suppress the problem of instability due interferences between the generated lines. We notice that these lines benefit from the spectral quality of the seed laser because the spectral width of the synthesized hyperfrequency and radiofrequency signals are generally narrower than laser sources.
Enhanced Second-Harmonic Generation Using Broken Symmetry III–V Semiconductor Fano Metasurfaces
Vabishchevich, Polina P.; Liu, Sheng; Sinclair, Michael B.; ...
2018-01-27
All-dielectric metasurfaces, two-dimensional arrays of subwavelength low loss dielectric inclusions, can be used not only to control the amplitude and phase of optical beams, but also to generate new wavelengths through enhanced nonlinear optical processes that are free from some of the constraints dictated by the use of bulk materials. Recently, high quality factor (Q) resonances in these metasurfaces have been revealed and utilized for applications such as sensing and lasing. The origin of these resonances stems from the interference of two nanoresonator modes with vastly different Q. Here we show that nonlinear optical processes can be further enhanced bymore » utilizing these high-Q resonances in broken symmetry all-dielectric metasurfaces. As a result, we study second harmonic generation from broken symmetry metasurfaces made from III–V semiconductors and observe nontrivial spectral shaping of second-harmonic and multifold efficiency enhancement induced by high field localization and enhancement inside the nanoresonators.« less
Shen, Kai; Lu, Hui; Baig, Sarfaraz; Wang, Michael R
2017-11-01
The multi-frame superresolution technique is introduced to significantly improve the lateral resolution and image quality of spectral domain optical coherence tomography (SD-OCT). Using several sets of low resolution C-scan 3D images with lateral sub-spot-spacing shifts on different sets, the multi-frame superresolution processing of these sets at each depth layer reconstructs a higher resolution and quality lateral image. Layer by layer processing yields an overall high lateral resolution and quality 3D image. In theory, the superresolution processing including deconvolution can solve the diffraction limit, lateral scan density and background noise problems together. In experiment, the improved lateral resolution by ~3 times reaching 7.81 µm and 2.19 µm using sample arm optics of 0.015 and 0.05 numerical aperture respectively as well as doubling the image quality has been confirmed by imaging a known resolution test target. Improved lateral resolution on in vitro skin C-scan images has been demonstrated. For in vivo 3D SD-OCT imaging of human skin, fingerprint and retina layer, we used the multi-modal volume registration method to effectively estimate the lateral image shifts among different C-scans due to random minor unintended live body motion. Further processing of these images generated high lateral resolution 3D images as well as high quality B-scan images of these in vivo tissues.
Critical appraisal of clinical trials in multiple system atrophy: Toward better quality.
Castro Caldas, Ana; Levin, Johannes; Djaldetti, Ruth; Rascol, Olivier; Wenning, Gregor; Ferreira, Joaquim J
2017-10-01
Multiple system atrophy (MSA) is a rare neurodegenerative disease of undetermined cause. Although many clinical trials have been conducted, there is still no treatment that cures the disease or slows its progression. We sought to assess the clinical trials, methodology, and quality of reporting of clinical trails conducted in MSA patients. We conducted a systematic review of all trials with at least 1 MSA patient subject to any pharmacological/nonpharmacological interventions. Two independent reviewers evaluated the methodological characteristics and quality of reporting of trials. A total of 60 clinical trials were identified, including 1375 MSA patients. Of the trials, 51% (n = 31) were single-arm studies. A total of 28% (n = 17) had a parallel design, half of which (n = 13) were placebo controlled. Of the studies, 8 (13.3%) were conducted in a multicenter setting, 3 of which were responsible for 49.3% (n = 678) of the total included MSA patients. The description of primary outcomes was unclear in 60% (n = 40) of trials. Only 10 (16.7%) clinical trials clearly described the randomization process. Blinding of the participants, personnel, and outcome assessments were at high risk of bias in the majority of studies. The number of dropouts/withdrawals was high (n = 326, 23.4% among the included patients). Overall, the design and quality of reporting of the reviewed studies is unsatisfactory. The most frequent clinical trials were small and single centered. Inadequate reporting was related to the information on the randomization process, sequence generation, allocation concealment, blinding of participants, and sample size calculations. Although improved during the recent years, methodological quality and trial design need to be optimized to generate more informative results. © 2017 International Parkinson and Movement Disorder Society. © 2017 International Parkinson and Movement Disorder Society.
Memory preservation made prestigious but easy
NASA Astrophysics Data System (ADS)
Fageth, Reiner; Debus, Christina; Sandhaus, Philipp
2011-01-01
Preserving memories combined with story-telling using either photo books for multiple images or high quality products such as one or a few images printed on canvas or images mounted on acryl to create high-quality wall decorations are gradually becoming more popular than classical 4*6 prints and classical silver halide posters. Digital printing via electro photography and ink jet is increasingly replacing classical silver halide technology as the dominant production technology for these kinds of products. Maintaining a consistent and comparable quality of output is becoming more challenging than using silver halide paper for both, prints and posters. This paper describes a unique approach of combining both desktop based software to initiate a compelling project and the use of online capabilities in order to finalize and optimize that project in an online environment in a community process. A comparison of the consumer behavior between online and desktop based solutions for generating photo books will be presented.
Effects of Anode Arc Root Fluctuation on Coating Quality During Plasma Spraying
NASA Astrophysics Data System (ADS)
An, Lian-Tong; Gao, Yang; Sun, Chengqi
2011-06-01
To obtain a coating of high quality, a new type of plasma torch was designed and constructed to increase the stability of the plasma arc and reduce the air entrainment into the plasma jet. The torch, called bi-anode torch, generates an elongated arc with comparatively high arc voltage and low arc fluctuation. Spraying experiments were carried out to compare the quality of coatings deposited by a conventional torch and a bi-anode torch. Alumina coatings and tungsten carbide coatings were prepared to appraise the heating of the sprayed particles in the plasma jets and the entrainment of the surrounding air into the plasma jets, respectively. The results show that anode arc root fluctuation has only a small effect on the melting rate of alumina particles. On the other hand, reduced air entrainment into the plasma jet of the bi-anode torch will drastically reduce the decarbonization of tungsten carbide coatings.
Catalytic conversion of Chlorella pyrenoidosa to biofuels in supercritical alcohols over zeolites.
Yang, Le; Ma, Rui; Ma, Zewei; Li, Yongdan
2016-06-01
Microalgae have been considered as the feedstock for the third generation biofuels production, given its high lipid content and fast productivity. Herein, a catalytic approach for microalgae liquefaction to biocrude is examined in a temperature range of 250-300°C in methanol and ethanol over zeolites. Higher biocrude yield was achieved in ethanol and at lower temperatures, while better quality biocrude with higher light biocrude ratio and lower average molecular weight (Mw) was favored in methanol and at higher temperatures. Application of zeolites improves the biocrude quality significantly. Among the catalysts, HY shows the strongest acidity and performs the best to produce high quality biocrude. Solid residues have been extensively explored with thermal gravity analysis and elemental analysis. It is reported for the first time that up to 99wt.% of sulfur is deposited in the solid residue at 250°C for both solvents. Copyright © 2016 Elsevier Ltd. All rights reserved.
Formalization of the engineering science discipline - knowledge engineering
NASA Astrophysics Data System (ADS)
Peng, Xiao
Knowledge is the most precious ingredient facilitating aerospace engineering research and product development activities. Currently, the most common knowledge retention methods are paper-based documents, such as reports, books and journals. However, those media have innate weaknesses. For example, four generations of flying wing aircraft (Horten, Northrop XB-35/YB-49, Boeing BWB and many others) were mostly developed in isolation. The subsequent engineers were not aware of the previous developments, because these projects were documented such which prevented the next generation of engineers to benefit from the previous lessons learned. In this manner, inefficient knowledge retention methods have become a primary obstacle for knowledge transfer from the experienced to the next generation of engineers. In addition, the quality of knowledge itself is a vital criterion; thus, an accurate measure of the quality of 'knowledge' is required. Although qualitative knowledge evaluation criteria have been researched in other disciplines, such as the AAA criterion by Ernest Sosa stemming from the field of philosophy, a quantitative knowledge evaluation criterion needs to be developed which is capable to numerically determine the qualities of knowledge for aerospace engineering research and product development activities. To provide engineers with a high-quality knowledge management tool, the engineering science discipline Knowledge Engineering has been formalized to systematically address knowledge retention issues. This research undertaking formalizes Knowledge Engineering as follows: 1. Categorize knowledge according to its formats and representations for the first time, which serves as the foundation for the subsequent knowledge management function development. 2. Develop an efficiency evaluation criterion for knowledge management by analyzing the characteristics of both knowledge and the parties involved in the knowledge management processes. 3. Propose and develop an innovative Knowledge-Based System (KBS), AVD KBS, forming a systematic approach facilitating knowledge management. 4. Demonstrate the efficiency advantages of AVDKBS over traditional knowledge management methods via selected design case studies. This research formalizes, for the first time, Knowledge Engineering as a distinct discipline by delivering a robust and high-quality knowledge management and process tool, AVDKBS. Formalizing knowledge proves to significantly impact the effectiveness of aerospace knowledge retention and utilization.
Bowie, Paul; McNab, Duncan; Ferguson, Julie; de Wet, Carl; Smith, Gregor; MacLeod, Marion; McKay, John; White, Craig
2015-04-28
(1) To ascertain from patients what really matters to them on a personal level of such high importance that it should 'always happen' when they interact with healthcare professionals and staff groups. (2) To critically review existing criteria for selecting 'always events' (AEs) and generate a candidate list of AE examples based on the patient feedback data. Mixed methods study informed by participatory design principles. Convenience samples of patients with a long-term clinical condition in Scottish general practices. 195 patients from 13 general practices were interviewed (n=65) or completed questionnaires (n=130). 4 themes of high importance to patients were identified from which examples of potential 'AEs' (n=8) were generated: (1) emotional support, respect and kindness (eg, "I want all practice team members to show genuine concern for me at all times"); (2) clinical care management (eg, "I want the correct treatment for my problem"); (3) communication and information (eg, "I want the clinician who sees me to know my medical history") and (4) access to, and continuity of, healthcare (eg, "I want to arrange appointments around my family and work commitments"). Each 'AE' was linked to a system process or professional behaviour that could be measured to facilitate improvements in the quality of patient care. This study is the first known attempt to develop the AE concept as a person-centred approach to quality improvement in primary care. Practice managers were able to collect data from patients on what they 'always want' in terms of expectations related to care quality from which a list of AE examples was generated that could potentially be used as patient-driven quality improvement (QI) measures. There is strong implementation potential in the Scottish health service. However, further evaluation of the utility of the method is also necessary. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
NASA Astrophysics Data System (ADS)
Hagemann, M.; Jeznach, L. C.; Park, M. H.; Tobiason, J. E.
2016-12-01
Extreme precipitation events such as tropical storms and hurricanes are by their nature rare, yet have disproportionate and adverse effects on surface water quality. In the context of drinking water reservoirs, common concerns of such events include increased erosion and sediment transport and influx of natural organic matter and nutrients. As part of an effort to model the effects of an extreme precipitation event on water quality at the reservoir intake of a major municipal water system, this study sought to estimate extreme-event watershed responses including streamflow and exports of nutrients and organic matter for use as inputs to a 2-D hydrodynamic and water quality reservoir model. Since extreme-event watershed exports are highly uncertain, we characterized and propagated predictive uncertainty using a quasi-Monte Carlo approach to generate reservoir model inputs. Three storm precipitation depths—corresponding to recurrence intervals of 5, 50, and 100 years—were converted to streamflow in each of 9 tributaries by volumetrically scaling 2 storm hydrographs from the historical record. Rating-curve models for concentratoin, calibrated using 10 years of data for each of 5 constituents, were then used to estimate the parameters of a multivariate lognormal probability model of constituent concentrations, conditional on each scenario's storm date and streamflow. A quasi-random Halton sequence (n = 100) was drawn from the conditional distribution for each event scenario, and used to generate input files to a calibrated CE-QUAL-W2 reservoir model. The resulting simulated concentrations at the reservoir's drinking water intake constitute a low-discrepancy sample from the estimated uncertainty space of extreme-event source water-quality. Limiting factors to the suitability of this approach include poorly constrained relationships between hydrology and constituent concentrations, a high-dimensional space from which to generate inputs, and relatively long run-time for the reservoir model. This approach proved useful in probing a water supply's resilience to extreme events, and to inform management responses, particularly in a region such as the American Northeast where climate change is expected to bring such events with higher frequency and intensity than have occurred in the past.
USDA-ARS?s Scientific Manuscript database
Modern apples [Malus x domestica (Borkh.)] are thought to have originated in western China from the progenitor species, Malus sieversii (Ledeb.) M. Roem. Due to many generations of selection for traits associated with high fruit quality, our current breeding germplasm has become dangerously narrow....