Science.gov

Sample records for accurate detailed information

  1. An accurate fuzzy edge detection method using wavelet details subimages

    NASA Astrophysics Data System (ADS)

    Sedaghat, Nafiseh; Pourreza, Hamidreza

    2010-02-01

    Edge detection is a basic and important subject in computer vision and image processing. An edge detector is defined as a mathematical operator of small spatial extent that responds in some way to these discontinuities, usually classifying every image pixel as either belonging to an edge or not. Many researchers have been spent attempting to develop effective edge detection algorithms. Despite this extensive research, the task of finding the edges that correspond to true physical boundaries remains a difficult problem.Edge detection algorithms based on the application of human knowledge show their flexibility and suggest that the use of human knowledge is a reasonable alternative. In this paper we propose a fuzzy inference system with two inputs: gradient and wavelet details. First input is calculated by Sobel operator and the second is calculated by wavelet transform of input image and then reconstruction of image only with details subimages by inverse wavelet transform. There are many fuzzy edge detection methods, but none of them utilize wavelet transform as it is used in this paper. For evaluating our method, we detect edges of images with different brightness characteristics and compare results with canny edge detector. The results show the high performance of our method in finding true edges.

  2. Detailed behavioral assessment promotes accurate diagnosis in patients with disorders of consciousness

    PubMed Central

    Gilutz, Yael; Lazary, Avraham; Karpin, Hana; Vatine, Jean-Jacques; Misha, Tamar; Fortinsky, Hadassah; Sharon, Haggai

    2015-01-01

    Introduction: Assessing the awareness level in patients with disorders of consciousness (DOC) is made on the basis of exhibited behaviors. However, since motor signs of awareness (i.e., non-reflex motor responses) can be very subtle, differentiating the vegetative from minimally conscious states (which is in itself not clear-cut) is often challenging. Even the careful clinician relying on standardized scales may arrive at a wrong diagnosis. Aim: To report our experience in tackling this problem by using two in-house use assessment procedures developed at Reuth Rehabilitation Hospital, and demonstrate their clinical significance by reviewing two cases. Methods: (1) Reuth DOC Response Assessment (RDOC-RA) –administered in addition to the standardized tools, and emphasizes the importance of assessing a wide range of motor responses. In our experience, in some patients the only evidence for awareness may be a private specific movement that is not assessed by standard assessment tools. (2) Reuth DOC Periodic Intervention Model (RDOC-PIM) – current literature regarding assessment and diagnosis in DOC refers mostly to the acute phase of up to 1 year post injury. However, we have found major changes in responsiveness occurring 1 year or more post-injury in many patients. Therefore, we conduct periodic assessments at predetermined times points to ensure patients are not misdiagnosed or neurological changes overlooked. Results: In the first case the RDOC-RA promoted a more accurate diagnosis than that based on standardized scales alone. The second case shows how the RDOC-PIM allowed us to recognize late recovery and promoted reinstatement of treatment with good results. Conclusion: Adding a detailed periodic assessment of DOC patients to existing scales can yield critical information, promoting better diagnosis, treatment, and clinical outcomes. We discuss the implications of this observation for the future development and validation of assessment tools in DOC patients

  3. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  4. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... between co-transfer agents and recordkeeping transfer agents, maintenance of current control book... master securityholder files, maintenance of accurate securityholder files, communications between co... certificate detail from transfer journals received by the recordkeeping transfer agent from a...

  5. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... between co-transfer agents and recordkeeping transfer agents, maintenance of current control book... securityholder files, maintenance of accurate securityholder files, communications between co-transfer agents and... certificate detail from transfer journals received by the recordkeeping transfer agent from a...

  6. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... between co-transfer agents and recordkeeping transfer agents, maintenance of current control book... securityholder files, maintenance of accurate securityholder files, communications between co-transfer agents and... certificate detail from transfer journals received by the recordkeeping transfer agent from a...

  7. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... between co-transfer agents and recordkeeping transfer agents, maintenance of current control book... securityholder files, maintenance of accurate securityholder files, communications between co-transfer agents and... certificate detail from transfer journals received by the recordkeeping transfer agent from a...

  8. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... between co-transfer agents and recordkeeping transfer agents, maintenance of current control book... securityholder files, maintenance of accurate securityholder files, communications between co-transfer agents and... certificate detail from transfer journals received by the recordkeeping transfer agent from a...

  9. Detailed and Highly Accurate 3d Models of High Mountain Areas by the Macs-Himalaya Aerial Camera Platform

    NASA Astrophysics Data System (ADS)

    Brauchle, J.; Hein, D.; Berger, R.

    2015-04-01

    Remote sensing in areas with extreme altitude differences is particularly challenging. In high mountain areas specifically, steep slopes result in reduced ground pixel resolution and degraded quality in the DEM. Exceptionally high brightness differences can in part no longer be imaged by the sensors. Nevertheless, detailed information about mountainous regions is highly relevant: time and again glacier lake outburst floods (GLOFs) and debris avalanches claim dozens of victims. Glaciers are sensitive to climate change and must be carefully monitored. Very detailed and accurate 3D maps provide a basic tool for the analysis of natural hazards and the monitoring of glacier surfaces in high mountain areas. There is a gap here, because the desired accuracies are often not achieved. It is for this reason that the DLR Institute of Optical Sensor Systems has developed a new aerial camera, the MACS-Himalaya. The measuring unit comprises four camera modules with an overall aperture angle of 116° perpendicular to the direction of flight. A High Dynamic Range (HDR) mode was introduced so that within a scene, bright areas such as sun-flooded snow and dark areas such as shaded stone can be imaged. In 2014, a measuring survey was performed on the Nepalese side of the Himalayas. The remote sensing system was carried by a Stemme S10 motor glider. Amongst other targets, the Seti Valley, Kali-Gandaki Valley and the Mt. Everest/Khumbu Region were imaged at heights up to 9,200 m. Products such as dense point clouds, DSMs and true orthomosaics with a ground pixel resolution of up to 15 cm were produced. Special challenges and gaps in the investigation of high mountain areas, approaches for resolution of these problems, the camera system and the state of evaluation are presented with examples.

  10. 19. DETAIL OF AIR FORCE WEATHER INFORMATION TERMINAL AND CHART ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. DETAIL OF AIR FORCE WEATHER INFORMATION TERMINAL AND CHART RECORDER LOCATED IMMEDIATELY NORTH OF CONSOLE IN PHOTOS A-15 THROUGH A-18. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  11. Ultra-accurate collaborative information filtering via directed user similarity

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  12. Aggregate versus individual-level sexual behavior assessment: how much detail is needed to accurately estimate HIV/STI risk?

    PubMed

    Pinkerton, Steven D; Galletly, Carol L; McAuliffe, Timothy L; DiFranceisco, Wayne; Raymond, H Fisher; Chesson, Harrell W

    2010-02-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate). There is a natural trade-off between the level of sexual behavior detail and the precision of HIV/STI acquisition risk estimates. The results of this study indicate that relatively simple aggregate data collection techniques suffice to adequately estimate HIV risk. For highly infectious STIs, in contrast, accurate STI risk assessment requires more intensive partner-by-partner methods.

  13. The level of detail required in a deformable phantom to accurately perform quality assurance of deformable image registration

    NASA Astrophysics Data System (ADS)

    Saenz, Daniel L.; Kim, Hojin; Chen, Josephine; Stathakis, Sotirios; Kirby, Neil

    2016-09-01

    The primary purpose of the study was to determine how detailed deformable image registration (DIR) phantoms need to adequately simulate human anatomy and accurately assess the quality of DIR algorithms. In particular, how many distinct tissues are required in a phantom to simulate complex human anatomy? Pelvis and head-and-neck patient CT images were used for this study as virtual phantoms. Two data sets from each site were analyzed. The virtual phantoms were warped to create two pairs consisting of undeformed and deformed images. Otsu’s method was employed to create additional segmented image pairs of n distinct soft tissue CT number ranges (fat, muscle, etc). A realistic noise image was added to each image. Deformations were applied in MIM Software (MIM) and Velocity deformable multi-pass (DMP) and compared with the known warping. Images with more simulated tissue levels exhibit more contrast, enabling more accurate results. Deformation error (magnitude of the vector difference between known and predicted deformation) was used as a metric to evaluate how many CT number gray levels are needed for a phantom to serve as a realistic patient proxy. Stabilization of the mean deformation error was reached by three soft tissue levels for Velocity DMP and MIM, though MIM exhibited a persisting difference in accuracy between the discrete images and the unprocessed image pair. A minimum detail of three levels allows a realistic patient proxy for use with Velocity and MIM deformation algorithms.

  14. A new accurate pill recognition system using imprint information

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Kamata, Sei-ichiro

    2013-12-01

    Great achievements in modern medicine benefit human beings. Also, it has brought about an explosive growth of pharmaceuticals that current in the market. In daily life, pharmaceuticals sometimes confuse people when they are found unlabeled. In this paper, we propose an automatic pill recognition technique to solve this problem. It functions mainly based on the imprint feature of the pills, which is extracted by proposed MSWT (modified stroke width transform) and described by WSC (weighted shape context). Experiments show that our proposed pill recognition method can reach an accurate rate up to 92.03% within top 5 ranks when trying to classify more than 10 thousand query pill images into around 2000 categories.

  15. Building accurate geometric models from abundant range imaging information

    SciTech Connect

    Diegert, C.; Sackos, J.; Nellums, R.

    1997-05-01

    The authors define two simple metrics for accuracy of models built from range imaging information. They apply the metric to a model built from a recent range image taken at the Laser Radar Development and Evaluation Facility (LDERF), Eglin AFB, using a Scannerless Range Imager (SRI) from Sandia National Laboratories. They also present graphical displays of the residual information produced as a byproduct of this measurement, and discuss mechanisms that these data suggest for further improvement in the performance of this already impressive SRI.

  16. The Good, the Strong, and the Accurate: Preschoolers' Evaluations of Informant Attributes

    ERIC Educational Resources Information Center

    Fusaro, Maria; Corriveau, Kathleen H.; Harris, Paul L.

    2011-01-01

    Much recent evidence shows that preschoolers are sensitive to the accuracy of an informant. Faced with two informants, one of whom names familiar objects accurately and the other inaccurately, preschoolers subsequently prefer to learn the names and functions of unfamiliar objects from the more accurate informant. This study examined the inference…

  17. Aggregate versus Individual-Level Sexual Behavior Assessment: How Much Detail Is Needed to Accurately Estimate HIV/STI Risk?

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Galletly, Carol L.; McAuliffe, Timothy L.; DiFranceisco, Wayne; Raymond, H. Fisher; Chesson, Harrell W.

    2010-01-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate).…

  18. Accurate first-principles detailed-balance determination of auger recombination and impact ionization rates in semiconductors.

    PubMed

    Picozzi, S; Asahi, R; Geller, C B; Freeman, A J

    2002-11-04

    The technologically important prediction of Auger recombination lifetimes in semiconductors is addressed by means of a fully first-principles formalism, based on precise energy bands and wave functions provided by the full-potential linearized augmented plane wave code. The minority carrier Auger lifetime is determined by two related approaches: (i) a direct evaluation within Fermi's golden rule, and (ii) an indirect evaluation, based on a detailed balance formulation combining Auger recombination and its inverse process, impact ionization, in a unified framework. Lifetimes determined with the direct and indirect methods show excellent consistency between them (i) for n-doped GaAs and (ii) with measured values for GaAs and InGaAs. This indicates the computational formalism as a new sensitive tool for use in materials performance optimization.

  19. Tomato Analyzer: a useful software application to collect accurate and detailed morphological and colorimetric data from two-dimensional objects.

    PubMed

    Rodríguez, Gustavo R; Moyseenko, Jennifer B; Robbins, Matthew D; Morejón, Nancy Huarachi; Francis, David M; van der Knaap, Esther

    2010-03-16

    Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards. TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds

  20. Single-sideband modulator accurately reproduces phase information in 2-Mc signals

    NASA Technical Reports Server (NTRS)

    Strenglein, H. F.

    1966-01-01

    Phase-locked oscillator system employing solid state components acts as a single-sideband modulator to accurately reproduce phase information in 2-Mc signals. This system is useful in telemetry, aircraft communications and position-finding stations, and VHF test circuitry.

  1. PC-based Multiple Information System Interface (PC/MISI) detailed design and implementation plan

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    The design plan for the personal computer multiple information system interface (PC/MISI) project is discussed. The document is intended to be used as a blueprint for the implementation of the system. Each component is described in the detail necessary to allow programmers to implement the system. A description of the system data flow and system file structures is given.

  2. Accurate protein structure modeling using sparse NMR data and homologous structure information.

    PubMed

    Thompson, James M; Sgourakis, Nikolaos G; Liu, Gaohua; Rossi, Paolo; Tang, Yuefeng; Mills, Jeffrey L; Szyperski, Thomas; Montelione, Gaetano T; Baker, David

    2012-06-19

    While information from homologous structures plays a central role in X-ray structure determination by molecular replacement, such information is rarely used in NMR structure determination because it can be incorrect, both locally and globally, when evolutionary relationships are inferred incorrectly or there has been considerable evolutionary structural divergence. Here we describe a method that allows robust modeling of protein structures of up to 225 residues by combining (1)H(N), (13)C, and (15)N backbone and (13)Cβ chemical shift data, distance restraints derived from homologous structures, and a physically realistic all-atom energy function. Accurate models are distinguished from inaccurate models generated using incorrect sequence alignments by requiring that (i) the all-atom energies of models generated using the restraints are lower than models generated in unrestrained calculations and (ii) the low-energy structures converge to within 2.0 Å backbone rmsd over 75% of the protein. Benchmark calculations on known structures and blind targets show that the method can accurately model protein structures, even with very remote homology information, to a backbone rmsd of 1.2-1.9 Å relative to the conventional determined NMR ensembles and of 0.9-1.6 Å relative to X-ray structures for well-defined regions of the protein structures. This approach facilitates the accurate modeling of protein structures using backbone chemical shift data without need for side-chain resonance assignments and extensive analysis of NOESY cross-peak assignments.

  3. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    PubMed Central

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions. PMID:24957323

  4. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  5. Quantitatively mapping cellular viscosity with detailed organelle information via a designed PET fluorescent probe.

    PubMed

    Liu, Tianyu; Liu, Xiaogang; Spring, David R; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-24

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  6. Cas9-chromatin binding information enables more accurate CRISPR off-target prediction

    PubMed Central

    Singh, Ritambhara; Kuscu, Cem; Quinlan, Aaron; Qi, Yanjun; Adli, Mazhar

    2015-01-01

    The CRISPR system has become a powerful biological tool with a wide range of applications. However, improving targeting specificity and accurately predicting potential off-targets remains a significant goal. Here, we introduce a web-based CRISPR/Cas9 Off-target Prediction and Identification Tool (CROP-IT) that performs improved off-target binding and cleavage site predictions. Unlike existing prediction programs that solely use DNA sequence information; CROP-IT integrates whole genome level biological information from existing Cas9 binding and cleavage data sets. Utilizing whole-genome chromatin state information from 125 human cell types further enhances its computational prediction power. Comparative analyses on experimentally validated datasets show that CROP-IT outperforms existing computational algorithms in predicting both Cas9 binding as well as cleavage sites. With a user-friendly web-interface, CROP-IT outputs scored and ranked list of potential off-targets that enables improved guide RNA design and more accurate prediction of Cas9 binding or cleavage sites. PMID:26032770

  7. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    SciTech Connect

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Anderson, Kevin K.; Daly, Don S.; Smith, Richard D.

    2005-08-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/- 5 ppm and 1 ppm) and NET value (no constraint, +/- 0.05 and 0.01 on a 0-1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LCMS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate easurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/- 1 ppm and elution time easurements within +/- 0.01 NET.

  8. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    PubMed Central

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Smith, Richard D.

    2007-01-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/− 5 ppm and 1 ppm) and NET value (no constraint, +/− 0.05 and 0.01 on a 0–1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LC-MS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate measurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/− 1 ppm and elution time measurements within +/− 0.01 NET. PMID:15979333

  9. Fracture Network Characteristics Informed by Detailed Studies of Chlorinated Solvent Plumes in Sedimentary Rock Aquifers

    NASA Astrophysics Data System (ADS)

    Parker, B. L.; Chapman, S.

    2015-12-01

    Various numerical approaches have been used to simulate contaminant plumes in fractured porous rock, but the one that allows field and laboratory measurements to be most directly used as inputs to these models is the Discrete Fracture Network (DFN) Approach. To effectively account for fracture-matrix interactions, emphasis must be placed on identifying and parameterizing all of the fractures that participate substantially in groundwater flow and contaminated transport. High resolution plume studies at four primary research sites, where chlorinated solvent plumes serve as long-term (several decades) tracer tests, provide insight concerning the density of the fracture network unattainable by conventional methods. Datasets include contaminant profiles from detailed VOC subsampling informed by continuous core logs, hydraulic head and transmissivity profiles, packer testing and sensitive temperature logging methods in FLUTe™ lined holes. These show presence of many more transmissive fractures, contrasting observations of only a few flow zones per borehole obtained from conventional hydraulic tests including flow metering in open boreholes. Incorporating many more fractures with a wider range of transmissivities is key to predicting contaminant migration. This new understanding of dense fracture networks combined with matrix property measurements have informed 2-D DFN flow and transport modelling using Fractran and HydroGeosphere to simulate plume characteristics ground-truthed by detailed field site plume characterization. These process-based simulations corroborate field findings that plumes in sedimentary rock after decades of transport show limited plume front distances and strong internal plume attenuation by diffusion, transverse dispersion and slow degradation. This successful application of DFN modeling informed by field-derived parameters demonstrates how the DFN Approach can be applied to other sites to inform plume migration rates and remedial efficacy.

  10. Detailed Clinical Models: Representing Knowledge, Data and Semantics in Healthcare Information Technology

    PubMed Central

    2014-01-01

    Objectives This paper will present an overview of the developmental effort in harmonizing clinical knowledge modeling using the Detailed Clinical Models (DCMs), and will explain how it can contribute to the preservation of Electronic Health Records (EHR) data. Methods Clinical knowledge modeling is vital for the management and preservation of EHR and data. Such modeling provides common data elements and terminology binding with the intention of capturing and managing clinical information over time and location independent from technology. Any EHR data exchange without an agreed clinical knowledge modeling will potentially result in loss of information. Results Many attempts exist from the past to model clinical knowledge for the benefits of semantic interoperability using standardized data representation and common terminologies. The objective of each project is similar with respect to consistent representation of clinical data, using standardized terminologies, and an overall logical approach. However, the conceptual, logical, and the technical expressions are quite different in one clinical knowledge modeling approach versus another. There currently are synergies under the Clinical Information Modeling Initiative (CIMI) in order to create a harmonized reference model for clinical knowledge models. Conclusions The goal for the CIMI is to create a reference model and formalisms based on for instance the DCM (ISO/TS 13972), among other work. A global repository of DCMs may potentially be established in the future. PMID:25152829

  11. Advanced information processing system: The Army Fault-Tolerant Architecture detailed design overview

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven

    1994-01-01

    The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.

  12. Using geometrical, textural, and contextual information of land parcels for classification of detailed urban land use

    USGS Publications Warehouse

    Wu, S.-S.; Qiu, X.; Usery, E.L.; Wang, L.

    2009-01-01

    Detailed urban land use data are important to government officials, researchers, and businesspeople for a variety of purposes. This article presents an approach to classifying detailed urban land use based on geometrical, textural, and contextual information of land parcels. An area of 6 by 14 km in Austin, Texas, with land parcel boundaries delineated by the Travis Central Appraisal District of Travis County, Texas, is tested for the approach. We derive fifty parcel attributes from relevant geographic information system (GIS) and remote sensing data and use them to discriminate among nine urban land uses: single family, multifamily, commercial, office, industrial, civic, open space, transportation, and undeveloped. Half of the 33,025 parcels in the study area are used as training data for land use classification and the other half are used as testing data for accuracy assessment. The best result with a decision tree classification algorithm has an overall accuracy of 96 percent and a kappa coefficient of 0.78, and two naive, baseline models based on the majority rule and the spatial autocorrelation rule have overall accuracy of 89 percent and 79 percent, respectively. The algorithm is relatively good at classifying single-family, multifamily, commercial, open space, and undeveloped land uses and relatively poor at classifying office, industrial, civic, and transportation land uses. The most important attributes for land use classification are the geometrical attributes, particularly those related to building areas. Next are the contextual attributes, particularly those relevant to the spatial relationship between buildings, then the textural attributes, particularly the semivariance texture statistic from 0.61-m resolution images.

  13. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics

    PubMed Central

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research. PMID:27571061

  14. Arthroscopic optical coherence tomography provides detailed information on articular cartilage lesions in horses.

    PubMed

    te Moller, N C R; Brommer, H; Liukkonen, J; Virén, T; Timonen, M; Puhakka, P H; Jurvelin, J S; van Weeren, P R; Töyräs, J

    2013-09-01

    Arthroscopy enables direct inspection of the articular surface, but provides no information on deeper cartilage layers. Optical coherence tomography (OCT), based on measurement of reflection and backscattering of light, is a diagnostic technique used in cardiovascular surgery and ophthalmology. It provides cross-sectional images at resolutions comparable to that of low-power microscopy. The aim of this study was to determine if OCT is feasible for advanced clinical assessment of lesions in equine articular cartilage during diagnostic arthroscopy. Diagnostic arthroscopy of 36 metacarpophalangeal joints was carried out ex vivo. Of these, 18 joints with varying degrees of cartilage damage were selected, wherein OCT arthroscopy was conducted using an OCT catheter (diameter 0.9 mm) inserted through standard instrument portals. Five sites of interest, occasionally supplemented with other locations where defects were encountered, were arthroscopically graded according to the International Cartilage Repair Society (ICRS) classification system. The same sites were evaluated qualitatively (ICRS classification and morphological description of the lesions) and quantitatively (measurement of cartilage thickness) on OCT images. OCT provided high resolution images of cartilage enabling determination of cartilage thickness. Comparing ICRS grades determined by both arthroscopy and OCT revealed poor agreement. Furthermore, OCT visualised a spectrum of lesions, including cavitation, fibrillation, superficial and deep clefts, erosion, ulceration and fragmentation. In addition, with OCT the arthroscopically inaccessible area between the dorsal MC3 and P1 was reachable in some cases. Arthroscopically-guided OCT provided more detailed and quantitative information on the morphology of articular cartilage lesions than conventional arthroscopy. OCT could therefore improve the diagnostic value of arthroscopy in equine orthopaedic surgery.

  15. Detailed Soil Information for Hydrologic Modeling in the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Bliss, N. B.; Waltman, S. W.; Neale, A. C.

    2010-12-01

    Detailed soil data for the Conterminous United States are being made available to hydrologic modelers and others in a new gridded format. The Soil Survey Geographic (SSURGO) Database is now 86 percent complete for the Conterminous United States. The soil properties of interest to hydrologists include available water capacity, bulk density, saturated hydraulic conductivity, field capacity, porosity, average soil thickness, soil organic matter or carbon content, and percentages of sand, silt, clay, and rocks. The methods for creating the gridded format data summarize the attributes across soil horizons and soil components to create a value for each attribute at the mapunit level. Separate gridded products can be developed for specific depth zones, as required. The SSURGO data are being continuously improved by National Cooperative Soil Survey under the leadership of the U.S. Department of Agriculture Natural Resources Conservation Service (NRCS). Readily accessible gridded soils data have several advantages over vector data, such as easier integration with other land surface datasets. Currently, the data are available at a 30-meter resolution in the Albers Equal Area projection. The compilation of the new database has been made possible as part of a National Atlas of Ecosystem Services being developed under the leadership of the US Environmental Protection Agency (EPA), along with many partner organizations including the NRCS and the United States Geological Survey. When complete, the atlas information will include many ecosystem features and will be used in a wide variety of ecosystem service assessments.

  16. The Sunspot Number and beyond : reconstructing detailed solar information over centuries

    NASA Astrophysics Data System (ADS)

    Lefevre, L.

    2014-12-01

    With four centuries of solar evolution, the International Sunspot Number (SSN) forms the longest solar time series currently available. It provides an essential reference for understanding and quantifying how the solar output has varied over decades and centuries and thus for assessing the variations of the main natural forcing on the Earth climate. Because of its importance, this unique time-series must be closely monitored for any possible biases and drifts. Here, we report about recent disagreements between solar indices, for example the sunspot sumber and the 10.7cm radio flux. Recent analyses indicate that while part of this divergence may be due to a calibration drift in the SSN, it also results from an intrinsic change in the global magnetic parameters of sunspots and solar active regions, suggesting a possible transition to a new activity regime. Going beyond the SSN series, in the framework of the TOSCA (www.cost-tosca.eu/) and SOLID (projects.pmodwrc.ch/solid/) projects, we produced a survey of all existing catalogs providing detailed sunspot information (Lefevre & Clette, 2014:10.1007/s11207-012-0184-5) and we also located different primary solar images and drawing collections that can be exploitable to complement the existing catalogs. These are first steps towards the construction of a multi-parametric time series of multiple sunspot and sunspot group properties over more than a century, allowing to reconstruct and extend the current 1-D SSN series. By bringing new spatial, morphological and evolutionary information, such a data set should bring major advances for the modeling of the solar dynamo and solar irradiance. We will present here the current status of this work. The preliminary version catalog now extends over the last 150 years. It makes use of data from DPD (http://fenyi.solarobs.unideb.hu/DPD/index.html), from the Uccle Solar Equatorial Table (USET:http://sidc.oma.be/uset/) operated by the Royal Obeservatory of Belgium, the Greenwich

  17. Towards a first detailed reconstruction of sunspot information over the last 150 years

    NASA Astrophysics Data System (ADS)

    Lefevre, Laure; Clette, Frédéric

    2013-04-01

    With four centuries of solar evolution, the International Sunspot Number (SSN) forms the longest solar time series currently available. It provides an essential reference for understanding and quantifying how the solar output has varied over decades and centuries and thus for assessing the variations of the main natural forcing on the Earth climate. For such a quantitative use, this unique time-series must be closely monitored for any possible biases and drifts. This is the main objective of the Sunspot Workshops organized jointly by the National Solar Observatory (NSO) and the Royal Observatory of Belgium (ROB) since 2010. Here, we will report about some recent outcomes of past workshops, like diagnostics of scaling errors and their proposed corrections, or the recent disagreement between the sunspot sumber and other solar indices like the 10.7cm radio flux. Our most recent analyses indicate that while part of this divergence may be due to a calibration drift in the SSN, it also results from an intrinsic change in the global magnetic parameters of sunspots and solar active regions, suggesting a possible transition to a new activity regime. Going beyond the SSN series, in the framework of the SOTERIA, TOSCA and SOLID projects, we produced a survey of all existing catalogs providing detailed sunspot information and we also located different primary solar images and drawing collections that can be exploitable to complement the existing catalogs (COMESEP project). These are first steps towards the construction of a multi-parametric time series of multiple sunspot group properties over at least the last 150 years, allowing to reconstruct and extend the current 1-D SSN series. By bringing new spatial, morphological and evolutionary information, such a data set should bring major advances for the modeling of the solar dynamo and solar irradiance. We will present here the current status of this work. The catalog now extends over the last 3 cycles (Lefevre & Clette 2011

  18. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    SciTech Connect

    Allan, M.E.; Wilson, M.L.; Wightman, J. )

    1996-01-01

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.

  19. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    SciTech Connect

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-12-31

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.

  20. Accurate determination of the reaction course in HY2 <−> Y + YH (Y = O, S): detailed analysis of the covalent- to hydrogen-bonding transition.

    PubMed

    Varandas, A J C

    2013-08-15

    The accurate prediction of a bond-breaking/bond-forming reaction course is useful but very difficult. Toward this goal, a cost-effective multireference scheme (A. J. C. Varandas, J. Chem. Theory Comput. 2012, 8, 428) is tested that provides a generalization of the Hartree-Fock plus dispersion model for closed-shell interactions, and hence is based on the popular but largely untested idea of performing single point calculations with a high-level method at stationary points or along paths located using a lower level method. The energetics so calculated for the reaction HO2 <−> O + OH is predicted in excellent agreement with the experimental data, whereas the reaction path shows a scar at the onset of hydrogen-bonding: a weak van der Waals type minimum separated from the deep covalent well by a small barrier, all below the O + OH asymptote. The O-OH long-range interaction potential is also examined and possible implications in reaction dynamics discussed. Corresponding attributes for the reaction HS2 <−> S + SH are predicted, in good agreement with the best theoretical and experimental results. A perspective on the general utility of the approach is presented.

  1. Assessment of Preferences for Classification Detail in Medical Information: Is Uniformity Better?

    ERIC Educational Resources Information Center

    Lorence, Daniel P.; Spink, Amanda

    2003-01-01

    Reports results from a national study into the perceived variation reported by health information managers related to the relevance-efficiency trade-offs of information classification across regions and practice settings. Findings suggest that due to major regional variation, stringent national information standards may be counterproductive for…

  2. Accurate refinement of docked protein complexes using evolutionary information and deep learning.

    PubMed

    Akbal-Delibas, Bahar; Farhoodi, Roshanak; Pomplun, Marc; Haspel, Nurit

    2016-06-01

    One of the major challenges for protein docking methods is to accurately discriminate native-like structures from false positives. Docking methods are often inaccurate and the results have to be refined and re-ranked to obtain native-like complexes and remove outliers. In a previous work, we introduced AccuRefiner, a machine learning based tool for refining protein-protein complexes. Given a docked complex, the refinement tool produces a small set of refined versions of the input complex, with lower root-mean-square-deviation (RMSD) of atomic positions with respect to the native structure. The method employs a unique ranking tool that accurately predicts the RMSD of docked complexes with respect to the native structure. In this work, we use a deep learning network with a similar set of features and five layers. We show that a properly trained deep learning network can accurately predict the RMSD of a docked complex with 1.40 Å error margin on average, by approximating the complex relationship between a wide set of scoring function terms and the RMSD of a docked structure. The network was trained on 35000 unbound docking complexes generated by RosettaDock. We tested our method on 25 different putative docked complexes produced also by RosettaDock for five proteins that were not included in the training data. The results demonstrate that the high accuracy of the ranking tool enables AccuRefiner to consistently choose the refinement candidates with lower RMSD values compared to the coarsely docked input structures.

  3. 78 FR 42796 - 30-Day Notice of Proposed Information Collection: HUD Standard Grant Application Forms: Detailed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-17

    ...: Detailed Budget Form (HUD-424-CB), Budget Worksheet (HUD-424CBW), Application for Federal Assistance (SF...-424-CB), Budget Worksheet (HUD- 424CBW), Application for Federal Assistance (SF-424), and the Third... the Paperwork Reduction Act of 1995, 44 U.S.C. Chapter 35. Dated: July 9, 2013. Colette...

  4. Obtaining detailed structural information about supramolecular systems on surfaces by combining high-resolution force microscopy with ab initio calculations.

    PubMed

    Kawai, Shigeki; Sadeghi, Ali; Xu, Feng; Feng, Xu; Peng, Lifen; Lifen, Peng; Pawlak, Rémy; Glatzel, Thilo; Willand, Alexander; Orita, Akihiro; Otera, Junzo; Goedecker, Stefan; Meyer, Ernst

    2013-10-22

    State-of-the art experimental techniques such as scanning tunneling microscopy have great difficulties in extracting detailed structural information about molecules adsorbed on surfaces. By combining atomic force microscopy and Kelvin probe force microscopy with ab initio calculations, we demonstrate that we can obtain a wealth of detailed structural information about the molecule itself and its environment. Studying an FFPB molecule on a gold surface, we are able to determine its exact location on the surface, the nature of its bonding properties with neighboring molecules that lead to the growth of one-dimensional strips, and the internal torsions and bendings of the molecule.

  5. Information Systems Security and Computer Crime in the IS Curriculum: A Detailed Examination

    ERIC Educational Resources Information Center

    Foltz, C. Bryan; Renwick, Janet S.

    2011-01-01

    The authors examined the extent to which information systems (IS) security and computer crime are covered in information systems programs. Results suggest that IS faculty believe security coverage should be increased in required, elective, and non-IS courses. However, respondent faculty members are concerned that existing curricula leave little…

  6. Polyallelic structural variants can provide accurate, highly informative genetic markers focused on diagnosis and therapeutic targets: Accuracy vs. Precision.

    PubMed

    Roses, A D

    2016-02-01

    Structural variants (SVs) include all insertions, deletions, and rearrangements in the genome, with several common types of nucleotide repeats including single sequence repeats, short tandem repeats, and insertion-deletion length variants. Polyallelic SVs provide highly informative markers for association studies with well-phenotyped cohorts. SVs can influence gene regulation by affecting epigenetics, transcription, splicing, and/or translation. Accurate assays of polyallelic SV loci are required to define the range and allele frequency of variable length alleles.

  7. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 16 Commercial Practices 2 2012-01-01 2012-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  8. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  9. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  10. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  11. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Reasonable steps to assure information is... to the best of the submitter's knowledge and belief, provided that: (i) The confirmation is made by... physician; or (v) The confirmation is made by a parent or guardian of a child involved in an...

  12. Detailed requirements document for common software of shuttle program information management system

    NASA Technical Reports Server (NTRS)

    Everette, J. M.; Bradfield, L. D.; Horton, C. L.

    1975-01-01

    Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.

  13. Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

    PubMed Central

    Mantel, Bruno; Stoffregen, Thomas A.; Campbell, Alain; Bardy, Benoît G.

    2015-01-01

    Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate. PMID:25856410

  14. A Tale of Two Course Guides: Providing Students with Detailed Course Information

    ERIC Educational Resources Information Center

    Hanson, Karen; Williamson, Kasi

    2010-01-01

    Where do students find out about courses they might take? Potentially, from just about anywhere: friends, bulletin boards, department Web sites, advisors, e-mails, or flyers posted in the halls. Of course, some of these sources are more trustworthy than others. Where should students go to get reliable information that can help them make wise…

  15. Detailed design specification for the ALT Shuttle Information Extraction Subsystem (SIES)

    NASA Technical Reports Server (NTRS)

    Clouette, G. L.; Fitzpatrick, W. N.

    1976-01-01

    The approach and landing test (ALT) shuttle information extraction system (SIES) is described in terms of general requirements and system characteristics output products and processing options, output products and data sources, and system data flow. The ALT SIES is a data reduction system designed to satisfy certain data processing requirements for the ALT phase of the space shuttle program. The specific ALT SIES data processing requirements are stated in the data reduction complex approach and landing test data processing requirements. In general, ALT SIES must produce time correlated data products as a result of standardized data reduction or special purpose analytical processes. The main characteristics of ALT SIES are: (1) the system operates in a batch (non-interactive) mode; (2) the processing is table driven; (3) it is data base oriented; (4) it has simple operating procedures; and (5) it requires a minimum of run time information.

  16. An Information Management Study for Headquarters Department of the Army, Phase 1 Detailed Report.

    DTIC Science & Technology

    1979-06-12

    education, (4) information-resource technology assessment, -(5) metadata management, (6) data base administration guidance, (7) data standardization, (8...OF SYSTEMS DESCRIPTIONS VIII-2 4. APPLICATION OF IRM METHODS AND TECHNOLOGY VIII-6 5. SUMMARY OF FINDINGS AND IMPACTS VIII-10 S TABLE OF CONTENTS...such a unified system, even with today’s technology . Instead what will be recommended is the establishment of a managerial approach which can

  17. Informed Consent for Interventional Radiology Procedures: A Survey Detailing Current European Practice

    SciTech Connect

    O'Dwyer, H.M.; Lyon, S.M.; Fotheringham, T.; Lee, M.J.

    2003-09-15

    Purpose: Official recommendations for obtaining informed consent for interventional radiology procedures are that the patient gives their consent to the operator more than 24 hr prior to the procedure. This has significant implications for interventional radiology practice. The purpose of this study was to identify the proportion of European interventional radiologists who conform to these guidelines. Methods: A questionnaire was designed consisting of 12 questions on current working practice and opinions regarding informed consent. These questions related to where, when and by whom consent was obtained from the patient. Questions also related to the use of formal consent forms and written patient information leaflets. Respondents were asked whether they felt patients received adequate explanation regarding indications for intervention,the procedure, alternative treatment options and complications. The questionnaire was distributed to 786 European interventional radiologists who were members of interventional societies. The anonymous replies were then entered into a database and analyzed. Results: Two hundred and fifty-four (32.3%) questionnaires were returned. Institutions were classified as academic (56.7%),non-academic (40.5%) or private (2.8%). Depending on the procedure,in a significant proportion of patients consent was obtained in the outpatient department (22%), on the ward (65%) and in the radiology day case ward (25%), but in over half (56%) of patients consent or re-consent was obtained in the interventional suite. Fifty percent of respondents indicated that they obtain consent more than 24 hr before some procedures, in 42.9% consent is obtained on the morning of the procedure and 48.8% indicated that in some patients consent is obtained immediately before the procedure. We found that junior medical staff obtained consent in 58% of cases. Eighty-two percent of respondents do not use specific consent forms and 61% have patient information leaflets. The

  18. Honey bees can perform accurately directed waggle dances based solely on information from a homeward trip.

    PubMed

    Edrich, Wolfgang

    2015-10-01

    Honey bees were displaced several 100 m from their hive to an unfamiliar site and provisioned with honey. After feeding, almost two-thirds of the bees flew home to their hive within a 50 min observation time. About half of these returning, bees signalled the direction of the release site in waggle dances thus demonstrating that the dance can be guided entirely by information gathered on a single homeward trip. The likely reason for the bees' enthusiastic dancing on their initial return from this new site was the highly rewarding honeycomb that they were given there. The attractive nature of the site is confirmed by many of these bees revisiting the site and continuing to forage there.

  19. Accurately decoding visual information from fMRI data obtained in a realistic virtual environment

    PubMed Central

    Floren, Andrew; Naylor, Bruce; Miikkulainen, Risto; Ress, David

    2015-01-01

    Three-dimensional interactive virtual environments (VEs) are a powerful tool for brain-imaging based cognitive neuroscience that are presently under-utilized. This paper presents machine-learning based methods for identifying brain states induced by realistic VEs with improved accuracy as well as the capability for mapping their spatial topography on the neocortex. VEs provide the ability to study the brain under conditions closer to the environment in which humans evolved, and thus to probe deeper into the complexities of human cognition. As a test case, we designed a stimulus to reflect a military combat situation in the Middle East, motivated by the potential of using real-time functional magnetic resonance imaging (fMRI) in the treatment of post-traumatic stress disorder. Each subject experienced moving through the virtual town where they encountered 1–6 animated combatants at different locations, while fMRI data was collected. To analyze the data from what is, compared to most studies, more complex and less controlled stimuli, we employed statistical machine learning in the form of Multi-Voxel Pattern Analysis (MVPA) with special attention given to artificial Neural Networks (NN). Extensions to NN that exploit the block structure of the stimulus were developed to improve the accuracy of the classification, achieving performances from 58 to 93% (chance was 16.7%) with six subjects. This demonstrates that MVPA can decode a complex cognitive state, viewing a number of characters, in a dynamic virtual environment. To better understand the source of this information in the brain, a novel form of sensitivity analysis was developed to use NN to quantify the degree to which each voxel contributed to classification. Compared with maps produced by general linear models and the searchlight approach, these sensitivity maps revealed a more diverse pattern of information relevant to the classification of cognitive state. PMID:26106315

  20. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  1. Sewerage Mapping and Information System of the Metropolis of Tokyo (SEMIS) : Details of the Development and Outline of the System

    NASA Astrophysics Data System (ADS)

    Kawakami, Kouichi; Sekita, Mitsunobu

    It is essential to manage sewerage ledgers as information when maintaining and controlling sewerage, one of the infrastructures of cities. The Bureau of Sewerage developed the full scale Sewerage Mapping and Information System (SEMIS), the first trial done by a local government in this country and has operated it since 1985. Before the development the questionnaires were conducted to survey the use of sewerage ledgers by staffs engaged in sewage works, and means of improving how to prepare plans of sewerage were considered based on them. Employing these means the Bureau made a database of plans and descriptions which comprise sewerage ledgers, and then constructed the computer system which manages it comprehensively. The details of the development and the system outline are described.

  2. SnowyOwl: accurate prediction of fungal genes by using RNA-Seq and homology information to select among ab initio models

    PubMed Central

    2014-01-01

    Background Locating the protein-coding genes in novel genomes is essential to understanding and exploiting the genomic information but it is still difficult to accurately predict all the genes. The recent availability of detailed information about transcript structure from high-throughput sequencing of messenger RNA (RNA-Seq) delineates many expressed genes and promises increased accuracy in gene prediction. Computational gene predictors have been intensively developed for and tested in well-studied animal genomes. Hundreds of fungal genomes are now or will soon be sequenced. The differences of fungal genomes from animal genomes and the phylogenetic sparsity of well-studied fungi call for gene-prediction tools tailored to them. Results SnowyOwl is a new gene prediction pipeline that uses RNA-Seq data to train and provide hints for the generation of Hidden Markov Model (HMM)-based gene predictions and to evaluate the resulting models. The pipeline has been developed and streamlined by comparing its predictions to manually curated gene models in three fungal genomes and validated against the high-quality gene annotation of Neurospora crassa; SnowyOwl predicted N. crassa genes with 83% sensitivity and 65% specificity. SnowyOwl gains sensitivity by repeatedly running the HMM gene predictor Augustus with varied input parameters and selectivity by choosing the models with best homology to known proteins and best agreement with the RNA-Seq data. Conclusions SnowyOwl efficiently uses RNA-Seq data to produce accurate gene models in both well-studied and novel fungal genomes. The source code for the SnowyOwl pipeline (in Python) and a web interface (in PHP) is freely available from http://sourceforge.net/projects/snowyowl/. PMID:24980894

  3. When the Details Matter – Sensitivities in PRA Calculations That Could Affect Risk-Informed Decision-Making

    SciTech Connect

    Dana L. Kelly; Nathan O. Siu

    2010-06-01

    As the U.S. Nuclear Regulatory Commission (NRC) continues its efforts to increase its use of risk information in decision making, the detailed, quantitative results of probabilistic risk assessment (PRA) calculations are coming under increased scrutiny. Where once analysts and users were not overly concerned with figure of merit variations that were less than an order of magnitude, now factors of two or even less can spark heated debate regarding modeling approaches and assumptions. The philosophical and policy-related aspects of this situation are well-recognized by the PRA community. On the other hand, the technical implications for PRA methods and modeling have not been as widely discussed. This paper illustrates the potential numerical effects of choices as to the details of models and methods for parameter estimation with three examples: 1) the selection of the time period data for parameter estimation, and issues related to component boundary and failure mode definitions; 2) the selection of alternative diffuse prior distributions, including the constrained noninformative prior distribution, in Bayesian parameter estimation; and 3) the impact of uncertainty in calculations for recovery of offsite power.

  4. Academic detailing.

    PubMed

    Shankar, P R; Jha, N; Piryani, R M; Bajracharya, O; Shrestha, R; Thapa, H S

    2010-01-01

    There are a number of sources available to prescribers to stay up to date about medicines. Prescribers in rural areas in developing countries however, may not able to access some of them. Interventions to improve prescribing can be educational, managerial, and regulatory or use a mix of strategies. Detailing by the pharmaceutical industry is widespread. Academic detailing (AD) has been classically seen as a form of continuing medical education in which a trained health professional such as a physician or pharmacist visits physicians in their offices to provide evidence-based information. Face-to-face sessions, preferably on an individual basis, clear educational and behavioural objectives, establishing credibility with respect to objectivity, stimulating physician interaction, use of concise graphic educational materials, highlighting key messages, and when possible, providing positive reinforcement of improved practices in follow-up visits can increase success of AD initiatives. AD is common in developed countries and certain examples have been cited in this review. In developing countries the authors have come across reports of AD in Pakistan, Sudan, Argentina and Uruguay, Bihar state in India, Zambia, Cuba, Indonesia and Mexico. AD had a consistent, small but potentially significant impact on prescribing practices. AD has much less resources at its command compared to the efforts by the industry. Steps have to be taken to formally start AD in Nepal and there may be specific hindering factors similar to those in other developing nations.

  5. Center for Information Services, Phase II: Detailed System Design and Programming, Part 7 - Text Processing, Phase IIA Final Report.

    ERIC Educational Resources Information Center

    Silva, Georgette M.

    Libraries, as well as larger information networks, are necessarily based upon the storage of information files consisting in many cases of written materials and texts such as books, serials, abstracts, manuscripts and archives. At the present stage of the "information explosion" no librarian can afford to ignore the contribution of…

  6. Transient Auditory Storage of Acoustic Details Is Associated with Release of Speech from Informational Masking in Reverberant Conditions

    ERIC Educational Resources Information Center

    Huang, Ying; Huang, Qiang; Chen, Xun; Wu, Xihong; Li, Liang

    2009-01-01

    Perceptual integration of the sound directly emanating from the source with reflections needs both temporal storage and correlation computation of acoustic details. We examined whether the temporal storage is frequency dependent and associated with speech unmasking. In Experiment 1, a break in correlation (BIC) between interaurally correlated…

  7. Robust fundamental frequency estimation in sustained vowels: detailed algorithmic comparisons and information fusion with adaptive Kalman filtering.

    PubMed

    Tsanas, Athanasios; Zañartu, Matías; Little, Max A; Fox, Cynthia; Ramig, Lorraine O; Clifford, Gari D

    2014-05-01

    There has been consistent interest among speech signal processing researchers in the accurate estimation of the fundamental frequency (F(0)) of speech signals. This study examines ten F(0) estimation algorithms (some well-established and some proposed more recently) to determine which of these algorithms is, on average, better able to estimate F(0) in the sustained vowel /a/. Moreover, a robust method for adaptively weighting the estimates of individual F(0) estimation algorithms based on quality and performance measures is proposed, using an adaptive Kalman filter (KF) framework. The accuracy of the algorithms is validated using (a) a database of 117 synthetic realistic phonations obtained using a sophisticated physiological model of speech production and (b) a database of 65 recordings of human phonations where the glottal cycles are calculated from electroglottograph signals. On average, the sawtooth waveform inspired pitch estimator and the nearly defect-free algorithms provided the best individual F(0) estimates, and the proposed KF approach resulted in a ∼16% improvement in accuracy over the best single F(0) estimation algorithm. These findings may be useful in speech signal processing applications where sustained vowels are used to assess vocal quality, when very accurate F(0) estimation is required.

  8. Robust fundamental frequency estimation in sustained vowels: Detailed algorithmic comparisons and information fusion with adaptive Kalman filtering

    PubMed Central

    Tsanas, Athanasios; Zañartu, Matías; Little, Max A.; Fox, Cynthia; Ramig, Lorraine O.; Clifford, Gari D.

    2014-01-01

    There has been consistent interest among speech signal processing researchers in the accurate estimation of the fundamental frequency (F0) of speech signals. This study examines ten F0 estimation algorithms (some well-established and some proposed more recently) to determine which of these algorithms is, on average, better able to estimate F0 in the sustained vowel /a/. Moreover, a robust method for adaptively weighting the estimates of individual F0 estimation algorithms based on quality and performance measures is proposed, using an adaptive Kalman filter (KF) framework. The accuracy of the algorithms is validated using (a) a database of 117 synthetic realistic phonations obtained using a sophisticated physiological model of speech production and (b) a database of 65 recordings of human phonations where the glottal cycles are calculated from electroglottograph signals. On average, the sawtooth waveform inspired pitch estimator and the nearly defect-free algorithms provided the best individual F0 estimates, and the proposed KF approach resulted in a ∼16% improvement in accuracy over the best single F0 estimation algorithm. These findings may be useful in speech signal processing applications where sustained vowels are used to assess vocal quality, when very accurate F0 estimation is required. PMID:24815269

  9. The role of cognitive switching in head-up displays. [to determine pilot ability to accurately extract information from either of two sources

    NASA Technical Reports Server (NTRS)

    Fischer, E.

    1979-01-01

    The pilot's ability to accurately extract information from either one or both of two superimposed sources of information was determined. Static, aerial, color 35 mm slides of external runway environments and slides of corresponding static head-up display (HUD) symbology were used as the sources. A three channel tachistoscope was utilized to show either the HUD alone, the scene alone, or the two slides superimposed. Cognitive performance of the pilots was assessed by determining the percentage of correct answers given to two HUD related questions, two scene related questions, or one HUD and one scene related question.

  10. Dynamism & Detail

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    2004-01-01

    New material discovered in the study of cell research is presented for the benefit of biology teachers. Huge amounts of data are being generated in fields like cellular dynamics, and it is felt that people's understanding of the cell is becoming much more complex and detailed.

  11. The Lunar Laser Ranging Experiment: Accurate ranges have given a large improvement in the lunar orbit and new selenophysical information.

    PubMed

    Bender, P L; Currie, D G; Poultney, S K; Alley, C O; Dicke, R H; Wilkinson, D T; Eckhardt, D H; Faller, J E; Kaula, W M; Mulholland, J D; Plotkin, H H; Silverberg, E C; Williams, J G

    1973-10-19

    The lunar ranging measurements now being made at the McDonald Observatory have an accuracy of 1 nsec in round-trip travel time. This corresponds to 15 cm in the one-way distance. The use of lasers with pulse-lengths of less than 1 nsec is expected to give an accuracy of 2 to 3 cm in the next few years. A new station is under construction in Hawaii, and additional stations in other countries are either in operation or under development. It is hoped that these stations will form the basis for a worldwide network to determine polar motion and earth rotation on a regular basis, and will assist in providing information about movement of the tectonic plates making up the earth's surface. Several mobile lunar ranging stations with telescopes having diameters of 1.0 m or less could, in the future, greatly extend the information obtainable about motions within and between the tectonic plates. The data obtained so far by the McDonald Observatory have been used to generate a new lunar ephemeris based on direct numerical integration of the equations of motion for the moon and planets. With this ephemeris, the range to the three Apollo retro-reflectors can be fit to an accuracy of 5 m by adjusting the differences in moments of inertia of the moon about its principal axes, the selenocentric coordinates of the reflectors, and the McDonald longitude. The accuracy of fitting the results is limited currently by errors of the order of an arc second in the angular orientation of the moon, as derived from the best available theory of how the moon rotates in response to the torques acting on it. Both a new calculation of the moon's orientation as a function of time based on direct numerical integration of the torque equations and a new analytic theory of the moon's orientation are expected to be available soon, and to improve considerably the accuracy of fitting the data. The accuracy already achieved routinely in lunar laser ranging represents a hundredfold improvement over any

  12. Crowdsourcing detailed flood data

    NASA Astrophysics Data System (ADS)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  13. Argon Cluster Sputtering Source for ToF-SIMS Depth Profiling of Insulating Materials: High Sputter Rate and Accurate Interfacial Information.

    PubMed

    Wang, Zhaoying; Liu, Bingwen; Zhao, Evan W; Jin, Ke; Du, Yingge; Neeway, James J; Ryan, Joseph V; Hu, Dehong; Zhang, Kelvin H L; Hong, Mina; Le Guernic, Solenne; Thevuthasan, Suntharampilai; Wang, Fuyi; Zhu, Zihua

    2015-08-01

    The use of an argon cluster ion sputtering source has been demonstrated to perform superiorly relative to traditional oxygen and cesium ion sputtering sources for ToF-SIMS depth profiling of insulating materials. The superior performance has been attributed to effective alleviation of surface charging. A simulated nuclear waste glass (SON68) and layered hole-perovskite oxide thin films were selected as model systems because of their fundamental and practical significance. Our results show that high sputter rates and accurate interfacial information can be achieved simultaneously for argon cluster sputtering, whereas this is not the case for cesium and oxygen sputtering. Therefore, the implementation of an argon cluster sputtering source can significantly improve the analysis efficiency of insulating materials and, thus, can expand its applications to the study of glass corrosion, perovskite oxide thin film characterization, and many other systems of interest.

  14. IrisPlex: a sensitive DNA tool for accurate prediction of blue and brown eye colour in the absence of ancestry information.

    PubMed

    Walsh, Susan; Liu, Fan; Ballantyne, Kaye N; van Oven, Mannis; Lao, Oscar; Kayser, Manfred

    2011-06-01

    A new era of 'DNA intelligence' is arriving in forensic biology, due to the impending ability to predict externally visible characteristics (EVCs) from biological material such as those found at crime scenes. EVC prediction from forensic samples, or from body parts, is expected to help concentrate police investigations towards finding unknown individuals, at times when conventional DNA profiling fails to provide informative leads. Here we present a robust and sensitive tool, termed IrisPlex, for the accurate prediction of blue and brown eye colour from DNA in future forensic applications. We used the six currently most eye colour-informative single nucleotide polymorphisms (SNPs) that previously revealed prevalence-adjusted prediction accuracies of over 90% for blue and brown eye colour in 6168 Dutch Europeans. The single multiplex assay, based on SNaPshot chemistry and capillary electrophoresis, both widely used in forensic laboratories, displays high levels of genotyping sensitivity with complete profiles generated from as little as 31pg of DNA, approximately six human diploid cell equivalents. We also present a prediction model to correctly classify an individual's eye colour, via probability estimation solely based on DNA data, and illustrate the accuracy of the developed prediction test on 40 individuals from various geographic origins. Moreover, we obtained insights into the worldwide allele distribution of these six SNPs using the HGDP-CEPH samples of 51 populations. Eye colour prediction analyses from HGDP-CEPH samples provide evidence that the test and model presented here perform reliably without prior ancestry information, although future worldwide genotype and phenotype data shall confirm this notion. As our IrisPlex eye colour prediction test is capable of immediate implementation in forensic casework, it represents one of the first steps forward in the creation of a fully individualised EVC prediction system for future use in forensic DNA intelligence.

  15. 10. DETAIL, CAB SIDE. DETAIL, END OF BOOM. DETAIL, LOWER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. DETAIL, CAB SIDE. DETAIL, END OF BOOM. DETAIL, LOWER PART OF TOWER, SHOWING METAL WHEELS AND CABLE SPOOLS. DETAIL, LOOKING UP AT THE UNDERSIDE OF THE REVOLVING PLATFORM ATOP THE TOWER. - United Engineering Company Shipyard, Crane, 2900 Main Street, Alameda, Alameda County, CA

  16. Establishing an Appropriate Level of Detail (LoD) for a Building Information Model (BIM) - West Block, Parliament Hill, Ottawa, Canada

    NASA Astrophysics Data System (ADS)

    Fai, S.; Rafeiro, J.

    2014-05-01

    In 2011, Public Works and Government Services Canada (PWGSC) embarked on a comprehensive rehabilitation of the historically significant West Block of Canada's Parliament Hill. With over 17 thousand square meters of floor space, the West Block is one of the largest projects of its kind in the world. As part of the rehabilitation, PWGSC is working with the Carleton Immersive Media Studio (CIMS) to develop a building information model (BIM) that can serve as maintenance and life-cycle management tool once construction is completed. The scale and complexity of the model have presented many challenges. One of these challenges is determining appropriate levels of detail (LoD). While still a matter of debate in the development of international BIM standards, LoD is further complicated in the context of heritage buildings because we must reconcile the LoD of the BIM with that used in the documentation process (terrestrial laser scan and photogrammetric survey data). In this paper, we will discuss our work to date on establishing appropriate LoD within the West Block BIM that will best serve the end use. To facilitate this, we have developed a single parametric model for gothic pointed arches that can be used for over seventy-five unique window types present in the West Block. Using the AEC (CAN) BIM as a reference, we have developed a workflow to test each of these window types at three distinct levels of detail. We have found that the parametric Gothic arch significantly reduces the amount of time necessary to develop scenarios to test appropriate LoD.

  17. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  18. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    ERIC Educational Resources Information Center

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  19. Detailed cross sections of the Eocene Green River Formation along the north and east margins of the Piceance Basin, western Colorado, using measured sections and drill hole information

    USGS Publications Warehouse

    Johnson, Ronald C.

    2014-01-01

    This report presents two detailed cross sections of the Eocene Green River Formation in the Piceance Basin, northwestern Colorado, constructed from eight detailed measured sections, fourteen core holes, and two rotary holes. The Eocene Green River Formation in the Piceance Basin contains the world’s largest known oil shale deposit with more than 1.5 billion barrels of oil in place. It was deposited in Lake Uinta, a long-lived saline lake that once covered much of the Piceance Basin and the Uinta Basin to the west. The cross sections extend across the northern and eastern margins of the Piceance Basin and are intended to aid in correlating between surface sections and the subsurface in the basin.

  20. Detailed design package for design of a video system providing optimal visual information for controlling payload and experiment operations with television

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A detailed description of a video system for controlling space shuttle payloads and experiments is presented in the preliminary design review and critical design review, first and second engineering design reports respectively, and in the final report submitted jointly with the design package. The material contained in the four subsequent sections of the package contains system descriptions, design data, and specifications for the recommended 2-view system. Section 2 contains diagrams relating to the simulation test configuration of the 2-view system. Section 3 contains descriptions and drawings of the deliverable breadboard equipment. A description of the recommended system is contained in Section 4 with equipment specifications in Section 5.

  1. Lost in translation: preclinical studies on 3,4-methylenedioxymethamphetamine provide information on mechanisms of action, but do not allow accurate prediction of adverse events in humans

    PubMed Central

    Green, AR; King, MV; Shortall, SE; Fone, KCF

    2012-01-01

    3,4-Methylenedioxymethamphetamine (MDMA) induces both acute adverse effects and long-term neurotoxic loss of brain 5-HT neurones in laboratory animals. However, when choosing doses, most preclinical studies have paid little attention to the pharmacokinetics of the drug in humans or animals. The recreational use of MDMA and current clinical investigations of the drug for therapeutic purposes demand better translational pharmacology to allow accurate risk assessment of its ability to induce adverse events. Recent pharmacokinetic studies on MDMA in animals and humans are reviewed and indicate that the risks following MDMA ingestion should be re-evaluated. Acute behavioural and body temperature changes result from rapid MDMA-induced monoamine release, whereas long-term neurotoxicity is primarily caused by metabolites of the drug. Therefore acute physiological changes in humans are fairly accurately mimicked in animals by appropriate dosing, although allometric dosing calculations have little value. Long-term changes require MDMA to be metabolized in a similar manner in experimental animals and humans. However, the rate of metabolism of MDMA and its major metabolites is slower in humans than rats or monkeys, potentially allowing endogenous neuroprotective mechanisms to function in a species specific manner. Furthermore acute hyperthermia in humans probably limits the chance of recreational users ingesting sufficient MDMA to produce neurotoxicity, unlike in the rat. MDMA also inhibits the major enzyme responsible for its metabolism in humans thereby also assisting in preventing neurotoxicity. These observations question whether MDMA alone produces long-term 5-HT neurotoxicity in human brain, although when taken in combination with other recreational drugs it may induce neurotoxicity. LINKED ARTICLES This article is commented on by Parrott, pp. 1518–1520 of this issue. To view this commentary visit http://dx.doi.org/10.1111/j.1476-5381.2012.01941.x and to view the the

  2. Student Information Systems Demystified: The Increasing Demand for Accurate, Timely Data Means Schools and Districts Are Relying Heavily on SIS Technologies

    ERIC Educational Resources Information Center

    McIntire, Todd

    2004-01-01

    Student information systems, one of the first applications of computer technology in education, are undergoing a significant transition yet again. The first major shift in SIS technologies occurred about 15 years ago when they evolved from mainframe programs to client-server solutions. Now, vendors across the board are offering centralized…

  3. Effect of detailed information in the minority game: optimality of 2-day memory and enhanced efficiency due to random exogenous data

    NASA Astrophysics Data System (ADS)

    Sasidevan, V.

    2016-07-01

    In the minority game (MG), an odd number of heterogeneous and adaptive agents choose between two alternatives and those who end up on the minority side win. When the information available to the agents to make their choice is the identity of the minority side for the past m days, it is well-known that the emergent coordination among the agents is maximum when m∼ {{log}2}(N) . The optimal memory-length thus increases with the system size. In this work we show that, in MG when the information available to the agents to make their choice is the strength of the minority side for the past m days, the optimal memory length for the agents is always two (m  =  2) for large enough system sizes. The system is inefficient for m  =  1 and converges to random choice behaviour for m>2 for large N. Surprisingly, providing the agents with uniformly and randomly sampled m  =  1 exogenous information results in an increase in coordination between them compared to the case of endogenous information with any value of m. This is in stark contrast to the conventional MG, where agent’s coordination is invariant or gets worse with respect to such random exogenous information.

  4. Detailed requirements document for Stowage List and Hardware Tracking System (SLAHTS). [computer based information management system in support of space shuttle orbiter stowage configuration

    NASA Technical Reports Server (NTRS)

    Keltner, D. J.

    1975-01-01

    The stowage list and hardware tracking system, a computer based information management system, used in support of the space shuttle orbiter stowage configuration and the Johnson Space Center hardware tracking is described. The input, processing, and output requirements that serve as a baseline for system development are defined.

  5. Systematic assessment of coordinated activity cliffs formed by kinase inhibitors and detailed characterization of activity cliff clusters and associated SAR information.

    PubMed

    Dimova, Dilyana; Stumpfe, Dagmar; Bajorath, Jürgen

    2015-01-27

    From currently available kinase inhibitors and their activity data, clusters of coordinated activity cliffs were systematically derived and subjected to cluster index and index map analysis. Type I-like inhibitors with well-defined IC50 measurements were found to provide a large knowledge base of activity cliff clusters for 266 targets from nine kinase groups. On the basis of index map analysis, these clusters were systematically organized according to structural similarity of inhibitors and activity cliff diversity and prioritized for structure-activity relationship (SAR) analysis. From prioritized clusters, interpretable SAR information can be extracted. It is also shown that activity cliff clusters formed by ATP site-directed inhibitors often represent local SAR environments of rather different complexity and interpretability. In addition, activity cliff clusters including promiscuous kinase inhibitors have been determined. Only a small subset of inhibitors was found to change activity cliff roles in different clusters. The activity cliff clusters described herein and their index map organization substantially enrich SAR information associated with kinase inhibitors in compound subsets of limited size. The cluster and index map information is made available upon request to provide opportunities for further SAR exploration. On the basis of our analysis and the data provided, activity cliff clusters and corresponding inhibitor series for kinase targets of interest can be readily selected.

  6. We Built This House; It's Time to Move in: Leveraging Existing DICOM Structure to More Completely Utilize Readily Available Detailed Contrast Administration Information.

    PubMed

    Hirsch, Jeffrey D; Siegel, Eliot L; Balasubramanian, Sridhar; Wang, Kenneth C

    2015-08-01

    The Digital Imaging and Communications in Medicine (DICOM) standard is the universal format for interoperability in medical imaging. In addition to imaging data, DICOM has evolved to support a wide range of imaging metadata including contrast administration data that is readily available from many modern contrast injectors. Contrast agent, route of administration, start and stop time, volume, flow rate, and duration can be recorded using DICOM attributes [1]. While this information is sparsely and inconsistently recorded in routine clinical practice, it could potentially be of significant diagnostic value. This work will describe parameters recorded by automatic contrast injectors, summarize the DICOM mechanisms available for tracking contrast injection data, and discuss the role of such data in clinical radiology.

  7. Detailed mechanism of benzene oxidation

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1987-01-01

    A detailed quantitative mechanism for the oxidation of benzene in both argon and nitrogen diluted systems is presented. Computed ignition delay time for argon diluted mixtures are in satisfactory agreement with experimental results for a wide range of initial conditions. An experimental temperature versus time profile for a nitrogen diluted oxidation was accurately matched and several concentration profiles were matched qualitatively. Application of sensitivity analysis has given approximate rate constant expressions for the two dominant heat release reactions, the oxidation of C6H5 and C5H5 radicals by molecular oxygen.

  8. Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross Bracing Detail, Vertical Cross Bracing-End Detail - Cumberland Covered Bridge, Spanning Mississinewa River, Matthews, Grant County, IN

  9. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  10. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  11. Chord Splicing & Joining Detail; Chord & CrossBracing Joint Details; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Chord Splicing & Joining Detail; Chord & Cross-Bracing Joint Details; Cross Bracing Center Joint Detail; Chord & Diagonal Joint Detail - Vermont Covered Bridge, Highland Park, spanning Kokomo Creek at West end of Deffenbaugh Street (moved to), Kokomo, Howard County, IN

  12. Arch & Chord Joint Detail; Crossbracing Center Joint Detail; Chord, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Arch & Chord Joint Detail; Crossbracing Center Joint Detail; Chord, Panel Post, Tie & Diagonal Brace Joint Detail; Chord, Panel Post, Tie & Crossbracing Joint Detail - Dunlapsville Covered Bridge, Spanning East Fork Whitewater River, Dunlapsville, Union County, IN

  13. LF460 detail design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    This is the final technical report documenting the detail design of the LF460, and advanced turbotip lift fan intended for application with the YJ97-GE-100 turbojet jet generator to a V/STOL transport research aircraft. Primary objective of the design was to achieve a low noise level while maintaining the high thrust/weight ratio capability of a high pressure ratio lift fan. Report covers design requirements and summarizes activities and final results in the areas of aerodynamic and mechanical design, component and system performance, acoustic features and final noise predictions.

  14. Details of meiosis

    SciTech Connect

    1993-12-31

    Chapter 18, discusses the details of meiosis, beginning with the structure and number of chiasmata, i.e., the cytological term for two homologous chromosomes forming a bivalent which begin to repel each other until they are held together only at the point of crossing-over. The synaptonemal complex which consists of two lateral elements which contain protein and RNA is also discussed. The chapter concludes with a description of meiosis in polyploids, human meiosis, and the behavior of X and Y chromosomes. 28 refs., 8 figs.

  15. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  16. Fabricating an Accurate Implant Master Cast: A Technique Report.

    PubMed

    Balshi, Thomas J; Wolfinger, Glenn J; Alfano, Stephen G; Cacovean, Jeannine N; Balshi, Stephen F

    2015-12-01

    The technique for fabricating an accurate implant master cast following the 12-week healing period after Teeth in a Day® dental implant surgery is detailed. The clinical, functional, and esthetic details captured during the final master impression are vital to creating an accurate master cast. This technique uses the properties of the all-acrylic resin interim prosthesis to capture these details. This impression captures the relationship between the remodeled soft tissue and the interim prosthesis. This provides the laboratory technician with an accurate orientation of the implant replicas in the master cast with which a passive fitting restoration can be fabricated.

  17. Accurate spectral color measurements

    NASA Astrophysics Data System (ADS)

    Hiltunen, Jouni; Jaeaeskelaeinen, Timo; Parkkinen, Jussi P. S.

    1999-08-01

    Surface color measurement is of importance in a very wide range of industrial applications including paint, paper, printing, photography, textiles, plastics and so on. For a demanding color measurements spectral approach is often needed. One can measure a color spectrum with a spectrophotometer using calibrated standard samples as a reference. Because it is impossible to define absolute color values of a sample, we always work with approximations. The human eye can perceive color difference as small as 0.5 CIELAB units and thus distinguish millions of colors. This 0.5 unit difference should be a goal for the precise color measurements. This limit is not a problem if we only want to measure the color difference of two samples, but if we want to know in a same time exact color coordinate values accuracy problems arise. The values of two instruments can be astonishingly different. The accuracy of the instrument used in color measurement may depend on various errors such as photometric non-linearity, wavelength error, integrating sphere dark level error, integrating sphere error in both specular included and specular excluded modes. Thus the correction formulas should be used to get more accurate results. Another question is how many channels i.e. wavelengths we are using to measure a spectrum. It is obvious that the sampling interval should be short to get more precise results. Furthermore, the result we get is always compromise of measuring time, conditions and cost. Sometimes we have to use portable syste or the shape and the size of samples makes it impossible to use sensitive equipment. In this study a small set of calibrated color tiles measured with the Perkin Elmer Lamda 18 and the Minolta CM-2002 spectrophotometers are compared. In the paper we explain the typical error sources of spectral color measurements, and show which are the accuracy demands a good colorimeter should have.

  18. Detail of Triton

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This color photo of Neptune's large satellite Triton was obtained on Aug. 24 1989 at a range of 530,000 kilometers (330,000 miles). The resolution is about 10 kilometers (6.2 miles), sufficient to begin to show topographic detail. The image was made from pictures taken through the green, violet and ultraviolet filters. In this technique, regions that are highly reflective in the ultraviolet appear blue in color. In reality, there is no part of Triton that would appear blue to the eye. The bright southern hemisphere of Triton, which fills most of this frame, is generally pink in tone as is the even brighter equatorial band. The darker regions north of the equator also tend to be pink or reddish in color. JPL manages the Voyager project for NASA's Office of Space Science, Washington, DC.

  19. Detail of Triton's Surface

    NASA Technical Reports Server (NTRS)

    1989-01-01

    This color photo of Neptune's large satellite Triton was obtained on Aug. 24 1989 at a range of 530,000 kilometers(330,000 miles). The resolution is about 10 kilometers (6.2 miles), sufficient to begin to show topographic detail. The image was made from pictures taken through the green, violet and ultraviolet filters. In this technique, regions that are highly reflective in the ultraviolet appear blue in color. In reality, there is no part of Triton that would appear blue to the eye. The bright southern hemisphere of Triton, which fills most of this frame, is generally pink in tone as is the even brighter equatorial band. The darker regions north of the equator also tend to be pink or reddish in color.

    JPL manages the Voyager project for NASA's Office of Space Science.

  20. Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross Bracing Joint, Vertical Cross Bracing End Detail - Ceylon Covered Bridge, Limberlost Park, spanning Wabash River at County Road 900 South, Geneva, Adams County, IN

  1. Southeast Elevation; Dome Rafter Detail; Piazza Rafter Detail; Main Block ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Southeast Elevation; Dome Rafter Detail; Piazza Rafter Detail; Main Block Bracket Detail - National Home for Disabled Volunteer Soldiers - Battle Mountain Sanitarium, Administration Building, 500 North Fifth Street, Hot Springs, Fall River County, SD

  2. double hung window details, hall window details, entrance door profiles ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    double hung window details, hall window details, entrance door profiles - Chopawamsic Recreational Demonstration Area - Cabin Camp 1, Help's Quarters, Prince William Forest Park, Triangle, Prince William County, VA

  3. On accurate determination of contact angle

    NASA Technical Reports Server (NTRS)

    Concus, P.; Finn, R.

    1992-01-01

    Methods are proposed that exploit a microgravity environment to obtain highly accurate measurement of contact angle. These methods, which are based on our earlier mathematical results, do not require detailed measurement of a liquid free-surface, as they incorporate discontinuous or nearly-discontinuous behavior of the liquid bulk in certain container geometries. Physical testing is planned in the forthcoming IML-2 space flight and in related preparatory ground-based experiments.

  4. Morphological details in bloodstain particles.

    PubMed

    De Wael, K; Lepot, L

    2015-01-01

    During the commission of crimes blood can be transferred to the clothing of the offender or on other crime related objects. Bloodstain particles are sub-millimetre sized flakes that are lost from dried bloodstains. The nature of these red particles is easily confirmed using spectroscopic methods. In casework, bloodstain particles showing highly detailed morphological features were observed. These provided a rationale for a series of experiments described in this work. It was found that the "largest" particles are shed from blood deposited on polyester and polyamide woven fabrics. No particles are lost from the stains made on absorbent fabrics and from those made on knitted fabrics. The morphological features observed in bloodstain particles can provide important information on the substrates from which they were lost.

  5. Detailed modelling of the 21-cm forest

    NASA Astrophysics Data System (ADS)

    Semelin, B.

    2016-01-01

    The 21-cm forest is a promising probe of the Epoch of Reionization. The local state of the intergalactic medium (IGM) is encoded in the spectrum of a background source (radio-loud quasars or gamma-ray burst afterglow) by absorption at the local 21-cm wavelength, resulting in a continuous and fluctuating absorption level. Small-scale structures (filaments and minihaloes) in the IGM are responsible for the strongest absorption features. The absorption can also be modulated on large scales by inhomogeneous heating and Wouthuysen-Field coupling. We present the results from a simulation that attempts to preserve the cosmological environment while resolving some of the small-scale structures (a few kpc resolution in a 50 h-1 Mpc box). The simulation couples the dynamics and the ionizing radiative transfer and includes X-ray and Lyman lines radiative transfer for a detailed physical modelling. As a result we find that soft X-ray self-shielding, Ly α self-shielding and shock heating all have an impact on the predicted values of the 21-cm optical depth of moderately overdense structures like filaments. A correct treatment of the peculiar velocities is also critical. Modelling these processes seems necessary for accurate predictions and can be done only at high enough resolution. As a result, based on our fiducial model, we estimate that LOFAR should be able to detect a few (strong) absorptions features in a frequency range of a few tens of MHz for a 20 mJy source located at z = 10, while the SKA would extract a large fraction of the absorption information for the same source.

  6. Fast and accurate automated cell boundary determination for fluorescence microscopy

    NASA Astrophysics Data System (ADS)

    Arce, Stephen Hugo; Wu, Pei-Hsun; Tseng, Yiider

    2013-07-01

    Detailed measurement of cell phenotype information from digital fluorescence images has the potential to greatly advance biomedicine in various disciplines such as patient diagnostics or drug screening. Yet, the complexity of cell conformations presents a major barrier preventing effective determination of cell boundaries, and introduces measurement error that propagates throughout subsequent assessment of cellular parameters and statistical analysis. State-of-the-art image segmentation techniques that require user-interaction, prolonged computation time and specialized training cannot adequately provide the support for high content platforms, which often sacrifice resolution to foster the speedy collection of massive amounts of cellular data. This work introduces a strategy that allows us to rapidly obtain accurate cell boundaries from digital fluorescent images in an automated format. Hence, this new method has broad applicability to promote biotechnology.

  7. The Finer Details: Climate Modeling

    NASA Technical Reports Server (NTRS)

    2000-01-01

    If you want to know whether you will need sunscreen or an umbrella for tomorrow's picnic, you can simply read the local weather report. However, if you are calculating the impact of gas combustion on global temperatures, or anticipating next year's rainfall levels to set water conservation policy, you must conduct a more comprehensive investigation. Such complex matters require long-range modeling techniques that predict broad trends in climate development rather than day-to-day details. Climate models are built from equations that calculate the progression of weather-related conditions over time. Based on the laws of physics, climate model equations have been developed to predict a number of environmental factors, for example: 1. Amount of solar radiation that hits the Earth. 2. Varying proportions of gases that make up the air. 3. Temperature at the Earth's surface. 4. Circulation of ocean and wind currents. 5. Development of cloud cover. Numerical modeling of the climate can improve our understanding of both the past and, the future. A model can confirm the accuracy of environmental measurements taken. in, the past and can even fill in gaps in those records. In addition, by quantifying the relationship between different aspects of climate, scientists can estimate how a future change in one aspect may alter the rest of the world. For example, could an increase in the temperature of the Pacific Ocean somehow set off a drought on the other side of the world? A computer simulation could lead to an answer for this and other questions. Quantifying the chaotic, nonlinear activities that shape our climate is no easy matter. You cannot run these simulations on your desktop computer and expect results by the time you have finished checking your morning e-mail. Efficient and accurate climate modeling requires powerful computers that can process billions of mathematical calculations in a single second. The NCCS exists to provide this degree of vast computing capability.

  8. Detail view of ornamental lighting detail of southwest corner of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of ornamental lighting detail of southwest corner of Sixth Street Bridge. Looking northeast - Sixth Street Bridge, Spanning 101 Freeway at Sixth Street, Los Angeles, Los Angeles County, CA

  9. Detail, Scandia Hotel, view to southwest showing details of balloon ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail, Scandia Hotel, view to southwest showing details of balloon framing, including full two-story studs notched to carry girts supporting second story floor joists (210mm lens) - Scandia Hotel, 225 First Street, Eureka, Humboldt County, CA

  10. 58. DETAIL OF PINION AND BULL GEARS: Detail view towards ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    58. DETAIL OF PINION AND BULL GEARS: Detail view towards northeast of the pinion and bull gears of the winding machinery. - San Francisco Cable Railway, Washington & Mason Streets, San Francisco, San Francisco County, CA

  11. Detailed signal model of coherent wind measurement lidar

    NASA Astrophysics Data System (ADS)

    Ma, Yuechao; Li, Sining; Lu, Wei

    2016-11-01

    Lidar is short for light detection and ranging, which is a tool to help measuring some useful information of atmosphere. In the recent years, more and more attention was paid to the research of wind measurement by lidar. Because the accurate wind information can be used not only in weather report, but also the safety guarantee of the airplanes. In this paper, a more detailed signal model of wind measurement lidar is proposed. It includes the laser transmitting part which describes the broadening of the spectral, the laser attenuation in the atmosphere, the backscattering signal and the detected signal. A Voigt profile is used to describe the broadening of the transmitting laser spectral, which is the most common situation that is the convolution of different broadening line shapes. The laser attenuation includes scattering and absorption. We use a Rayleigh scattering model and partially-Correlated quadratic-Velocity-Dependent Hard-Collision (pCqSDHC) model to describe the molecule scattering and absorption. When calculate the particles scattering and absorption, the Gaussian particles model is used to describe the shape of particles. Because of the Doppler Effect occurred between the laser and atmosphere, the wind velocity can be calculated by the backscattering signal. Then, a two parameter Weibull distribution is used to describe the wind filed, so that we can use it to do the future work. After all the description, the signal model of coherent wind measurement lidar is decided. And some of the simulation is given by MATLAB. This signal model can describe the system more accurate and more detailed, so that the following work will be easier and more efficient.

  12. Making detailed predictions makes (some) predictions worse

    NASA Astrophysics Data System (ADS)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  13. Influences on physicians' adoption of electronic detailing (e-detailing).

    PubMed

    Alkhateeb, Fadi M; Doucette, William R

    2009-01-01

    E-detailing means using digital technology: internet, video conferencing and interactive voice response. There are two types of e-detailing: interactive (virtual) and video. Currently, little is known about what factors influence physicians' adoption of e-detailing. The objectives of this study were to test a model of physicians' adoption of e-detailing and to describe physicians using e-detailing. A mail survey was sent to a random sample of 2000 physicians practicing in Iowa. Binomial logistic regression was used to test the model of influences on physician adoption of e-detailing. On the basis of Rogers' model of adoption, the independent variables included relative advantage, compatibility, complexity, peer influence, attitudes, years in practice, presence of restrictive access to traditional detailing, type of specialty, academic affiliation, type of practice setting and control variables. A total of 671 responses were received giving a response rate of 34.7%. A total of 141 physicians (21.0%) reported using of e-detailing. The overall adoption model for using either type of e-detailing was found to be significant. Relative advantage, peer influence, attitudes, type of specialty, presence of restrictive access and years of practice had significant influences on physician adoption of e-detailing. The model of adoption of innovation is useful to explain physicians' adoption of e-detailing.

  14. Accurate three-dimensional documentation of distinct sites

    NASA Astrophysics Data System (ADS)

    Singh, Mahesh K.; Dutta, Ashish; Subramanian, Venkatesh K.

    2017-01-01

    One of the most critical aspects of documenting distinct sites is acquiring detailed and accurate range information. Several three-dimensional (3-D) acquisition techniques are available, but each has its own limitations. This paper presents a range data fusion method with the aim to enhance the descriptive contents of the entire 3-D reconstructed model. A kernel function is introduced for supervised classification of the range data using a kernelized support vector machine. The classification method is based on the local saliency features of the acquired range data. The range data acquired from heterogeneous range sensors are transformed into a defined common reference frame. Based on the segmentation criterion, the fusion of range data is performed by integrating finer regions of range data acquired from a laser range scanner with the coarser region of Kinect's range data. After fusion, the Delaunay triangulation algorithm is applied to generate the highly accurate, realistic 3-D model of the scene. Finally, experimental results show the robustness of the proposed approach.

  15. Memory for details with self-referencing.

    PubMed

    Serbun, Sarah J; Shih, Joanne Y; Gutchess, Angela H

    2011-11-01

    Self-referencing benefits item memory, but little is known about the ways in which referencing the self affects memory for details. Experiment 1 assessed whether the effects of self-referencing operate only at the item, or general, level or whether they also enhance memory for specific visual details of objects. Participants incidentally encoded objects by making judgements in reference to the self, a close other (one's mother), or a familiar other (Bill Clinton). Results indicate that referencing the self or a close other enhances both specific and general memory. Experiments 2 and 3 assessed verbal memory for source in a task that relied on distinguishing between different mental operations (internal sources). The results indicate that self-referencing disproportionately enhances source memory, relative to conditions referencing other people, semantic, or perceptual information. We conclude that self-referencing not only enhances specific memory for both visual and verbal information, but can also disproportionately improve memory for specific internal source details.

  16. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  17. Occupation Competency Profile: Steel Detailer Program.

    ERIC Educational Resources Information Center

    Alberta Learning, Edmonton. Apprenticeship and Industry Training.

    This document presents information about the apprenticeship training program of Alberta, Canada, in general and the steel detailer program in particular. The first part of the document discusses the following items: Alberta's apprenticeship and industry training system; the apprenticeship and industry training committee structure; local…

  18. Automatic classification and accurate size measurement of blank mask defects

    NASA Astrophysics Data System (ADS)

    Bhamidipati, Samir; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2015-07-01

    A blank mask and its preparation stages, such as cleaning or resist coating, play an important role in the eventual yield obtained by using it. Blank mask defects' impact analysis directly depends on the amount of available information such as the number of defects observed, their accurate locations and sizes. Mask usability qualification at the start of the preparation process, is crudely based on number of defects. Similarly, defect information such as size is sought to estimate eventual defect printability on the wafer. Tracking of defect characteristics, specifically size and shape, across multiple stages, can further be indicative of process related information such as cleaning or coating process efficiencies. At the first level, inspection machines address the requirement of defect characterization by detecting and reporting relevant defect information. The analysis of this information though is still largely a manual process. With advancing technology nodes and reducing half-pitch sizes, a large number of defects are observed; and the detailed knowledge associated, make manual defect review process an arduous task, in addition to adding sensitivity to human errors. Cases where defect information reported by inspection machine is not sufficient, mask shops rely on other tools. Use of CDSEM tools is one such option. However, these additional steps translate into increased costs. Calibre NxDAT based MDPAutoClassify tool provides an automated software alternative to the manual defect review process. Working on defect images generated by inspection machines, the tool extracts and reports additional information such as defect location, useful for defect avoidance[4][5]; defect size, useful in estimating defect printability; and, defect nature e.g. particle, scratch, resist void, etc., useful for process monitoring. The tool makes use of smart and elaborate post-processing algorithms to achieve this. Their elaborateness is a consequence of the variety and

  19. An attempt to obtain a detailed declination chart from the United States magnetic anomaly map

    USGS Publications Warehouse

    Alldredge, L.R.

    1989-01-01

    Modern declination charts of the United States show almost no details. It was hoped that declination details could be derived from the information contained in the existing magnetic anomaly map of the United States. This could be realized only if all of the survey data were corrected to a common epoch, at which time a main-field vector model was known, before the anomaly values were computed. Because this was not done, accurate declination values cannot be determined. In spite of this conclusion, declination values were computed using a common main-field model for the entire United States to see how well they compared with observed values. The computed detailed declination values were found to compare less favourably with observed values of declination than declination values computed from the IGRF 1985 model itself. -from Author

  20. 15. CYLINDER DETAILS; DETAILS OF STEEL FOR CYLINDERS NO. 50 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. CYLINDER DETAILS; DETAILS OF STEEL FOR CYLINDERS NO. 50 (PIER 5) AND NO. 66 (PIER 6), DWG. 83, CH BY AF, ECL, APPROVED BY O.F. LACKEY, MAY 18, 1908 - Baltimore Inner Harbor, Pier 5, South of Pratt Street between Market Place & Concord Street, Baltimore, Independent City, MD

  1. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  2. Computed tomography:the details.

    SciTech Connect

    Doerry, Armin Walter

    2007-07-01

    Computed Tomography (CT) is a well established technique, particularly in medical imaging, but also applied in Synthetic Aperture Radar (SAR) imaging. Basic CT imaging via back-projection is treated in many texts, but often with insufficient detail to appreciate subtleties such as the role of non-uniform sampling densities. Herein are given some details often neglected in many texts.

  3. 13. DETAIL VIEW OF BRIDGE, LOOKING SOUTH FROM ROADWAY. DETAIL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BRIDGE, LOOKING SOUTH FROM ROADWAY. DETAIL VIEW OF THE PIERS AND LIGHTING FIXTURES ON THE COLORADO STREET BRIDGE. THIS VIEW SHOWS A PORTION OF THE BRIDGE ALONG THE SOUTH SIDE OF THE ROADWAY. EACH FIXTURE ALSO ORIGINALLY HAD FOUR ADDITIONAL GLOBES, WHICH EXTENDED FROM THE COLUMN BELOW THE MAIN GLOBE. THE 'REFUGE' SEATING AREAS ARE ORIGINAL, WHILE THE RAILING IS A LATER ADDITION. - Colorado Street Bridge, Spanning Arroyo Seco at Colorado Boulevard, Pasadena, Los Angeles County, CA

  4. 5 CFR 370.104 - Length of details.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Length of details. 370.104 Section 370.104 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.104 Length of details. (a) Details may be for a period of between 3...

  5. 5 CFR 370.104 - Length of details.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Length of details. 370.104 Section 370.104 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.104 Length of details. (a) Details may be for a period of between 3...

  6. 5 CFR 370.104 - Length of details.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Length of details. 370.104 Section 370.104 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.104 Length of details. (a) Details may be for a period of between 3...

  7. 5 CFR 370.104 - Length of details.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Length of details. 370.104 Section 370.104 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.104 Length of details. (a) Details may be for a period of between 3...

  8. 5 CFR 370.104 - Length of details.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Length of details. 370.104 Section 370.104 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.104 Length of details. (a) Details may be for a period of between 3...

  9. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  10. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  11. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  12. Review of Ship Structural Details

    DTIC Science & Technology

    1977-01-01

    8 4.3 Knee and Beam Brackets 4-11 4.3.1 Brackets for Girders and Deep Webs 4-11 4.3.2 Brackets Connecting Rolled Sections 4-15 4.4 Tripping...are shell stringers penetrating deep web frames and longitudinal girders penetrating deep transverses. This is not a common detail. If double...34. 3-76 ^"SECTION ’’.’(-K PLAJ iNG * S v *^ 4Fb^:TH»r.KNF.^ SAME AS FLAMGE ► BULKHFADQR DEEP WEB SS- 9 Detail Type: STANCHION END

  13. DAGAL: Detailed Anatomy of Galaxies

    NASA Astrophysics Data System (ADS)

    Knapen, Johan H.

    2017-03-01

    The current IAU Symposium is closely connected to the EU-funded network DAGAL (Detailed Anatomy of Galaxies), with the final annual network meeting of DAGAL being at the core of this international symposium. In this short paper, we give an overview of DAGAL, its training activities, and some of the scientific advances that have been made under its umbrella.

  14. Details for Manuscript Number SSM-D-06-00377R1 “Targeted Ethnography as a Critical Step to Inform Cultural Adaptations of HIV Prevention Interventions for Adults with Severe Mental Illness.”

    PubMed Central

    Gonzalez, M. Alfredo; McKinnon, Karen; Elkington, Katherine S; Pinto, Diana; Mann, Claudio Gruber; Mattos, Paulo E

    2007-01-01

    As in other countries worldwide, adults with severe mental illness (SMI) in Brazil are disproportionately infected with HIV relative to the general population. Brazilian psychiatric facilities lack tested HIV prevention interventions. To adapt existing interventions, developed only in the U.S., we conducted targeted ethnography with adults with SMI and staff from two psychiatric institutions in Brazil. We sought to characterize individual, institutional, and interpersonal factors that may affect HIV risk behavior in this population. We conducted 350 hours of ethnographic field observations in two mental health service settings in Rio de Janeiro, and 9 focus groups (n = 72) and 16 key-informant interviews with patients and staff in these settings. Data comprised field notes and audiotapes of all exchanges, which were transcribed, coded, and systematically analyzed. The ethnography characterized the institutional culture and identified: 1) patients’ risk behaviors; 2) the institutional setting; 3) intervention content; and 4) intervention format and delivery strategies. Targeted ethnography also illuminated broader contextual issues for development and implementation of HIV prevention interventions for adults with SMI in Brazil, including an institutional culture that did not systematically address patients’ sexual behavior, sexual health, or HIV sexual risk, yet strongly impacted the structure of patients’ sexual networks. Further, ethnography identified the Brazilian concept of “social responsibility” as important to prevention work with psychiatric patients. Targeted ethnography with adults with SMI and institutional staff provided information critical to the adaptation of tested U.S. HIV prevention interventions from the US for Brazilians with SMI. PMID:17475382

  15. Non-targeted analysis of electronics waste by comprehensive two-dimensional gas chromatography combined with high-resolution mass spectrometry: Using accurate mass information and mass defect analysis to explore the data.

    PubMed

    Ubukata, Masaaki; Jobst, Karl J; Reiner, Eric J; Reichenbach, Stephen E; Tao, Qingping; Hang, Jiliang; Wu, Zhanpin; Dane, A John; Cody, Robert B

    2015-05-22

    Comprehensive two-dimensional gas chromatography (GC×GC) and high-resolution mass spectrometry (HRMS) offer the best possible separation of their respective techniques. Recent commercialization of combined GC×GC-HRMS systems offers new possibilities for the analysis of complex mixtures. However, such experiments yield enormous data sets that require new informatics tools to facilitate the interpretation of the rich information content. This study reports on the analysis of dust obtained from an electronics recycling facility by using GC×GC in combination with a new high-resolution time-of-flight (TOF) mass spectrometer. New software tools for (non-traditional) Kendrick mass defect analysis were developed in this research and greatly aided in the identification of compounds containing chlorine and bromine, elements that feature in most persistent organic pollutants (POPs). In essence, the mass defect plot serves as a visual aid from which halogenated compounds are recognizable on the basis of their mass defect and isotope patterns. Mass chromatograms were generated based on specific ions identified in the plots as well as region of the plot predominantly occupied by halogenated contaminants. Tentative identification was aided by database searches, complementary electron-capture negative ionization experiments and elemental composition determinations from the exact mass data. These included known and emerging flame retardants, such as polybrominated diphenyl ethers (PBDEs), hexabromobenzene, tetrabromo bisphenol A and tris (1-chloro-2-propyl) phosphate (TCPP), as well as other legacy contaminants such as polychlorinated biphenyls (PCBs) and polychlorinated terphenyls (PCTs).

  16. On numerically accurate finite element

    NASA Technical Reports Server (NTRS)

    Nagtegaal, J. C.; Parks, D. M.; Rice, J. R.

    1974-01-01

    A general criterion for testing a mesh with topologically similar repeat units is given, and the analysis shows that only a few conventional element types and arrangements are, or can be made suitable for computations in the fully plastic range. Further, a new variational principle, which can easily and simply be incorporated into an existing finite element program, is presented. This allows accurate computations to be made even for element designs that would not normally be suitable. Numerical results are given for three plane strain problems, namely pure bending of a beam, a thick-walled tube under pressure, and a deep double edge cracked tensile specimen. The effects of various element designs and of the new variational procedure are illustrated. Elastic-plastic computation at finite strain are discussed.

  17. Role of spectral detail in sound-source localization.

    PubMed

    Kulkarni, A; Colburn, H S

    Sounds heard over headphones are typically perceived inside the head (internalized), unlike real sound sources which are perceived outside the head (externalized). If the acoustical waveforms from a real sound source are reproduced precisely using headphones, auditory images are appropriately externalized and localized. The filtering (relative boosting, attenuation and delaying of component frequencies) of a sound by the head and outer ear provides information about the location of a sound source by means of the differences in the frequency spectra between the ears as well as the overall spectral shape. This location-dependent filtering is explicitly described by the head-related transfer function (HRTF) from sound source to ear canal. Here we present sounds to subjects through open-canal tube-phones and investigate how accurately the HRTFs must be reproduced to achieve true three-dimensional perception of auditory signals in anechoic space. Listeners attempted to discriminate between 'real' sounds presented from a loudspeaker and 'virtual' sounds presented over tube-phones. Our results show that the HRTFs can be smoothed significantly in frequency without affecting the perceived location of a sound. Listeners cannot distinguish real from virtual sources until the HRTF has lost most of its detailed variation in frequency, at which time the perceived elevation of the image is the reported cue.

  18. Recovering and preventing loss of detailed memory: differential rates of forgetting for detail types in episodic memory

    PubMed Central

    Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris

    2016-01-01

    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired, while memory for central details is relatively spared. Given the sensitivity of memory to loss of details, the present study sought to investigate factors that mediate the forgetting of different types of information from naturalistic episodic memories in young healthy adults. The study investigated (1) time-dependent loss of “central” and “peripheral” details from episodic memories, (2) the effectiveness of cuing with reminders to reinstate memory details, and (3) the role of retrieval in preventing forgetting. Over the course of 7 d, memory for naturalistic events (film clips) underwent a time-dependent loss of peripheral details, while memory for central details (the core or gist of events) showed significantly less loss. Giving brief reminders of the clips just before retrieval reinstated memory for peripheral details, suggesting that loss of details is not always permanent, and may reflect both a storage and retrieval deficit. Furthermore, retrieving a memory shortly after it was encoded prevented loss of both central and peripheral details, thereby promoting retention over time. We consider the implications of these results for behavioral and neurobiological models of retention and forgetting. PMID:26773100

  19. Route visualization using detail lenses.

    PubMed

    Karnick, Pushpak; Cline, David; Jeschke, Stefan; Razdan, Anshuman; Wonka, Peter

    2010-01-01

    We present a method designed to address some limitations of typical route map displays of driving directions. The main goal of our system is to generate a printable version of a route map that shows the overview and detail views of the route within a single, consistent visual frame. Our proposed visualization provides a more intuitive spatial context than a simple list of turns. We present a novel multifocus technique to achieve this goal, where the foci are defined by points of interest (POI) along the route. A detail lens that encapsulates the POI at a finer geospatial scale is created for each focus. The lenses are laid out on the map to avoid occlusion with the route and each other, and to optimally utilize the free space around the route. We define a set of layout metrics to evaluate the quality of a lens layout for a given route map visualization. We compare standard lens layout methods to our proposed method and demonstrate the effectiveness of our method in generating aesthetically pleasing layouts. Finally, we perform a user study to evaluate the effectiveness of our layout choices.

  20. Accurate ab Initio Spin Densities.

    PubMed

    Boguslawski, Katharina; Marti, Konrad H; Legeza, Ors; Reiher, Markus

    2012-06-12

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740].

  1. Formation of the prebiotic molecule NH2CHO on astronomical amorphous solid water surfaces: accurate tunneling rate calculations† †Electronic supplementary information (ESI) available: Geometric details, lists of calculated rate constants. See DOI: 10.1039/c6cp05727f Click here for additional data file.

    PubMed Central

    Song, Lei

    2016-01-01

    Investigating how formamide forms in the interstellar medium is a hot topic in astrochemistry, which can contribute to our understanding of the origin of life on Earth. We have constructed a QM/MM model to simulate the hydrogenation of isocyanic acid on amorphous solid water surfaces to form formamide. The binding energy of HNCO on the ASW surface varies significantly between different binding sites, we found values between ∼0 and 100 kJ mol–1. The barrier for the hydrogenation reaction is almost independent of the binding energy, though. We calculated tunneling rate constants of H + HNCO → NH2CO at temperatures down to 103 K combining QM/MM with instanton theory. Tunneling dominates the reaction at such low temperatures. The tunneling reaction is hardly accelerated by the amorphous solid water surface compared to the gas phase for this system, even though the activation energy of the surface reaction is lower than the one of the gas-phase reaction. Both the height and width of the barrier affect the tunneling rate in practice. Strong kinetic isotope effects were observed by comparing to rate constants of D + HNCO → NHDCO. At 103 K we found a KIE of 231 on the surface and 146 in the gas phase. Furthermore, we investigated the gas-phase reaction NH2 + H2CO → NH2CHO + H and found it unlikely to occur at cryogenic temperatures. The data of our tunneling rate constants are expected to significantly influence astrochemical models. PMID:27731439

  2. Space Telecommunications Radio System (STRS) Architecture, Tutorial Part 2 - Detailed

    NASA Technical Reports Server (NTRS)

    Handler, Louis

    2014-01-01

    The STRS architecture detail presentation presents each requirement in the STRS Architecture Standard with some examples and supporting information. The purpose is to give a platform provider, application provider, or application integrator a better, more detailed understanding of the STRS Architecture Standard and its use.

  3. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  4. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  5. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  6. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  7. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  8. 32 CFR 241.6 - Length of details.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 2 2014-07-01 2014-07-01 false Length of details. 241.6 Section 241.6 National Defense Department of Defense (Continued) OFFICE OF THE SECRETARY OF DEFENSE (CONTINUED) MISCELLANEOUS PILOT PROGRAM FOR TEMPORARY EXCHANGE OF INFORMATION TECHNOLOGY PERSONNEL § 241.6 Length of details....

  9. Must Kohn-Sham oscillator strengths be accurate at threshold?

    SciTech Connect

    Yang Zenghui; Burke, Kieron; Faassen, Meta van

    2009-09-21

    The exact ground-state Kohn-Sham (KS) potential for the helium atom is known from accurate wave function calculations of the ground-state density. The threshold for photoabsorption from this potential matches the physical system exactly. By carefully studying its absorption spectrum, we show the answer to the title question is no. To address this problem in detail, we generate a highly accurate simple fit of a two-electron spectrum near the threshold, and apply the method to both the experimental spectrum and that of the exact ground-state Kohn-Sham potential.

  10. Behind Scenes in Medicine: Training the Pharmaceutical Detail Man

    ERIC Educational Resources Information Center

    Seltz, Judith S.

    1974-01-01

    The pharmaceutical detail man, in bringing information about new drugs to physicians, hospitals and pharmacists, performs a complex and sensitive task. Does his training fit him to meet the challenge? (Author)

  11. Derivation of a quantitative minimal model from a detailed elementary-step mechanism supported by mathematical coupling analysis

    NASA Astrophysics Data System (ADS)

    Shaik, O. S.; Kammerer, J.; Gorecki, J.; Lebiedz, D.

    2005-12-01

    Accurate experimental data increasingly allow the development of detailed elementary-step mechanisms for complex chemical and biochemical reaction systems. Model reduction techniques are widely applied to obtain representations in lower-dimensional phase space which are more suitable for mathematical analysis, efficient numerical simulation, and model-based control tasks. Here, we exploit a recently implemented numerical algorithm for error-controlled computation of the minimum dimension required for a still accurate reduced mechanism based on automatic time scale decomposition and relaxation of fast modes. We determine species contributions to the active (slow) dynamical modes of the reaction system and exploit this information in combination with quasi-steady-state and partial-equilibrium approximations for explicit model reduction of a novel detailed chemical mechanism for the Ru-catalyzed light-sensitive Belousov-Zhabotinsky reaction. The existence of a minimum dimension of seven is demonstrated to be mandatory for the reduced model to show good quantitative consistency with the full model in numerical simulations. We derive such a maximally reduced seven-variable model from the detailed elementary-step mechanism and demonstrate that it reproduces quantitatively accurately the dynamical features of the full model within a given accuracy tolerance.

  12. Academic detailing can play a key role in assessing and implementing comparative effectiveness research findings.

    PubMed

    Fischer, Michael A; Avorn, Jerry

    2012-10-01

    Comparative effectiveness research evaluates the relative effectiveness, safety, and value of competing treatment options in clinically realistic settings. Such evaluations can be methodologically complex and difficult to interpret. There will be a growing need for critical evaluation of comparative effectiveness studies to assess the adequacy of their design and to put new information into a broader context. Equally important, this knowledge will have to be communicated to clinicians in a way that will actually change practice. We identify three challenges to effective dissemination of comparative effectiveness research findings: the difficulty of interpreting comparative effectiveness research data, the need for trusted sources of information, and the challenge of turning research results into clinical action. We suggest that academic detailing-direct outreach education that gives clinicians an accurate and unbiased synthesis of the best evidence for practice in a given clinical area-can translate comparative effectiveness research findings into actions that improve health care decision making and patient outcomes.

  13. A Generalized Detailed Balance Relation

    NASA Astrophysics Data System (ADS)

    Ruelle, David

    2016-08-01

    Given a system M in a thermal bath we obtain a generalized detailed balance relation for the ratio r=π _τ (K→ J)/π _τ (J→ K) of the transition probabilities M:J→ K and M:K→ J in time τ . We assume an active bath, containing solute molecules in metastable states. These molecules may react with M and the transition J→ K occurs through different channels α involving different reactions with the bath. We find that r=sum p^α r^α , where p^α is the probability that channel α occurs, and r^α depends on the amount of heat (more precisely enthalpy) released to the bath in channel α.

  14. Recovering and Preventing Loss of Detailed Memory: Differential Rates of Forgetting for Detail Types in Episodic Memory

    ERIC Educational Resources Information Center

    Sekeres, Melanie J.; Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris

    2016-01-01

    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired,…

  15. 78 FR 34604 - Submitting Complete and Accurate Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    .... Nuclear Regulatory Commission (NRC) is publishing for comment a petition for rulemaking (PRM) filed with... into ADAMS. The Petition The NRC has received a PRM (ADAMS Accession No. ML13113A443) requesting the... been docketed as PRM-50-107. The full text of the incoming petition is available at...

  16. Quantification of the Information Limit of Transmission Electron Microscopes

    SciTech Connect

    Barthel, J.; Thust, A.

    2008-11-14

    The resolving power of high-resolution transmission electron microscopes is characterized by the information limit, which reflects the size of the smallest object detail observable with a particular instrument. We introduce a highly accurate measurement method for the information limit, which is suitable for modern aberration-corrected electron microscopes. An experimental comparison with the traditionally applied Young's fringe method yields severe discrepancies and confirms theoretical considerations according to which the Young's fringe method does not reveal the information limit.

  17. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  18. A new debate for Turkish physicians: e-detailing.

    PubMed

    Ventura, Keti; Baybars, Miray; Dedeoglu, Ayla Ozhan

    2012-01-01

    The study presents an empirical analysis of the attitudes of Turkish physicians towards e-detailing practices compared to face-to-face detailing. The findings reveal that although physicians have positive attitudes toward e-detailing, on some points they are still undecided and/or have doubts. The structural model revealed that affect, convenience, and informative content influence their attitude in a positive manner, whereas the personal interaction was found to be a negative factor. Physicians' age and frequency of calls received from representatives are moderators. The present study can be seen as an addition to pharmaceutical marketing, an underresearched study field in Turkey, and e-detailing particularly.

  19. Accurate stochastic reconstruction of heterogeneous microstructures by limited x-ray tomographic projections.

    PubMed

    Li, Hechao; Kaira, Shashank; Mertens, James; Chawla, Nikhilesh; Jiao, Yang

    2016-12-01

    An accurate knowledge of the complex microstructure of a heterogeneous material is crucial for its performance prediction, prognosis and optimization. X-ray tomography has provided a nondestructive means for microstructure characterization in 3D and 4D (i.e. structural evolution over time), in which a material is typically reconstructed from a large number of tomographic projections using filtered-back-projection (FBP) method or algebraic reconstruction techniques (ART). Here, we present in detail a stochastic optimization procedure that enables one to accurately reconstruct material microstructure from a small number of absorption contrast x-ray tomographic projections. This discrete tomography reconstruction procedure is in contrast to the commonly used FBP and ART, which usually requires thousands of projections for accurate microstructure rendition. The utility of our stochastic procedure is first demonstrated by reconstructing a wide class of two-phase heterogeneous materials including sandstone and hard-particle packing from simulated limited-angle projections in both cone-beam and parallel beam projection geometry. It is then applied to reconstruct tailored Sn-sphere-clay-matrix systems from limited-angle cone-beam data obtained via a lab-scale tomography facility at Arizona State University and parallel-beam synchrotron data obtained at Advanced Photon Source, Argonne National Laboratory. In addition, we examine the information content of tomography data by successively incorporating larger number of projections and quantifying the accuracy of the reconstructions. We show that only a small number of projections (e.g. 20-40, depending on the complexity of the microstructure of interest and desired resolution) are necessary for accurate material reconstructions via our stochastic procedure, which indicates its high efficiency in using limited structural information. The ramifications of the stochastic reconstruction procedure in 4D materials science are also

  20. Accurate Method for Determining Adhesion of Cantilever Beams

    SciTech Connect

    Michalske, T.A.; de Boer, M.P.

    1999-01-08

    Using surface micromachined samples, we demonstrate the accurate measurement of cantilever beam adhesion by using test structures which are adhered over long attachment lengths. We show that this configuration has a deep energy well, such that a fracture equilibrium is easily reached. When compared to the commonly used method of determining the shortest attached beam, the present method is much less sensitive to variations in surface topography or to details of capillary drying.

  1. Detailed Aerosol Characterization using Polarimetric Measurements

    NASA Astrophysics Data System (ADS)

    Hasekamp, Otto; di Noia, Antonio; Stap, Arjen; Rietjens, Jeroen; Smit, Martijn; van Harten, Gerard; Snik, Frans

    2016-04-01

    Anthropogenic aerosols are believed to cause the second most important anthropogenic forcing of climate change after greenhouse gases. In contrast to the climate effect of greenhouse gases, which is understood relatively well, the negative forcing (cooling effect) caused by aerosols represents the largest reported uncertainty in the most recent assessment of the International Panel on Climate Change (IPCC). To reduce the large uncertainty on the aerosol effects on cloud formation and climate, accurate satellite measurements of aerosol optical properties (optical thickness, single scattering albedo, phase function) and microphysical properties (size distribution, refractive index, shape) are essential. There is growing consensus in the aerosol remote sensing community that multi-angle measurements of intensity and polarization are essential to unambiguously determine all relevant aerosol properties. This presentations adresses the different aspects of polarimetric remote sensing of atmospheric aerosols, including retrieval algorithm development, validation, and data needs for climate and air quality applications. During past years, at SRON-Netherlands Instite for Space Research retrieval algorithms have been developed that make full use of the capabilities of polarimetric measurements. We will show results of detailed aerosol properties from ground-based- (groundSPEX), airborne- (NASA Research Scanning Polarimeter), and satellite (POLDER) measurements. Also we will discuss observational needs for future instrumentation in order to improve our understanding of the role of aerosols in climate change and air quality.

  2. What input data are needed to accurately model electromagnetic fields from mobile phone base stations?

    PubMed

    Beekhuizen, Johan; Kromhout, Hans; Bürgi, Alfred; Huss, Anke; Vermeulen, Roel

    2015-01-01

    The increase in mobile communication technology has led to concern about potential health effects of radio frequency electromagnetic fields (RF-EMFs) from mobile phone base stations. Different RF-EMF prediction models have been applied to assess population exposure to RF-EMF. Our study examines what input data are needed to accurately model RF-EMF, as detailed data are not always available for epidemiological studies. We used NISMap, a 3D radio wave propagation model, to test models with various levels of detail in building and antenna input data. The model outcomes were compared with outdoor measurements taken in Amsterdam, the Netherlands. Results showed good agreement between modelled and measured RF-EMF when 3D building data and basic antenna information (location, height, frequency and direction) were used: Spearman correlations were >0.6. Model performance was not sensitive to changes in building damping parameters. Antenna-specific information about down-tilt, type and output power did not significantly improve model performance compared with using average down-tilt and power values, or assuming one standard antenna type. We conclude that 3D radio wave propagation modelling is a feasible approach to predict outdoor RF-EMF levels for ranking exposure levels in epidemiological studies, when 3D building data and information on the antenna height, frequency, location and direction are available.

  3. Accurate On-Line Intervention Practices for Efficient Improvement of Reading Skills in Africa

    ERIC Educational Resources Information Center

    Marshall, Minda B.

    2016-01-01

    Lifelong learning is the only way to sustain proficient learning in a rapidly changing world. Knowledge and information are exploding across the globe. We need accurate ways to facilitate the process of drawing external factual information into an internal perceptive advantage from which to interpret and argue new information. Accurate and…

  4. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  5. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  6. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  7. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  8. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  9. Identification of potential surgical site infections leveraging an enterprise clinical information warehouse.

    PubMed

    Santangelo, Jennifer; Erdal, Selnur; Wellington, Linda; Mekhjian, Hagop; Kamal, Jyoti

    2008-11-06

    At The Ohio State University Medical Center (OSUMC), infection control practitioners (ICPs) need an accurate list of patients undergoing defined operative procedures to track surgical site infections. Using data from the OSUMC Information Warehouse (IW), we have created an automated report detailing required data. This report also displays associated surgical and pathology text or dictated reports providing additional information to the ICPs.

  10. Accurate fundamental parameters for 23 bright solar-type stars

    NASA Astrophysics Data System (ADS)

    Bruntt, H.; Bedding, T. R.; Quirion, P.-O.; Lo Curto, G.; Carrier, F.; Smalley, B.; Dall, T. H.; Arentoft, T.; Bazot, M.; Butler, R. P.

    2010-07-01

    We combine results from interferometry, asteroseismology and spectroscopy to determine accurate fundamental parameters of 23 bright solar-type stars, from spectral type F5 to K2 and luminosity classes III-V. For some stars we can use direct techniques to determine the mass, radius, luminosity and effective temperature, and we compare with indirect methods that rely on photometric calibrations or spectroscopic analyses. We use the asteroseismic information available in the literature to infer an indirect mass with an accuracy of 4-15 per cent. From indirect methods we determine luminosity and radius to 3 per cent. We find evidence that the luminosity from the indirect method is slightly overestimated (~ 5 per cent) for the coolest stars, indicating that their bolometric corrections (BCs) are too negative. For Teff we find a slight offset of -40 +/- 20K between the spectroscopic method and the direct method, meaning the spectroscopic temperatures are too high. From the spectroscopic analysis we determine the detailed chemical composition for 13 elements, including Li, C and O. The metallicity ranges from [Fe/H] = -1.7 to +0.4, and there is clear evidence for α-element enhancement in the metal-poor stars. We find no significant offset between the spectroscopic surface gravity and the value from combining asteroseismology with radius estimates. From the spectroscopy we also determine v sin i and we present a new calibration of macroturbulence and microturbulence. From the comparison between the results from the direct and spectroscopic methods we claim that we can determine Teff, log g and [Fe/H] with absolute accuracies of 80K, 0.08 and 0.07dex. Photometric calibrations of Strömgren indices provide accurate results for Teff and [Fe/H] but will be more uncertain for distant stars when interstellar reddening becomes important. The indirect methods are important to obtain reliable estimates of the fundamental parameters of relatively faint stars when interferometry

  11. A Detailed Chemical Kinetic Model for TNT

    SciTech Connect

    Pitz, W J; Westbrook, C K

    2005-01-13

    A detailed chemical kinetic mechanism for 2,4,6-tri-nitrotoluene (TNT) has been developed to explore problems of explosive performance and soot formation during the destruction of munitions. The TNT mechanism treats only gas-phase reactions. Reactions for the decomposition of TNT and for the consumption of intermediate products formed from TNT are assembled based on information from the literature and on current understanding of aromatic chemistry. Thermodynamic properties of intermediate and radical species are estimated by group additivity. Reaction paths are developed based on similar paths for aromatic hydrocarbons. Reaction-rate constant expressions are estimated from the literature and from analogous reactions where the rate constants are available. The detailed reaction mechanism for TNT is added to existing reaction mechanisms for RDX and for hydrocarbons. Computed results show the effect of oxygen concentration on the amount of soot precursors that are formed in the combustion of RDX and TNT mixtures in N{sub 2}/O{sub 2} mixtures.

  12. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  13. The MATPHOT Algorithm for Accurate and Precise Stellar Photometry and Astrometry Using Discrete Point Spread Functions

    NASA Astrophysics Data System (ADS)

    Mighell, K. J.

    2004-12-01

    I describe the key features of my MATPHOT algorithm for accurate and precise stellar photometry and astrometry using discrete Point Spread Functions. A discrete Point Spread Function (PSF) is a sampled version of a continuous two-dimensional PSF. The shape information about the photon scattering pattern of a discrete PSF is typically encoded using a numerical table (matrix) or a FITS image file. The MATPHOT algorithm shifts discrete PSFs within an observational model using a 21-pixel-wide damped sinc function and position partial derivatives are computed using a five-point numerical differentiation formula. The MATPHOT algorithm achieves accurate and precise stellar photometry and astrometry of undersampled CCD observations by using supersampled discrete PSFs that are sampled 2, 3, or more times more finely than the observational data. I have written a C-language computer program called MPD which is based on the current implementation of the MATPHOT algorithm; all source code and documentation for MPD and support software is freely available at the following website: http://www.noao.edu/staff/mighell/matphot . I demonstrate the use of MPD and present a detailed MATPHOT analysis of simulated James Webb Space Telescope observations which demonstrates that millipixel relative astrometry and millimag photometric accuracy is achievable with very complicated space-based discrete PSFs. This work was supported by a grant from the National Aeronautics and Space Administration (NASA), Interagency Order No. S-13811-G, which was awarded by the Applied Information Systems Research (AISR) Program of NASA's Science Mission Directorate.

  14. Enabling Detailed Energy Analyses via the Technology Performance Exchange: Preprint

    SciTech Connect

    Studer, D.; Fleming, K.; Lee, E.; Livingood, W.

    2014-08-01

    One of the key tenets to increasing adoption of energy efficiency solutions in the built environment is improving confidence in energy performance. Current industry practices make extensive use of predictive modeling, often via the use of sophisticated hourly or sub-hourly energy simulation programs, to account for site-specific parameters (e.g., climate zone, hours of operation, and space type) and arrive at a performance estimate. While such methods are highly precise, they invariably provide less than ideal accuracy due to a lack of high-quality, foundational energy performance input data. The Technology Performance Exchange was constructed to allow the transparent sharing of foundational, product-specific energy performance data, and leverages significant, external engineering efforts and a modular architecture to efficiently identify and codify the minimum information necessary to accurately predict product energy performance. This strongly-typed database resource represents a novel solution to a difficult and established problem. One of the most exciting benefits is the way in which the Technology Performance Exchange's application programming interface has been leveraged to integrate contributed foundational data into the Building Component Library. Via a series of scripts, data is automatically translated and parsed into the Building Component Library in a format that is immediately usable to the energy modeling community. This paper (1) presents a high-level overview of the project drivers and the structure of the Technology Performance Exchange; (2) offers a detailed examination of how technologies are incorporated and translated into powerful energy modeling code snippets; and (3) examines several benefits of this robust workflow.

  15. Numerical Methodology for Coupled Time-Accurate Simulations of Primary and Secondary Flowpaths in Gas Turbines

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.

    2006-01-01

    Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.

  16. [Guidelines for Accurate and Transparent Health Estimates Reporting: the GATHER Statement].

    PubMed

    Stevens, Gretchen A; Alkema, Leontine; Black, Robert E; Boerma, J Ties; Collins, Gary S; Ezzati, Majid; Grove, John T; Hogan, Daniel R; Hogan, Margaret C; Horton, Richard; Lawn, Joy E; Marušic, Ana; Mathers, Colin D; Murray, Christopher J L; Rudan, Igor; Salomon, Joshua A; Simpson, Paul J; Vos, Theo; Welch, Vivian

    2017-01-01

    Measurements of health indicators are rarely available for every population and period of interest, and available data may not be comparable. The Guidelines for Accurate and Transparent Health Estimates Reporting (GATHER) define best reporting practices for studies that calculate health estimates for multiple populations (in time or space) using multiple information sources. Health estimates that fall within the scope of GATHER include all quantitative population-level estimates (including global, regional, national, or subnational estimates) of health indicators, including indicators of health status, incidence and prevalence of diseases, injuries, and disability and functioning; and indicators of health determinants, including health behaviours and health exposures. GATHER comprises a checklist of 18 items that are essential for best reporting practice. A more detailed explanation and elaboration document, describing the interpretation and rationale of each reporting item along with examples of good reporting, is available on the GATHER website (http://gather-statement.org).

  17. Accurate oscillator strengths for ultraviolet lines of Ar I - Implications for interstellar material

    NASA Technical Reports Server (NTRS)

    Federman, S. R.; Beideck, D. J.; Schectman, R. M.; York, D. G.

    1992-01-01

    Analysis of absorption from interstellar Ar I in lightly reddened lines of sight provides information on the warm and hot components of the interstellar medium near the sun. The details of the analysis are limited by the quality of the atomic data. Accurate oscillator strengths for the Ar I lines at 1048 and 1067 A and the astrophysical implications are presented. From lifetimes measured with beam-foil spectroscopy, an f-value for 1048 A of 0.257 +/- 0.013 is obtained. Through the use of a semiempirical formalism for treating singlet-triplet mixing, an oscillator strength of 0.064 +/- 0.003 is derived for 1067 A. Because of the accuracy of the results, the conclusions of York and colleagues from spectra taken with the Copernicus satellite are strengthened. In particular, for interstellar gas in the solar neighborhood, argon has a solar abundance, and the warm, neutral material is not pervasive.

  18. Cornice Detail of Rake, Cornice Detail of Eave, Wood DoubleHung ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Cornice Detail of Rake, Cornice Detail of Eave, Wood Double-Hung Window Details, Wood Door Details - Boxley Grist Mill, Boxley vicinity on State Route 43, Buffalo National River, Ponca, Newton County, AR

  19. Study on detailed geological modelling for fluvial sandstone reservoir in Daqing oil field

    SciTech Connect

    Zhao Hanqing; Fu Zhiguo; Lu Xiaoguang

    1997-08-01

    Guided by the sedimentation theory and knowledge of modern and ancient fluvial deposition and utilizing the abundant information of sedimentary series, microfacies type and petrophysical parameters from well logging curves of close spaced thousands of wells located in a large area. A new method for establishing detailed sedimentation and permeability distribution models for fluvial reservoirs have been developed successfully. This study aimed at the geometry and internal architecture of sandbodies, in accordance to their hierarchical levels of heterogeneity and building up sedimentation and permeability distribution models of fluvial reservoirs, describing the reservoir heterogeneity on the light of the river sedimentary rules. The results and methods obtained in outcrop and modem sedimentation studies have successfully supported the study. Taking advantage of this method, the major producing layers (PI{sub 1-2}), which have been considered as heterogeneous and thick fluvial reservoirs extending widely in lateral are researched in detail. These layers are subdivided into single sedimentary units vertically and the microfacies are identified horizontally. Furthermore, a complex system is recognized according to their hierarchical levels from large to small, meander belt, single channel sandbody, meander scroll, point bar, and lateral accretion bodies of point bar. The achieved results improved the description of areal distribution of point bar sandbodies, provide an accurate and detailed framework model for establishing high resolution predicting model. By using geostatistic technique, it also plays an important role in searching for enriched zone of residual oil distribution.

  20. A Detailed Modeling Study of Propane Oxidation

    SciTech Connect

    Westbrook, C K; Jayaweera, T M; Pitz, W J; Curran, H J

    2004-03-19

    A detailed chemical kinetic mechanism has been used to simulate ignition delay times recorded by a number of experimental shock tube studies over the temperature range 900 {le} T {le} 1800 K, in the pressure range 0.75-40 atm and in the equivalence ratio range 0.5 {le} {phi} {le} 2.0. Flame speed measurements at 1 atm in the equivalence ratio range 0.4 {le} {phi} {le} 1.8 have also been simulated. Both of these data sets, particularly those recorded at high pressure, are of particular importance in validating a kinetic mechanism, as internal combustion engines operate at elevated pressures and temperatures and rates of fuel oxidation are critical to efficient system operation. Experiments in which reactant, intermediate and product species were quantitatively recorded, versus temperature in a jet-stirred reactor (JSR) and versus time in a flow reactor are also simulated. This data provide a stringent test of the kinetic mechanism as it must reproduce accurate quantitative profiles for all reactant, intermediate and product species. The JSR experiments were performed in the temperature range 1000-1110 K, in the equivalence ratio range 0.5 {le} {phi} {le} 4.0, at a pressure of 5 atm. These experiments are complemented by those carried out in a flow reactor in the temperature range 660-820 K, at 10 atm and at an equivalence ratio of 0.4. In addition, burner stabilized flames were simulated, where chemical species profiles were measured at atmospheric pressure for two propane-air flat flames. Overall, reasonably good agreement is observed between the model simulations and the experimental results.

  1. Detailed Globes Enhance Education and Recreation

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Orbis World Globes creates inflatable globes-Earthballs-in many sizes that depict Earth as it is seen from space, complete with atmospheric cloud cover. Orbis designs and produces the most visually authentic replicas of Earth ever created, and NASA took notice of Orbis globes and employed a 16-inch diameter EarthBall for an educational film it made aboard the STS-45 shuttle mission. Orbis later collaborated with NASA to create two 16-foot diameter world globes for display at the 2002 Olympic Winter Games in Salt Lake City, using more detailed satellite imagery. The satellite image now printed on all Orbis globes displays 1-kilometer resolution and is 21,600 by 43,200 pixels in size, and Orbis globes are otherwise meteorologically accurate, though the cloud cover has been slightly reduced in order for most of the landforms to be visible. Orbis also developed the exclusive NightGlow Cities feature, enabling EarthBalls to display the world's cities as they appear as the Earth revolves from daylight into night. Orbis inflatable globes are available in sizes from 1 to 100 feet in diameter, with the most common being the standard 16-inch and 1-meter diameter EarthBalls. Applications include educational uses from preschools to universities, games, and for a variety of display purposes at conferences, trade shows, festivals, concerts, and parades. A 16-foot diameter Orbis globe was exhibited at the United Nations' World Urban Forum, in Vancouver, Canada; the Space 2006 conference, in San Jose, California; and the X-Prize Cup Personal Spaceflight Exposition in Las Cruces, New Mexico.

  2. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  3. Accurate upper body rehabilitation system using kinect.

    PubMed

    Sinha, Sanjana; Bhowmick, Brojeshwar; Chakravarty, Kingshuk; Sinha, Aniruddha; Das, Abhijit

    2016-08-01

    The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

  4. Accurate, reproducible measurement of blood pressure.

    PubMed Central

    Campbell, N R; Chockalingam, A; Fodor, J G; McKay, D W

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine consumption, smoking and physical exertion within half an hour before measurement. The use of standardized techniques to measure blood pressure will help to avoid large systematic errors. Poor technique can account for differences in readings of more than 15 mm Hg and ultimately misdiagnosis. Most of the recommended procedures are simple and, when routinely incorporated into clinical practice, require little additional time. The equipment must be appropriate and in good condition. Physicians should have a suitable selection of cuff sizes readily available; the use of the correct cuff size is essential to minimize systematic errors in blood pressure measurement. Semiannual calibration of aneroid sphygmomanometers and annual inspection of mercury sphygmomanometers and blood pressure cuffs are recommended. We review the methods recommended for measuring blood pressure and discuss the factors known to produce large differences in blood pressure readings. PMID:2192791

  5. Acoustic emission monitoring for assessment of steel bridge details

    SciTech Connect

    Kosnik, D. E.; Corr, D. J.; Hopwood, T.

    2011-06-23

    Acoustic emission (AE) testing was deployed on details of two large steel Interstate Highway bridges: one cantilever through-truss and one trapezoidal box girder bridge. Quantitative measurements of activity levels at known and suspected crack locations were made by monitoring AE under normal service loads (e.g., live traffic and wind). AE indications were used to direct application of radiography, resulting in identification of a previously unknown flaw, and to inform selection of a retrofit detail.

  6. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    Monitoring sulphur chemistry is thought to be of great importance for exoplanets. Doing this requires detailed knowledge of the spectroscopic properties of sulphur containing molecules such as hydrogen sulphide (H2S) [1], sulphur dioxide (SO2), and sulphur trioxide (SO3). Each of these molecules can be found in terrestrial environments, produced in volcano emissions on Earth, and analysis of their spectroscopic data can prove useful to the characterisation of exoplanets, as well as the study of planets in our own solar system, with both having a possible presence on Venus. A complete, high temperature list of line positions and intensities for H32 2 S is presented. The DVR3D program suite is used to calculate the bound ro-vibration energy levels, wavefunctions, and dipole transition intensities using Radau coordinates. The calculations are based on a newly determined, spectroscopically refined potential energy surface (PES) and a new, high accuracy, ab initio dipole moment surface (DMS). Tests show that the PES enables us to calculate the line positions accurately and the DMS gives satisfactory results for line intensities. Comparisons with experiment as well as with previous theoretical spectra will be presented. The results of this study will form an important addition to the databases which are considered as sources of information for space applications; especially, in analysing the spectra of extrasolar planets, and remote sensing studies for Venus and Earth, as well as laboratory investigations and pollution studies. An ab initio line list for SO3 was previously computed using the variational nuclear motion program TROVE [2], and was suitable for modelling room temperature SO3 spectra. The calculations considered transitions in the region of 0-4000 cm-1 with rotational states up to J = 85, and includes 174,674,257 transitions. A list of 10,878 experimental transitions had relative intensities placed on an absolute scale, and were provided in a form suitable

  7. [Academic detailing for best practice and pharmacists' role].

    PubMed

    Yamamoto, Michiko

    2014-01-01

    It is necessary to offer the proper information about prescription drugs for appropriate use of them in clinical practice. However, a lot of time and labor is required to comprehensively collect the information necessary for clinical application and it could be extremely difficult. If the clinical experience and other information is derived solely on a commercial basis, then it may lead to improper prescription practices. "Academic detailing" is a form of interactive educational outreach to physicians to provide unbiased, non-commercial, evidence-based information about medications and other therapeutic decisions, with the goal of improving patient care. In Western countries, the public funds are used to support universities and other research institution programs. The experience from such programs spreads to a broader scientific community. In US, "Academic detailing" was pioneered 30 years ago. National Resource Center for Academic Detailing (NaRCAD) is an initiative supported by Agency for Healthcare Research and Quality (AHRQ) grant. Clinical pharmacists are acting as Detailers in Europe and America, and this improves medical quality. The importance of Academic Detailing activity would be also recognized in Japan, and fully-trained (with six-years of specialized training) pharmacists with evaluative and communication skills can be expected to act as such a specialist.

  8. Air Force Geophysics Laboratory Management Information System Study.

    DTIC Science & Technology

    1985-11-01

    management information system (MIS) at AFGL. The study summarizes current management and administrative practices at AFGL. Requirements have been identified for automating several currently manual functions to compile accurate and timely information to better manage and plan AFGL programs. This document describes the functions and relative priorities of five MIS subsystems and provides suggestions for implementation solutions. Creation of a detailed Development Plan is recommended as the follow-on task.

  9. Validation of a fast and accurate chromatographic method for detailed quantification of vitamin E in green leafy vegetables.

    PubMed

    Cruz, Rebeca; Casal, Susana

    2013-11-15

    Vitamin E analysis in green vegetables is performed by an array of different methods, making it difficult to compare published data or choosing the adequate one for a particular sample. Aiming to achieve a consistent method with wide applicability, the current study reports the development and validation of a fast micro-method for quantification of vitamin E in green leafy vegetables. The methodology uses solid-liquid extraction based on the Folch method, with tocol as internal standard, and normal-phase HPLC with fluorescence detection. A large linear working range was confirmed, being highly reproducible, with inter-day precisions below 5% (RSD). Method sensitivity was established (below 0.02 μg/g fresh weight), and accuracy was assessed by recovery tests (>96%). The method was tested in different green leafy vegetables, evidencing diverse tocochromanol profiles, with variable ratios and amounts of α- and γ-tocopherol, and other minor compounds. The methodology is adequate for routine analyses, with a reduced chromatographic run (<7 min) and organic solvent consumption, and requires only standard chromatographic equipment available in most laboratories.

  10. Detailed 3D representations for object recognition and modeling.

    PubMed

    Zia, M Zeeshan; Stark, Michael; Schiele, Bernt; Schindler, Konrad

    2013-11-01

    Geometric 3D reasoning at the level of objects has received renewed attention recently in the context of visual scene understanding. The level of geometric detail, however, is typically limited to qualitative representations or coarse boxes. This is linked to the fact that today's object class detectors are tuned toward robust 2D matching rather than accurate 3D geometry, encouraged by bounding-box-based benchmarks such as Pascal VOC. In this paper, we revisit ideas from the early days of computer vision, namely, detailed, 3D geometric object class representations for recognition. These representations can recover geometrically far more accurate object hypotheses than just bounding boxes, including continuous estimates of object pose and 3D wireframes with relative 3D positions of object parts. In combination with robust techniques for shape description and inference, we outperform state-of-the-art results in monocular 3D pose estimation. In a series of experiments, we analyze our approach in detail and demonstrate novel applications enabled by such an object class representation, such as fine-grained categorization of cars and bicycles, according to their 3D geometry, and ultrawide baseline matching.

  11. Recording surface detail on moist surfaces with elastomeric impression materials.

    PubMed

    McCabe, J F; Carrick, T E

    2006-03-01

    The objective was to assess the ability to accurately record detail on moist surfaces for three elastomeric impression materials derived from different polymers. One polyvinylsiloxane, one polyether and one hybrid material containing a copolymer of siloxane and polyether polymers were used. Impressions were recorded of moist gypsum casts having both a shallow (approximately 20 microm) and deep (approximately 180 microm) groove reproduced on their surface. The grooves in the casts and in the impressions were profiled using a non-contacting laser profilometer Comparisons were made between the groove depths in the casts and impressions (paired t-test). The results indicated that all of the tested materials accurately recorded dimensions in the x-y plane. However, there was evidence that the polyether and hybrid materials were more accurate than the polyvinylsiloxane in recording the true depths of the deep grooves (z plane) under moist conditions. It was concluded that the more hydrophilic nature of the polyether and hybrid materials enabled them to record more accurate impressions of moist surfaces, particularly in areas of difficult access as modelled by the deep grooves.

  12. Academic detailing among psychiatrists - feasibility and acceptability.

    PubMed

    Vasudev, Kamini; Lamoure, Joel; Beyaert, Michael; Dua, Varinder; Dixon, David; Eadie, Jason; Husarewych, Larissa; Dhir, Ragu; Takhar, Jatinder

    2017-02-13

    Purpose Research has shown that academic detailing (AD), which includes repeated in-person educational messages in an interactive format in a physician's office, is among the most effective continuing medical education (CME) forms for improving prescribing practices and reducing drug costs. The purpose of this paper is to investigate AD's feasibility and acceptability as an educational tool among psychiatrists and its ability to facilitate positive changes in antipsychotic prescribing. Design/methodology/approach All psychiatrists practicing in Southwestern Ontario, Canada were invited to participate. Participants (32/299(10.7 percent)) were provided with two educational sessions by a healthcare professional. Participants evaluated their AD visits and completed a pre- and post-AD questionnaire measuring various prescribing practice aspects. Findings A total of 26 out of 32 (81.3 percent) participants completed the post-AD evaluation; most of them (61.5 percent, n=16) felt that AD gave noteworthy information on tools for monitoring side-effects and 50.0 percent ( n=13) endorsed using these in practice. In total, 13 participants (50.0 percent) felt that the AD sessions gave them helpful information on tools for documenting polypharmacy use, which 46.2 percent ( n=12) indicated they would implement in their practice. No significant differences were found between participants' pre- and post-assessment prescribing behaviors. Practical implications There is great need for raising AD program's awareness and improving physician engagement in this process locally, provincially and nationally. Originality/value To the authors' knowledge, this is the first AD program in Canada to target specialists solely. Participant psychiatrists accepted the AD intervention and perceived it as a feasible CME method.

  13. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  14. Nonexposure accurate location K-anonymity algorithm in LBS.

    PubMed

    Jia, Jinying; Zhang, Fengli

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR.

  15. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    NASA Astrophysics Data System (ADS)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest

  16. Optoelectronic pH Meter: Further Details

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Antony S.; Anderson, Mejody M.; Macatangay, Ariel V.

    2009-01-01

    A collection of documents provides further detailed information about an optoelectronic instrument that measures the pH of an aqueous cell-culture medium to within 0.1 unit in the range from 6.5 to 7.5. The instrument at an earlier stage of development was reported in Optoelectronic Instrument Monitors pH in a Culture Medium (MSC-23107), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 4a. To recapitulate: The instrument includes a quartz cuvette through which the medium flows as it is circulated through a bioreactor. The medium contains some phenol red, which is an organic pH-indicator dye. The cuvette sits between a light source and a photodetector. [The light source in the earlier version comprised red (625 nm) and green (558 nm) light-emitting diodes (LEDs); the light source in the present version comprises a single green- (560 nm)-or-red (623 nm) LED.] The red and green are repeatedly flashed in alternation. The responses of the photodiode to the green and red are processed electronically to obtain the ratio between the amounts of green and red light transmitted through the medium. The optical absorbance of the phenol red in the green light varies as a known function of pH. Hence, the pH of the medium can be calculated from the aforesaid ratio.

  17. Detailed observations of the source of terrestrial narrowband electromagnetic radiation

    NASA Technical Reports Server (NTRS)

    Kurth, W. S.

    1982-01-01

    Detailed observations are presented of a region near the terrestrial plasmapause where narrowband electromagnetic radiation (previously called escaping nonthermal continuum radiation) is being generated. These observations show a direct correspondence between the narrowband radio emissions and electron cyclotron harmonic waves near the upper hybrid resonance frequency. In addition, electromagnetic radiation propagating in the Z-mode is observed in the source region which provides an extremely accurate determination of the electron plasma frequency and, hence, density profile of the source region. The data strongly suggest that electrostatic waves and not Cerenkov radiation are the source of the banded radio emissions and define the coupling which must be described by any viable theory.

  18. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  19. 44 CFR 5.27 - Deletion of identifying details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Deletion of identifying details. 5.27 Section 5.27 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY, DEPARTMENT OF HOMELAND SECURITY GENERAL PRODUCTION OR DISCLOSURE OF INFORMATION Publication of...

  20. 21. DETAIL OF AREA WHERE FIRST FLOOR PASSAGEWAY USED TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. DETAIL OF AREA WHERE FIRST FLOOR PASSAGEWAY USED TO BE SHOWING VERTICAL WOOD MOLDING COVERING JOINT WHERE PARTITION USED TO BE (LEFT), TELLER'S WINDOW LINKING PASSAGEWAY WITH INFORMATION BOOTH (CENTER), AND TYPICAL FURNITURE. VIEW TO EAST. - Boise Project, Boise Project Office, 214 Broadway, Boise, Ada County, ID

  1. Site Plan, Brief History, Site Elevation, Main Gate Detail, Southern ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Site Plan, Brief History, Site Elevation, Main Gate Detail, Southern Live Oak (Quercus Virginiana) Information - Main Gate and Auburn Oaks at Toomer's Corner, Entrance to Auburn University's Campus, Intersection of West Magnolia Avenue and South College Street, Auburn, Lee County, AL

  2. Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models

    PubMed Central

    Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V.

    2013-01-01

    In the context of limiting the environmental impact of transportation, this paper reviews new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions. PMID:21597604

  3. Computational Time-Accurate Body Movement: Methodology, Validation, and Application

    DTIC Science & Technology

    1995-10-01

    used that had a leading-edge sweep angle of 45 deg and a NACA 64A010 symmetrical airfoil section. A cross section of the pylon is a symmetrical...25 2. Information Flow for the Time-Accurate Store Trajectory Prediction Process . . . . . . . . . 26 3. Pitch Rates for NACA -0012 Airfoil...section are comparisons of the computational results to data for a NACA -0012 airfoil following a predefined pitching motion. Validation of the

  4. ACCURATE SIMULATIONS OF BINARY BLACK HOLE MERGERS IN FORCE-FREE ELECTRODYNAMICS

    SciTech Connect

    Alic, Daniela; Moesta, Philipp; Rezzolla, Luciano; Jaramillo, Jose Luis; Zanotti, Olindo

    2012-07-20

    We provide additional information on our recent study of the electromagnetic emission produced during the inspiral and merger of supermassive black holes when these are immersed in a force-free plasma threaded by a uniform magnetic field. As anticipated in a recent letter, our results show that although a dual-jet structure is present, the associated luminosity is {approx}100 times smaller than the total one, which is predominantly quadrupolar. Here we discuss the details of our implementation of the equations in which the force-free condition is not implemented at a discrete level, but rather obtained via a damping scheme which drives the solution to satisfy the correct condition. We show that this is important for a correct and accurate description of the current sheets that can develop in the course of the simulation. We also study in greater detail the three-dimensional charge distribution produced as a consequence of the inspiral and show that during the inspiral it possesses a complex but ordered structure which traces the motion of the two black holes. Finally, we provide quantitative estimates of the scaling of the electromagnetic emission with frequency, with the diffused part having a dependence that is the same as the gravitational-wave one and that scales as L{sup non-coll}{sub EM} Almost-Equal-To {Omega}{sup 10/3-8/3}, while the collimated one scales as L{sup coll}{sub EM} Almost-Equal-To {Omega}{sup 5/3-6/3}, thus with a steeper dependence than previously estimated. We discuss the impact of these results on the potential detectability of dual jets from supermassive black holes and the steps necessary for more accurate estimates.

  5. Processing of airborne lidar bathymetry data for detailed sea floor mapping

    NASA Astrophysics Data System (ADS)

    Tulldahl, H. Michael

    2014-10-01

    Airborne bathymetric lidar has proven to be a valuable sensor for rapid and accurate sounding of shallow water areas. With advanced processing of the lidar data, detailed mapping of the sea floor with various objects and vegetation is possible. This mapping capability has a wide range of applications including detection of mine-like objects, mapping marine natural resources, and fish spawning areas, as well as supporting the fulfillment of national and international environmental monitoring directives. Although data sets collected by subsea systems give a high degree of credibility they can benefit from a combination with lidar for surveying and monitoring larger areas. With lidar-based sea floor maps containing information of substrate and attached vegetation, the field investigations become more efficient. Field data collection can be directed into selected areas and even focused to identification of specific targets detected in the lidar map. The purpose of this work is to describe the performance for detection and classification of sea floor objects and vegetation, for the lidar seeing through the water column. With both experimental and simulated data we examine the lidar signal characteristics depending on bottom depth, substrate type, and vegetation. The experimental evaluation is based on lidar data from field documented sites, where field data were taken from underwater video recordings. To be able to accurately extract the information from the received lidar signal, it is necessary to account for the air-water interface and the water medium. The information content is hidden in the lidar depth data, also referred to as point data, and also in the shape of the received lidar waveform. The returned lidar signal is affected by environmental factors such as bottom depth and water turbidity, as well as lidar system factors such as laser beam footprint size and sounding density.

  6. Assessing Team Detailing End-user Satisfaction

    DTIC Science & Technology

    2004-10-01

    October 2004 Assessing Team Detailing End-user Satisfaction Kimberly P. Whittam, Ph.D. Zannette A. Uriell, M.S. Rorie N. Harris, Ph.D. Approved for...public release; distribution is unlimited. NPRST-AB-05-1 October 2004 Assessing Team Detailing End-user Satisfaction Kimberly P. Whittam, Ph.D...DAVID L. ALDERTON, Ph.D. Director vii Contents Assessing Team Detailing End-user Satisfaction

  7. Toward accurate prediction of pKa values for internal protein residues: the importance of conformational relaxation and desolvation energy.

    PubMed

    Wallace, Jason A; Wang, Yuhang; Shi, Chuanyin; Pastoor, Kevin J; Nguyen, Bao-Linh; Xia, Kai; Shen, Jana K

    2011-12-01

    Proton uptake or release controls many important biological processes, such as energy transduction, virus replication, and catalysis. Accurate pK(a) prediction informs about proton pathways, thereby revealing detailed acid-base mechanisms. Physics-based methods in the framework of molecular dynamics simulations not only offer pK(a) predictions but also inform about the physical origins of pK(a) shifts and provide details of ionization-induced conformational relaxation and large-scale transitions. One such method is the recently developed continuous constant pH molecular dynamics (CPHMD) method, which has been shown to be an accurate and robust pK(a) prediction tool for naturally occurring titratable residues. To further examine the accuracy and limitations of CPHMD, we blindly predicted the pK(a) values for 87 titratable residues introduced in various hydrophobic regions of staphylococcal nuclease and variants. The predictions gave a root-mean-square deviation of 1.69 pK units from experiment, and there were only two pK(a)'s with errors greater than 3.5 pK units. Analysis of the conformational fluctuation of titrating side-chains in the context of the errors of calculated pK(a) values indicate that explicit treatment of conformational flexibility and the associated dielectric relaxation gives CPHMD a distinct advantage. Analysis of the sources of errors suggests that more accurate pK(a) predictions can be obtained for the most deeply buried residues by improving the accuracy in calculating desolvation energies. Furthermore, it is found that the generalized Born implicit-solvent model underlying the current CPHMD implementation slightly distorts the local conformational environment such that the inclusion of an explicit-solvent representation may offer improvement of accuracy.

  8. The potential of more accurate InSAR covariance matrix estimation for land cover mapping

    NASA Astrophysics Data System (ADS)

    Jiang, Mi; Yong, Bin; Tian, Xin; Malhotra, Rakesh; Hu, Rui; Li, Zhiwei; Yu, Zhongbo; Zhang, Xinxin

    2017-04-01

    Synthetic aperture radar (SAR) and Interferometric SAR (InSAR) provide both structural and electromagnetic information for the ground surface and therefore have been widely used for land cover classification. However, relatively few studies have developed analyses that investigate SAR datasets over richly textured areas where heterogeneous land covers exist and intermingle over short distances. One of main difficulties is that the shapes of the structures in a SAR image cannot be represented in detail as mixed pixels are likely to occur when conventional InSAR parameter estimation methods are used. To solve this problem and further extend previous research into remote monitoring of urban environments, we address the use of accurate InSAR covariance matrix estimation to improve the accuracy of land cover mapping. The standard and updated methods were tested using the HH-polarization TerraSAR-X dataset and compared with each other using the random forest classifier. A detailed accuracy assessment complied for six types of surfaces shows that the updated method outperforms the standard approach by around 9%, with an overall accuracy of 82.46% over areas with rich texture in Zhuhai, China. This paper demonstrates that the accuracy of land cover mapping can benefit from the 3 enhancement of the quality of the observations in addition to classifiers selection and multi-source data ingratiation reported in previous studies.

  9. To be precise, the details don't matter: On predictive processing, precision, and level of detail of predictions.

    PubMed

    Kwisthout, Johan; Bekkering, Harold; van Rooij, Iris

    2017-03-01

    Many theoretical and empirical contributions to the Predictive Processing account emphasize the important role of precision modulation of prediction errors. Recently it has been proposed that the causal models used in human predictive processing are best formally modeled by categorical probability distributions. Crucially, such distributions assume a well-defined, discrete state space. In this paper we explore the consequences of this formalization. In particular we argue that the level of detail of generative models and predictions modulates prediction error. We show that both increasing the level of detail of the generative models and decreasing the level of detail of the predictions can be suitable mechanisms for lowering prediction errors. Both increase precision, yet come at the price of lowering the amount of information that can be gained by correct predictions. Our theoretical result establishes a key open empirical question to address: How does the brain optimize the trade-off between high precision and information gain when making its predictions?

  10. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities.

    PubMed

    Helb, Danica A; Tetteh, Kevin K A; Felgner, Philip L; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R; Beeson, James G; Tappero, Jordan; Smith, David L; Crompton, Peter D; Rosenthal, Philip J; Dorsey, Grant; Drakeley, Christopher J; Greenhouse, Bryan

    2015-08-11

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual's recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86-0.93), whereas responses to six antigens accurately estimated an individual's malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs.

  11. Cloud Imagers Offer New Details on Earth's Health

    NASA Technical Reports Server (NTRS)

    2009-01-01

    , limited scientists ability to acquire detailed information about individual particles. Now, experiments with specialized equipment can be flown on standard jets, making it possible for researchers to monitor and more accurately anticipate changes in Earth s atmosphere and weather patterns.

  12. Direct mapping of chemical oxidation of individual graphene sheets through dynamic force measurements at the nanoscale† †Electronic supplementary information (ESI) available: Further details regarding the measurement, UV/ozone treatment, adhesion measurement, graphene height characterization, detailed sample preparation, flow chart of the measurement, PeakForce mode, environmental stabilization and Raman spectra of treated samples. See DOI: 10.1039/c6nr05799c Click here for additional data file.

    PubMed Central

    Froning, Jens P.; Lazar, Petr; Pykal, Martin; Li, Qiang

    2017-01-01

    Graphene oxide is one of the most studied nanomaterials owing to its huge application potential in many fields, including biomedicine, sensing, drug delivery, optical and optoelectronic technologies. However, a detailed description of the chemical composition and the extent of oxidation in graphene oxide remains a key challenge affecting its applicability and further development of new applications. Here, we report direct monitoring of the chemical oxidation of an individual graphene flake during ultraviolet/ozone treatment through in situ atomic force microscopy based on dynamic force mapping. The results showed that graphene oxidation expanded from the graphene edges to the entire graphene surface. The interaction force mapping results correlated well with X-ray photoelectron spectroscopy data quantifying the degree of chemical oxidation. Density functional theory calculations confirmed the specific interaction forces measured between a silicon tip and graphene oxide. The developed methodology can be used as a simple protocol for evaluating the chemical functionalization of other two-dimensional materials with covalently attached functional groups. PMID:27735008

  13. High dynamic range compression and detail enhancement of infrared images in the gradient domain

    NASA Astrophysics Data System (ADS)

    Zhang, Feifei; Xie, Wei; Ma, Guorui; Qin, Qianqing

    2014-11-01

    To find the trade-off between providing an accurate perception of the global scene and improving the visibility of details without excessively distorting radiometric infrared information, a novel gradient-domain-based visualization method for high dynamic range infrared images is proposed in this study. The proposed method adopts an energy function which includes a data constraint term and a gradient constraint term. In the data constraint term, the classical histogram projection method is used to perform the initial dynamic range compression to obtain the desired pixel values and preserve the global contrast. In the gradient constraint term, the moment matching method is adopted to obtain the normalized image; then a gradient gain factor function is designed to adjust the magnitudes of the normalized image gradients and obtain the desired gradient field. Lastly, the low dynamic range image is solved from the proposed energy function. The final image is obtained by linearly mapping the low dynamic range image to the 8-bit display range. The effectiveness and robustness of the proposed method are analyzed using the infrared images obtained from different operating conditions. Compared with other well-established methods, our method shows a significant performance in terms of dynamic range compression, while enhancing the details and avoiding the common artifacts, such as halo, gradient reversal, hazy or saturation.

  14. Informational and Normative Influences in Conformity from a Neurocomputational Perspective.

    PubMed

    Toelch, Ulf; Dolan, Raymond J

    2015-10-01

    We consider two distinct influences that drive conformity behaviour. Whereas informational influences facilitate adaptive and accurate responses, normative influences bias decisions to enhance social acceptance. We explore these influences from a perspective of perceptual and value-based decision-making models and apply these models to classical works on conformity. We argue that an informational account predicts a surprising tendency to conform. Moreover, we detail how normative influences fit into this framework and interact with social influences. Finally, we explore potential neuronal substrates for informational and normative influences based on a consideration of the neurobiological literature, highlighting conceptual shortcomings particularly with regard to a failure to segregate informational and normative influences.

  15. Measuring the IMF and Detailed Abundance Patterns from the Integrated Light of Old Stellar Systems

    NASA Astrophysics Data System (ADS)

    Conroy, Charlie

    The spectral energy distributions (SEDs) of unresolved stellar systems holds key information regarding the detailed abundance pattern, star formation history, dust properties, and initial mass function (IMF) of the underlying stellar population(s). This information can only be extracted with the aid of stellar population synthesis (SPS) models. Such models have been employed to estimate basic properties such as the star formation rate, metallicity (Z, and in certain contexts, alpha-enhancement), and total stellar mass (assuming an IMF). However, much more information is available in the SED than can be extracted by the current generation of SPS models because existing models are plagued by incomplete and poorly calibrated ingredients. The proposers request funds to develop a next generation SPS model capable of measuring the IMF and detailed abundance patterns from the SEDs of composite stellar systems. In particular, we intend to develop an SPS model that makes accurate predictions for the SEDs (from 0.1-3mu m at a resolving power of ~5,000) of composite systems as a function of the IMF, stellar age, metallicity, and individual elemental abundances (including C, N, O, Na, Mg, Ca, Ti, Cr, Mn, Fe, Sr, and Ba). This will require the construction of a new synthetic stellar spectral library and a new isochrone library. This new model will be the first to make predictions for the full SED shape as a function of individual abundance ratios, age, and the IMF. We will extensively calibrate the model predictions against data on individual stars and globular clusters. The new model will be essential for interpreting optical-NIR spectra obtained from the James Webb Space Telescope as well as both present and future ground-based facilities.

  16. 24. 'HANGAR SHEDS ELEVATIONS DETAILS; ARCHITECTURAL PLANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. 'HANGAR SHEDS - ELEVATIONS - DETAILS; ARCHITECTURAL PLANS - PLANT AREA; MODIFICATION CENTER NO. 1, DAGGETT, CALIFORNIA.' Partial elevations, and details of sliding doors and ventilator flaps, as built. Contract no. W509 Eng. 2743; File no. 555/81, revision B, dated April 6, 1943. No sheet number. - Barstow-Daggett Airport, Hangar Shed No. 4, 39500 National Trails Highway, Daggett, San Bernardino County, CA

  17. 18 CFR 401.122 - Supplementary details.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 2 2011-04-01 2011-04-01 false Supplementary details. 401.122 Section 401.122 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION ADMINISTRATIVE MANUAL RULES OF PRACTICE AND PROCEDURE General Provisions § 401.122 Supplementary details....

  18. 18 CFR 401.122 - Supplementary details.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 2 2014-04-01 2014-04-01 false Supplementary details. 401.122 Section 401.122 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION ADMINISTRATIVE MANUAL RULES OF PRACTICE AND PROCEDURE General Provisions § 401.122 Supplementary details....

  19. 18 CFR 401.122 - Supplementary details.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 2 2013-04-01 2012-04-01 true Supplementary details. 401.122 Section 401.122 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION ADMINISTRATIVE MANUAL RULES OF PRACTICE AND PROCEDURE General Provisions § 401.122 Supplementary details....

  20. 18 CFR 401.122 - Supplementary details.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 2 2012-04-01 2012-04-01 false Supplementary details. 401.122 Section 401.122 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION ADMINISTRATIVE MANUAL RULES OF PRACTICE AND PROCEDURE General Provisions § 401.122 Supplementary details....

  1. Local Mode Analysis: Decoding IR Spectra by Visualizing Molecular Details.

    PubMed

    Massarczyk, M; Rudack, T; Schlitter, J; Kuhne, J; Kötting, C; Gerwert, K

    2017-02-08

    Integration of experimental and computational approaches to investigate chemical reactions in proteins has proven to be very successful. Experimentally, time-resolved FTIR difference-spectroscopy monitors chemical reactions at atomic detail. To decode detailed structural information encoded in IR spectra, QM/MM calculations are performed. Here, we present a novel method which we call local mode analysis (LMA) for calculating IR spectra and assigning spectral IR-bands on the basis of movements of nuclei and partial charges from just a single QM/MM trajectory. Through LMA the decoding of IR spectra no longer requires several simulations or optimizations. The novel approach correlates the motions of atoms of a single simulation with the corresponding IR bands and provides direct access to the structural information encoded in IR spectra. Either the contributions of a particular atom or atom group to the complete IR spectrum of the molecule are visualized, or an IR-band is selected to visualize the corresponding structural motions. Thus, LMA decodes the detailed information contained in IR spectra and provides an intuitive approach for structural biologists and biochemists. The unique feature of LMA is the bidirectional analysis connecting structural details to spectral features and vice versa spectral features to molecular motions.

  2. Remembering the Specific Visual Details of Presented Objects: Neuroimaging Evidence for Effects of Emotion

    ERIC Educational Resources Information Center

    Kensinger, Elizabeth A.; Schacter, Daniel L.

    2007-01-01

    Memories can be retrieved with varied amounts of visual detail, and the emotional content of information can influence the likelihood that visual detail is remembered. In the present fMRI experiment (conducted with 19 adults scanned using a 3T magnet), we examined the neural processes that correspond with recognition of the visual details of…

  3. A Review of Research and a Meta-Analysis of the Seductive Detail Effect

    ERIC Educational Resources Information Center

    Rey, Gunter Daniel

    2012-01-01

    Seductive details constitute interesting but irrelevant information that are not necessary to achieve the instructional objective. The seductive detail effect occurs when people learn more deeply from instructional messages that exclude rather than include these details. This effect is mainly explained by assuming an overloading of the working…

  4. 25. 'HANGAR SHEDS TRUSSES DETAILS; ARCHITECTURAL PLANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. 'HANGAR SHEDS - TRUSSES - DETAILS; ARCHITECTURAL PLANS - PLANT AREA; MODIFICATION CENTER NO. 1, DAGGETT, CALIFORNIA.' Sections and details of trusses, ironwork, and joints, as modified to show ridge joint detail. As built. This blueline also shows the fire suppression system, added in orange pencil for 'Project 13: Bldgs. T-30, T-50, T-70, T-90' at a later, unspecified date. Contract no. W509 Eng. 2743; File no. 555/84, revision B, dated August 24, 1942. No sheet number. - Barstow-Daggett Airport, Hangar Shed No. 4, 39500 National Trails Highway, Daggett, San Bernardino County, CA

  5. Robust high-resolution cloth using parallelism, history-based collisions, and accurate friction.

    PubMed

    Selle, Andrew; Su, Jonathan; Irving, Geoffrey; Fedkiw, Ronald

    2009-01-01

    In this paper we simulate high resolution cloth consisting of up to 2 million triangles which allows us to achieve highly detailed folds and wrinkles. Since the level of detail is also influenced by object collision and self collision, we propose a more accurate model for cloth-object friction. We also propose a robust history-based repulsion/collision framework where repulsions are treated accurately and efficiently on a per time step basis. Distributed memory parallelism is used for both time evolution and collisions and we specifically address Gauss-Seidel ordering of repulsion/collision response. This algorithm is demonstrated by several high resolution and high-fidelity simulations.

  6. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  7. Global detailed geoid computation and model analysis

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Vincent, S.

    1974-01-01

    Comparisons and analyses were carried out through the use of detailed gravimetric geoids which we have computed by combining models with a set of 26,000 1 deg x 1 deg mean free air gravity anomalies. The accuracy of the detailed gravimetric geoid computed using the most recent Goddard earth model (GEM-6) in conjunction with the set of 1 deg x 1 deg mean free air gravity anomalies is assessed at + or - 2 meters on the continents of North America, Europe, and Australia, 2 to 5 meters in the Northeast Pacific and North Atlantic areas, and 5 to 10 meters in other areas where surface gravity data are sparse. The R.M.S. differences between this detailed geoid and the detailed geoids computed using the other satellite gravity fields in conjuction with same set of surface data range from 3 to 7 meters.

  8. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  9. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  10. Accurate, practical simulation of satellite infrared radiometer spectral data

    SciTech Connect

    Sullivan, T.J.

    1982-09-01

    This study's purpose is to determine whether a relatively simple random band model formulation of atmospheric radiation transfer in the infrared region can provide valid simulations of narrow interval satellite-borne infrared sounder system data. Detailed ozonesondes provide the pertinent atmospheric information and sets of calibrated satellite measurements provide the validation. High resolution line-by-line model calculations are included to complete the evaluation.

  11. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  12. Interior building details of Building C, Room C203: detail decorative ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior building details of Building C, Room C-203: detail decorative radiator and four-over-four windows; southwesterly view - San Quentin State Prison, Building 22, Point San Quentin, San Quentin, Marin County, CA

  13. Plan, Detail of Lower Chord, Section at U8L8, Detail of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Plan, Detail of Lower Chord, Section at U8L8, Detail of Upper Chord - Springfield-Des Arc Bridge, Spanning North Branch of Cadron Creek at Old Springfield-Des Arc Road (County Road 222), Springfield, Conway County, AR

  14. 36 CFR 1202.26 - Who will make sure that my record is accurate?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... RECORDS ADMINISTRATION GENERAL RULES REGULATIONS IMPLEMENTING THE PRIVACY ACT OF 1974 Collecting Information § 1202.26 Who will make sure that my record is accurate? The system manager ensures that...

  15. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  16. Balancing detail and scale in assessing transparency to improve the governance of agricultural commodity supply chains

    NASA Astrophysics Data System (ADS)

    Godar, Javier; Suavet, Clément; Gardner, Toby A.; Dawkins, Elena; Meyfroidt, Patrick

    2016-03-01

    To date, assessments of the sustainability of agricultural commodity supply chains have largely relied on some combination of macro-scale footprint accounts, detailed life-cycle analyses and fine-scale traceability systems. Yet these approaches are limited in their ability to support the sustainability governance of agricultural supply chains, whether because they are intended for coarser-grained analyses, do not identify individual actors, or are too costly to be implemented in a consistent manner for an entire region of production. Here we illustrate some of the advantages of a complementary middle-ground approach that balances detail and scale of supply chain transparency information by combining consistent country-wide data on commodity production at the sub-national (e.g. municipal) level with per shipment customs data to describe trade flows of a given commodity covering all companies and production regions within that country. This approach can support supply chain governance in two key ways. First, enhanced spatial resolution of the production regions that connect to individual supply chains allows for a more accurate consideration of geographic variability in measures of risk and performance that are associated with different production practices. Second, identification of key actors that operate within a specific supply chain, including producers, traders, shippers and consumers can help discriminate coalitions of actors that have shared stake in a particular region, and that together are capable of delivering more cost-effective and coordinated interventions. We illustrate the potential of this approach with examples from Brazil, Indonesia and Colombia. We discuss how transparency information can deepen understanding of the environmental and social impacts of commodity production systems, how benefits are distributed among actors, and some of the trade-offs involved in efforts to improve supply chain sustainability. We then discuss the challenges and

  17. [Teacher Referral Information and Statistical Information Forms.

    ERIC Educational Resources Information Center

    Short, N. J.

    This rating information form used to refer children to the PIC program, elicits information concerning the child's emotional, cognitive, and personality development. See TM 001 111 for details of the program in which it is used. (DLG)

  18. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  19. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  20. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  1. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  2. Vadose zone transport field study: Detailed test plan for simulated leak tests

    SciTech Connect

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from these uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to

  3. Detailed seafloor habitat mapping to enhance marine-resource management

    USGS Publications Warehouse

    Zawada, David G.; Hart, Kristen M.

    2010-01-01

    Pictures of the seafloor capture important information about the sediments, exposed geologic features, submerged aquatic vegetation, and animals found in a given habitat. With the emergence of marine protected areas (MPAs) as a favored tactic for preserving coral reef resources, knowledge of essential habitat components is paramount to designing effective management strategies. Surprisingly, detailed information on seafloor habitat components is not available in many areas that are being considered for MPA designation or that are already designated as MPAs. A task of the U.S. Geological Survey Coral Reef Ecosystem STudies (USGS CREST) project is addressing this issue.

  4. An Accurate, Simplified Model Intrabeam Scattering

    SciTech Connect

    Bane, Karl LF

    2002-05-23

    Beginning with the general Bjorken-Mtingwa solution for intrabeam scattering (IBS) we derive an accurate, greatly simplified model of IBS, valid for high energy beams in normal storage ring lattices. In addition, we show that, under the same conditions, a modified version of Piwinski's IBS formulation (where {eta}{sub x,y}{sup 2}/{beta}{sub x,y} has been replaced by {Eta}{sub x,y}) asymptotically approaches the result of Bjorken-Mtingwa.

  5. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  6. Accurate determination of heteroclinic orbits in chaotic dynamical systems

    NASA Astrophysics Data System (ADS)

    Li, Jizhou; Tomsovic, Steven

    2017-03-01

    Accurate calculation of heteroclinic and homoclinic orbits can be of significant importance in some classes of dynamical system problems. Yet for very strongly chaotic systems initial deviations from a true orbit will be magnified by a large exponential rate making direct computational methods fail quickly. In this paper, a method is developed that avoids direct calculation of the orbit by making use of the well-known stability property of the invariant unstable and stable manifolds. Under an area-preserving map, this property assures that any initial deviation from the stable (unstable) manifold collapses onto them under inverse (forward) iterations of the map. Using a set of judiciously chosen auxiliary points on the manifolds, long orbit segments can be calculated using the stable and unstable manifold intersections of the heteroclinic (homoclinic) tangle. Detailed calculations using the example of the kicked rotor are provided along with verification of the relation between action differences and certain areas bounded by the manifolds.

  7. Memory for Specific Visual Details can be Enhanced by Negative Arousing Content

    ERIC Educational Resources Information Center

    Kensinger, Elizabeth A.; Garoff-Eaton, Rachel J.; Schacter, Daniel L.

    2006-01-01

    Individuals often claim that they vividly remember information with negative emotional content. At least two types of information could lead to this sense of enhanced vividness: Information about the emotional item itself (e.g., the exact visual details of a snake) and information about the context in which the emotional item was encountered…

  8. Accurate documentation, correct coding, and compliance: it's your best defense!

    PubMed

    Coles, T S; Babb, E F

    1999-07-01

    This article focuses on the need for physicians to maintain an awareness of regulatory policy and the law impacting the federal government's medical insurance programs, and to internalize and apply this knowledge in their practices. Basic information concerning selected fraud and abuse statutes and the civil monetary penalties and sanctions for noncompliance is discussed. The application of accurate documentation and correct coding principles, as well as the rationale for implementating an effective compliance plan in order to prevent fraud and abuse and/or minimize disciplinary action from government regulatory agencies, are emphasized.

  9. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  10. Detail in architecture: Between arts & crafts

    NASA Astrophysics Data System (ADS)

    Dulencin, Juraj

    2016-06-01

    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an

  11. Accurate measurement of streamwise vortices using dual-plane PIV

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Breuer, Kenneth S.

    2012-11-01

    Low Reynolds number aerodynamic experiments with flapping animals (such as bats and small birds) are of particular interest due to their application to micro air vehicles which operate in a similar parameter space. Previous PIV wake measurements described the structures left by bats and birds and provided insight into the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions based on said measurements. The highly three-dimensional and unsteady nature of the flows associated with flapping flight are major challenges for accurate measurements. The challenge of animal flight measurements is finding small flow features in a large field of view at high speed with limited laser energy and camera resolution. Cross-stream measurement is further complicated by the predominately out-of-plane flow that requires thick laser sheets and short inter-frame times, which increase noise and measurement uncertainty. Choosing appropriate experimental parameters requires compromise between the spatial and temporal resolution and the dynamic range of the measurement. To explore these challenges, we do a case study on the wake of a fixed wing. The fixed model simplifies the experiment and allows direct measurements of the aerodynamic forces via load cell. We present a detailed analysis of the wake measurements, discuss the criteria for making accurate measurements, and present a solution for making quantitative aerodynamic load measurements behind free-flyers.

  12. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  13. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  14. Somerset County Flood Information System

    USGS Publications Warehouse

    Summer, William M.

    1998-01-01

    IntroductionThe timely warning of a flood is crucial to the protection of lives and property. One has only to recall the flood of August 2, 1973, in Somerset County, New Jersey, in which six lives were lost and major property damage occurred, to realize how unexpected and costly, especially in terms of human life, a flood can be. Accurate forecasts and warnings cannot be made, however, without detailed information about precipitation and streamflow in the drainage basin.Recognizing the need for detailed hydrologic information for Somerset County, the U.S. Geological Survey (USGS), in cooperation with Somerset County, installed the Somerset County Flood Information System (SCFIS) in 1990. The availability of data provided by this system will improve the flood forecasting ability of the National Weather Service (NWS), and has assisted Somerset County and municipal agencies in planning and execution of flood-preparation and emergency evacuation procedures in the county.This fact sheet describes the Somerset County Flood Information System and identifies its benefits.

  15. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  16. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features.

  17. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well.

  18. Educational Outreach to Opioid Prescribers: The Case for Academic Detailing.

    PubMed

    Trotter Davis, Margot; Bateman, Brian; Avorn, Jerry

    2017-02-01

    Nonmedical use of opioid medications constitutes a serious health threat as the rates of addiction, overdoses, and deaths have risen in recent years. Increasingly, inappropriate and excessively liberal prescribing of opioids by physicians is understood to be a central part of the crisis. Public health officials, hospital systems, and legislators are developing programs and regulations to address the problem in sustained and systematic ways that both insures effective treatment of pain and appropriate limits on the availability of opioids. Three approaches have obtained prominence as means of avoiding excessive and inappropriate prescribing, including: providing financial incentives to physicians to change their clinical decision through pay-for-performance contracts, monitoring patient medications through Prescription Drug Monitoring Programs, and educational outreach to physicians. A promising approach to educational outreach to physicians is an intervention known as "academic detailing." It was developed in the 1980s to provide one-on-one educational outreach to physicians using similar methods as the pharmaceutical industry that sends "detailers" to market their products to physician practices. Core to academic detailing, however, is the idea that medical decisions should be based on evidence-based information, including managing conditions with updated assessment measures, behavioral, and nonpharmacological interventions. With the pharmaceutical industry spending billions of dollars to advertise their products, individual practitioners can have difficulty gathering unbiased information, especially as the number of approved medications grows each year. Academic detailing has successfully affected the management of health conditions, such as atrial fibrillation, chronic obstructive pulmonary disease, and recently, has targeted physicians who prescribe opioids. This article discusses the approach as a potentially effective preventative intervention to address the

  19. States Anxious to Get Details about Stimulus

    ERIC Educational Resources Information Center

    Hoff, David J.

    2009-01-01

    As Congress began debate last week over the size and scope of more than $120 billion in proposed emergency education aid, state leaders were anxiously awaiting the details so they could make specific plans to spend the economic-stimulus money. Governors, state legislators, and state schools chiefs have yet to learn what rules Congress will attach…

  20. Third Sound Amplification and Detailed Balance

    SciTech Connect

    Eddinger, J. D.; Ellis, F. M.

    2006-09-07

    Condensation of atoms from the vapor into a third sound resonance is expected to be capable of acoustic amplification. This results from normal to superfluid conversion that coherently accommodates atoms into the third sound velocity field. Consideration of third sound in light of the equilibrium detailed balance between vapor particles and the superfluid film provides further evidence that acoustic amplification is attainable.

  1. Details on the biography of Jerzy Neyman

    NASA Astrophysics Data System (ADS)

    Gaina, Alex

    2003-04-01

    Details on the biography of Jerzy Neyman (1894-1981) and a short outline of the native town Tighina in Basarabia (the Republic of Moldova) of the outstanding mathematician and statistician, astronomer, meteorologist, biologist, philosopher and sociologist, founder of the mathematical theory of selection has been given.

  2. Big Heads, Small Details and Autism

    ERIC Educational Resources Information Center

    White, Sarah; O'Reilly, Helen; Frith, Uta

    2009-01-01

    Autism is thought to be associated with a bias towards detail-focussed processing. While the cognitive basis remains controversial, one strong hypothesis is that there are high processing costs associated with changing from local into global processing. A possible neural mechanism underlying this processing style is abnormal neural connectivity;…

  3. Flexibility Detailed for Testing Students with Disabilities

    ERIC Educational Resources Information Center

    Samuels, Christina, A.

    2006-01-01

    A proposed federal regulation on testing students with disabilities provides details on the flexibility available to states and schools for meeting the requirements of the No Child Left Behind Act. While state education officials have generally welcomed the flexibility, representatives from advocacy groups for people with disabilities say they are…

  4. The rich detail of cultural symbol systems.

    PubMed

    Read, Dwight W

    2014-08-01

    The goal of forming a science of intentional behavior requires a more richly detailed account of symbolic systems than is assumed by the authors. Cultural systems are not simply the equivalent in the ideational domain of culture of the purported Baldwin Effect in the genetic domain.

  5. Detailed modeling of cluster galaxies in free-form lenses

    NASA Astrophysics Data System (ADS)

    Lam, Daniel

    2015-08-01

    The main goal of the Frontier Fields is to characterize the population of high redshift galaxies that are gravitationally lensed and magnified by foreground massive galaxy clusters. The magnification received by lensed images has to be accurately quantified in order to derive the correct science results. The magnification is in turn computed from lens models, which are constructed from various constraints, most commonly the positions and redshifts of multiply-lensed galaxies.The locations and magnification of multiple images that appear near cluster galaxies are very sensitive to the mass distribution of those individual galaxies. In current free-form lens models, they are at best crudely approximated by arbitrary mass halos and are usually being completely neglected. Given sufficient free parameters and iterations, such models may be highly consistent but their predictive power would be rather limited. This shortcoming is particularly pronounced in light of the recent discovery of the first multiply-lensed supernova in the Frontier Fields cluster MACSJ1149. The proximity of its images to cluster galaxies mandates detailed modeling on galaxy-scales, where free-form methods solely based on grid solutions simply fail.We present a hybrid free-form lens model of Abell 2744, which for the first time incorporates a detailed mass component modeled by GALFIT that accurately captures the stellar light distribution of the hundred brightest cluster galaxies. The model better reproduces the image positions than a previous version, which modeled cluster galaxies with simplistic NFW halos. Curiously, this improvement is found in all but system 2, which has two radial images appearing around the BCG. Despite its complex light profile is being captured by GALFIT, the persistent discrepancies suggest considering mass distributions that may be largely offset from the stellar light distribution.

  6. Can radiation therapy treatment planning system accurately predict surface doses in postmastectomy radiation therapy patients?

    SciTech Connect

    Wong, Sharon; Back, Michael; Tan, Poh Wee; Lee, Khai Mun; Baggarley, Shaun; Lu, Jaide Jay

    2012-07-01

    Skin doses have been an important factor in the dose prescription for breast radiotherapy. Recent advances in radiotherapy treatment techniques, such as intensity-modulated radiation therapy (IMRT) and new treatment schemes such as hypofractionated breast therapy have made the precise determination of the surface dose necessary. Detailed information of the dose at various depths of the skin is also critical in designing new treatment strategies. The purpose of this work was to assess the accuracy of surface dose calculation by a clinically used treatment planning system and those measured by thermoluminescence dosimeters (TLDs) in a customized chest wall phantom. This study involved the construction of a chest wall phantom for skin dose assessment. Seven TLDs were distributed throughout each right chest wall phantom to give adequate representation of measured radiation doses. Point doses from the CMS Xio Registered-Sign treatment planning system (TPS) were calculated for each relevant TLD positions and results correlated. There were no significant difference between measured absorbed dose by TLD and calculated doses by the TPS (p > 0.05 (1-tailed). Dose accuracy of up to 2.21% was found. The deviations from the calculated absorbed doses were overall larger (3.4%) when wedges and bolus were used. 3D radiotherapy TPS is a useful and accurate tool to assess the accuracy of surface dose. Our studies have shown that radiation treatment accuracy expressed as a comparison between calculated doses (by TPS) and measured doses (by TLD dosimetry) can be accurately predicted for tangential treatment of the chest wall after mastectomy.

  7. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  8. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2017-03-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  9. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  10. Determining accurate distances to nearby galaxies

    NASA Astrophysics Data System (ADS)

    Bonanos, Alceste Zoe

    2005-11-01

    Determining accurate distances to nearby or distant galaxies is a very simple conceptually, yet complicated in practice, task. Presently, distances to nearby galaxies are only known to an accuracy of 10-15%. The current anchor galaxy of the extragalactic distance scale is the Large Magellanic Cloud, which has large (10-15%) systematic uncertainties associated with it, because of its morphology, its non-uniform reddening and the unknown metallicity dependence of the Cepheid period-luminosity relation. This work aims to determine accurate distances to some nearby galaxies, and subsequently help reduce the error in the extragalactic distance scale and the Hubble constant H 0 . In particular, this work presents the first distance determination of the DIRECT Project to M33 with detached eclipsing binaries. DIRECT aims to obtain a new anchor galaxy for the extragalactic distance scale by measuring direct, accurate (to 5%) distances to two Local Group galaxies, M31 and M33, with detached eclipsing binaries. It involves a massive variability survey of these galaxies and subsequent photometric and spectroscopic follow-up of the detached binaries discovered. In this work, I also present a catalog of variable stars discovered in one of the DIRECT fields, M31Y, which includes 41 eclipsing binaries. Additionally, we derive the distance to the Draco Dwarf Spheroidal galaxy, with ~100 RR Lyrae found in our first CCD variability study of this galaxy. A "hybrid" method of discovering Cepheids with ground-based telescopes is described next. It involves applying the image subtraction technique on the images obtained from ground-based telescopes and then following them up with the Hubble Space Telescope to derive Cepheid period-luminosity distances. By re-analyzing ESO Very Large Telescope data on M83 (NGC 5236), we demonstrate that this method is much more powerful for detecting variability, especially in crowded fields. I finally present photometry for the Wolf-Rayet binary WR 20a

  11. Iterative feature refinement for accurate undersampled MR image reconstruction

    NASA Astrophysics Data System (ADS)

    Wang, Shanshan; Liu, Jianbo; Liu, Qiegen; Ying, Leslie; Liu, Xin; Zheng, Hairong; Liang, Dong

    2016-05-01

    Accelerating MR scan is of great significance for clinical, research and advanced applications, and one main effort to achieve this is the utilization of compressed sensing (CS) theory. Nevertheless, the existing CSMRI approaches still have limitations such as fine structure loss or high computational complexity. This paper proposes a novel iterative feature refinement (IFR) module for accurate MR image reconstruction from undersampled K-space data. Integrating IFR with CSMRI which is equipped with fixed transforms, we develop an IFR-CS method to restore meaningful structures and details that are originally discarded without introducing too much additional complexity. Specifically, the proposed IFR-CS is realized with three iterative steps, namely sparsity-promoting denoising, feature refinement and Tikhonov regularization. Experimental results on both simulated and in vivo MR datasets have shown that the proposed module has a strong capability to capture image details, and that IFR-CS is comparable and even superior to other state-of-the-art reconstruction approaches.

  12. Higher Education in France: A Handbook of Information Concerning Fields of Study in Each Institution. Bulletin, 1952, No. 6

    ERIC Educational Resources Information Center

    Kahler, Edith

    1952-01-01

    Advising students who wish to study in other countries is often difficult because accurate, up-to-date, and detailed information about the offerings in their higher institutions is frequently unavailable. A student wishing to study in a given country needs to know what the several institutions offer not only in his own subject area but also in…

  13. Detailed Facility Report | ECHO | US EPA

    EPA Pesticide Factsheets

    ECHO, Enforcement and Compliance History Online, provides compliance and enforcement information for approximately 800,000 EPA-regulated facilities nationwide. ECHO includes permit, inspection, violation, enforcement action, and penalty information about facilities regulated under the Clean Air Act (CAA) Stationary Source Program, Clean Water Act (CWA) National Pollutant Elimination Discharge System (NPDES), and/or Resource Conservation and Recovery Act (RCRA). Information also is provided on surrounding demographics when available.

  14. Accurate colon residue detection algorithm with partial volume segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Liang, Zhengrong; Zhang, PengPeng; Kutcher, Gerald J.

    2004-05-01

    Colon cancer is the second leading cause of cancer-related death in the United States. Earlier detection and removal of polyps can dramatically reduce the chance of developing malignant tumor. Due to some limitations of optical colonoscopy used in clinic, many researchers have developed virtual colonoscopy as an alternative technique, in which accurate colon segmentation is crucial. However, partial volume effect and existence of residue make it very challenging. The electronic colon cleaning technique proposed by Chen et al is a very attractive method, which is also kind of hard segmentation method. As mentioned in their paper, some artifacts were produced, which might affect the accurate colon reconstruction. In our paper, instead of labeling each voxel with a unique label or tissue type, the percentage of different tissues within each voxel, which we call a mixture, was considered in establishing a maximum a posterior probability (MAP) image-segmentation framework. A Markov random field (MRF) model was developed to reflect the spatial information for the tissue mixtures. The spatial information based on hard segmentation was used to determine which tissue types are in the specific voxel. Parameters of each tissue class were estimated by the expectation-maximization (EM) algorithm during the MAP tissue-mixture segmentation. Real CT experimental results demonstrated that the partial volume effects between four tissue types have been precisely detected. Meanwhile, the residue has been electronically removed and very smooth and clean interface along the colon wall has been obtained.

  15. Accurate taxonomic assignment of short pyrosequencing reads.

    PubMed

    Clemente, José C; Jansson, Jesper; Valiente, Gabriel

    2010-01-01

    Ambiguities in the taxonomy dependent assignment of pyrosequencing reads are usually resolved by mapping each read to the lowest common ancestor in a reference taxonomy of all those sequences that match the read. This conservative approach has the drawback of mapping a read to a possibly large clade that may also contain many sequences not matching the read. A more accurate taxonomic assignment of short reads can be made by mapping each read to the node in the reference taxonomy that provides the best precision and recall. We show that given a suffix array for the sequences in the reference taxonomy, a short read can be mapped to the node of the reference taxonomy with the best combined value of precision and recall in time linear in the size of the taxonomy subtree rooted at the lowest common ancestor of the matching sequences. An accurate taxonomic assignment of short reads can thus be made with about the same efficiency as when mapping each read to the lowest common ancestor of all matching sequences in a reference taxonomy. We demonstrate the effectiveness of our approach on several metagenomic datasets of marine and gut microbiota.

  16. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  17. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  18. Sparse and accurate high resolution SAR imaging

    NASA Astrophysics Data System (ADS)

    Vu, Duc; Zhao, Kexin; Rowe, William; Li, Jian

    2012-05-01

    We investigate the usage of an adaptive method, the Iterative Adaptive Approach (IAA), in combination with a maximum a posteriori (MAP) estimate to reconstruct high resolution SAR images that are both sparse and accurate. IAA is a nonparametric weighted least squares algorithm that is robust and user parameter-free. IAA has been shown to reconstruct SAR images with excellent side lobes suppression and high resolution enhancement. We first reconstruct the SAR images using IAA, and then we enforce sparsity by using MAP with a sparsity inducing prior. By coupling these two methods, we can produce a sparse and accurate high resolution image that are conducive for feature extractions and target classification applications. In addition, we show how IAA can be made computationally efficient without sacrificing accuracies, a desirable property for SAR applications where the size of the problems is quite large. We demonstrate the success of our approach using the Air Force Research Lab's "Gotcha Volumetric SAR Data Set Version 1.0" challenge dataset. Via the widely used FFT, individual vehicles contained in the scene are barely recognizable due to the poor resolution and high side lobe nature of FFT. However with our approach clear edges, boundaries, and textures of the vehicles are obtained.

  19. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  20. Derivative particles for simulating detailed movements of fluids.

    PubMed

    Song, Oh-young; Kim, Doyub; Ko, Hyeong-Seok

    2007-01-01

    We present a new fluid simulation technique that significantly reduces the nonphysical dissipation of velocity. The proposed method is based on an apt use of particles and derivative information. We note that a major source of numerical dissipation in the conventional Navier-Stokes equations solver lies in the advection step. Hence, starting with the conventional grid-based simulator, when the details of fluid movements need to be simulated, we replace the advection part with a particle simulator. When swapping between the grid-based and particle-based simulators, the physical quantities such as the level set and velocity must be converted. For this purpose, we develop a novel dissipation-suppressing conversion procedure that utilizes the derivative information stored in the particles, as well as in the grid points. For the fluid regions where such details are not needed, the advection is simulated using an octree-based constrained interpolation profile (CIP) solver, which we develop in this work. Through several experiments, we show that the proposed technique can reproduce the detailed movements of high-Reynolds-number fluids such as droplets/bubbles, thin water sheets, and whirlpools. The increased accuracy in the advection, which forms the basis of the proposed technique, can also be used to produce better results in larger scale fluid simulations.

  1. Detail enhancement of blurred infrared images based on frequency extrapolation

    NASA Astrophysics Data System (ADS)

    Xu, Fuyuan; Zeng, Deguo; Zhang, Jun; Zheng, Ziyang; Wei, Fei; Wang, Tiedan

    2016-05-01

    A novel algorithm for enhancing the details of the blurred infrared images based on frequency extrapolation has been raised in this paper. Unlike other researchers' work, this algorithm mainly focuses on how to predict the higher frequency information based on the Laplacian pyramid separation of the blurred image. This algorithm uses the first level of the high frequency component of the pyramid of the blurred image to reverse-generate a higher, non-existing frequency component, and adds back to the histogram equalized input blurred image. A simple nonlinear operator is used to analyze the extracted first level high frequency component of the pyramid. Two critical parameters are participated in the calculation known as the clipping parameter C and the scaling parameter S. The detailed analysis of how these two parameters work during the procedure is figure demonstrated in this paper. The blurred image will become clear, and the detail will be enhanced due to the added higher frequency information. This algorithm has the advantages of computational simplicity and great performance, and it can definitely be deployed in the real-time industrial applications. We have done lots of experiments and gave illustrations of the algorithm's performance in this paper to convince its effectiveness.

  2. Characteristics of physicians targeted by the pharmaceutical industry to participate in e-detailing.

    PubMed

    Alkhateeb, Fadi M; Khanfar, Nile M; Doucette, William R; Loudon, David

    2009-01-01

    Electronic detailing (e-detailing) has been introduced in the last few years by the pharmaceutical industry as a new communication channel through which to promote pharmaceutical products to physicians. E-detailing involves using digital technology, such as Internet, video conferencing, and interactive voice response, by which drug companies target their marketing efforts toward specific physicians with pinpoint accuracy. A mail survey of 671 Iowa physicians was used to gather information about the physician characteristics and practice setting characteristics of those who are usually targeted by pharmaceutical companies to participate in e-detailing. A model is developed and tested to explain firms' targeting strategy for targeting physicians for e-detailing.

  3. On detailed 3D reconstruction of large indoor environments

    NASA Astrophysics Data System (ADS)

    Bondarev, Egor

    2015-03-01

    In this paper we present techniques for highly detailed 3D reconstruction of extra large indoor environments. We discuss the benefits and drawbacks of low-range, far-range and hybrid sensing and reconstruction approaches. The proposed techniques for low-range and hybrid reconstruction, enabling the reconstruction density of 125 points/cm3 on large 100.000 m3 models, are presented in detail. The techniques tackle the core challenges for the above requirements, such as a multi-modal data fusion (fusion of a LIDAR data with a Kinect data), accurate sensor pose estimation, high-density scanning and depth data noise filtering. Other important aspects for extra large 3D indoor reconstruction are the point cloud decimation and real-time rendering. In this paper, we present a method for planar-based point cloud decimation, allowing for reduction of a point cloud size by 80-95%. Besides this, we introduce a method for online rendering of extra large point clouds enabling real-time visualization of huge cloud spaces in conventional web browsers.

  4. Detailed Jet Dynamics in a Collapsing Bubble

    NASA Astrophysics Data System (ADS)

    Supponen, Outi; Obreschkow, Danail; Kobel, Philippe; Farhat, Mohamed

    2015-12-01

    We present detailed visualizations of the micro-jet forming inside an aspherically collapsing cavitation bubble near a free surface. The high-quality visualizations of large and strongly deformed bubbles disclose so far unseen features of the dynamics inside the bubble, such as a mushroom-like flattened jet-tip, crown formation and micro-droplets. We also find that jetting near a free surface reduces the collapse time relative to the Rayleigh time.

  5. How accurate is the Pearson r-from-Z approximation? A Monte Carlo simulation study.

    PubMed

    Hittner, James B; May, Kim

    2012-01-01

    The Pearson r-from-Z approximation estimates the sample correlation (as an effect size measure) from the ratio of two quantities: the standard normal deviate equivalent (Z-score) corresponding to a one-tailed p-value divided by the square root of the total (pooled) sample size. The formula has utility in meta-analytic work when reports of research contain minimal statistical information. Although simple to implement, the accuracy of the Pearson r-from-Z approximation has not been empirically evaluated. To address this omission, we performed a series of Monte Carlo simulations. Results indicated that in some cases the formula did accurately estimate the sample correlation. However, when sample size was very small (N = 10) and effect sizes were small to small-moderate (ds of 0.1 and 0.3), the Pearson r-from-Z approximation was very inaccurate. Detailed figures that provide guidance as to when the Pearson r-from-Z formula will likely yield valid inferences are presented.

  6. Flexible, Fast and Accurate Sequence Alignment Profiling on GPGPU with PaSWAS

    PubMed Central

    Warris, Sven; Yalcin, Feyruz; Jackson, Katherine J. L.; Nap, Jan Peter

    2015-01-01

    Motivation To obtain large-scale sequence alignments in a fast and flexible way is an important step in the analyses of next generation sequencing data. Applications based on the Smith-Waterman (SW) algorithm are often either not fast enough, limited to dedicated tasks or not sufficiently accurate due to statistical issues. Current SW implementations that run on graphics hardware do not report the alignment details necessary for further analysis. Results With the Parallel SW Alignment Software (PaSWAS) it is possible (a) to have easy access to the computational power of NVIDIA-based general purpose graphics processing units (GPGPUs) to perform high-speed sequence alignments, and (b) retrieve relevant information such as score, number of gaps and mismatches. The software reports multiple hits per alignment. The added value of the new SW implementation is demonstrated with two test cases: (1) tag recovery in next generation sequence data and (2) isotype assignment within an immunoglobulin 454 sequence data set. Both cases show the usability and versatility of the new parallel Smith-Waterman implementation. PMID:25830241

  7. Structural concepts and details for seismic design

    SciTech Connect

    Not Available

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  8. Using ecological zones to increase the detail of Landsat classifications

    NASA Technical Reports Server (NTRS)

    Fox, L., III; Mayer, K. E.

    1981-01-01

    Changes in classification detail of forest species descriptions were made for Landsat data on 2.2 million acres in northwestern California. Because basic forest canopy structures may exhibit very similar E-M energy reflectance patterns in different environmental regions, classification labels based on Landsat spectral signatures alone become very generalized when mapping large heterogeneous ecological regions. By adding a seven ecological zone stratification, a 167% improvement in classification detail was made over the results achieved without it. The seven zone stratification is a less costly alternative to the inclusion of complex collateral information, such as terrain data and soil type, into the Landsat data base when making inventories of areas greater than 500,000 acres.

  9. Exploring Architectural Details Through a Wearable Egocentric Vision Device

    PubMed Central

    Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita

    2016-01-01

    Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience. PMID:26901197

  10. Exploring Architectural Details Through a Wearable Egocentric Vision Device.

    PubMed

    Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita

    2016-02-17

    Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience.

  11. Downscaling NASA Climatological Data to Produce Detailed Climate Zone Maps

    NASA Technical Reports Server (NTRS)

    Chandler, William S.; Hoell, James M.; Westberg, David J.; Whitlock, Charles H.; Zhang, Taiping; Stackhouse, P. W.

    2011-01-01

    The design of energy efficient sustainable buildings is heavily dependent on accurate long-term and near real-time local weather data. To varying degrees the current meteorological networks over the globe have been used to provide these data albeit often from sites far removed from the desired location. The national need is for access to weather and solar resource data accurate enough to use to develop preliminary building designs within a short proposal time limit, usually within 60 days. The NASA Prediction Of Worldwide Energy Resource (POWER) project was established by NASA to provide industry friendly access to globally distributed solar and meteorological data. As a result, the POWER web site (power.larc.nasa.gov) now provides global information on many renewable energy parameters and several buildings-related items but at a relatively coarse resolution. This paper describes a method of downscaling NASA atmospheric assimilation model results to higher resolution and maps those parameters to produce building climate zone maps using estimates of temperature and precipitation. The distribution of climate zones for North America with an emphasis on the Pacific Northwest for just one year shows very good correspondence to the currently defined distribution. The method has the potential to provide a consistent procedure for deriving climate zone information on a global basis that can be assessed for variability and updated more regularly.

  12. Revisiting the Seductive Details Effect in Multimedia Learning: Context-Dependency of Seductive Details

    ERIC Educational Resources Information Center

    Ozdemir, Devrim; Doolittle, Peter

    2015-01-01

    The purpose of this study was to investigate the effects of context-dependency of seductive details on recall and transfer in multimedia learning environments. Seductive details were interesting yet irrelevant sentences in the instructional text. Two experiments were conducted. The purpose of Experiment 1 was to identify context-dependent and…

  13. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  14. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  15. LSM: perceptually accurate line segment merging

    NASA Astrophysics Data System (ADS)

    Hamid, Naila; Khan, Nazar

    2016-11-01

    Existing line segment detectors tend to break up perceptually distinct line segments into multiple segments. We propose an algorithm for merging such broken segments to recover the original perceptually accurate line segments. The algorithm proceeds by grouping line segments on the basis of angular and spatial proximity. Then those line segment pairs within each group that satisfy unique, adaptive mergeability criteria are successively merged to form a single line segment. This process is repeated until no more line segments can be merged. We also propose a method for quantitative comparison of line segment detection algorithms. Results on the York Urban dataset show that our merged line segments are closer to human-marked ground-truth line segments compared to state-of-the-art line segment detection algorithms.

  16. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  17. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  18. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  19. Obtaining accurate translations from expressed sequence tags.

    PubMed

    Wasmuth, James; Blaxter, Mark

    2009-01-01

    The genomes of an increasing number of species are being investigated through the generation of expressed sequence tags (ESTs). However, ESTs are prone to sequencing errors and typically define incomplete transcripts, making downstream annotation difficult. Annotation would be greatly improved with robust polypeptide translations. Many current solutions for EST translation require a large number of full-length gene sequences for training purposes, a resource that is not available for the majority of EST projects. As part of our ongoing EST programs investigating these "neglected" genomes, we have developed a polypeptide prediction pipeline, prot4EST. It incorporates freely available software to produce final translations that are more accurate than those derived from any single method. We describe how this integrated approach goes a long way to overcoming the deficit in training data.

  20. Accurate radio positions with the Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, M. J.; Gulkis, S.; Jauncey, D. L.; Rayner, P. T.

    1979-01-01

    The Tidbinbilla interferometer (Batty et al., 1977) is designed specifically to provide accurate radio position measurements of compact radio sources in the Southern Hemisphere with high sensitivity. The interferometer uses the 26-m and 64-m antennas of the Deep Space Network at Tidbinbilla, near Canberra. The two antennas are separated by 200 m on a north-south baseline. By utilizing the existing antennas and the low-noise traveling-wave masers at 2.29 GHz, it has been possible to produce a high-sensitivity instrument with a minimum of capital expenditure. The north-south baseline ensures that a good range of UV coverage is obtained, so that sources lying in the declination range between about -80 and +30 deg may be observed with nearly orthogonal projected baselines of no less than about 1000 lambda. The instrument also provides high-accuracy flux density measurements for compact radio sources.

  1. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  2. The Nigerian national blindness and visual impairment survey: Rationale, objectives and detailed methodology

    PubMed Central

    Dineen, Brendan; Gilbert, Clare E; Rabiu, Mansur; Kyari, Fatima; Mahdi, Abdull M; Abubakar, Tafida; Ezelum, Christian C; Gabriel, Entekume; Elhassan , Elizabeth; Abiose, Adenike; Faal, Hannah; Jiya, Jonathan Y; Ozemela, Chinenyem P; Lee, Pak Sang; Gudlavalleti, Murthy VS

    2008-01-01

    Background Despite having the largest population in Africa, Nigeria has no accurate population based data to plan and evaluate eye care services. A national survey was undertaken to estimate the prevalence and determine the major causes of blindness and low vision. This paper presents the detailed methodology used during the survey. Methods A nationally representative sample of persons aged 40 years and above was selected. Children aged 10–15 years and individuals aged <10 or 16–39 years with visual impairment were also included if they lived in households with an eligible adult. All participants had their height, weight, and blood pressure measured followed by assessment of presenting visual acuity, refractokeratomery, A-scan ultrasonography, visual fields and best corrected visual acuity. Anterior and posterior segments of each eye were examined with a torch and direct ophthalmoscope. Participants with visual acuity of < = 6/12 in one or both eyes underwent detailed examination including applanation tonometry, dilated slit lamp biomicroscopy, lens grading and fundus photography. All those who had undergone cataract surgery were refracted and best corrected vision recorded. Causes of visual impairment by eye and for the individual were determined using a clinical algorithm recommended by the World Health Organization. In addition, 1 in 7 adults also underwent a complete work up as described for those with vision < = 6/12 for constructing a normative data base for Nigerians. Discussion The field work for the study was completed in 30 months over the period 2005–2007 and covered 305 clusters across the entire country. Concurrently persons 40+ years were examined to form a normative data base. Analysis of the data is currently underway. Conclusion The methodology used was robust and adequate to provide estimates on the prevalence and causes of blindness in Nigeria. The survey would also provide information on barriers to accessing services, quality of life of

  3. Radio Astronomers Set New Standard for Accurate Cosmic Distance Measurement

    NASA Astrophysics Data System (ADS)

    1999-06-01

    the distance to NGC 4258 as either 27 or 29 million light-years, depending on assumptions about the characteristics of this type of star in that galaxy. Other Cepheid-based galaxy distances were used to calculate the expansion rate of the universe, called the Hubble Constant, announced by a team of HST observers last week. "This difference could mean that there may be more uncertainty in Cepheid-determined distances than people have realized," said Moran. "Providing this directly-determined distance to one galaxy -- a distance that can serve as a milestone -- should be helpful in determining distances to other galaxies, and thus the Hubble Constant and the size and age of the universe" The VLBA is a system of ten radio-telescope antennas, each 25 meters (82 feet) in diameter, stretching some 5,000 miles from Mauna Kea in Hawaii to St. Croix in the U.S. Virgin Islands. Operated from NRAO's Array Operations Center in Socorro, NM, the VLBA offers astronomers the greatest resolving power of any telescope anywhere. The NRAO is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. Background information: Determining Cosmic Distances Determining cosmic distances obviously is vital to understanding the size of the universe. In turn, knowing the size of the universe is an important step in determining its age. "The size puts a limit on how much expansion could have occurred since the Big Bang, and thus tells us something about the age," said Moran. However, determining cosmic distances has proven to be a particularly thorny problem for astronomers. In the third century, B.C., the Greek astronomer Aristarchus devised a method of using trigonometry to determine the relative distances of the Moon and Sun, but in practice his method was difficult to use. Though a great first step, he missed the mark by a factor of 20. It wasn't until 1761 that trigonometric methods produced a relatively accurate distance to Venus, thus

  4. Information services and information processing

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Attempts made to design and extend space system capabilities are reported. Special attention was given to establishing user needs for information or services which might be provided by space systems. Data given do not attempt to detail scientific, technical, or economic bases for the needs expressed by the users.

  5. Influence of pansharpening techniques in obtaining accurate vegetation thematic maps

    NASA Astrophysics Data System (ADS)

    Ibarrola-Ulzurrun, Edurne; Gonzalo-Martin, Consuelo; Marcello-Ruiz, Javier

    2016-10-01

    In last decades, there have been a decline in natural resources, becoming important to develop reliable methodologies for their management. The appearance of very high resolution sensors has offered a practical and cost-effective means for a good environmental management. In this context, improvements are needed for obtaining higher quality of the information available in order to get reliable classified images. Thus, pansharpening enhances the spatial resolution of the multispectral band by incorporating information from the panchromatic image. The main goal in the study is to implement pixel and object-based classification techniques applied to the fused imagery using different pansharpening algorithms and the evaluation of thematic maps generated that serve to obtain accurate information for the conservation of natural resources. A vulnerable heterogenic ecosystem from Canary Islands (Spain) was chosen, Teide National Park, and Worldview-2 high resolution imagery was employed. The classes considered of interest were set by the National Park conservation managers. 7 pansharpening techniques (GS, FIHS, HCS, MTF based, Wavelet `à trous' and Weighted Wavelet `à trous' through Fractal Dimension Maps) were chosen in order to improve the data quality with the goal to analyze the vegetation classes. Next, different classification algorithms were applied at pixel-based and object-based approach, moreover, an accuracy assessment of the different thematic maps obtained were performed. The highest classification accuracy was obtained applying Support Vector Machine classifier at object-based approach in the Weighted Wavelet `à trous' through Fractal Dimension Maps fused image. Finally, highlight the difficulty of the classification in Teide ecosystem due to the heterogeneity and the small size of the species. Thus, it is important to obtain accurate thematic maps for further studies in the management and conservation of natural resources.

  6. Super Resolution Reconstruction Based on Adaptive Detail Enhancement for ZY-3 Satellite Images

    NASA Astrophysics Data System (ADS)

    Zhu, Hong; Song, Weidong; Tan, Hai; Wang, Jingxue; Jia, Di

    2016-06-01

    Super-resolution reconstruction of sequence remote sensing image is a technology which handles multiple low-resolution satellite remote sensing images with complementary information and obtains one or more high resolution images. The cores of the technology are high precision matching between images and high detail information extraction and fusion. In this paper puts forward a new image super resolution model frame which can adaptive multi-scale enhance the details of reconstructed image. First, the sequence images were decomposed into a detail layer containing the detail information and a smooth layer containing the large scale edge information by bilateral filter. Then, a texture detail enhancement function was constructed to promote the magnitude of the medium and small details. Next, the non-redundant information of the super reconstruction was obtained by differential processing of the detail layer, and the initial super resolution construction result was achieved by interpolating fusion of non-redundant information and the smooth layer. At last, the final reconstruction image was acquired by executing a local optimization model on the initial constructed image. Experiments on ZY-3 satellite images of same phase and different phase show that the proposed method can both improve the information entropy and the image details evaluation standard comparing with the interpolation method, traditional TV algorithm and MAP algorithm, which indicate that our method can obviously highlight image details and contains more ground texture information. A large number of experiment results reveal that the proposed method is robust and universal for different kinds of ZY-3 satellite images.

  7. A Case for Detailed Surface Analysis.

    NASA Astrophysics Data System (ADS)

    Sanders, Frederick; Doswell, Charles A., III

    1995-04-01

    Detailed analysis of the temperature and moisture fields based on routine hourly surface observations in North America can provide a rational basis for surface feature analysis, thus clarifying the present confusion. Recognition of surface features is an important part of weather forecasting and is especially needed in a careful diagnosis for the prospects of deep convection.Surface temperature gradients are advocated as the primary basis for identifying fronts; examples are given of gross discrepancies in current operational practice between the surface temperature fields and the associated frontal analyses. Surface potential temperature, selected as a means of compensating for elevation differences, is analyzed in the western United States for a period in which a strong, damaging cold front develops and dissipates over a period of less than 24 h. Frontogenesis-related calculations, based on detailed surface temperature analyses, help to explain a case of focusing of heavy precipitation in northern Kentucky that produced a flash flood.Conditions for the initiation of intense convection are illustrated by detailed analyses of the surface moisture and temperature fields. These are used to estimate the buoyancy of surface air lifted to midtroposphere and show the relationship of this buoyancy to ensuing convection. The analyses aid in recognition of the surface dryline (a feature commonly misanalyzed as a cold front) and those convectively produced pools of cold air at the surface that often play a major role in the subsequent redevelopment of convection.The proposed analyses might be difficult to achieve manually in operational practice during busy weather situations, but this could be facilitated by using objective methods with present and prospective workstations. Once surface features are identified, their temporal and spatial evolution must be followed carefully since they can change rapidly.

  8. Chord, Tie Bar & Crossbracing Joint Detail in Plan; Crossbracing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Chord, Tie Bar & Crossbracing Joint Detail in Plan; Crossbracing Center Joint Detail in Plan; Chord Joining Detail in Plan & Elevation; Chord, Panel Post, Tie Bar, & Diagonal Brace Joint Detail; Crossbracing Center Joint Detail in Section; Chord, Panel Post, Tie Bar & Horizontal Brace Joint Detail - Narrows Bridge, Spanning Sugar Creek at Old County Road 280 East, Marshall, Parke County, IN

  9. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  10. Academic detailing to teach aging and geriatrics.

    PubMed

    Duckett, Ashley; Cuoco, Theresa; Pride, Pamela; Wiley, Kathy; Iverson, Patty J; Marsden, Justin; Moran, William; Caton, Cathryn

    2015-01-01

    Geriatric education is a required component of internal medicine training. Work hour rules and hectic schedules have challenged residency training programs to develop and utilize innovative teaching methods. In this study, the authors examined the use of academic detailing as a teaching intervention in their residents' clinic and on the general medicine inpatient wards to improve clinical knowledge and skills in geriatric care. The authors found that this teaching method enables efficient, directed education without disrupting patient care. We were able to show improvements in medical knowledge as well as self-efficacy across multiple geriatric topics.

  11. Implementation details of the coupled QMR algorithm

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.; Nachtigal, Noel M.

    1992-01-01

    The original quasi-minimal residual method (QMR) relies on the three-term look-ahead Lanczos process, to generate basis vectors for the underlying Krylov subspaces. However, empirical observations indicate that, in finite precision arithmetic, three-term vector recurrences are less robust than mathematically equivalent coupled two-term recurrences. Therefore, we recently proposed a new implementation of the QMR method based on a coupled two-term look-ahead Lanczos procedure. In this paper, we describe implementation details of this coupled QMR algorithm, and we present results of numerical experiments.

  12. Instrumentation for detailed bridge-scour measurements

    USGS Publications Warehouse

    Landers, Mark N.; Mueller, David S.; Trent, Roy E.; ,

    1993-01-01

    A portable instrumentation system is being developed to obtain channel bathymetry during floods for detailed bridge-scour measurements. Portable scour measuring systems have four components: sounding instrument, horizontal positioning instrument, deployment mechanisms, and data storage device. The sounding instrument will be a digital fathometer. Horizontal position will be measured using a range-azimuth based hydrographic survey system. The deployment mechanism designed for this system is a remote-controlled boat using a small waterplane area, twin-hull design. An on-board computer and radio will monitor the vessel instrumentation, record measured data, and telemeter data to shore.

  13. A detailed phylogeny for the Methanomicrobiales

    NASA Technical Reports Server (NTRS)

    Rouviere, P.; Mandelco, L.; Winker, S.; Woese, C. R.

    1992-01-01

    The small subunit rRNA sequence of twenty archaea, members of the Methanomicrobiales, permits a detailed phylogenetic tree to be inferred for the group. The tree confirms earlier studies, based on far fewer sequences, in showing the group to be divided into two major clusters, temporarily designated the "methanosarcina" group and the "methanogenium" group. The tree also defines phylogenetic relationships within these two groups, which in some cases do not agree with the phylogenetic relationships implied by current taxonomic names--a problem most acute for the genus Methanogenium and its relatives. The present phylogenetic characterization provides the basis for a consistent taxonomic restructuring of this major methanogenic taxon.

  14. Accurate spectral numerical schemes for kinetic equations with energy diffusion

    NASA Astrophysics Data System (ADS)

    Wilkening, Jon; Cerfon, Antoine J.; Landreman, Matt

    2015-08-01

    We examine the merits of using a family of polynomials that are orthogonal with respect to a non-classical weight function to discretize the speed variable in continuum kinetic calculations. We consider a model one-dimensional partial differential equation describing energy diffusion in velocity space due to Fokker-Planck collisions. This relatively simple case allows us to compare the results of the projected dynamics with an expensive but highly accurate spectral transform approach. It also allows us to integrate in time exactly, and to focus entirely on the effectiveness of the discretization of the speed variable. We show that for a fixed number of modes or grid points, the non-classical polynomials can be many orders of magnitude more accurate than classical Hermite polynomials or finite-difference solvers for kinetic equations in plasma physics. We provide a detailed analysis of the difference in behavior and accuracy of the two families of polynomials. For the non-classical polynomials, if the initial condition is not smooth at the origin when interpreted as a three-dimensional radial function, the exact solution leaves the polynomial subspace for a time, but returns (up to roundoff accuracy) to the same point evolved to by the projected dynamics in that time. By contrast, using classical polynomials, the exact solution differs significantly from the projected dynamics solution when it returns to the subspace. We also explore the connection between eigenfunctions of the projected evolution operator and (non-normalizable) eigenfunctions of the full evolution operator, as well as the effect of truncating the computational domain.

  15. Braking of fast and accurate elbow flexions in the monkey.

    PubMed Central

    Flament, D; Hore, J; Vilis, T

    1984-01-01

    The processes responsible for braking fast and accurate elbow movements were studied in the monkey. The movements studied were made over different amplitudes and against different inertias . All were made to the same end position. Only fast movements that showed the typical biphasic or triphasic pattern of activity in agonists and antagonists were analysed in detail. For movements made over different amplitudes and at different velocities there was symmetry between the acceleration and deceleration phases of the movements. For movements of the same amplitude performed at different velocities there was a direct linear relation between peak velocity and both the peak acceleration (and integrated agonist burst) and peak deceleration (and integrated antagonist burst). The slopes of these relations and their intercept with the peak velocity axis were a function of movement amplitude. This was such that for large and small movements of the same peak velocity and the same end position (i) peak acceleration and phasic agonist activity were larger for the small movements and (ii) peak deceleration and phasic antagonist activity were larger for the small movements. The slope of these relations and the symmetry between acceleration and deceleration were not affected by the addition of an inertial load to the handle held by the monkey. The results indicate that fast and accurate elbow movements in the monkey are braked by antagonist activity that is centrally programmed. As all movements were made to the same end position, the larger antagonist burst in small movements, made at the same peak velocity as large movements, cannot be due to differences in the viscoelastic contribution to braking (cf. Marsden, Obeso & Rothwell , 1983).(ABSTRACT TRUNCATED AT 250 WORDS) PMID:6737291

  16. Detailed Chemical Kinetic Modeling of Hydrazine Decomposition

    NASA Technical Reports Server (NTRS)

    Meagher, Nancy E.; Bates, Kami R.

    2000-01-01

    The purpose of this research project is to develop and validate a detailed chemical kinetic mechanism for gas-phase hydrazine decomposition. Hydrazine is used extensively in aerospace propulsion, and although liquid hydrazine is not considered detonable, many fuel handling systems create multiphase mixtures of fuels and fuel vapors during their operation. Therefore, a thorough knowledge of the decomposition chemistry of hydrazine under a variety of conditions can be of value in assessing potential operational hazards in hydrazine fuel systems. To gain such knowledge, a reasonable starting point is the development and validation of a detailed chemical kinetic mechanism for gas-phase hydrazine decomposition. A reasonably complete mechanism was published in 1996, however, many of the elementary steps included had outdated rate expressions and a thorough investigation of the behavior of the mechanism under a variety of conditions was not presented. The current work has included substantial revision of the previously published mechanism, along with a more extensive examination of the decomposition behavior of hydrazine. An attempt to validate the mechanism against the limited experimental data available has been made and was moderately successful. Further computational and experimental research into the chemistry of this fuel needs to be completed.

  17. Detailed balance of the Feynman micromotor

    NASA Astrophysics Data System (ADS)

    Abbott, Derek; Davis, Bruce R.; Parrondo, Juan M. R.

    1999-09-01

    One existing implication of micromotors is that they can be powered by rectifying non-equilibrium thermal fluctuations or mechanical vibrations via the so-called Feynman- micromotor. An example of mechanical rectification is found in the batteryless wristwatch. The original concept was described in as early as 1912 by Smoluchowski and was later revisited in 1963 by Feynman, in the context of rectifying thermal fluctuations to obtain useful motion. It has been shown that, although rectification is impossible at equilibrium, it is possible for the Feynman-micromotor to perform work under non-equilibrium conditions. These concepts can now be realized by MEMS technology and may have exciting implications in biomedicine - where the Feynman- micromotor can be used to power a smart pill, for example. Previously, Feynman's analysis of the motor's efficiency has been shown to be flawed by Parrondo and Espanol. We now show there are further problems in Feynman's treatment of detailed balance. In order to design and understand this device correctly, the equations of detailed balance must be found. Feynman's approach was to use probabilities based on energies and we show that this is problematic. In this paper, we demonstrate corrected equations using level crossing probabilities instead. A potential application of the Feynman-micromotor is a batteryless nanopump that consists of a small MEMS chip that adheres to the skin of a patient and dispense nanoliter quantities of medication. Either mechanical or thermal rectification via a Feynman- micromotor, as the power source, is open for possible investigation.

  18. HUBBLE CAPTURES DETAILED IMAGE OF URANUS' ATMOSPHERE

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Hubble Space Telescope has peered deep into Uranus' atmosphere to see clear and hazy layers created by a mixture of gases. Using infrared filters, Hubble captured detailed features of three layers of Uranus' atmosphere. Hubble's images are different from the ones taken by the Voyager 2 spacecraft, which flew by Uranus 10 years ago. Those images - not taken in infrared light - showed a greenish-blue disk with very little detail. The infrared image allows astronomers to probe the structure of Uranus' atmosphere, which consists of mostly hydrogen with traces of methane. The red around the planet's edge represents a very thin haze at a high altitude. The haze is so thin that it can only be seen by looking at the edges of the disk, and is similar to looking at the edge of a soap bubble. The yellow near the bottom of Uranus is another hazy layer. The deepest layer, the blue near the top of Uranus, shows a clearer atmosphere. Image processing has been used to brighten the rings around Uranus so that astronomers can study their structure. In reality, the rings are as dark as black lava or charcoal. This false color picture was assembled from several exposures taken July 3, 1995 by the Wide Field Planetary Camera-2. CREDIT: Erich Karkoschka (University of Arizona Lunar and Planetary Lab) and NASA

  19. Thirty Meter Telescope Detailed Science Case: 2015

    NASA Astrophysics Data System (ADS)

    Skidmore, Warren; TMT International Science Development Teams; Science Advisory Committee, TMT

    2015-12-01

    The TMT Detailed Science Case describes the transformational science that the Thirty Meter Telescope will enable. Planned to begin science operations in 2024, TMT will open up opportunities for revolutionary discoveries in essentially every field of astronomy, astrophysics and cosmology, seeing much fainter objects much more clearly than existing telescopes. Per this capability, TMT's science agenda fills all of space and time, from nearby comets and asteroids, to exoplanets, to the most distant galaxies, and all the way back to the very first sources of light in the universe. More than 150 astronomers from within the TMT partnership and beyond offered input in compiling the new 2015 Detailed Science Case. The contributing astronomers represent the entire TMT partnership, including the California Institute of Technology (Caltech), the Indian Institute of Astrophysics (IIA), the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), the National Astronomical Observatory of Japan (NAOJ), the University of California, the Association of Canadian Universities for Research in Astronomy (ACURA) and US associate partner, the Association of Universities for Research in Astronomy (AURA). Cover image: artist's rendition of the TMT International Observatory on Mauna Kea opening in the late evening before beginning operations.

  20. Effects of Trainer Expressiveness, Seductive Details, and Trainee Goal Orientation on Training Outcomes

    ERIC Educational Resources Information Center

    Towler, Annette

    2009-01-01

    This study focuses on trainer expressiveness and trainee mastery orientation within the context of the seductive details effect. The seductive details effect refers to inclusion of "highly interesting and entertaining information that is only tangentially related to the topic" (Harp & Mayer, 1998, p. 1). One hundred thirty-two participants…

  1. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  2. Accurate thermoplasmonic simulation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yu, Da-Miao; Liu, Yan-Nan; Tian, Fa-Lin; Pan, Xiao-Min; Sheng, Xin-Qing

    2017-01-01

    Thermoplasmonics leads to enhanced heat generation due to the localized surface plasmon resonances. The measurement of heat generation is fundamentally a complicated task, which necessitates the development of theoretical simulation techniques. In this paper, an efficient and accurate numerical scheme is proposed for applications with complex metallic nanostructures. Light absorption and temperature increase are, respectively, obtained by solving the volume integral equation (VIE) and the steady-state heat diffusion equation through the method of moments (MoM). Previously, methods based on surface integral equations (SIEs) were utilized to obtain light absorption. However, computing light absorption from the equivalent current is as expensive as O(NsNv), where Ns and Nv, respectively, denote the number of surface and volumetric unknowns. Our approach reduces the cost to O(Nv) by using VIE. The accuracy, efficiency and capability of the proposed scheme are validated by multiple simulations. The simulations show that our proposed method is more efficient than the approach based on SIEs under comparable accuracy, especially for the case where many incidents are of interest. The simulations also indicate that the temperature profile can be tuned by several factors, such as the geometry configuration of array, beam direction, and light wavelength.

  3. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  4. Accurate Theoretical Thermochemistry for Fluoroethyl Radicals.

    PubMed

    Ganyecz, Ádám; Kállay, Mihály; Csontos, József

    2017-02-09

    An accurate coupled-cluster (CC) based model chemistry was applied to calculate reliable thermochemical quantities for hydrofluorocarbon derivatives including radicals 1-fluoroethyl (CH3-CHF), 1,1-difluoroethyl (CH3-CF2), 2-fluoroethyl (CH2F-CH2), 1,2-difluoroethyl (CH2F-CHF), 2,2-difluoroethyl (CHF2-CH2), 2,2,2-trifluoroethyl (CF3-CH2), 1,2,2,2-tetrafluoroethyl (CF3-CHF), and pentafluoroethyl (CF3-CF2). The model chemistry used contains iterative triple and perturbative quadruple excitations in CC theory, as well as scalar relativistic and diagonal Born-Oppenheimer corrections. To obtain heat of formation values with better than chemical accuracy perturbative quadruple excitations and scalar relativistic corrections were inevitable. Their contributions to the heats of formation steadily increase with the number of fluorine atoms in the radical reaching 10 kJ/mol for CF3-CF2. When discrepancies were found between the experimental and our values it was always possible to resolve the issue by recalculating the experimental result with currently recommended auxiliary data. For each radical studied here this study delivers the best heat of formation as well as entropy data.

  5. Accurate methods for large molecular systems.

    PubMed

    Gordon, Mark S; Mullin, Jonathan M; Pruitt, Spencer R; Roskop, Luke B; Slipchenko, Lyudmila V; Boatz, Jerry A

    2009-07-23

    Three exciting new methods that address the accurate prediction of processes and properties of large molecular systems are discussed. The systematic fragmentation method (SFM) and the fragment molecular orbital (FMO) method both decompose a large molecular system (e.g., protein, liquid, zeolite) into small subunits (fragments) in very different ways that are designed to both retain the high accuracy of the chosen quantum mechanical level of theory while greatly reducing the demands on computational time and resources. Each of these methods is inherently scalable and is therefore eminently capable of taking advantage of massively parallel computer hardware while retaining the accuracy of the corresponding electronic structure method from which it is derived. The effective fragment potential (EFP) method is a sophisticated approach for the prediction of nonbonded and intermolecular interactions. Therefore, the EFP method provides a way to further reduce the computational effort while retaining accuracy by treating the far-field interactions in place of the full electronic structure method. The performance of the methods is demonstrated using applications to several systems, including benzene dimer, small organic species, pieces of the alpha helix, water, and ionic liquids.

  6. Accurate equilibrium structures for piperidine and cyclohexane.

    PubMed

    Demaison, Jean; Craig, Norman C; Groner, Peter; Écija, Patricia; Cocinero, Emilio J; Lesarri, Alberto; Rudolph, Heinz Dieter

    2015-03-05

    Extended and improved microwave (MW) measurements are reported for the isotopologues of piperidine. New ground state (GS) rotational constants are fitted to MW transitions with quartic centrifugal distortion constants taken from ab initio calculations. Predicate values for the geometric parameters of piperidine and cyclohexane are found from a high level of ab initio theory including adjustments for basis set dependence and for correlation of the core electrons. Equilibrium rotational constants are obtained from GS rotational constants corrected for vibration-rotation interactions and electronic contributions. Equilibrium structures for piperidine and cyclohexane are fitted by the mixed estimation method. In this method, structural parameters are fitted concurrently to predicate parameters (with appropriate uncertainties) and moments of inertia (with uncertainties). The new structures are regarded as being accurate to 0.001 Å and 0.2°. Comparisons are made between bond parameters in equatorial piperidine and cyclohexane. Another interesting result of this study is that a structure determination is an effective way to check the accuracy of the ground state experimental rotational constants.

  7. Noninvasive hemoglobin monitoring: how accurate is enough?

    PubMed

    Rice, Mark J; Gravenstein, Nikolaus; Morey, Timothy E

    2013-10-01

    Evaluating the accuracy of medical devices has traditionally been a blend of statistical analyses, at times without contextualizing the clinical application. There have been a number of recent publications on the accuracy of a continuous noninvasive hemoglobin measurement device, the Masimo Radical-7 Pulse Co-oximeter, focusing on the traditional statistical metrics of bias and precision. In this review, which contains material presented at the Innovations and Applications of Monitoring Perfusion, Oxygenation, and Ventilation (IAMPOV) Symposium at Yale University in 2012, we critically investigated these metrics as applied to the new technology, exploring what is required of a noninvasive hemoglobin monitor and whether the conventional statistics adequately answer our questions about clinical accuracy. We discuss the glucose error grid, well known in the glucose monitoring literature, and describe an analogous version for hemoglobin monitoring. This hemoglobin error grid can be used to evaluate the required clinical accuracy (±g/dL) of a hemoglobin measurement device to provide more conclusive evidence on whether to transfuse an individual patient. The important decision to transfuse a patient usually requires both an accurate hemoglobin measurement and a physiologic reason to elect transfusion. It is our opinion that the published accuracy data of the Masimo Radical-7 is not good enough to make the transfusion decision.

  8. Fast and accurate exhaled breath ammonia measurement.

    PubMed

    Solga, Steven F; Mudalel, Matthew L; Spacek, Lisa A; Risby, Terence H

    2014-06-11

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.

  9. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  10. Provenance management in Swift with implementation details.

    SciTech Connect

    Gadelha, L. M. R; Clifford, B.; Mattoso, M.; Wilde, M.; Foster, I.

    2011-04-01

    The Swift parallel scripting language allows for the specification, execution and analysis of large-scale computations in parallel and distributed environments. It incorporates a data model for recording and querying provenance information. In this article we describe these capabilities and evaluate interoperability with other systems through the use of the Open Provenance Model. We describe Swift's provenance data model and compare it to the Open Provenance Model. We also describe and evaluate activities performed within the Third Provenance Challenge, which consisted of implementing a specific scientific workflow, capturing and recording provenance information of its execution, performing provenance queries, and exchanging provenance information with other systems. Finally, we propose improvements to both the Open Provenance Model and Swift's provenance system.

  11. A new and accurate continuum description of moving fronts

    NASA Astrophysics Data System (ADS)

    Johnston, S. T.; Baker, R. E.; Simpson, M. J.

    2017-03-01

    Processes that involve moving fronts of populations are prevalent in ecology and cell biology. A common approach to describe these processes is a lattice-based random walk model, which can include mechanisms such as crowding, birth, death, movement and agent–agent adhesion. However, these models are generally analytically intractable and it is computationally expensive to perform sufficiently many realisations of the model to obtain an estimate of average behaviour that is not dominated by random fluctuations. To avoid these issues, both mean-field (MF) and corrected mean-field (CMF) continuum descriptions of random walk models have been proposed. However, both continuum descriptions are inaccurate outside of limited parameter regimes, and CMF descriptions cannot be employed to describe moving fronts. Here we present an alternative description in terms of the dynamics of groups of contiguous occupied lattice sites and contiguous vacant lattice sites. Our description provides an accurate prediction of the average random walk behaviour in all parameter regimes. Critically, our description accurately predicts the persistence or extinction of the population in situations where previous continuum descriptions predict the opposite outcome. Furthermore, unlike traditional MF models, our approach provides information about the spatial clustering within the population and, subsequently, the moving front.

  12. Mouse models of human AML accurately predict chemotherapy response

    PubMed Central

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S.; Zhao, Zhen; Rappaport, Amy R.; Luo, Weijun; McCurrach, Mila E.; Yang, Miao-Miao; Dolan, M. Eileen; Kogan, Scott C.; Downing, James R.; Lowe, Scott W.

    2009-01-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to conventional therapy that mirror clinical experience. Specifically, murine leukemias expressing the AML1/ETO fusion oncoprotein, associated with a favorable prognosis in patients, show a dramatic response to induction chemotherapy owing to robust activation of the p53 tumor suppressor network. Conversely, murine leukemias expressing MLL fusion proteins, associated with a dismal prognosis in patients, are drug-resistant due to an attenuated p53 response. Our studies highlight the importance of genetic information in guiding the treatment of human AML, functionally establish the p53 network as a central determinant of chemotherapy response in AML, and demonstrate that genetically engineered mouse models of human cancer can accurately predict therapy response in patients. PMID:19339691

  13. Mouse models of human AML accurately predict chemotherapy response.

    PubMed

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S; Zhao, Zhen; Rappaport, Amy R; Luo, Weijun; McCurrach, Mila E; Yang, Miao-Miao; Dolan, M Eileen; Kogan, Scott C; Downing, James R; Lowe, Scott W

    2009-04-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to conventional therapy that mirror clinical experience. Specifically, murine leukemias expressing the AML1/ETO fusion oncoprotein, associated with a favorable prognosis in patients, show a dramatic response to induction chemotherapy owing to robust activation of the p53 tumor suppressor network. Conversely, murine leukemias expressing MLL fusion proteins, associated with a dismal prognosis in patients, are drug-resistant due to an attenuated p53 response. Our studies highlight the importance of genetic information in guiding the treatment of human AML, functionally establish the p53 network as a central determinant of chemotherapy response in AML, and demonstrate that genetically engineered mouse models of human cancer can accurately predict therapy response in patients.

  14. Emplacement of Long Lava Flows: Detailed Topography of the Carrizozo Basalt Lava Flow, New Mexico

    NASA Technical Reports Server (NTRS)

    Zimbelman, J. R; Johnston, A. K.

    2000-01-01

    The Carrizozo flow in south-central New Mexico was examined to obtain detailed topography for a long basaltic lava flow. This information will be helpful in evaluating emplacement models for long lava flows.

  15. Structural details, pathways, and energetics of unfolding apomyoglobin.

    PubMed

    Onufriev, Alexey; Case, David A; Bashford, Donald

    2003-01-17

    Protein folding is often difficult to characterize experimentally because of the transience of intermediate states, and the complexity of the protein-solvent system. Atomistic simulations, which could provide more detailed information, have had to employ highly simplified models or high temperatures, to cope with the long time scales of unfolding; direct simulation of folding is even more problematic. We report a fully atomistic simulation of the acid-induced unfolding of apomyoglobin in which the protonation of acidic side-chains to simulate low pH is sufficient to induce unfolding at room temperature with no added biasing forces or other unusual conditions; and the trajectory is validated by comparison to experimental characterization of intermediate states. Novel insights provided by their analysis include: characterization of a dry swollen globule state forming a barrier to initial unfolding or final folding; observation of cooperativity in secondary and tertiary structure formation and its explanation in terms of dielectric environments; and structural details of the intermediate and the completely unfolded states. These insights involve time scales and levels of structural detail that are presently beyond the range of experiment, but come within reach through the simulation methods described here. An implicit solvation model is used to analyze the energetics of protein folding at various pH and ionic strength values, and a reasonable estimate of folding free energy is obtained. Electrostatic interactions are found to disfavor folding.

  16. Method for Accurate Surface Temperature Measurements During Fast Induction Heating

    NASA Astrophysics Data System (ADS)

    Larregain, Benjamin; Vanderesse, Nicolas; Bridier, Florent; Bocher, Philippe; Arkinson, Patrick

    2013-07-01

    A robust method is proposed for the measurement of surface temperature fields during induction heating. It is based on the original coupling of temperature-indicating lacquers and a high-speed camera system. Image analysis tools have been implemented to automatically extract the temporal evolution of isotherms. This method was applied to the fast induction treatment of a 4340 steel spur gear, allowing the full history of surface isotherms to be accurately documented for a sequential heating, i.e., a medium frequency preheating followed by a high frequency final heating. Three isotherms, i.e., 704, 816, and 927°C, were acquired every 0.3 ms with a spatial resolution of 0.04 mm per pixel. The information provided by the method is described and discussed. Finally, the transformation temperature Ac1 is linked to the temperature on specific locations of the gear tooth.

  17. Photoacoustic computed tomography without accurate ultrasonic transducer responses

    NASA Astrophysics Data System (ADS)

    Sheng, Qiwei; Wang, Kun; Xia, Jun; Zhu, Liren; Wang, Lihong V.; Anastasio, Mark A.

    2015-03-01

    Conventional photoacoustic computed tomography (PACT) image reconstruction methods assume that the object and surrounding medium are described by a constant speed-of-sound (SOS) value. In order to accurately recover fine structures, SOS heterogeneities should be quantified and compensated for during PACT reconstruction. To address this problem, several groups have proposed hybrid systems that combine PACT with ultrasound computed tomography (USCT). In such systems, a SOS map is reconstructed first via USCT. Consequently, this SOS map is employed to inform the PACT reconstruction method. Additionally, the SOS map can provide structural information regarding tissue, which is complementary to the functional information from the PACT image. We propose a paradigm shift in the way that images are reconstructed in hybrid PACT-USCT imaging. Inspired by our observation that information about the SOS distribution is encoded in PACT measurements, we propose to jointly reconstruct the absorbed optical energy density and SOS distributions from a combined set of USCT and PACT measurements, thereby reducing the two reconstruction problems into one. This innovative approach has several advantages over conventional approaches in which PACT and USCT images are reconstructed independently: (1) Variations in the SOS will automatically be accounted for, optimizing PACT image quality; (2) The reconstructed PACT and USCT images will possess minimal systematic artifacts because errors in the imaging models will be optimally balanced during the joint reconstruction; (3) Due to the exploitation of information regarding the SOS distribution in the full-view PACT data, our approach will permit high-resolution reconstruction of the SOS distribution from sparse array data.

  18. [Finishing and detailing, stability and harmony].

    PubMed

    Fourquet, Lucile; Göttle, Magalie; Bounoure, Guy

    2014-03-01

    The finishing and detailing phase, the last stage of active orthodontic treatment, makes it possible to perfect the occlusion, by adhering to criteria defined by various authors and to improve the esthetic result, while achieving the treatment objectives made during the pre-planning phase. The reliability of end of treatment results cannot be ensured without an initial individualized analysis of the risk factors for relapse specific to each patient. It is only after this analysis, that the orthodontist will be able to determine how to comply with these criteria for stability, common in any treatment, and to individually choose and implement reliable procedures. When planning for stability as the treatment objective, orthodontic patients are able to achieve stable alignment. This course of action is the necessary process to help ensure equilibrium and alignment. Eight different methods of alignment, already frequently discussed in the literature, will be described and analyzed in this paper.

  19. Detailed mechanism for oxidation of benzene

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1990-01-01

    A detailed mechanism for the oxidation of benzene is presented and used to compute experimentally obtained concentration profiles and ignition delay times over a wide range of equivalence ratio and temperature. The computed results agree qualitatively with all the experimental trends. Quantitative agreement is obtained with several of the composition profiles and for the temperature dependence of the ignition delay times. There are indications, however, that some important reactions are as yet undiscovered in this mechanism. Recent literature expressions have been used for the rate coefficients of most important reactions, except for some involving phenol. The discrepancy between the phenol pyrolysis rate coefficient used in this work and a recent literature expression remains to be explained.

  20. Capture barrier distributions: Some insights and details

    SciTech Connect

    Rowley, N.; Grar, N.; Trotta, M.

    2007-10-15

    The 'experimental barrier distribution' provides a parameter-free representation of experimental heavy-ion capture cross sections that highlights the effects of entrance-channel couplings. Its relation to the s-wave transmission is discussed, and in particular it is shown how the full capture cross section can be generated from an l=0 coupled-channels calculation. Furthermore, it is shown how this transmission can be simply exploited in calculations of quasifission and evaporation-residue cross sections. The system {sup 48}Ca+{sup 154}Sm is studied in detail. A calculation of the compound-nucleus spin distribution reveals a possible energy dependence of barrier weights due to polarization arising from target and projectile quadrupole phonon states; this effect also gives rise to an entrance-channel 'extra-push'.

  1. Quality and safety of detailed clinical models.

    PubMed

    Ritz, Derek

    2013-01-01

    This chapter describes quality and safety risks related to the development and use of Detailed Clinical Models (DCM) and mechanisms which may be employed to mitigate such risks. The chapter begins with a brief discussion of DCMs and the role they can play in mitigating patient safety risk. There is then a brief description of the risks which DCMs themselves may introduce, followed by the introduction of a standards-based risk assessment method and the ways this assessment method may be applied to DCMs in particular. A general description is then made of the ISO 9000-based approach to quality management systems (QMS) and, specifically, how such an approach may be applied to DCM development, maintenance, deployment and use. The chapter concludes with a discussion of specific DCM quality and safety challenges and governance approaches which may be employed to help address these.

  2. Picornavirus uncoating intermediate captured in atomic detail

    PubMed Central

    Ren, Jingshan; Wang, Xiangxi; Hu, Zhongyu; Gao, Qiang; Sun, Yao; Li, Xuemei; Porta, Claudine; Walter, Thomas S.; Gilbert, Robert J.; Zhao, Yuguang; Axford, Danny; Williams, Mark; McAuley, Katherine; Rowlands, David J.; Yin, Weidong; Wang, Junzhi; Stuart, David I.; Rao, Zihe; Fry, Elizabeth E.

    2013-01-01

    It remains largely mysterious how the genomes of non-enveloped eukaryotic viruses are transferred across a membrane into the host cell. Picornaviruses are simple models for such viruses, and initiate this uncoating process through particle expansion, which reveals channels through which internal capsid proteins and the viral genome presumably exit the particle, although this has not been clearly seen until now. Here we present the atomic structure of an uncoating intermediate for the major human picornavirus pathogen CAV16, which reveals VP1 partly extruded from the capsid, poised to embed in the host membrane. Together with previous low-resolution results, we are able to propose a detailed hypothesis for the ordered egress of the internal proteins, using two distinct sets of channels through the capsid, and suggest a structural link to the condensed RNA within the particle, which may be involved in triggering RNA release. PMID:23728514

  3. Most Detailed Image of the Crab Nebula

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This new Hubble image -- one among the largest ever produced with the Earth-orbiting observatory -- shows the most detailed view so far of the entire Crab Nebula ever made. The Crab is arguably the single most interesting object, as well as one of the most studied, in all of astronomy. The image is the largest image ever taken with Hubble's WFPC2 workhorse camera.

    The Crab Nebula is one of the most intricately structured and highly dynamical objects ever observed. The new Hubble image of the Crab was assembled from 24 individual exposures taken with the NASA/ESA Hubble Space Telescope and is the highest resolution image of the entire Crab Nebula ever made.

  4. Detailed gravimetric geoid for the United States.

    NASA Technical Reports Server (NTRS)

    Strange, W. E.; Vincent, S. F.; Berry, R. H.; Marsh, J. G.

    1972-01-01

    A detailed gravimetric geoid was computed for the United States using a combination of satellite-derived spherical harmonic coefficients and 1 by 1 deg mean gravity values from surface gravimetry. Comparisons of this geoid with astrogeodetic geoid data indicate that a precision of plus or minus 2 meters has been obtained. Translations only were used to convert the NAD astrogeodetic geoid heights to geocentric astrogeodetic heights. On the basis of the agreement between the geocentric astrogeodetic geoid heights and the gravimetric geoid heights, no evidence is found for rotation in the North American datum. The value of the zero-order undulation can vary by 10 to 20 meters, depending on which investigator's station positions are used to establish it.

  5. Generation and Memory for Contextual Detail

    ERIC Educational Resources Information Center

    Mulligan, Neil W.

    2004-01-01

    Generation enhances item memory but may not enhance other aspects of memory. In 12 experiments, the author investigated the effect of generation on context memory, motivated in part by the hypothesis that generation produces a trade-off in encoding item and contextual information. Participants generated some study words (e.g., hot-___) and read…

  6. A Look Inside: MRI Shows the Detail

    ERIC Educational Resources Information Center

    Gosman, Derek; Rose, Mary Annette

    2015-01-01

    Understanding the advantages, risks, and financial costs of medical technology is one way that technologically literate citizens can make better-informed decisions regarding their health and medical care. A cascade of advancements in medical imaging technologies (Ulmer & Jansen 2010) offers an exciting backdrop from which to help students…

  7. Hubble Captures Detailed Image of Uranus' Atmosphere

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Hubble Space Telescope has peered deep into Uranus' atmosphere to see clear and hazy layers created by a mixture of gases. Using infrared filters, Hubble captured detailed features of three layers of Uranus' atmosphere.

    Hubble's images are different from the ones taken by the Voyager 2 spacecraft, which flew by Uranus 10 years ago. Those images - not taken in infrared light - showed a greenish-blue disk with very little detail.

    The infrared image allows astronomers to probe the structure of Uranus' atmosphere, which consists of mostly hydrogen with traces of methane. The red around the planet's edge represents a very thin haze at a high altitude. The haze is so thin that it can only be seen by looking at the edges of the disk, and is similar to looking at the edge of a soap bubble. The yellow near the bottom of Uranus is another hazy layer. The deepest layer, the blue near the top of Uranus, shows a clearer atmosphere.

    Image processing has been used to brighten the rings around Uranus so that astronomers can study their structure. In reality, the rings are as dark as black lava or charcoal.

    This false color picture was assembled from several exposures taken July 3, 1995 by the Wide Field Planetary Camera-2.

    The Wide Field/Planetary Camera 2 was developed by the Jet Propulsion Laboratory and managed by the Goddard Spaced Flight Center for NASA's Office of Space Science.

    This image and other images and data received from the Hubble Space Telescope are posted on the World Wide Web on the Space Telescope Science Institute home page at URL http://oposite.stsci.edu/pubinfo/

  8. A Technique Using Calibrated Photography and Photoshop for Accurate Shade Analysis and Communication.

    PubMed

    McLaren, Edward A; Figueira, Johan; Goldstein, Ronald E

    2017-02-01

    This article reviews the critical aspects of controlling the shade-taking environment and discusses various modalities introduced throughout the years to acquire and communicate shade information. Demonstrating a highly calibrated digital photographic technique for capturing shade information, this article shows how to use Photoshop® to standardize images and extract color information from the tooth and shade tab for use by a ceramist for an accurate shade-matching restoration.

  9. Accurate orbit propagation with planetary close encounters

    NASA Astrophysics Data System (ADS)

    Baù, Giulio; Milani Comparetti, Andrea; Guerra, Francesca

    2015-08-01

    We tackle the problem of accurately propagating the motion of those small bodies that undergo close approaches with a planet. The literature is lacking on this topic and the reliability of the numerical results is not sufficiently discussed. The high-frequency components of the perturbation generated by a close encounter makes the propagation particularly challenging both from the point of view of the dynamical stability of the formulation and the numerical stability of the integrator. In our approach a fixed step-size and order multistep integrator is combined with a regularized formulation of the perturbed two-body problem. When the propagated object enters the region of influence of a celestial body, the latter becomes the new primary body of attraction. Moreover, the formulation and the step-size will also be changed if necessary. We present: 1) the restarter procedure applied to the multistep integrator whenever the primary body is changed; 2) new analytical formulae for setting the step-size (given the order of the multistep, formulation and initial osculating orbit) in order to control the accumulation of the local truncation error and guarantee the numerical stability during the propagation; 3) a new definition of the region of influence in the phase space. We test the propagator with some real asteroids subject to the gravitational attraction of the planets, the Yarkovsky and relativistic perturbations. Our goal is to show that the proposed approach improves the performance of both the propagator implemented in the OrbFit software package (which is currently used by the NEODyS service) and of the propagator represented by a variable step-size and order multistep method combined with Cowell's formulation (i.e. direct integration of position and velocity in either the physical or a fictitious time).

  10. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  11. Accurate paleointensities - the multi-method approach

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  12. Accurate glucose detection in a small etalon

    NASA Astrophysics Data System (ADS)

    Martini, Joerg; Kuebler, Sebastian; Recht, Michael; Torres, Francisco; Roe, Jeffrey; Kiesel, Peter; Bruce, Richard

    2010-02-01

    We are developing a continuous glucose monitor for subcutaneous long-term implantation. This detector contains a double chamber Fabry-Perot-etalon that measures the differential refractive index (RI) between a reference and a measurement chamber at 850 nm. The etalon chambers have wavelength dependent transmission maxima which dependent linearly on the RI of their contents. An RI difference of ▵n=1.5.10-6 changes the spectral position of a transmission maximum by 1pm in our measurement. By sweeping the wavelength of a single-mode Vertical-Cavity-Surface-Emitting-Laser (VCSEL) linearly in time and detecting the maximum transmission peaks of the etalon we are able to measure the RI of a liquid. We have demonstrated accuracy of ▵n=+/-3.5.10-6 over a ▵n-range of 0 to 1.75.10-4 and an accuracy of 2% over a ▵nrange of 1.75.10-4 to 9.8.10-4. The accuracy is primarily limited by the reference measurement. The RI difference between the etalon chambers is made specific to glucose by the competitive, reversible release of Concanavalin A (ConA) from an immobilized dextran matrix. The matrix and ConA bound to it, is positioned outside the optical detection path. ConA is released from the matrix by reacting with glucose and diffuses into the optical path to change the RI in the etalon. Factors such as temperature affect the RI in measurement and detection chamber equally but do not affect the differential measurement. A typical standard deviation in RI is +/-1.4.10-6 over the range 32°C to 42°C. The detector enables an accurate glucose specific concentration measurement.

  13. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Astrophysics Data System (ADS)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    and IKONOS imagery and the 3-D volume estimates. The combination of these then allow for a rapid and hopefully very accurate estimation of biomass.

  14. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  15. How accurate are Scottish cancer registration data?

    PubMed Central

    Brewster, D.; Crichton, J.; Muir, C.

    1994-01-01

    In order to assess the accuracy of Scottish cancer registration data, a random sample of 2,200 registrations, attributed to the year 1990, was generated. Relevant medical records were available for review in 2,021 (92%) cases. Registration details were reabstracted from available records and compared with data in the registry. Discrepancies in identifying items of data (surname, forename, sex and date of birth) were found in 3.5% of cases. Most were trivial and would not disturb record linkage. Discrepancy rates of 7.1% in post code of residence at the time of diagnosis (excluding differences arising through boundary changes), 11.0% in anniversary date (excluding differences of 6 weeks or less), 7.7% in histological verification status, 5.4% in ICD-9 site codes (the first three digits) and 14.5% in ICD-O morphology codes (excluding 'inferred' morphology codes) were recorded. Overall, serious discrepancies were judged to have occurred in 2.8% of cases. In many respects, therefore, Scottish cancer registration data show a high level of accuracy that compares favourably to the reported accuracy of the few other cancer registries undertaking such analyses. PMID:7947104

  16. Integrating complementary information for photorealistic representation of large-scale environments

    NASA Astrophysics Data System (ADS)

    Hu, Jinhui

    2007-12-01

    A wealth of datasets from different sensors exists for environment representation. The key observations of this thesis are that the different datasets are complementary and that fusing information from complementary datasets reduces errors in processing each dataset. In addition, a fusion method benefits from the merit of each dataset, hence helps us to represent large-scale environments in an efficient and accurate way. This thesis presents a hybrid approach fusing information from four complementary datasets, LiDAR data, aerial images, ground images and videos, to photorealistically represent large-scale environments. LiDAR data samples are dense in surface points and they directly measure model heights with accuracy up to centimeters. However, edges from LiDAR data are jaggy due to the relatively low sampling rate (usually one meter) of the sensor and reconstruction results from LiDAR lack color information. On the other hand, aerial images provide detailed texture and color information in high-resolution, making them necessary for texture data and appealing for extracting detailed model features. However, reconstruction from stereo aerial images often generates sparse points, making them unsuitable for reconstruction of complex surfaces, such as curved surfaces and roofs with slopes. Ground images offer high-resolution texture information and details of model facades, but they are local, static and lack the capability to provide information of the most recent changes in the environment. Live videos are real-time, making them ideal for updating the information of the environment, however, they are often low-resolution. A natural conclusion is to combine the geometry, photometry, and other sensing sources to compensate for the shortcomings of each sensing technology and obtain a more detailed and accurate representation of the environment. In this thesis, we first fuse information from both LiDAR and an aerial image to create urban models with accurate surfaces

  17. Accurate calculation of field and carrier distributions in doped semiconductors

    NASA Astrophysics Data System (ADS)

    Yang, Wenji; Tang, Jianping; Yu, Hongchun; Wang, Yanguo

    2012-06-01

    We use the numerical squeezing algorithm(NSA) combined with the shooting method to accurately calculate the built-in fields and carrier distributions in doped silicon films (SFs) in the micron and sub-micron thickness range and results are presented in graphical form for variety of doping profiles under different boundary conditions. As a complementary approach, we also present the methods and the results of the inverse problem (IVP) - finding out the doping profile in the SFs for given field distribution. The solution of the IVP provides us the approach to arbitrarily design field distribution in SFs - which is very important for low dimensional (LD) systems and device designing. Further more, the solution of the IVP is both direct and much easy for all the one-, two-, and three-dimensional semiconductor systems. With current efforts focused on the LD physics, knowing of the field and carrier distribution details in the LD systems will facilitate further researches on other aspects and hence the current work provides a platform for those researches.

  18. Accurate measurement of streamwise vortices in low speed aerodynamic flows

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Kudo, Jun; Breuer, Kenneth S.

    2010-11-01

    Low Reynolds number experiments with flapping animals (such as bats and small birds) are of current interest in understanding biological flight mechanics, and due to their application to Micro Air Vehicles (MAVs) which operate in a similar parameter space. Previous PIV wake measurements have described the structures left by bats and birds, and provided insight to the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions due to significant experimental challenges associated with the highly three-dimensional and unsteady nature of the flows, and the low wake velocities associated with lifting bodies that only weigh a few grams. This requires the high-speed resolution of small flow features in a large field of view using limited laser energy and finite camera resolution. Cross-stream measurements are further complicated by the high out-of-plane flow which requires thick laser sheets and short interframe times. To quantify and address these challenges we present data from a model study on the wake behind a fixed wing at conditions comparable to those found in biological flight. We present a detailed analysis of the PIV wake measurements, discuss the criteria necessary for accurate measurements, and present a new dual-plane PIV configuration to resolve these issues.

  19. Ad hoc methods for accurate determination of Bader's atomic boundary

    NASA Astrophysics Data System (ADS)

    Polestshuk, Pavel M.

    2013-08-01

    In addition to the recently published triangulation method [P. M. Polestshuk, J. Comput. Chem. 34, 206 (2013)], 10.1002/jcc.23121, two new highly accurate approaches, ZFSX and SINTY, for the integration over an atomic region covered by a zero-flux surface (zfs) were developed and efficiently interfaced into the TWOE program. ZFSX method was realized as three independent modules (ZFSX-1, ZFSX-3, and ZFSX-5) handling interatomic surfaces of a different complexity. Details of algorithmic implementation of ZFSX and SINTY are discussed. A special attention to an extended analysis of errors in calculations of atomic properties is paid. It was shown that uncertainties in zfs determination caused by ZFSX and SINTY approaches contribute negligibly (less than 10-6 a.u.) to the total atomic integration errors. Moreover, the new methods are able to evaluate atomic integrals with a reasonable time and can be universally applied for the systems of any complexity. It is suggested, therefore, that ZFSX and SINTY can be regarded as benchmark methods for the computation of any Quantum Theory of Atoms in Molecules atomic property.

  20. Ad hoc methods for accurate determination of Bader's atomic boundary.

    PubMed

    Polestshuk, Pavel M

    2013-08-07

    In addition to the recently published triangulation method [P. M. Polestshuk, J. Comput. Chem. 34, 206 (2013)], two new highly accurate approaches, ZFSX and SINTY, for the integration over an atomic region covered by a zero-flux surface (zfs) were developed and efficiently interfaced into the TWOE program. ZFSX method was realized as three independent modules (ZFSX-1, ZFSX-3, and ZFSX-5) handling interatomic surfaces of a different complexity. Details of algorithmic implementation of ZFSX and SINTY are discussed. A special attention to an extended analysis of errors in calculations of atomic properties is paid. It was shown that uncertainties in zfs determination caused by ZFSX and SINTY approaches contribute negligibly (less than 10(-6) a.u.) to the total atomic integration errors. Moreover, the new methods are able to evaluate atomic integrals with a reasonable time and can be universally applied for the systems of any complexity. It is suggested, therefore, that ZFSX and SINTY can be regarded as benchmark methods for the computation of any Quantum Theory of Atoms in Molecules atomic property.

  1. Generating Facial Expressions Using an Anatomically Accurate Biomechanical Model.

    PubMed

    Wu, Tim; Hung, Alice; Mithraratne, Kumar

    2014-11-01

    This paper presents a computational framework for modelling the biomechanics of human facial expressions. A detailed high-order (Cubic-Hermite) finite element model of the human head was constructed using anatomical data segmented from magnetic resonance images. The model includes a superficial soft-tissue continuum consisting of skin, the subcutaneous layer and the superficial Musculo-Aponeurotic system. Embedded within this continuum mesh, are 20 pairs of facial muscles which drive facial expressions. These muscles were treated as transversely-isotropic and their anatomical geometries and fibre orientations were accurately depicted. In order to capture the relative composition of muscles and fat, material heterogeneity was also introduced into the model. Complex contact interactions between the lips, eyelids, and between superficial soft tissue continuum and deep rigid skeletal bones were also computed. In addition, this paper investigates the impact of incorporating material heterogeneity and contact interactions, which are often neglected in similar studies. Four facial expressions were simulated using the developed model and the results were compared with surface data obtained from a 3D structured-light scanner. Predicted expressions showed good agreement with the experimental data.

  2. Gigantic Cosmic Corkscrew Reveals New Details About Mysterious Microquasar

    NASA Astrophysics Data System (ADS)

    2004-10-01

    Image of SS 433: Red-and-Blue Line Shows Path of Constant-Speed Jets. Note Poor Match of Path to Image. CREDIT: Blundell & Bowler, NRAO/AUI/NSF SS 433 Same Image, With Colored Beads Representing Particle Ejections at Different Speeds. Particle Path Now Matches. CREDIT: Blundell & Bowler, NRAO/AUI/NSF Click Here for Page of Full-Sized Graphics The new VLA image shows two full turns of the jets' corkscrew on both sides of the core. Analyzing the image showed that if material came from the core at a constant speed, the jet paths would not accurately match the details of the image. "By simulating ejections at varying speeds, we were able to produce an exact match to the observed structure," Blundell explained. The scientists first did their match to one of the jets. "We then were stunned to see that the varying speeds that matched the structure of one jet also exactly reproduced the other jet's path," Blundell said. Matching the speeds in the two jets reproduced the observed structure even allowing for the fact that, because one jet is moving more nearly away from us than the other, it takes light longer to reach us from it, she added. The astrophysicists speculate that the changes in ejection speed may be caused by changes in the rate at which material is transferred from the companion star onto the accretion disk. The detailed new VLA image also allowed the astrophysicists to determine that SS 433 is nearly 18,000 light-years distant from Earth. Earlier estimates had the object, in the constellation Aquila, as near as 10,000 light-years. An accurate distance, the scientists said, now allows them to better determine the age of the shell of debris blown out by the supernova explosion that created the dense, compact object in the microquasar. Knowing the distance accurately also allows them to measure the actual brightness of the microquasar's components, and this, they said, improves their understanding of the physical processes at work in the system. The breakthrough image

  3. Description of Axial Detail for ROK Fuel

    SciTech Connect

    Trellue, Holly R; Galloway, Jack D

    2012-04-20

    For the purpose of NDA simulations of the ROK fuel assemblies, we have developed an axial burnup distribution to represent the pins themselves based on gamma scans of rods in the G23 assembly. For the purpose of modeling the G23 assembly (both at ORNL and LANL), the pin-by-pin burnup map as simulated by ROK is being assumed to represent the radial burnup distribution. However, both DA and NDA results indicate that this simulated estimate is not 100% correct. In particular, the burnup obtained from the axial gamma scan of 7 pins does not represent exactly the same 'average' pin burnup as the ROK simulation. Correction for this discrepancy is a goal of the well-characterized assembly task but will take time. For now, I have come up with a correlation for 26 axial points of the burnup as obtained by gamma scans of 7 different rods (C13, G01, G02, J11, K10, L02, and M04, neglecting K02 at this time) to the average burnup given by the simulation for each of the rods individually. The resulting fraction in each axial zone is then averaged for the 7 different rods so that it can represent every fuel pin in the assembly. The burnup in each of the 26 axial zones of rods in all ROK assemblies will then be directly adjusted using this fraction, which is given in Table 1. Note that the gamma scan data given by ROK for assembly G23 included a length of {approx}3686 mm, so the first 12 mm and the last 14 mm were ignored to give an actual rod length of {approx}366 cm. To represent assembly F02 in which no pin-by-pin burnup distribution is given by ROK, we must model it using infinitely-reflected geometry but can look at the effects of measuring in different axial zones by using intermediate burnup files (i.e. smaller burnups than 28 GWd/MTU) and determining which axial zone(s) each burnup represents. Details for assembly F02 are then given in Tables 2 and 3, which is given in Table 1 and has 44 total axial zones to represent the top meter in explicit detail in addition to the

  4. [Expects for academic detailing from the standpoints of evidence-based medicine (EBM)].

    PubMed

    Nakayama, Takeo

    2014-01-01

    Academic detailing, interactive information services by pharmacists for clinicians, has been getting interests in the US and European countries. A systematic review of randomized controlled trials supported the effectiveness of academic detailing. Knowledge of evidence-based medicine and clinical practice guidelines is one of the essential bases for pharmacists to promote these activities. In addition, pharmacists need to understand attitudes and ways of thinking of clinicians toward medicines. Through communications and information sharing between clinicians and pharmacists, collaborations to modify and improve the use of medicines should be facilitated. On these grounds, academic detailing will be able to play an important role in real healthcare circumstances.

  5. Robust High-Resolution Cloth Using Parallelism, History-Based Collisions and Accurate Friction

    PubMed Central

    Selle, Andrew; Su, Jonathan; Irving, Geoffrey; Fedkiw, Ronald

    2015-01-01

    In this paper we simulate high resolution cloth consisting of up to 2 million triangles which allows us to achieve highly detailed folds and wrinkles. Since the level of detail is also influenced by object collision and self collision, we propose a more accurate model for cloth-object friction. We also propose a robust history-based repulsion/collision framework where repulsions are treated accurately and efficiently on a per time step basis. Distributed memory parallelism is used for both time evolution and collisions and we specifically address Gauss-Seidel ordering of repulsion/collision response. This algorithm is demonstrated by several high-resolution and high-fidelity simulations. PMID:19147895

  6. Magnificant Details in a Dusty Spiral Galaxy

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In 1995, the majestic spiral galaxy NGC 4414 was imaged by the Hubble Space Telescope as part of the HST Key Project on the Extragalactic Distance Scale. An international team of astronomers, led by Dr. Wendy Freedman of the Observatories of the Carnegie Institution of Washington, observed this galaxy on 13 different occasions over the course of two months. Images were obtained with Hubble's Wide Field Planetary Camera 2 (WFPC2) through three different color filters. Based on their discovery and careful brightness measurements of variable stars in NGC 4414, the Key Project astronomers were able to make an accurate determination of the distance to the galaxy. The resulting distance to NGC 4414, 19.1 megaparsecs or about 60 million light-years, along with similarly determined distances to other nearby galaxies, contributes to astronomers' overall knowledge of the rate of expansion of the universe. In 1999, the Hubble Heritage Team revisited NGC 4414 and completed its portrait by observing the other half with the same filters as were used in 1995. The end result is a stunning full-color look at the entire dusty spiral galaxy. The new Hubble picture shows that the central regions of this galaxy, as is typical of most spirals, contain primarily older, yellow and red stars. The outer spiral arms are considerably bluer due to ongoing formation of young, blue stars, the brightest of which can be seen individually at the high resolution provided by the Hubble camera. The arms are also very rich in clouds of interstellar dust, seen as dark patches and streaks silhouetted against the starlight.

  7. Detailed anatomy of the capsulopalpebral fascia.

    PubMed

    Nam, Yong Seok; Han, Seung-Ho; Shin, Sun Young

    2012-09-01

    This study was designed to elucidate the detailed anatomy of the capsulopalpebral fascia (CPF) and capsulopalpebral head (CPH), and their relationships to the inferior rectus muscle (IRM). In this cohort study, 40 eyes from 20 cadavers were observed macroscopically. Dissection was carried out from the CPF origin to its insertion, and the CPF origin pattern was photographed in each specimen. The width, length, and tensile strength of the CPF were measured. The CPF originated 25.07 ± 1.07 mm laterally and 24.86 ± 1.10 mm medially from the origin of the IRM and extended to the lower border of the inferior oblique muscle, and it firmly adhered to the IRM surface and formed into the CPH. The CPH was 4.31 ± 0.86 mm laterally and 6.18 ± 1.94 mm medially in length and 7.47 ± 0.81 mm in width. The CPF originated from the total width or 3/4 temporal part of the IRM in 32 (80%) of 40 faces. There was asymmetry in the pattern of the CPF origin between the left and right eyes in 4 of 20 paired specimens (20%). The tensile strength of the posterior layer was 19.12 ± 11.22 N, which was significantly higher than that of the anterior layer (8.59 ± 3.88 N) (P = 0.001). This study provided a good understanding of the CPF structures conducive to performing IRM surgery.

  8. Some articulatory details of emotional speech

    NASA Astrophysics Data System (ADS)

    Lee, Sungbok; Yildirim, Serdar; Bulut, Murtaza; Kazemzadeh, Abe; Narayanan, Shrikanth

    2005-09-01

    Differences in speech articulation among four emotion types, neutral, anger, sadness, and happiness are investigated by analyzing tongue tip, jaw, and lip movement data collected from one male and one female speaker of American English. The data were collected using an electromagnetic articulography (EMA) system while subjects produce simulated emotional speech. Pitch, root-mean-square (rms) energy and the first three formants were estimated for vowel segments. For both speakers, angry speech exhibited the largest rms energy and largest articulatory activity in terms of displacement range and movement speed. Happy speech is characterized by largest pitch variability. It has higher rms energy than neutral speech but articulatory activity is rather comparable to, or less than, neutral speech. That is, happy speech is more prominent in voicing activity than in articulation. Sad speech exhibits longest sentence duration and lower rms energy. However, its articulatory activity is no less than neutral speech. Interestingly, for the male speaker, articulation for vowels in sad speech is consistently more peripheral (i.e., more forwarded displacements) when compared to other emotions. However, this does not hold for female subject. These and other results will be discussed in detail with associated acoustics and perceived emotional qualities. [Work supported by NIH.

  9. Details of tetrahedral anisotropic mesh adaptation

    NASA Astrophysics Data System (ADS)

    Jensen, Kristian Ejlebjerg; Gorman, Gerard

    2016-04-01

    We have implemented tetrahedral anisotropic mesh adaptation using the local operations of coarsening, swapping, refinement and smoothing in MATLAB without the use of any for- N loops, i.e. the script is fully vectorised. In the process of doing so, we have made three observations related to details of the implementation: 1. restricting refinement to a single edge split per element not only simplifies the code, it also improves mesh quality, 2. face to edge swapping is unnecessary, and 3. optimising for the Vassilevski functional tends to give a little higher value for the mean condition number functional than optimising for the condition number functional directly. These observations have been made for a uniform and a radial shock metric field, both starting from a structured mesh in a cube. Finally, we compare two coarsening techniques and demonstrate the importance of applying smoothing in the mesh adaptation loop. The results pertain to a unit cube geometry, but we also show the effect of corners and edges by applying the implementation in a spherical geometry.

  10. Detailed abundances in EMP dwarfs from SDSS

    NASA Astrophysics Data System (ADS)

    Sbordone, Luca; Caffau, Elisabetta; Bonifacio, Piercarlo

    2012-09-01

    We report on the current status of an ongoing survey to select extremely metal poor (EMP) turn-off (TO) stars from Sloan Digital Sky Survey (SDSS) spectra, and determine their detailed chemical composition through high resolution follow-up. So far, 26 stars have been observed with UVESatVLT and X-SHOOTERatVLT, all but two showing an iron content below [Fe/H]=-3. Among them we detected the current record holder for the lowest total metallicity (SDSS J102915+172927, Z=10-5 Zsolar), four carbon-enhanced extremely metal poor objects (CEMP), as well as subsets with enhanced Ni and Mn. Lithium abundances or upper limits were derived, confirming the previously detected ``meltdown'' of the Spite plateau for metallicities below about [Fe/H]=-2.8. SDSS J102915+172927 in particular shows no detectable Li I 670.8 doublet, leading to an upper limit of A(Li)<1.1, hinting to an even deeper Li depletion in TO stars below [Fe/H]=-4. Spectroscopic follow-up is currently being prosecuted by the recently started ESO large program TOPoS, aiming to observe about 80 more EMP candidates.

  11. Total lower lid reconstruction: technical details.

    PubMed Central

    Hughes, W L

    1976-01-01

    The main complications of this type of lower lid reconstruction are lash loss or malposition, entropion of the upper lid, upper lid retraction, undue laxity of the lower lid, and lid margin deformities. These can all be avioded by meticulous attention to surgical details and dressing techniques. I believe that this is the best and simplest method of providing a lid of acceptable function and appearance. The advantages of this type of operation are: (1) The new lower lid is constructed of lid tissue including the tarsus and conjunctiva from the upper lid. (2) The function and appearance of the new lower lid are acceptable with practically no tendency to late retraction. (3) The function and appearance of the upper lid need not be interfered with. (4) No external scars are produced except when a lash transplant is done. This transplant leaves a small, hardly noticeable scar in the lower part of the opposite brow. (5) The technique is relatively simple and well within the realm of any well-trained ophthalmic surgeon. The obvious disadvantages are the surgeon's inability to inspect the eye for two to four months and the inconvenience to the patient of having one eye closed for such a long period of time. Images FIGURE 1 A FIGURE 1 B FIGURE 2 FIGURE 3 FIGURE 4 FIGURE 5 FIGURE 6 FIGURE 7 PMID:867633

  12. Seismic Waves, 4th order accurate

    SciTech Connect

    2013-08-16

    SW4 is a program for simulating seismic wave propagation on parallel computers. SW4 colves the seismic wave equations in Cartesian corrdinates. It is therefore appropriate for regional simulations, where the curvature of the earth can be neglected. SW4 implements a free surface boundary condition on a realistic topography, absorbing super-grid conditions on the far-field boundaries, and a kinematic source model consisting of point force and/or point moment tensor source terms. SW4 supports a fully 3-D heterogeneous material model that can be specified in several formats. SW4 can output synthetic seismograms in an ASCII test format, or in the SAC finary format. It can also present simulation information as GMT scripts, whixh can be used to create annotated maps. Furthermore, SW4 can output the solution as well as the material model along 2-D grid planes.

  13. Detailed map of a cis-regulatory input function

    NASA Astrophysics Data System (ADS)

    Setty, Y.; Mayo, A. E.; Surette, M. G.; Alon, U.

    2003-06-01

    Most genes are regulated by multiple transcription factors that bind specific sites in DNA regulatory regions. These cis-regulatory regions perform a computation: the rate of transcription is a function of the active concentrations of each of the input transcription factors. Here, we used accurate gene expression measurements from living cell cultures, bearing GFP reporters, to map in detail the input function of the classic lacZYA operon of Escherichia coli, as a function of about a hundred combinations of its two inducers, cAMP and isopropyl -D-thiogalactoside (IPTG). We found an unexpectedly intricate function with four plateau levels and four thresholds. This result compares well with a mathematical model of the binding of the regulatory proteins cAMP receptor protein (CRP) and LacI to the lac regulatory region. The model is also used to demonstrate that with few mutations, the same region could encode much purer AND-like or even OR-like functions. This possibility means that the wild-type region is selected to perform an elaborate computation in setting the transcription rate. The present approach can be generally used to map the input functions of other genes.

  14. Urban scale air quality modelling using detailed traffic emissions estimates

    NASA Astrophysics Data System (ADS)

    Borrego, C.; Amorim, J. H.; Tchepel, O.; Dias, D.; Rafael, S.; Sá, E.; Pimentel, C.; Fontes, T.; Fernandes, P.; Pereira, S. R.; Bandeira, J. M.; Coelho, M. C.

    2016-04-01

    The atmospheric dispersion of NOx and PM10 was simulated with a second generation Gaussian model over a medium-size south-European city. Microscopic traffic models calibrated with GPS data were used to derive typical driving cycles for each road link, while instantaneous emissions were estimated applying a combined Vehicle Specific Power/Co-operative Programme for Monitoring and Evaluation of the Long-range Transmission of Air Pollutants in Europe (VSP/EMEP) methodology. Site-specific background concentrations were estimated using time series analysis and a low-pass filter applied to local observations. Air quality modelling results are compared against measurements at two locations for a 1 week period. 78% of the results are within a factor of two of the observations for 1-h average concentrations, increasing to 94% for daily averages. Correlation significantly improves when background is added, with an average of 0.89 for the 24 h record. The results highlight the potential of detailed traffic and instantaneous exhaust emissions estimates, together with filtered urban background, to provide accurate input data to Gaussian models applied at the urban scale.

  15. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  16. Detailed Evaluation of MODIS Fire Radiative Power Measurements

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles

    2010-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has been gaining recognition as an important parameter for facilitating the development of various scientific studies relating to the quantitative characterization of biomass burning and their emissions. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to characterize the uncertainties associated with them, such as those due to the MODIS bow-tie effects and other factors, in order to establish their error budget for use in scientific research and applications. In this presentation, we will show preliminary results of the MODIS FRP data analysis, including comparisons with airborne measurements.

  17. Detailed gravity anomalies from Geos 3 satellite altimetry data

    NASA Technical Reports Server (NTRS)

    Gopalapillai, G. S.; Mourad, A. G.

    1979-01-01

    Detailed gravity anomalies are computed from a combination of Geos 3 satellite altimeter and terrestrial gravity data using least-squares principles. The mathematical model used is based on the Stokes' equation modified for a nonglobal solution. Using Geos 3 data in the calibration area, the effects of several anomaly parameter configurations and data densities/distributions on the anomalies and their accuracy estimates are studied. The accuracy estimates for 1 deg x 1 deg mean anomalies from low density altimetry data are of the order of 4 mgal. Comparison of these anomalies with the terrestrial data and also with Rapp's data derived using collocation techniques show rms differences of 7.2 and 4.9 mgal, respectively. Indications are that the anomaly accuracies can be improved to about 2 mgal with high density data. Estimation of 30 in. x 30 in. mean anomalies indicates accuracies of the order of 5 mgal. Proper verification of these results will be possible only when accurate ground truth data become available.

  18. Detailed Facility Report Data Dictionary | ECHO | US EPA

    EPA Pesticide Factsheets

    The Detailed Facility Report Data Dictionary provides users with a list of the variables and definitions that have been incorporated into the Detailed Facility Report. The Detailed Facility Report provides a concise enforcement and compliance history for a facility.

  19. Chord, Horizontal Tie Bar & Crossbracing Joint Details; Crossbracing Center ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Chord, Horizontal Tie Bar & Crossbracing Joint Details; Crossbracing Center Joint Detail; Chord, Panel Posts, Braces & Counterbrace Joint Detail - Brownsville Covered Bridge, Spanning East Fork Whitewater River (moved to Eagle Creek Park, Indianapolis), Brownsville, Union County, IN

  20. Reference module selection criteria for accurate testing of photovoltaic (PV) panels

    SciTech Connect

    Roy, J.N.; Gariki, Govardhan Rao; Nagalakhsmi, V.

    2010-01-15

    It is shown that for accurate testing of PV panels the correct selection of reference modules is important. A detailed description of the test methodology is given. Three different types of reference modules, having different I{sub SC} (short circuit current) and power (in Wp) have been used for this study. These reference modules have been calibrated from NREL. It has been found that for accurate testing, both I{sub SC} and power of the reference module must be either similar or exceed to that of modules under test. In case corresponding values of the test modules are less than a particular limit, the measurements may not be accurate. The experimental results obtained have been modeled by using simple equivalent circuit model and associated I-V equations. (author)

  1. 43 CFR 2.31 - What must a submitter include in a detailed Exemption 4 objection statement?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... include a specific and detailed discussion of why the information is a trade secret or, if the information is not a trade secret, the following three categories must be addressed (unless the bureau...

  2. MAGNIFICENT DETAILS IN A DUSTY SPIRAL GALAXY

    NASA Technical Reports Server (NTRS)

    2002-01-01

    In 1995, the majestic spiral galaxy NGC 4414 was imaged by the Hubble Space Telescope as part of the HST Key Project on the Extragalactic Distance Scale. An international team of astronomers, led by Dr. Wendy Freedman of the Observatories of the Carnegie Institution of Washington, observed this galaxy on 13 different occasions over the course of two months. Images were obtained with Hubble's Wide Field Planetary Camera 2 (WFPC2) through three different color filters. Based on their discovery and careful brightness measurements of variable stars in NGC 4414, the Key Project astronomers were able to make an accurate determination of the distance to the galaxy. The resulting distance to NGC 4414, 19.1 megaparsecs or about 60 million light-years, along with similarly determined distances to other nearby galaxies, contributes to astronomers' overall knowledge of the rate of expansion of the universe. The Hubble constant (H0) is the ratio of how fast galaxies are moving away from us to their distance from us. This astronomical value is used to determine distances, sizes, and the intrinsic luminosities for many objects in our universe, and the age of the universe itself. Due to the large size of the galaxy compared to the WFPC2 detectors, only half of the galaxy observed was visible in the datasets collected by the Key Project astronomers in 1995. In 1999, the Hubble Heritage Team revisited NGC 4414 and completed its portrait by observing the other half with the same filters as were used in 1995. The end result is a stunning full-color look at the entire dusty spiral galaxy. The new Hubble picture shows that the central regions of this galaxy, as is typical of most spirals, contain primarily older, yellow and red stars. The outer spiral arms are considerably bluer due to ongoing formation of young, blue stars, the brightest of which can be seen individually at the high resolution provided by the Hubble camera. The arms are also very rich in clouds of interstellar dust

  3. Petascale self-consistent electromagnetic computations using scalable and accurate algorithms for complex structures

    NASA Astrophysics Data System (ADS)

    Cary, John R.; Abell, D.; Amundson, J.; Bruhwiler, D. L.; Busby, R.; Carlsson, J. A.; Dimitrov, D. A.; Kashdan, E.; Messmer, P.; Nieter, C.; Smithe, D. N.; Spentzouris, P.; Stoltz, P.; Trines, R. M.; Wang, H.; Werner, G. R.

    2006-09-01

    As the size and cost of particle accelerators escalate, high-performance computing plays an increasingly important role; optimization through accurate, detailed computermodeling increases performance and reduces costs. But consequently, computer simulations face enormous challenges. Early approximation methods, such as expansions in distance from the design orbit, were unable to supply detailed accurate results, such as in the computation of wake fields in complex cavities. Since the advent of message-passing supercomputers with thousands of processors, earlier approximations are no longer necessary, and it is now possible to compute wake fields, the effects of dampers, and self-consistent dynamics in cavities accurately. In this environment, the focus has shifted towards the development and implementation of algorithms that scale to large numbers of processors. So-called charge-conserving algorithms evolve the electromagnetic fields without the need for any global solves (which are difficult to scale up to many processors). Using cut-cell (or embedded) boundaries, these algorithms can simulate the fields in complex accelerator cavities with curved walls. New implicit algorithms, which are stable for any time-step, conserve charge as well, allowing faster simulation of structures with details small compared to the characteristic wavelength. These algorithmic and computational advances have been implemented in the VORPAL7 Framework, a flexible, object-oriented, massively parallel computational application that allows run-time assembly of algorithms and objects, thus composing an application on the fly.

  4. Learning accurate very fast decision trees from uncertain data streams

    NASA Astrophysics Data System (ADS)

    Liang, Chunquan; Zhang, Yang; Shi, Peng; Hu, Zhengguo

    2015-12-01

    Most existing works on data stream classification assume the streaming data is precise and definite. Such assumption, however, does not always hold in practice, since data uncertainty is ubiquitous in data stream applications due to imprecise measurement, missing values, privacy protection, etc. The goal of this paper is to learn accurate decision tree models from uncertain data streams for classification analysis. On the basis of very fast decision tree (VFDT) algorithms, we proposed an algorithm for constructing an uncertain VFDT tree with classifiers at tree leaves (uVFDTc). The uVFDTc algorithm can exploit uncertain information effectively and efficiently in both the learning and the classification phases. In the learning phase, it uses Hoeffding bound theory to learn from uncertain data streams and yield fast and reasonable decision trees. In the classification phase, at tree leaves it uses uncertain naive Bayes (UNB) classifiers to improve the classification performance. Experimental results on both synthetic and real-life datasets demonstrate the strong ability of uVFDTc to classify uncertain data streams. The use of UNB at tree leaves has improved the performance of uVFDTc, especially the any-time property, the benefit of exploiting uncertain information, and the robustness against uncertainty.

  5. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  6. Personalized Orthodontic Accurate Tooth Arrangement System with Complete Teeth Model.

    PubMed

    Cheng, Cheng; Cheng, Xiaosheng; Dai, Ning; Liu, Yi; Fan, Qilei; Hou, Yulin; Jiang, Xiaotong

    2015-09-01

    The accuracy, validity and lack of relation information between dental root and jaw in tooth arrangement are key problems in tooth arrangement technology. This paper aims to describe a newly developed virtual, personalized and accurate tooth arrangement system based on complete information about dental root and skull. Firstly, a feature constraint database of a 3D teeth model is established. Secondly, for computed simulation of tooth movement, the reference planes and lines are defined by the anatomical reference points. The matching mathematical model of teeth pattern and the principle of the specific pose transformation of rigid body are fully utilized. The relation of position between dental root and alveolar bone is considered during the design process. Finally, the relative pose relationships among various teeth are optimized using the object mover, and a personalized therapeutic schedule is formulated. Experimental results show that the virtual tooth arrangement system can arrange abnormal teeth very well and is sufficiently flexible. The relation of position between root and jaw is favorable. This newly developed system is characterized by high-speed processing and quantitative evaluation of the amount of 3D movement of an individual tooth.

  7. Combining heterogeneous data sources for accurate functional annotation of proteins

    PubMed Central

    2013-01-01

    Combining heterogeneous sources of data is essential for accurate prediction of protein function. The task is complicated by the fact that while sequence-based features can be readily compared across species, most other data are species-specific. In this paper, we present a multi-view extension to GOstruct, a structured-output framework for function annotation of proteins. The extended framework can learn from disparate data sources, with each data source provided to the framework in the form of a kernel. Our empirical results demonstrate that the multi-view framework is able to utilize all available information, yielding better performance than sequence-based models trained across species and models trained from collections of data within a given species. This version of GOstruct participated in the recent Critical Assessment of Functional Annotations (CAFA) challenge; since then we have significantly improved the natural language processing component of the method, which now provides performance that is on par with that provided by sequence information. The GOstruct framework is available for download at http://strut.sourceforge.net. PMID:23514123

  8. Ancillary-service details: Dynamic scheduling

    SciTech Connect

    Hirst, E.; Kirby, B.

    1997-01-01

    Dynamic scheduling (DS) is the electronic transfer from one control area to another of the time-varying electricity consumption associated with a load or the time-varying electricity production associated with a generator. Although electric utilities have been using this technique for at least two decades, its use is growing in popularity and importance. This growth is a consequence of the major changes under way in US bulk-power markets, in particular efforts to unbundle generation from transmission and to increase competition among generation providers. DS can promote competition and increase choices. It allows consumers to purchase certain services from entities outside their physical-host area and it allows generators to sell certain services to entities other than their physical host. These services include regulation (following minute-to-minute variations in load) and operating reserves, among others. Such an increase in the number of possible suppliers and customers should encourage innovation and reduce the costs and prices of providing electricity services. The purpose of the project reported here was to collect and analyze data on utility experiences with DS. Chapter 2 provides additional details and examples of the definitions of DS. Chapter 3 explains why DS might be an attractive service that customers and generators, as well as transmission providers, might wan to use. Chapter 4 presents some of the many current DS examples the authors uncovered in their interviews. Chapter 5 discusses the costs and cost-effectiveness of DS. Chapter 6 explains what they believe can and cannot be electronically moved from one control area to another, primarily in terms of the six ancillary services that FERC defined in Order 888. Chapter 7 discusses the need for additional research on DS.

  9. 11. Exterior detail view of northeast corner, showing stucco finish ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. Exterior detail view of northeast corner, showing stucco finish and woodwork details - American Railway Express Company Freight Building, 1060 Northeast Division Street, Bend, Deschutes County, OR

  10. site plan, floor plan, southeast and east elevations, detail showing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    site plan, floor plan, southeast and east elevations, detail showing original front entrance, interior detail showing fireplace in elevation - Neiman House, 1930 Providence Road, Charlotte, Mecklenburg County, NC

  11. Analysis of information systems for hydropower operations: Executive summary

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Becker, L.; Estes, J.; Simonett, D.; Yeh, W.

    1976-01-01

    An analysis was performed of the operations of hydropower systems, with emphasis on water resource management, to determine how aerospace derived information system technologies can effectively increase energy output. Better utilization of water resources was sought through improved reservoir inflow forecasting based on use of hydrometeorologic information systems with new or improved sensors, satellite data relay systems, and use of advanced scheduling techniques for water release. Specific mechanisms for increased energy output were determined, principally the use of more timely and accurate short term (0-7 days) inflow information to reduce spillage caused by unanticipated dynamic high inflow events. The hydrometeorologic models used in predicting inflows were examined in detail to determine the sensitivity of inflow prediction accuracy to the many variables employed in the models, and the results were used to establish information system requirements. Sensor and data handling system capabilities were reviewed and compared to the requirements, and an improved information system concept was outlined.

  12. 77 FR 3800 - Accurate NDE & Inspection, LLC; Confirmatory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... COMMISSION Accurate NDE & Inspection, LLC; Confirmatory Order In the Matter of Accurate NDE & Docket: 150... request ADR with the NRC in an attempt to resolve issues associated with this matter. In response, on August 9, 2011, Accurate NDE requested ADR to resolve this matter with the NRC. On September 28,...

  13. Retinal Connectomics: Towards Complete, Accurate Networks

    PubMed Central

    Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott

    2013-01-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  14. Accurate attitude determination of the LACE satellite

    NASA Technical Reports Server (NTRS)

    Miglin, M. F.; Campion, R. E.; Lemos, P. J.; Tran, T.

    1993-01-01

    The Low-power Atmospheric Compensation Experiment (LACE) satellite, launched in February 1990 by the Naval Research Laboratory, uses a magnetic damper on a gravity gradient boom and a momentum wheel with its axis perpendicular to the plane of the orbit to stabilize and maintain its attitude. Satellite attitude is determined using three types of sensors: a conical Earth scanner, a set of sun sensors, and a magnetometer. The Ultraviolet Plume Instrument (UVPI), on board LACE, consists of two intensified CCD cameras and a gimbal led pointing mirror. The primary purpose of the UVPI is to image rocket plumes from space in the ultraviolet and visible wavelengths. Secondary objectives include imaging stars, atmospheric phenomena, and ground targets. The problem facing the UVPI experimenters is that the sensitivity of the LACF satellite attitude sensors is not always adequate to correctly point the UVPI cameras. Our solution is to point the UVPI cameras at known targets and use the information thus gained to improve attitude measurements. This paper describes the three methods developed to determine improved attitude values using the UVPI for both real-time operations and post observation analysis.

  15. Detailed Kinetic Modeling of Gasoline Surrogate Mixtures

    SciTech Connect

    Mehl, M; Curran, H J; Pitz, W J; Westbrook, C K

    2009-03-09

    Real fuels are complex mixtures of thousands of hydrocarbon compounds including linear and branched paraffins, naphthenes, olefins and aromatics. It is generally agreed that their behavior can be effectively reproduced by simpler fuel surrogates containing a limited number of components. In this work, a recently revised version of the kinetic model by the authors is used to analyze the combustion behavior of several components relevant to gasoline surrogate formulation. Particular attention is devoted to linear and branched saturated hydrocarbons (PRF mixtures), olefins (1-hexene) and aromatics (toluene). Model predictions for pure components, binary mixtures and multi-component gasoline surrogates are compared with recent experimental information collected in rapid compression machine, shock tube and jet stirred reactors covering a wide range of conditions pertinent to internal combustion engines. Simulation results are discussed focusing attention on the mixing effects of the fuel components.

  16. Defense Infrastructure: More Accurate Data Would Allow DOD to Improve the Tracking, Management, and Security of Its Leased Facilities

    DTIC Science & Technology

    2016-03-01

    DEFENSE INFRASTRUCTURE More Accurate Data Would Allow DOD to Improve the Tracking, Management , and Security of Its...Accurate Data Would Allow DOD to Improve the Tracking, Management , and Security of Its Leased Facilities Why GAO Did This Study Overreliance on... data were sufficiently reliable to provide a basis for managing leases and externally reporting information on leases. We drew a statistical random

  17. On the accurate simulation of tsunami wave propagation

    NASA Astrophysics Data System (ADS)

    Castro, C. E.; Käser, M.; Toro, E. F.

    2009-04-01

    A very important part of any tsunami early warning system is the numerical simulation of the wave propagation in the open sea and close to geometrically complex coastlines respecting bathymetric variations. Here we are interested in improving the numerical tools available to accurately simulate tsunami wave propagation on a Mediterranean basin scale. To this end, we need to accomplish some targets, such as: high-order numerical simulation in space and time, preserve steady state conditions to avoid spurious oscillations and describe complex geometries due to bathymetry and coastlines. We use the Arbitrary accuracy DERivatives Riemann problem method together with Finite Volume method (ADER-FV) over non-structured triangular meshes. The novelty of this method is the improvement of the ADER-FV scheme, introducing the well-balanced property when geometrical sources are considered for unstructured meshes and arbitrary high-order accuracy. In a previous work from Castro and Toro [1], the authors mention that ADER-FV schemes approach asymptotically the well-balanced condition, which was true for the test case mentioned in [1]. However, new evidence[2] shows that for real scale problems as the Mediterranean basin, and considering realistic bathymetry as ETOPO-2[3], this asymptotic behavior is not enough. Under these realistic conditions the standard ADER-FV scheme fails to accurately describe the propagation of gravity waves without being contaminated with spurious oscillations, also known as numerical waves. The main problem here is that at discrete level, i.e. from a numerical point of view, the numerical scheme does not correctly balance the influence of the fluxes and the sources. Numerical schemes that retain this balance are said to satisfy the well-balanced property or the exact C-property. This unbalance reduces, as we refine the spatial discretization or increase the order of the numerical method. However, the computational cost increases considerably this way

  18. How accurately can the peak skin dose in fluoroscopy be determined using indirect dose metrics?

    SciTech Connect

    Jones, A. Kyle; Ensor, Joe E.; Pasciak, Alexander S.

    2014-07-15

    Purpose: Skin dosimetry is important for fluoroscopically-guided interventions, as peak skin doses (PSD) that result in skin reactions can be reached during these procedures. There is no consensus as to whether or not indirect skin dosimetry is sufficiently accurate for fluoroscopically-guided interventions. However, measuring PSD with film is difficult and the decision to do so must be madea priori. The purpose of this study was to assess the accuracy of different types of indirect dose estimates and to determine if PSD can be calculated within ±50% using indirect dose metrics for embolization procedures. Methods: PSD were measured directly using radiochromic film for 41 consecutive embolization procedures at two sites. Indirect dose metrics from the procedures were collected, including reference air kerma. Four different estimates of PSD were calculated from the indirect dose metrics and compared along with reference air kerma to the measured PSD for each case. The four indirect estimates included a standard calculation method, the use of detailed information from the radiation dose structured report, and two simplified calculation methods based on the standard method. Indirect dosimetry results were compared with direct measurements, including an analysis of uncertainty associated with film dosimetry. Factors affecting the accuracy of the different indirect estimates were examined. Results: When using the standard calculation method, calculated PSD were within ±35% for all 41 procedures studied. Calculated PSD were within ±50% for a simplified method using a single source-to-patient distance for all calculations. Reference air kerma was within ±50% for all but one procedure. Cases for which reference air kerma or calculated PSD exhibited large (±35%) differences from the measured PSD were analyzed, and two main causative factors were identified: unusually small or large source-to-patient distances and large contributions to reference air kerma from cone

  19. Detailed Burnup Calculations for Testing Nuclear Data

    SciTech Connect

    Leszczynski, F.

    2005-05-24

    A general method (MCQ) has been developed by introducing a microscopic burnup scheme that uses the Monte Carlo calculated fluxes and microscopic reaction rates of a complex system and a depletion code for burnup calculations as a basis for solving nuclide material balance equations for each spatial region in which the system is divided. Continuous energy-dependent cross-section libraries and full 3D geometry of the system can be input for the calculations. The resulting predictions for the system at successive burnup time steps are thus based on a calculation route where both geometry and cross sections are accurately represented, without geometry simplifications and with continuous energy data, providing an independent approach for benchmarking other methods and nuclear data of actinides, fission products, and other burnable absorbers. The main advantage of this method over the classical deterministic methods currently used is that the MCQ System is a direct 3D method without the limitations and errors introduced on the homogenization of geometry and condensation of energy of deterministic methods. The Monte Carlo and burnup codes adopted until now are the widely used MCNP and ORIGEN codes, but other codes can be used also. For using this method, there is need of a well-known set of nuclear data for isotopes involved in burnup chains, including burnable poisons, fission products, and actinides. For fixing the data to be included in this set, a study of the present status of nuclear data is performed, as part of the development of the MCQ method. This study begins with a review of the available cross-section data of isotopes involved in burnup chains for power and research nuclear reactors. The main data needs for burnup calculations are neutron cross sections, decay constants, branching ratios, fission energy, and yields. The present work includes results of selected experimental benchmarks and conclusions about the sensitivity of different sets of cross

  20. A Detailed Geomorphological Sketch Map of Titan's Afekan Crater Region

    NASA Astrophysics Data System (ADS)

    Schoenfeld, A.; Malaska, M. J.; Lopes, R. M. C.; Le Gall, A. A.; Birch, S. P.; Hayes, A.

    2014-12-01

    Due to Titan's uniquely thick atmosphere and organic haze layers, the most detailed images (with resolution of 300 meters per pixel) of the Saturnian moon's surface exist as Synthetic Aperture Radar (SAR) images taken by Cassini's RADAR instrument. Using the SAR data, we have been putting together detailed geomorphological sketch maps of various Titan regions in an effort to piece together its geologic history. We initially examined the Afekan region of Titan due to extensive SAR coverage. Features described on Afekan fall under the categories (in order of geologic age, extrapolated from their relative emplacement) of hummocky, labyrinthic, plains, and dunes. During our mapping effort, we also divided each terrain category into several different subclasses on a local level. Our map offers a chance to present and analyze the distribution, relationship, and potential formation hypotheses of the different terrains. In bulk, we find evidence for both Aeolian and fluvial processes. A particularly important unit found in the Afekan region is the unit designated "undifferentiated plains", or the "Blandlands" of Titan, a mid-latitude terrain unit comprising 25% of the moon's surface. Undifferentiated plains are notable for its relative featurelessness in radar and infrared. Our interpretation is that it is a fill unit in and around Afekan crater and other hummocky/mountainous units. The plains suggest that the nature of Titan's geomorphology seems to be tied to ongoing erosional forces and sediment deposition. Other datasets used in characterizing Titan's various geomorphological units include information obtained from radiometry, infrared (ISS), and spectrometry (VIMS). We will present the detailed geomorphological sketch map with all the terrain units assigned and labeled.

  1. Strengthening of competence planning truss through instructional media development details

    NASA Astrophysics Data System (ADS)

    Handayani, Sri; Nurcahyono, M. Hadi

    2017-03-01

    Competency-Based Learning is a model of learning in which the planning, implementation, and assessment refers to the mastery of competencies. Learning in lectures conducted in the framework for comprehensively realizing student competency. Competence means the orientation of the learning activities in the classroom must be given to the students to be more active learning, active search for information themselves and explore alone or with friends in learning activities in pairs or in groups, learn to use a variety of learning resources and printed materials, electronic media, as well as environment. Analysis of learning wooden structure known weakness in the understanding of the truss detail. Hence the need for the development of media that can provide a clear picture of what the structure of the wooden horses and connection details. Development of instructional media consisted of three phases of activity, namely planning, production and assessment. Learning Media planning should be tailored to the needs and conditions necessary to provide reinforcement to the mastery of competencies, through the table material needs. The production process of learning media is done by using hardware (hardware) and software (software) to support the creation of a medium of learning. Assessment of the media poduk yan include feasibility studies, namely by subject matter experts, media experts, while testing was done according to the student's perception of the product. The results of the analysis of the materials for the instructional aspects of the results obtained 100% (very good) and media analysis for the design aspects of the media expressed very good with a percentage of 88.93%. While the analysis of student perceptions expressed very good with a percentage of 84.84%. Media Learning Truss Details feasible and can be used in the implementation of learning wooden structure to provide capacity-building in planning truss

  2. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  3. Accurate patient dosimetry of kilovoltage cone-beam CT in radiation therapy

    SciTech Connect

    Ding, George X.; Duggan, Dennis M.; Coffey, Charles W.

    2008-03-15

    The increased utilization of x-ray imaging in image-guided radiotherapy has dramatically improved the radiation treatment and the lives of cancer patients. Daily imaging procedures, such as cone-beam computed tomography (CBCT), for patient setup may significantly increase the dose to the patient's normal tissues. This study investigates the dosimetry from a kilovoltage (kV) CBCT for real patient geometries. Monte Carlo simulations were used to study the kV beams from a Varian on-board imager integrated into the Trilogy accelerator. The Monte Carlo calculated results were benchmarked against measurements and good agreement was obtained. The authors developed a novel method to calibrate Monte Carlo simulated beams with measurements using an ionization chamber in which the air-kerma calibration factors are obtained from an Accredited Dosimetry Calibration Laboratory. The authors have introduced a new Monte Carlo calibration factor, f{sub MCcal}, which is determined from the calibration procedure. The accuracy of the new method was validated by experiment. When a Monte Carlo simulated beam has been calibrated, the simulated beam can be used to accurately predict absolute dose distributions in the irradiated media. Using this method the authors calculated dose distributions to patient anatomies from a typical CBCT acquisition for different treatment sites, such as head and neck, lung, and pelvis. Their results have shown that, from a typical head and neck CBCT, doses to soft tissues, such as eye, spinal cord, and brain can be up to 8, 6, and 5 cGy, respectively. The dose to the bone, due to the photoelectric effect, can be as much as 25 cGy, about three times the dose to the soft tissue. The study provides detailed information on the additional doses to the normal tissues of a patient from a typical kV CBCT acquisition. The methodology of the Monte Carlo beam calibration developed and introduced in this study allows the user to calculate both relative and absolute

  4. Emerging role of three-dimensional speckle tracking strain for accurate quantification of left ventricular dyssynchrony.

    PubMed

    Tanaka, Hidekazu; Tatsumi, Kazuhiro; Matsumoto, Kensuke; Kawai, Hiroya; Hirata, Ken-ichi

    2013-10-01

    A case was 53-year-old female with dilated-phase hypertrophic cardiomyopathy. She was classified as New York Heart Association functional class III heart failure despite receiving optimal medical therapy. The electrocardiogram taken showed intraventricular conduction delay with a QRS width of 194 msec. The left ventricular (LV) end-diastolic and systolic volumes, and ejection fraction (EF) were 101 mL, 68 mL, and 32%, respectively. The patient showed no significant mechanical LV dyssynchrony as evidenced by two-dimensional (2D) speckle tracking radial strain, which is defined as the time difference between anterior-septum and posterior wall, of 105 msec (<130 msec). Three-dimensional (3D) speckle tracking radial strain was performed for more detailed LV mechanical dyssynchrony analysis. An especially important finding for 3D speckle tracking radial strain analysis was that the average time-to-peak strain of 5 septum segments at 3 different LV levels (basal-anterior-septum, basal-septum, mid-anterior-septum, mid-septum, apical-septum) was significantly shorter than that of 5 posterolateral segments at 3 different LV levels (basal-posterior, basal-lateral, mid-posterior, mid-lateral, apical-lateral). This time difference between septum and posterolateral wall was 216 msec (204 msec vs. 420 msec), which was considered to indicate significant LV mechanical dyssynchrony (≥130 msec). 12 months after cardiac resynchronization therapy (CRT), EF had improved to 47%, while end-systolic and diastolic volumes had decreased to 88 mL and 47 mL, respectively, so that the patient was classified as a responder. In conclusions, a newly developed 3D speckle tracking strain can provide a comprehensive evaluation of "true" LV mechanical dyssynchrony from pyramidal 3D data sets acquired in the same beat, thus yielding more accurate information than previously possible with the 2D speckle tracking system.

  5. [Approach to academic detailing as a hospital pharmacist].

    PubMed

    Nishikori, Atsumi

    2014-01-01

    In 2012, a new medical fee system was introduced for the clinical activities of hospital pharmacists responsible for in-patient pharmacotherapy monitoring in medical institutions in Japan. The new medical system demands greater efforts to provide the most suitable and safest medicine for each patient. By applying the concept of academic detailing to clinical pharmacists' roles in hospitals, I present drug use evaluation in three disease states (peptic ulcer, insomnia, and osteoporosis). To analyze these from multiple aspects, we not only need knowledge of drug monographs (clinical and adverse drug effects), but also the ability to evaluate a patient's adherence and cost-effectiveness. If we combine the idea of academic detailing with a clinical pharmacist's role, it is necessary to strengthen drug information skills, such as guideline or literature search skills and journal evaluation. Simultaneously, it is important to introduce new pharmaceutical education curriculums regarding evidence-based medicine (EBM), pharmacoeconomics, and professional communication in order to explore pharmacists' roles in the future.

  6. A detailed spectroscopic study of an Italian fresco

    SciTech Connect

    Barilaro, Donatella; Crupi, Vincenza; Majolino, Domenico; Barone, Germana; Ponterio, Rosina

    2005-02-15

    In the present work we characterized samples of plasters and pictorial layers taken from a fresco in the Acireale Cathedral. The fresco represents the Coronation of Saint Venera, patron saint of this Ionian town. By performing a detailed spectroscopic analysis of the plaster preparation layer by Fourier-transform infrared (FTIR) spectroscopy and x-ray diffraction (XRD), and of the painting layer by FTIR and confocal Raman microspectroscopy, scanning electron microscopy+energy dispersive x-ray spectroscopy, and XRD, we were able to identify the pigments and the binders present. In particular, Raman investigation was crucial to the characterization of the pigments thanks to the high resolution of the confocal apparatus used. It is worth stressing that the simultaneous use of complementary techniques was able to provide more complete information for the conservation of the artifact we studied.

  7. Detailed H I kinematics of Tully-Fisher calibrator galaxies

    NASA Astrophysics Data System (ADS)

    Ponomareva, Anastasia A.; Verheijen, Marc A. W.; Bosma, Albert

    2016-12-01

    We present spatially resolved H I kinematics of 32 spiral galaxies which have Cepheid or/and tip of the red giant branch distances, and define a calibrator sample for the Tully-Fisher relation. The interferometric H I data for this sample were collected from available archives and supplemented with new Giant Metrewave Radio Telescope observations. This paper describes a uniform analysis of the H I kinematics of this inhomogeneous data set. Our main result is an atlas for our calibrator sample that presents global H I profiles, integrated H I column-density maps, H I surface-density profiles and, most importantly, detailed kinematic information in the form of high-quality rotation curves derived from highly resolved, two-dimensional velocity fields and position-velocity diagrams.

  8. Infrared image detail enhancement approach based on improved joint bilateral filter

    NASA Astrophysics Data System (ADS)

    Liu, Ning; Chen, Xiaohong

    2016-07-01

    In this paper, we proposed a new infrared image detail enhancement approach. This approach could not only achieve the goal of enhancing the digital detail, but also make the processed image much closer to the real situation. Inspired by the joint-bilateral filter, two adjacent images were utilized to calculate the kernel functions in order to distinguish the detail information from the raw image. We also designed a new kernel function to modify the joint-bilateral filter and to eliminate the gradient reversal artifacts caused by the non-linear filtering. The new kernel is based on an adaptive emerge coefficient to realize the detail layer determination. The detail information was modified by the adaptive emerge coefficient along with two key parameters to realize the detail enhancement. Finally, we combined the processed detail layer with the base layer and rearrange the high dynamic image into monitor-suited low dynamic range to achieve better visual effect. Numerical calculation showed that this new technology has the best value compare to the previous research in detail enhancement. Figures and data flowcharts were demonstrated in the paper.

  9. The North West Adelaide Health Study: detailed methods and baseline segmentation of a cohort for selected chronic diseases.

    PubMed

    Grant, Janet F; Chittleborough, Catherine R; Taylor, Anne W; Dal Grande, Eleonora; Wilson, David H; Phillips, Patrick J; Adams, Robert J; Cheek, Julianne; Price, Kay; Gill, Tiffany; Ruffin, Richard E

    2006-04-12

    The North West Adelaide Health Study is a population-based biomedical cohort study investigating the prevalence of a number of chronic conditions and health-related risk factors along a continuum. This methodology may assist with evidence-based decisions for health policy makers and planners, and inform health professionals who are involved in chronic disease prevention and management, by providing a better description of people at risk of developing or already diagnosed with selected chronic conditions for more accurate targeting groups for health gain and improved health outcomes. Longitudinal data will provide information on progression of chronic conditions and allow description of those who move forward and back along the continuum over time. Detailed methods are provided regarding the random recruitment and examination of a representative sample of participants (n = 4060), including the rationale for various processes and valuable lessons learnt. Self-reported and biomedical data were obtained on risk factors (smoking, alcohol consumption, physical activity, family history, body mass index, blood pressure, cholesterol) and chronic conditions (asthma, chronic obstructive pulmonary disease, diabetes) to classify participants according to their status along a continuum. Segmenting this population sample along a continuum showed that 71.5% had at least one risk factor for developing asthma, chronic obstructive pulmonary disease or diabetes. Almost one-fifth (18.8%) had been previously diagnosed with at least one of these chronic conditions, and an additional 3.9% had at least one of these conditions but had not been diagnosed. This paper provides a novel opportunity to examine how a cohort study was born. It presents detailed methodology behind the selection, recruitment and examination of a cohort and how participants with selected chronic conditions can be segmented along a continuum that may assist with health promotion and health services planning.

  10. Panel Post & Diagonal Brace Joint Detail; Crossbracing Center Joint ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Panel Post & Diagonal Brace Joint Detail; Crossbracing Center Joint Detail; Chord, Panel Post, Tie Bar, & Diagonal Brace Joint Detail; Chord, Tie Bar, & Crossbracing Joint Detail - Medora Bridge, Spanning East Fork of White River at State Route 235, Medora, Jackson County, IN

  11. Evaluation of Sensitivity and Robustness of Geothermal Resource Parameters Using Detailed and Approximate Stratigraphy

    NASA Astrophysics Data System (ADS)

    Whealton, C.; Jordan, T. E.; Frone, Z. S.; Smith, J. D.; Horowitz, F. G.; Stedinger, J. R.

    2015-12-01

    Accurate assessment of the spatial variation of geothermal heat is key to distinguishing among locations for geothermal project development. Resource assessment over large areas can be accelerated by using existing subsurface data collected for other purposes, such as petroleum industry bottom-hole temperature (BHT) datasets. BHT data are notoriously noisy but in many sedimentary basins their abundance offsets the potential low quality of an individual BHT measurement. Analysis requires description of conductivity stratigraphy, which for thousands of wells with BHT values is daunting. For regional assessment, a streamlined method is to approximate the thickness and conductivity of each formation using a set of standard columns rescaled to the sediment thickness at a location. Surface heat flow and related geothermal resource metrics are estimated from these and additional parameters. This study uses Monte Carlo techniques to compare the accuracy and precision of thermal predictions at single locations by the streamlined approach to well-specific conductivity stratigraphy. For 77 wells distributed across the Appalachian Basin of NY, PA, and WV, local geological experts made available detailed information on unit thicknesses . For the streamlined method we used the Correlation of Stratigraphic Units of North America (COSUNA) columns. For both data sets, we described thermal conductivity of the strata using generic values or values from the geologically similar Anadarko Basin. The well-specific surface heat flow and temperature-at-depth were evaluated using a one-dimensional conductive heat flow model. This research addresses the sensitivity of the estimated geothermal output to the model inputs (BHT, thermal conductivity) and the robustness of the approximate stratigraphic column assumptions when estimating the geothermal output. This research was conducted as part of the Dept. of Energy Geothermal Play Fairway Analysis program.

  12. Spurious Consensus and Opinion Revision: Why Might People Be More Confident in Their Less Accurate Judgments?

    ERIC Educational Resources Information Center

    Yaniv, Ilan; Choshen-Hillel, Shoham; Milyavsky, Maxim

    2009-01-01

    In the interest of improving their decision making, individuals revise their opinions on the basis of samples of opinions obtained from others. However, such a revision process may lead decision makers to experience greater confidence in their less accurate judgments. The authors theorize that people tend to underestimate the informative value of…

  13. Analysis of Dynamic Interactions between Different Drivetrain Components with a Detailed Wind Turbine Model

    NASA Astrophysics Data System (ADS)

    Bartschat, A.; Morisse, M.; Mertens, A.; Wenske, J.

    2016-09-01

    The presented work describes a detailed analysis of the dynamic interactions among mechanical and electrical drivetrain components of a modern wind turbine under the influence of parameter variations, different control mechanisms and transient excitations. For this study, a detailed model of a 2MW wind turbine with a gearbox, a permanent magnet synchronous generator and a full power converter has been developed which considers all relevant characteristics of the mechanical and electrical subsystems. This model includes an accurate representation of the aerodynamics and the mechanical properties of the rotor and the complete mechanical drivetrain. Furthermore, a detailed electrical modelling of the generator, the full scale power converter with discrete switching devices, its filters, the transformer and the grid as well as the control structure is considered. The analysis shows that, considering control measures based on active torsional damping, interactions between mechanical and electrical subsystems can significantly affect the loads and thus the individual lifetime of the components.

  14. Efficient eco-friendly inverted quantum dot sensitized solar cells† †Electronic supplementary information (ESI) available: TEM images of QDs, XPS spectra, UV-vis and PL spectra of the sensitized electrodes, details about photophysical characterization and IPCE spectra interpretation. See DOI: 10.1039/c5ta06769c Click here for additional data file.

    PubMed Central

    Park, Jinhyung; Sajjad, Muhammad T.; Jouneau, Pierre-Henri; Ruseckas, Arvydas; Faure-Vincent, Jérôme; Reiss, Peter

    2016-01-01

    Recent progress in quantum dot (QD) sensitized solar cells has demonstrated the possibility of low-cost and efficient photovoltaics. However, the standard device structure based on n-type materials often suffers from slow hole injection rate, which may lead to unbalanced charge transport. We have fabricated efficient p-type (inverted) QD sensitized cells, which combine the advantages of conventional QD cells with p-type dye sensitized configurations. Moreover, p-type QD sensitized cells can be used in highly promising tandem configurations with n-type ones. QDs without toxic Cd and Pb elements and with improved absorption and stability were successfully deposited onto mesoporous NiO electrode showing good coverage and penetration according to morphological analysis. Detailed photophysical charge transfer studies showed that high hole injection rates (108 s–1) observed in such systems are comparable with electron injection in conventional n-type QD assemblies. Inverted solar cells fabricated with various QDs demonstrate excellent power conversion efficiencies of up to 1.25%, which is 4 times higher than the best values for previous inverted QD sensitized cells. Attempts to passivate the surface of the QDs show that traditional methods of reduction of recombination in the QD sensitized cells are not applicable to the inverted architectures. PMID:27478616

  15. Quantum Information Science

    DTIC Science & Technology

    2012-02-01

    for constructing quantum gates. In [Miller11b] we detailed the use of multiplexing to simulate quantum teleportation . One alternative to multiplexing...LABORATORY INFORMATION DIRECTORATE QUANTUM INFORMATION SCIENCE FEBRUARY 2012 FINAL TECHNICAL REPORT  ROME, NY...YYYY) FEB 2012 2. REPORT TYPE Final Technical Report 3. DATES COVERED (From - To) OCT 2009 – SEP 2011 4. TITLE AND SUBTITLE QUANTUM INFORMATION

  16. How accurate are the nonlinear chemical Fokker-Planck and chemical Langevin equations?

    PubMed

    Grima, Ramon; Thomas, Philipp; Straube, Arthur V

    2011-08-28

    The chemical Fokker-Planck equation and the corresponding chemical Langevin equation are commonly used approximations of the chemical master equation. These equations are derived from an uncontrolled, second-order truncation of the Kramers-Moyal expansion of the chemical master equation and hence their accuracy remains to be clarified. We use the system-size expansion to show that chemical Fokker-Planck estimates of the mean concentrations and of the variance of the concentration fluctuations about the mean are accurate to order Ω(-3∕2) for reaction systems which do not obey detailed balance and at least accurate to order Ω(-2) for systems obeying detailed balance, where Ω is the characteristic size of the system. Hence, the chemical Fokker-Planck equation turns out to be more accurate than the linear-noise approximation of the chemical master equation (the linear Fokker-Planck equation) which leads to mean concentration estimates accurate to order Ω(-1∕2) and variance estimates accurate to order Ω(-3∕2). This higher accuracy is particularly conspicuous for chemical systems realized in small volumes such as biochemical reactions inside cells. A formula is also obtained for the approximate size of the relative errors in the concentration and variance predictions of the chemical Fokker-Planck equation, where the relative error is defined as the difference between the predictions of the chemical Fokker-Planck equation and the master equation divided by the prediction of the master equation. For dimerization and enzyme-catalyzed reactions, the errors are typically less than few percent even when the steady-state is characterized by merely few tens of molecules.

  17. The impact of model detail on power grid resilience measures

    NASA Astrophysics Data System (ADS)

    Auer, S.; Kleis, K.; Schultz, P.; Kurths, J.; Hellmann, F.

    2016-05-01

    Extreme events are a challenge to natural as well as man-made systems. For critical infrastructure like power grids, we need to understand their resilience against large disturbances. Recently, new measures of the resilience of dynamical systems have been developed in the complex system literature. Basin stability and survivability respectively assess the asymptotic and transient behavior of a system when subjected to arbitrary, localized but large perturbations in frequency and phase. To employ these methods that assess power grid resilience, we need to choose a certain model detail of the power grid. For the grid topology we considered the Scandinavian grid and an ensemble of power grids generated with a random growth model. So far the most popular model that has been studied is the classical swing equation model for the frequency response of generators and motors. In this paper we study a more sophisticated model of synchronous machines that also takes voltage dynamics into account, and compare it to the previously studied model. This model has been found to give an accurate picture of the long term evolution of synchronous machines in the engineering literature for post fault studies. We find evidence that some stable fix points of the swing equation become unstable when we add voltage dynamics. If this occurs the asymptotic behavior of the system can be dramatically altered, and basin stability estimates obtained with the swing equation can be dramatically wrong. We also find that the survivability does not change significantly when taking the voltage dynamics into account. Further, the limit cycle type asymptotic behaviour is strongly correlated with transient voltages that violate typical operational voltage bounds. Thus, transient voltage bounds are dominated by transient frequency bounds and play no large role for realistic parameters.

  18. The Combination of Laser Scanning and Structure from Motion Technology for Creation of Accurate Exterior and Interior Orthophotos of ST. Nicholas Baroque Church

    NASA Astrophysics Data System (ADS)

    Koska, B.; Křemen, T.

    2013-02-01

    Terrestrial laser scanning technology is used for creation of building documentation and 3D building model from its emerging at the turn of the millennium. Photogrammetry has even longer tradition in this field. Both technologies have some technical limitations if they are used for creation of a façade or even an interior orthophoto, but combination of both technologies seems profitable. Laser scanning can be used for creation of an accurate 3D model and photogrammetry for consequent application of high quality colour information. Both technologies were used in synergy to create the building plans, 2D drawing documentation of facades and interior views and the orthophotos of St. Nicholas Baroque church in Prague. The case study is described in details in the paper.

  19. What Data to Use for Forest Conservation Planning? A Comparison of Coarse Open and Detailed Proprietary Forest Inventory Data in Finland.

    PubMed

    Lehtomäki, Joona; Tuominen, Sakari; Toivonen, Tuuli; Leinonen, Antti

    2015-01-01

    The boreal region is facing intensifying resource extraction pressure, but the lack of comprehensive biodiversity data makes operative forest conservation planning difficult. Many countries have implemented forest inventory schemes and are making extensive and up-to-date forest databases increasingly available. Some of the more detailed inventory databases, however, remain proprietary and unavailable for conservation planning. Here, we investigate how well different open and proprietary forest inventory data sets suit the purpose of conservation prioritization in Finland. We also explore how much priorities are affected by using the less accurate but open data. First, we construct a set of indices for forest conservation value based on quantitative information commonly found in forest inventories. These include the maturity of the trees, tree species composition, and site fertility. Secondly, using these data and accounting for connectivity between forest types, we investigate the patterns in conservation priority. For prioritization, we use Zonation, a method and software for spatial conservation prioritization. We then validate the prioritizations by comparing them to known areas of high conservation value. We show that the overall priority patterns are relatively consistent across different data sources and analysis options. However, the coarse data cannot be used to accurately identify the high-priority areas as it misses much of the fine-scale variation in forest structures. We conclude that, while inventory data collected for forestry purposes may be useful for forest conservation purposes, it needs to be detailed enough to be able to account for more fine-scaled features of high conservation value. These results underline the importance of making detailed inventory data publicly available. Finally, we discuss how the prioritization methodology we used could be integrated into operative forest management, especially in countries in the boreal zone.

  20. Towards accurate and precise estimates of lion density.

    PubMed

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2016-12-13

    Reliable estimates of animal density are fundamental to our understanding of ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation biology since wildlife authorities rely on these figures to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging species such as carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores. African lions (Panthera leo) provide an excellent example as although abundance indices have been shown to produce poor inferences, they continue to be used to estimate lion density and inform management and policy. In this study we adapt a Bayesian spatially explicit capture-recapture model to estimate lion density in the Maasai Mara National Reserve (MMNR) and surrounding conservancies in Kenya. We utilize sightings data from a three-month survey period to produce statistically rigorous spatial density estimates. Overall posterior mean lion density was estimated to be 16.85 (posterior standard deviation = 1.30) lions over one year of age per 100km(2) with a sex ratio of 2.2♀:1♂. We argue that such methods should be developed, improved and favored over less reliable methods such as track and call-up surveys. We caution against trend analyses based on surveys of differing reliability and call for a unified framework to assess lion numbers across their range in order for better informed management and policy decisions to be made. This article is protected by copyright. All rights reserved.

  1. Accurate color images: from expensive luxury to essential resource

    NASA Astrophysics Data System (ADS)

    Saunders, David R.; Cupitt, John

    2002-06-01

    Over ten years ago the National Gallery in London began a program to make digital images of paintings in the collection using a colorimetric imaging system. This was to provide a permanent record of the state of paintings against which future images could be compared to determine if any changes had occurred. It quickly became apparent that such images could be used not only for scientific purposes, but also in applications where transparencies were then being used, for example as source materials for printed books and catalogues or for computer-based information systems. During the 1990s we were involved in the development of a series of digital cameras that have combined the high color accuracy of the original 'scientific' imaging system with the familiarity and portability of a medium format camera. This has culminated in the program of digitization now in progress at the National Gallery. By the middle of 2001 we will have digitized all the major paintings in the collection at a resolution of 10,000 pixels along their longest dimension and with calibrated color; we are on target to digitize the whole collection by the end of 2002. The images are available on-line within the museum for consultation and so that Gallery departments can use the images in printed publications and on the Gallery's web- site. We describe the development of the imaging systems used at National Gallery and how the research we have conducted into high-resolution accurate color imaging has developed from being a peripheral, if harmless, research activity to becoming a central part of the Gallery's information and publication strategy. Finally, we discuss some outstanding issues, such as interfacing our color management procedures with the systems used by external organizations.

  2. Concurrent and Accurate Short Read Mapping on Multicore Processors.

    PubMed

    Martínez, Héctor; Tárraga, Joaquín; Medina, Ignacio; Barrachina, Sergio; Castillo, Maribel; Dopazo, Joaquín; Quintana-Ortí, Enrique S

    2015-01-01

    We introduce a parallel aligner with a work-flow organization for fast and accurate mapping of RNA sequences on servers equipped with multicore processors. Our software, HPG Aligner SA (HPG Aligner SA is an open-source application. The software is available at http://www.opencb.org, exploits a suffix array to rapidly map a large fraction of the RNA fragments (reads), as well as leverages the accuracy of the Smith-Waterman algorithm to deal with conflictive reads. The aligner is enhanced with a careful strategy to detect splice junctions based on an adaptive division of RNA reads into small segments (or seeds), which are then mapped onto a number of candidate alignment locations, providing crucial information for the successful alignment of the complete reads. The experimental results on a platform with Intel multicore technology report the parallel performance of HPG Aligner SA, on RNA reads of 100-400 nucleotides, which excels in execution time/sensitivity to state-of-the-art aligners such as TopHat 2+Bowtie 2, MapSplice, and STAR.

  3. Economic Value Of Accurate Assessments Of Hydrological Uncertainty

    NASA Astrophysics Data System (ADS)

    Ajami, N. K.; Sunding, D. L.; Hornberger, G. M.

    2008-12-01

    The improvement of techniques to assist in the sustainable management of water resource systems is a crucial issue since our limited resources are under ever increasing pressure. A proper understanding of the sources and effects of uncertainty is needed to achieve goals related to improvements in reliability and sustainability in water resource management and planning. To date, many hydrological techniques have been developed to improve the quality and accuracy of hydrological forecasts and to assess the uncertainty associated with these forecasts. The economic value of improvements in calculations of uncertainty associated with hydrological forecasts from the water supply and demand management perspective remains largely unknown. We first explore the effect of more accurate assessments of hydrological uncertainty on the management of water resources by using an integrated approach to identify and quantify the sources of uncertainty. Subsequently, we analyze the value of a more reliable water supply forecast by studying the change in moments of the distribution of final surface water deliveries. This allows us to calculate the economic value of improving the information about uncertainty provided to stakeholders, especially during drought spells.

  4. Accurate SHAPE-directed RNA secondary structure modeling, including pseudoknots

    PubMed Central

    Hajdin, Christine E.; Bellaousov, Stanislav; Huggins, Wayne; Leonard, Christopher W.; Mathews, David H.; Weeks, Kevin M.

    2013-01-01

    A pseudoknot forms in an RNA when nucleotides in a loop pair with a region outside the helices that close the loop. Pseudoknots occur relatively rarely in RNA but are highly overrepresented in functionally critical motifs in large catalytic RNAs, in riboswitches, and in regulatory elements of viruses. Pseudoknots are usually excluded from RNA structure prediction algorithms. When included, these pairings are difficult to model accurately, especially in large RNAs, because allowing this structure dramatically increases the number of possible incorrect folds and because it is difficult to search the fold space for an optimal structure. We have developed a concise secondary structure modeling approach that combines SHAPE (selective 2′-hydroxyl acylation analyzed by primer extension) experimental chemical probing information and a simple, but robust, energy model for the entropic cost of single pseudoknot formation. Structures are predicted with iterative refinement, using a dynamic programming algorithm. This melded experimental and thermodynamic energy function predicted the secondary structures and the pseudoknots for a set of 21 challenging RNAs of known structure ranging in size from 34 to 530 nt. On average, 93% of known base pairs were predicted, and all pseudoknots in well-folded RNAs were identified. PMID:23503844

  5. Raman Spectroscopy as an Accurate Probe of Defects in Graphene

    NASA Astrophysics Data System (ADS)

    Rodriguez-Nieva, Joaquin; Barros, Eduardo; Saito, Riichiro; Dresselhaus, Mildred

    2014-03-01

    Raman Spectroscopy has proved to be an invaluable non-destructive technique that allows us to obtain intrinsic information about graphene. Furthermore, defect-induced Raman features, namely the D and D' bands, have previously been used to assess the purity of graphitic samples. However, quantitative studies of the signatures of the different types of defects on the Raman spectra is still an open problem. Experimental results already suggest that the Raman intensity ratio ID /ID' may allow us to identify the nature of the defects. We study from a theoretical point of view the power and limitations of Raman spectroscopy in the study of defects in graphene. We derive an analytic model that describes the Double Resonance Raman process of disordered graphene samples, and which explicitly shows the role played by both the defect-dependent parameters as well as the experimentally-controlled variables. We compare our model with previous Raman experiments, and use it to guide new ways in which defects in graphene can be accurately probed with Raman spectroscopy. We acknowledge support from NSF grant DMR1004147.

  6. Can numerical simulations accurately predict hydrodynamic instabilities in liquid films?

    NASA Astrophysics Data System (ADS)

    Denner, Fabian; Charogiannis, Alexandros; Pradas, Marc; van Wachem, Berend G. M.; Markides, Christos N.; Kalliadasis, Serafim

    2014-11-01

    Understanding the dynamics of hydrodynamic instabilities in liquid film flows is an active field of research in fluid dynamics and non-linear science in general. Numerical simulations offer a powerful tool to study hydrodynamic instabilities in film flows and can provide deep insights into the underlying physical phenomena. However, the direct comparison of numerical results and experimental results is often hampered by several reasons. For instance, in numerical simulations the interface representation is problematic and the governing equations and boundary conditions may be oversimplified, whereas in experiments it is often difficult to extract accurate information on the fluid and its behavior, e.g. determine the fluid properties when the liquid contains particles for PIV measurements. In this contribution we present the latest results of our on-going, extensive study on hydrodynamic instabilities in liquid film flows, which includes direct numerical simulations, low-dimensional modelling as well as experiments. The major focus is on wave regimes, wave height and wave celerity as a function of Reynolds number and forcing frequency of a falling liquid film. Specific attention is paid to the differences in numerical and experimental results and the reasons for these differences. The authors are grateful to the EPSRC for their financial support (Grant EP/K008595/1).

  7. Evaluation of freely available ancillary data used for detailed soil mapping in Brazil

    NASA Astrophysics Data System (ADS)

    Samuel-Rosa, Alessandro; Anjos, Lúcia; Vasques, Gustavo; Heuvelink, Gerard

    2014-05-01

    Brazil is one of the world's largest food producers, and is home of both largest rainforest and largest supply of renewable fresh water on Earth. However, it lacks detailed soil information in extensive areas of the country. The best soil map covering the entire country was published at a scale of 1:5,000,000. Termination of governmental support for systematic soil mapping in the 1980's made detailed soil mapping of the whole country a very difficult task to accomplish. Nowadays, due to new user-driven demands (e.g. precision agriculture), most detailed soil maps are produced for small size areas. Many of them rely on as is freely available ancillary data, although their accuracy is usually not reported or unknown. Results from a validation exercise that we performed using ground control points from a small hilly catchment (20 km²) in Southern Brazil (-53.7995ºE, -29.6355ºN) indicate that most freely available ancillary data needs some type of correction before use. Georeferenced and orthorectified RapidEye imagery (recently acquired by the Brazilian government) has a horizontal accuracy (root-mean-square error, RMSE) of 37 m, which is worse than the value published in the metadata (32 m). Like any remote sensing imagery, RapidEye imagery needs to be correctly registered before its use for soil mapping. Topographic maps produced by the Brazilian Army and derived geological maps (scale of 1:25,000) have a horizontal accuracy of 65 m, which is more than four times the maximum value allowed by Brazilian legislation (15 m). Worse results were found for geological maps derived from 1:50,000 topographic maps (RMSE = 147 m), for which the maximum allowed value is 30 m. In most cases positional errors are of systematic origin and can be easily corrected (e.g., affine transformation). ASTER GDEM has many holes and is very noisy, making it of little use in the studied area. TOPODATA, which is SRTM kriged from originally 3 to 1 arc-second by the Brazilian National

  8. Precision targeted ruthenium(ii) luminophores; highly effective probes for cell imaging by stimulated emission depletion (STED) microscopy† †Electronic supplementary information (ESI) available: Detailed synthesis and characterisation of metal complexes and peptides. See DOI: 10.1039/c6sc02588a Click here for additional data file.

    PubMed Central

    Byrne, Aisling; Burke, Christopher S.

    2016-01-01

    Fluorescence microscopy has undergone a dramatic evolution over the past two decades with development of super-resolution far-field microscopy methods that break the light diffraction limited resolution of conventional microscopy, offering unprecedented opportunity to interrogate cellular processes at the nanoscale. However, these methods make special demands of the luminescent agents used for contrast and development of probes suited to super-resolution fluorescent methods is still relatively in its infancy. In spite of their many photophysical advantages, metal complex luminophores have not yet been considered as probes in this regard, where to date, only organic fluorophores have been applied. Here, we report the first examples of metal complex luminophores applied as probes for use in stimulated emission depletion (STED) microscopy. Exemplified with endoplasmic reticulum and nuclear targeting complexes we demonstrate that luminescent Ru(ii) polypyridyl complexes can, through signal peptide targeting, be precisely and selectively delivered to key cell organelles without the need for membrane permeabilization, to give high quality STED images of these organelles. Detailed features of the tubular ER structure are revealed and in the case of the nuclear targeting probe we exploit the molecular light switch properties of a dipyrido[3,2-a:2′,3′-c]phenazine containing complex which emits only on DNA/RNA binding to give outstanding STED contrast and resolution of the chromosomes within the nucleus. Comparing performance with a member of the AlexaFluor family commonly recommended for STED, we find that the performance of the ruthenium complexes is superior across both CW and gated STED microscopy methods in terms of image resolution and photostability. The large Stokes shifts of the Ru probes permit excellent matching of the stimulating depletion laser with their emission whilst avoiding anti-Stokes excitation. Their long lifetimes make them particularly amenable to

  9. Method for accurate growth of vertical-cavity surface-emitting lasers

    DOEpatents

    Chalmers, S.A.; Killeen, K.P.; Lear, K.L.

    1995-03-14

    The authors report a method for accurate growth of vertical-cavity surface-emitting lasers (VCSELs). The method uses a single reflectivity spectrum measurement to determine the structure of the partially completed VCSEL at a critical point of growth. This information, along with the extracted growth rates, allows imprecisions in growth parameters to be compensated for during growth of the remaining structure, which can then be completed with very accurate critical dimensions. Using this method, they can now routinely grow lasing VCSELs with Fabry-Perot cavity resonance wavelengths controlled to within 0.5%. 4 figs.

  10. Method for accurate growth of vertical-cavity surface-emitting lasers

    DOEpatents

    Chalmers, Scott A.; Killeen, Kevin P.; Lear, Kevin L.

    1995-01-01

    We report a method for accurate growth of vertical-cavity surface-emitting lasers (VCSELs). The method uses a single reflectivity spectrum measurement to determine the structure of the partially completed VCSEL at a critical point of growth. This information, along with the extracted growth rates, allows imprecisions in growth parameters to be compensated for during growth of the remaining structure, which can then be completed with very accurate critical dimensions. Using this method, we can now routinely grow lasing VCSELs with Fabry-Perot cavity resonance wavelengths controlled to within 0.5%.

  11. Photocopy of "sheet 4 of 8" showing window details, door ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of "sheet 4 of 8" showing window details, door sill detail, vertical wall sections, and cross sections thru front, side and rear elevations. - Badger Mountain Lookout, .125 mile northwest of Badger Mountain summit, East Wenatchee, Douglas County, WA

  12. 9. South abutment, detail of collapsed east wing wall; also ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. South abutment, detail of collapsed east wing wall; also detail of bottom lateral bracing and stringers; looking southeast - Dodd Ford Bridge, County Road 147 Spanning Blue Earth River, Amboy, Blue Earth County, MN

  13. 3. East side, details of north half of east web; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. East side, details of north half of east web; also details of roadway, railing and overhead bracing; looking northeast - Dodd Ford Bridge, County Road 147 Spanning Blue Earth River, Amboy, Blue Earth County, MN

  14. Lock 1 (Savannah River Lock), Elevation of North Wall, Detail ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Lock 1 (Savannah River Lock), Elevation of North Wall, Detail of Wall Foundation, Detail of Gate Pocket - Savannah & Ogeechee Barge Canal, Between Ogeechee & Savannah Rivers, Savannah, Chatham County, GA

  15. 19. INTERIOR WEST BAY DETAIL VIEW, FACING NORTHWEST. TRACKS FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. INTERIOR WEST BAY DETAIL VIEW, FACING NORTHWEST. TRACKS FOR MOVEMENT OF MATERIALS, TRUSS DETAIL, PERSONNEL DOOR, BAY DOOR AND FENCED STORAGE AREA. - NASA Industrial Plant, Missile Research Laboratory, 12214 Lakewood Boulevard, Downey, Los Angeles County, CA

  16. 7. EXTERIOR SOUTHEAST SIDE DETAIL VIEW, FACING SOUTHWEST. BUILDINGS 101, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. EXTERIOR SOUTHEAST SIDE DETAIL VIEW, FACING SOUTHWEST. BUILDINGS 101, 278 AND CANOPY 685 DETAILED. - NASA Industrial Plant, Missile Research Laboratory, 12214 Lakewood Boulevard, Downey, Los Angeles County, CA

  17. Detailed high-accuracy megavoltage transmission measurements: A sensitive experimental benchmark of EGSnrc

    SciTech Connect

    Ali, E. S. M.; McEwen, M. R.; Rogers, D. W. O.

    2012-10-15

    Purpose: There are three goals for this study: (a) to perform detailed megavoltage transmission measurements in order to identify the factors that affect the measurement accuracy, (b) to use the measured data as a benchmark for the EGSnrc system in order to identify the computational limiting factors, and (c) to provide data for others to benchmark Monte Carlo codes. Methods: Transmission measurements are performed at the National Research Council Canada on a research linac whose incident electron parameters are independently known. Automated transmission measurements are made on-axis, down to a transmission value of {approx}1.7%, for eight beams between 10 MV (the lowest stable MV beam on the linac) and 30 MV, using fully stopping Be, Al, and Pb bremsstrahlung targets and no fattening filters. To diversify energy differentiation, data are acquired for each beam using low-Z and high-Z attenuators (C and Pb) and Farmer chambers with low-Z and high-Z buildup caps. Experimental corrections are applied for beam drifts (2%), polarity (2.5% typical maximum, 6% extreme), ion recombination (0.2%), leakage (0.3%), and room scatter (0.8%)-the values in parentheses are the largest corrections applied. The experimental setup and the detectors are modeled using EGSnrc, with the newly added photonuclear attenuation included (up to a 5.6% effect). A detailed sensitivity analysis is carried out for the measured and calculated transmission data. Results: The developed experimental protocol allows for transmission measurements with 0.4% uncertainty on the smallest signals. Suggestions for accurate transmission measurements are provided. Measurements and EGSnrc calculations agree typically within 0.2% for the sensitivity of the transmission values to the detector details, to the bremsstrahlung target material, and to the incident electron energy. Direct comparison of the measured and calculated transmission data shows agreement better than 2% for C (3.4% for the 10 MV beam) and

  18. A predictable and accurate technique with elastomeric impression materials.

    PubMed

    Barghi, N; Ontiveros, J C

    1999-08-01

    A method for obtaining more predictable and accurate final impressions with polyvinylsiloxane impression materials in conjunction with stock trays is proposed and tested. Heavy impression material is used in advance for construction of a modified custom tray, while extra-light material is used for obtaining a more accurate final impression.

  19. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  20. Application of airborne LiDAR to the detailed geological mapping of mineralised terrain: the Troodos ophiolite, Cyprus

    NASA Astrophysics Data System (ADS)

    Grebby, S.; Cunningham, D.; Naden, J.; Tansey, K.

    2009-04-01

    The identification of mineral prospects is highly dependent upon the acquisition and synthesis of a wide variety of geological information, e.g., lithological, structural, geophysical and geochemical data. Conventionally, the majority of this information is acquired through field-based surveys. However, the quality of data collected in this manner is often affected by subjectivity and lack of detail due to coarse sampling over vast areas or inaccessible terrain. Both multi- and hyperspectral satellite remote sensing and the interpretation of aerial photography are typically used to help try and overcome some of the limitations associated with field-based surveys. However, the use of these approaches for the extraction of exploration data can be hindered by spatial and spectral limitations and by dense forest cover. A relatively new active remote sensing technology—known as airborne Light Detection And Ranging (LiDAR)—offers the possibility of acquiring accurate and high-resolution (ca. 1-4 m) topographic data through dense forest cover. The ability of LiDAR systems to detect multiple returns from the emission of a single laser pulse can be utilised to generate a high-resolution digital elevation model (DEM) of the ground beneath the forest canopy. Airborne LiDAR is an important tool for geoscience research, with a wide spectrum of applications including the mapping of landslides and faults to help inform hazard assessment studies. A LiDAR system can also provide an insight into the spectral and textural properties of surface materials using intensity data—a ratio of the reflected laser energy to the emitted laser energy. Where rocks outcrop, these properties are linked to the surface mineralogy and weathering at the LiDAR footprint scale. The ability to acquire two high-resolution datasets simultaneously from a single survey makes airborne LiDAR an attractive tool for the extraction of detailed geological information in terrain with either sparse or dense

  1. 46 CFR 90.20-1 - Marine engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their...

  2. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. (a) All marine engineering details relative to... 40 feet in length will be found in subchapter F (Marine Engineering) of this chapter....

  3. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  4. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  5. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  6. 46 CFR 70.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 3 2010-10-01 2010-10-01 false Electrical engineering details. 70.25-1 Section 70.25-1... General Electrical Engineering Requirements § 70.25-1 Electrical engineering details. All electrical engineering details and installations shall be designed and installed in accordance with subchapter...

  7. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  8. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  9. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  10. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  11. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  12. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  13. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  14. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  15. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Marine engineering details. 24.20-1 Section 24.20-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY UNINSPECTED VESSELS GENERAL PROVISIONS General Marine Engineering Requirements § 24.20-1 Marine engineering details. All marine engineering details relative to...

  16. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Marine engineering details. 24.20-1 Section 24.20-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY UNINSPECTED VESSELS GENERAL PROVISIONS General Marine Engineering Requirements § 24.20-1 Marine engineering details. All marine engineering details relative to...

  17. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  18. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  19. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  20. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Marine engineering details. 24.20-1 Section 24.20-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY UNINSPECTED VESSELS GENERAL PROVISIONS General Marine Engineering Requirements § 24.20-1 Marine engineering details. (a) All marine engineering details relative...

  1. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  2. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Marine engineering details. 24.20-1 Section 24.20-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY UNINSPECTED VESSELS GENERAL PROVISIONS General Marine Engineering Requirements § 24.20-1 Marine engineering details. All marine engineering details relative to...

  3. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be designed... cables or tubes against other parts. (d) Each element of the flight control system must have...

  4. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be designed... cables or tubes against other parts. (d) Each element of the flight control system must have...

  5. 14 CFR 25.685 - Control system details.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 25.685 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.685 Control system details. (a) Each detail of each control system must be designed and installed to prevent...

  6. 14 CFR 25.685 - Control system details.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Control system details. 25.685 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Control Systems § 25.685 Control system details. (a) Each detail of each control system must be designed and installed to prevent...

  7. 14 CFR 29.685 - Control system details.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 29.685 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Design and Construction Control Systems § 29.685 Control system details. (a) Each detail of each control system must be designed to prevent jamming, chafing,...

  8. 14 CFR 29.685 - Control system details.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Control system details. 29.685 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Design and Construction Control Systems § 29.685 Control system details. (a) Each detail of each control system must be designed to prevent jamming, chafing,...

  9. Shrinking the Psoriasis Assessment Gap: Early Gene-Expression Profiling Accurately Predicts Response to Long-Term Treatment.

    PubMed

    Correa da Rosa, Joel; Kim, Jaehwan; Tian, Suyan; Tomalin, Lewis E; Krueger, James G; Suárez-Fariñas, Mayte

    2017-02-01

    There is an "assessment gap" between the moment a patient's response to treatment is biologically determined and when a response can actually be determined clinically. Patients' biochemical profiles are a major determinant of clinical outcome for a given treatment. It is therefore feasible that molecular-level patient information could be used to decrease the assessment gap. Thanks to clinically accessible biopsy samples, high-quality molecular data for psoriasis patients are widely available. Psoriasis is therefore an excellent disease for testing the prospect of predicting treatment outcome from molecular data. Our study shows that gene-expression profiles of psoriasis skin lesions, taken in the first 4 weeks of treatment, can be used to accurately predict (>80% area under the receiver operating characteristic curve) the clinical endpoint at 12 weeks. This could decrease the psoriasis assessment gap by 2 months. We present two distinct prediction modes: a universal predictor, aimed at forecasting the efficacy of untested drugs, and specific predictors aimed at forecasting clinical response to treatment with four specific drugs: etanercept, ustekinumab, adalimumab, and methotrexate. We also develop two forms of prediction: one from detailed, platform-specific data and one from platform-independent, pathway-based data. We show that key biomarkers are associated with responses to drugs and doses and thus provide insight into the biology of pathogenesis reversion.

  10. How utilities can achieve more accurate decommissioning cost estimates

    SciTech Connect

    Knight, R.

    1999-07-01

    The number of commercial nuclear power plants that are undergoing decommissioning coupled with the economic pressure of deregulation has increased the focus on adequate funding for decommissioning. The introduction of spent-fuel storage and disposal of low-level radioactive waste into the cost analysis places even greater concern as to the accuracy of the fund calculation basis. The size and adequacy of the decommissioning fund have also played a major part in the negotiations for transfer of plant ownership. For all of these reasons, it is important that the operating plant owner reduce the margin of error in the preparation of decommissioning cost estimates. To data, all of these estimates have been prepared via the building block method. That is, numerous individual calculations defining the planning, engineering, removal, and disposal of plant systems and structures are performed. These activity costs are supplemented by the period-dependent costs reflecting the administration, control, licensing, and permitting of the program. This method will continue to be used in the foreseeable future until adequate performance data are available. The accuracy of the activity cost calculation is directly related to the accuracy of the inventory of plant system component, piping and equipment, and plant structural composition. Typically, it is left up to the cost-estimating contractor to develop this plant inventory. The data are generated by searching and analyzing property asset records, plant databases, piping and instrumentation drawings, piping system isometric drawings, and component assembly drawings. However, experience has shown that these sources may not be up to date, discrepancies may exist, there may be missing data, and the level of detail may not be sufficient. Again, typically, the time constraints associated with the development of the cost estimate preclude perfect resolution of the inventory questions. Another problem area in achieving accurate cost

  11. History and progress on accurate measurements of the Planck constant

    NASA Astrophysics Data System (ADS)

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10-34 J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, NA. As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 108 from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the improved

  12. History and progress on accurate measurements of the Planck constant.

    PubMed

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10(-34) J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, N(A). As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 10(8) from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  13. Accurate compressed look up table method for CGH in 3D holographic display.

    PubMed

    Gao, Chuan; Liu, Juan; Li, Xin; Xue, Gaolei; Jia, Jia; Wang, Yongtian

    2015-12-28

    Computer generated hologram (CGH) should be obtained with high accuracy and high speed in 3D holographic display, and most researches focus on the high speed. In this paper, a simple and effective computation method for CGH is proposed based on Fresnel diffraction theory and look up table. Numerical simulations and optical experiments are performed to demonstrate its feasibility. The proposed method can obtain more accurate reconstructed images with lower memory usage compared with split look up table method and compressed look up table method without sacrificing the computational speed in holograms generation, so it is called accurate compressed look up table method (AC-LUT). It is believed that AC-LUT method is an effective method to calculate the CGH of 3D objects for real-time 3D holographic display where the huge information data is required, and it could provide fast and accurate digital transmission in various dynamic optical fields in the future.

  14. 3D FaceCam: a fast and accurate 3D facial imaging device for biometrics applications

    NASA Astrophysics Data System (ADS)

    Geng, Jason; Zhuang, Ping; May, Patrick; Yi, Steven; Tunnell, David

    2004-08-01

    Human faces are fundamentally three-dimensional (3D) objects, and each face has its unique 3D geometric profile. The 3D geometric features of a human face can be used, together with its 2D texture, for rapid and accurate face recognition purposes. Due to the lack of low-cost and robust 3D sensors and effective 3D facial recognition (FR) algorithms, almost all existing FR systems use 2D face images. Genex has developed 3D solutions that overcome the inherent problems in 2D while also addressing limitations in other 3D alternatives. One important aspect of our solution is a unique 3D camera (the 3D FaceCam) that combines multiple imaging sensors within a single compact device to provide instantaneous, ear-to-ear coverage of a human face. This 3D camera uses three high-resolution CCD sensors and a color encoded pattern projection system. The RGB color information from each pixel is used to compute the range data and generate an accurate 3D surface map. The imaging system uses no moving parts and combines multiple 3D views to provide detailed and complete 3D coverage of the entire face. Images are captured within a fraction of a second and full-frame 3D data is produced within a few seconds. This described method provides much better data coverage and accuracy in feature areas with sharp features or details (such as the nose and eyes). Using this 3D data, we have been able to demonstrate that a 3D approach can significantly improve the performance of facial recognition. We have conducted tests in which we have varied the lighting conditions and angle of image acquisition in the "field." These tests have shown that the matching results are significantly improved when enrolling a 3D image rather than a single 2D image. With its 3D solutions, Genex is working toward unlocking the promise of powerful 3D FR and transferring FR from a lab technology into a real-world biometric solution.

  15. New records and detailed distribution and abundance of selected arthropod species collected between 1999 and 2011 in Azorean native forests

    PubMed Central

    Gaspar, Clara; Crespo, Luís Carlos Fonseca; Rigal, François; Cardoso, Pedro; Pereira, Fernando; Rego, Carla; Amorim, Isabel R.; Melo, Catarina; Aguiar, Carlos; André, Genage; Mendonça, Enésima P.; Ribeiro, Sérvio; Hortal, Joaquín; Santos, Ana M.C.; Barcelos, Luís; Enghoff, Henrik; Mahnert, Volker; Pita, Margarida T.; Ribes, Jordi; Baz, Arturo; Sousa, António B.; Vieira, Virgílio; Wunderlich, Jörg; Parmakelis, Aristeidis; Whittaker, Robert J.; Quartau, José Alberto; Serrano, Artur R.M.; Triantis, Kostas A.

    2016-01-01

    Abstract Background In this contribution we present detailed distribution and abundance data for arthropod species identified during the BALA – Biodiversity of Arthropods from the Laurisilva of the Azores (1999-2004) and BALA2 projects (2010-2011) from 18 native forest fragments in seven of the nine Azorean islands (all excluding Graciosa and Corvo islands, which have no native forest left). New information Of the total 286 species identified, 81% were captured between 1999 and 2000, a period during which only 39% of all the samples were collected. On average, arthropod richness for each island increased by 10% during the time frame of these projects. The classes Arachnida, Chilopoda and Diplopoda represent the most remarkable cases of new island records, with more than 30% of the records being novelties. This study stresses the need to expand the approaches applied in these projects to other habitats in the Azores, and more importantly to other less surveyed taxonomic groups (e.g. Diptera and Hymenoptera). These steps are fundamental for getting a more accurate assessment of biodiversity in the archipelago. PMID:28174509

  16. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy

    PubMed Central

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy. PMID:25628867

  17. Further ALMA observations and detailed modeling of the Red Rectangle

    PubMed Central

    Bujarrabal, V.; Castro-Carrizo, A.; Alcolea, J.; Santander-García, M.; Van Winckel, H.; Sánchez Contreras, C.

    2016-01-01

    Aims We aim to study the rotating and expanding gas in the Red Rectangle, which is a well known object that recently left the asymptotic giant branch (AGB) phase. We analyze the properties of both components and the relation between them. Rotating disks have been very elusive in post-AGB nebulae, in which gas is almost always found to be in expansion. Methods We present new high-quality ALMA observations of C17O J=6−5 and H13CN J=4−3 line emission and results from a new reduction of already published 13CO J=3−2 data. A detailed model fitting of all the molecular line data, including previous maps and single-dish observations of lines of CO, CII, and CI, was performed using a sophisticated code that includes an accurate nonlocal treatment of radiative transfer in 2D. These observations (of low- and high-opacity lines requiring various degrees of excitation) and the corresponding modeling allowed us to deepen the analysis of the nebular properties. We also stress the uncertainties, particularly in the determination of the boundaries of the CO-rich gas and some properties of the outflow. Results We confirm the presence of a rotating equatorial disk and an outflow, which is mainly formed of gas leaving the disk. The mass of the disk is ~ 0.01 M⊙, and that of the CO-rich outflow is around ten times smaller. High temperatures of ≳ 100 K are derived for most components. From comparison of the mass values, we roughly estimate the lifetime of the rotating disk, which is found to be of about 10000 yr. Taking data of a few other post-AGB composite nebulae into account, we find that the lifetimes of disks around post-AGB stars typically range between 5000 and more than 20000 yr. The angular momentum of the disk is found to be high, ~ 9 M⊙ AU km s−1, which is comparable to that of the stellar system at present. Our observations of H13CN show a particularly wide velocity dispersion and indicate that this molecule is only abundant in the inner Keplerian disk, at

  18. Towards a scalable and accurate quantum approach for describing vibrations of molecule–metal interfaces

    PubMed Central

    Madebene, Bruno; Ulusoy, Inga; Mancera, Luis; Scribano, Yohann; Chulkov, Sergey

    2011-01-01

    Summary We present a theoretical framework for the computation of anharmonic vibrational frequencies for large systems, with a particular focus on determining adsorbate frequencies from first principles. We give a detailed account of our local implementation of the vibrational self-consistent field approach and its correlation corrections. We show that our approach is both robust, accurate and can be easily deployed on computational grids in order to provide an efficient computational tool. We also present results on the vibrational spectrum of hydrogen fluoride on pyrene, on the thiophene molecule in the gas phase, and on small neutral gold clusters. PMID:22003450

  19. Improving light propagation Monte Carlo simulations with accurate 3D modeling of skin tissue

    SciTech Connect

    Paquit, Vincent C; Price, Jeffery R; Meriaudeau, Fabrice; Tobin Jr, Kenneth William

    2008-01-01

    In this paper, we present a 3D light propagation model to simulate multispectral reflectance images of large skin surface areas. In particular, we aim to simulate more accurately the effects of various physiological properties of the skin in the case of subcutaneous vein imaging compared to existing models. Our method combines a Monte Carlo light propagation model, a realistic three-dimensional model of the skin using parametric surfaces and a vision system for data acquisition. We describe our model in detail, present results from the Monte Carlo modeling and compare our results with those obtained with a well established Monte Carlo model and with real skin reflectance images.

  20. Detailed Spectroscopy of 46Ca with the GRIFFIN Spectrometer

    NASA Astrophysics Data System (ADS)

    Pore, Jennifer; Griffin Collaboration Collaboration

    2016-09-01

    The neutron-rich calcium isotopes are currently a new frontier for modern ab-initio calculations based on NN and 3N forces. Detailed experimental data from these nuclei is necessary for a comprehensive understanding of the region. Many excited states in 46Ca have been previously identified by various reaction mechanisms, most notably from (p ,p') and (p , t) reactions, but many spins are only tentatively assigned or not measured and very few gamma-ray transitions have been placed in the level scheme. A high-statistics data set of the 46K decay into low-lying levels of 46Ca was taken with the new GRIFFIN spectrometer located at TRIUMF-ISAC. The level scheme of 46Ca has been greatly expanded to include 160 new gamma-ray transitions and 12 new excited states. Angular correlations between cascading gamma rays have been investigated to obtain information about the spins of the excited states. An overview of the experiment and a discussion of the results will be presented.

  1. Detailed analysis of structure and particle trajectories in sheared suspensions

    NASA Astrophysics Data System (ADS)

    Morris, Jeffrey; Katyal, Bhavana

    1999-11-01

    The structure and particle dynamics of sheared suspensions of hard spheres over a range of shear strength to Brownain motion (Péclet number, Pe) have been studied by detailed analysis of extended sampling of Stokesian Dynamics simulations of simple shear. The emphasis is upon large Pe. The structure has been analyzed by decomposition of the pair distribution function, g(r), into spherical harmonics; the harmonics are a complete set for the decompositon. The results indicate a profound and very marked change in structure due to shearing. It is shown that as Pe increases, the structure is increasingly distorted from teh equilibrium spherical symmetry and the number of harmonics required to recompose the original data to within an arbitrary accuracy increases, and this variation depends upon particle fraction. We present information on the content of the dominant harmonics as a function of radial distance for a pair, and interpret the results in terms of preferred directions in the material. Dynamic particle trajectories at time scales long relative to that used for the Brownian step are analyzed in a novel fashion by simple differential geometric measures, such as root mean square path curvature and torsion. Preliminary results illustrate that the path variation from mean flow correlates with the particle stress.

  2. Can people strategically control the encoding and retrieval of some morphologic and typographic details of words?

    PubMed

    Jou, Jerwen; Cortes, Hector M

    2012-09-01

    This study investigated whether the encoding and retrieval of plurality information and letter-case information of words in recognition memory can be inhibited. Response-deadline experiments (Hintzman & Curran, 1994) using single words have indicted a controlled processing mode, whereas studies using meaningful sentences (e.g., Jou & Harris, 1991) have indicated an automatic mode of processing plurality information. Two similar opposing views have existed on the processing of letter-case information. The abstractionist view contends that we retain the abstract lexical information and discard the superficial perceptual case information. The proceduralist view holds that perceptual details cannot be separated from the lexical information. Using an intentional and an attention-diverted learning procedure and instructions to ignore plurality and case, we found that subjects experienced a consistent interference effect of changed plurality from study to test but slightly less interference from changed case, suggesting strong automaticity for plurality, and comparatively less automaticity, for case processing.

  3. Problems in publishing accurate color in IEEE journals.

    PubMed

    Vrhel, Michael J; Trussell, H J

    2002-01-01

    To demonstrate the performance of color image processing algorithms, it is desirable to be able to accurately display color images in archival publications. In poster presentations, the authors have substantial control of the printing process, although little control of the illumination. For journal publication, the authors must rely on professional intermediaries (printers) to accurately reproduce their results. Our previous work describes requirements for accurately rendering images using your own equipment. This paper discusses the problems of dealing with intermediaries and offers suggestions for improved communication and rendering.

  4. M-X Environmental Technical Report. Socioeconomic Impact Estimates for Texas ROI Counties. Detailed Tables.

    DTIC Science & Technology

    1980-12-22

    and identify by block number) MX Socioeconomic Impact Siting Analysis Texas Ř 20. ABSTRACT (Continue on reveree aide If necessary end Identify by...block number) The detailed socioeconomic impacts reported in this volume form background information for the analysis contained in the M-X Deployment...information for the analysis contained in the M-X Deployment Area Selection and Land Withdrawal/Acquisition Draft Environmental Impact Statement (DEIS

  5. Advanced Meteor radar at Tirupati: System details and first results

    NASA Astrophysics Data System (ADS)

    Sunkara, Eswaraiah; Gurubaran, Subramanian; Sundararaman, Sathishkumar; Venkat Ratnam, Madineni; Karanam, Kishore Kumar; Eethamakula, Kosalendra; Vijaya Bhaskara Rao, S.

    An advanced meteor radar viz., Enhanced Meteor Detection Radar (EMDR) operating at 35.25 MHz is installed at Sri Venkateswara University (SVU), Tirupati (13.63oN, 79.4oE), India, in the month of August 2013. Present communication describes the need for the meteor radar at present location, system description, its measurement techniques, its variables and comparison of measured mean winds with contemporary radars over the Indian region. The present radar site is selected to fill the blind region of Gadanki (13.5oN, 79.2oE) MST radar, which covers mesosphere and lower thermosphere (MLT) region (70-110 km). By modifying the receiving antenna structure and elements, this radar is capable of providing accurate wind information between 70 and 110 km unlike other similar radars. Height covering region is extended by increasing the meteor counting capacity by modifying the receiving antenna structure and elements and hence its wind estimation limits extended below and above of 80 and 100 km, respectively. In the present study, we also made comparison of horizontal winds in the MLT region with those measured by similar and different (MST and MF radars) techniques over the Indian region including the model (HWM 07) data sets. The comparison showed a very good agreement between the overlapping altitudes (82-98 km) of different radars. Zonal winds compared very well as that of meridional winds. The observed discrepancies and limitations in the wind measurement are discussed. This new radar is expected to play important role in understanding the vertical and lateral coupling by forming a unique local network.

  6. The Iowa Flood Center's River Stage Sensors—Technical Details

    NASA Astrophysics Data System (ADS)

    Niemeier, J. J.; Kruger, A.; Ceynar, D.; Fahim Rezaei, H.

    2012-12-01

    The Iowa Flood Center (IFC), along with support from the Iowa Department of Transportation (DOT) and the Iowa Department of Natural Resources (DNR) have developed a bridge-mounted river stage sensor. Each sensor consists of an ultrasonic distance measuring module, cellular modem, a GPS unit that provides accurate time and an embedded controller that orchestrates the sensors' operation. A sensor is powered by a battery and solar panel along with a solar charge controller. All the components are housed in/on a sturdy metal box that is then mounted on the side of a bridge. Additionally, each sensor incorporates a water-intrusion sensor and an internal temperature sensor. In operation, the microcontroller wakes, and turns on the electronics every 15 minutes and then measures the distance between the ultrasonic sensor and the water surface. Several measurements are averaged and transmitted along with system health information (battery voltage, state of water intrusion sensor, and internal temperature) via cellular modem to remote servers on the internet. The microcontroller then powers the electronics down and enters a sleep/power savings mode. The sensor's firmware allows the remote server to adjust the measurement rate to 5, 15, and 60 minutes. Further, sensors maintain a 24-day buffer of previous measurements. If a sensor could not successfully transmit its data because of cellular network connection problems, it will transmit the backlog on subsequent transmissions. We paid meticulous attention to all engineering aspects and sensors are very robust and have operated essentially continuously through two Iowa winters and summers, including the 2012 record-breaking warm summer.

  7. Economic analysis of a randomized trial of academic detailing interventions to improve use of antihypertensive medications.

    PubMed

    Simon, Steven R; Rodriguez, Hector P; Majumdar, Sumit R; Kleinman, Ken; Warner, Cheryl; Salem-Schatz, Susanne; Miroshnik, Irina; Soumerai, Stephen B; Prosser, Lisa A

    2007-01-01

    The authors estimated the costs and cost savings of implementing a program of mailed practice guidelines and single-visit individual and group academic detailing interventions in a randomized controlled trial to improve the use of antihypertensive medications. Analyses took the perspective of the payer. The total costs of the mailed guideline, group detailing, and individual detailing interventions were estimated at 1000 dollars, 5500 dollars, and 7200 dollars, respectively, corresponding to changes in the average daily per person drug costs of -0.0558 dollars (95% confidence interval, -0.1365 dollars to 0.0250 dollars) in the individual detailing intervention and -0.0001 dollars (95% confidence interval, -0.0803 dollars to 0.0801 dollars) in the group detailing intervention, compared with the mailed intervention. For all patients with incident hypertension in the individual detailing arm, the annual total drug cost savings were estimated at 21,711 dollars (95% confidence interval, 53,131 dollars savings to 9709 dollars cost increase). Information on costs of academic detailing could assist with health plan decision making in developing interventions to improve prescribing.

  8. A detailed procedure for the use of small-scale photography in land use classification

    NASA Technical Reports Server (NTRS)

    Vegas, P. L.

    1974-01-01

    A procedure developed to produce accurate land use maps from available high-altitude, small-scale photography in a cost-effective manner is presented. An alternative procedure, for use when the capability for updating the resultant land use map is not required, is also presented. The technical approach is discussed in detail, and personnel and equipment needs are analyzed. Accuracy percentages are listed, and costs are cited. The experiment land use classification categories are explained, and a proposed national land use classification system is recommended.

  9. Interactive NCORP Map Details Community Research Sites | Division of Cancer Prevention

    Cancer.gov

    An interactive map of the NCI Community Oncology Research Program (NCORP) with detailed information on hundreds of community sites that take part in clinical trials is available on the NCORP website. NCORP Map NCORP Community Sites, Minority/Underserved Community Sites, and Research Bases |

  10. 78 FR 69669 - Intent To Conduct a Detailed Economic Impact Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ... From the Federal Register Online via the Government Publishing Office EXPORT-IMPORT BANK Intent To Conduct a Detailed Economic Impact Analysis This notice is to inform the public that the Export-Import... comments on this transaction by email to economic.impact@exim.gov or by mail to 811 Vermont Avenue...

  11. 78 FR 47317 - Intent To Conduct a Detailed Economic Impact Analysis

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-05

    ... From the Federal Register Online via the Government Publishing Office EXPORT-IMPORT BANK OF THE UNITED STATES Intent To Conduct a Detailed Economic Impact Analysis This notice is to inform the public... United States. Interested parties may submit comments on this transaction by email to...

  12. Characteristics of Doctoral Scientists and Engineers in the United States: 1999. Detailed Statistical Tables.

    ERIC Educational Resources Information Center

    Kang, Kelly H.

    This report presents detailed statistical tables that reflect the demographic and employment characteristics of doctoral degree holding scientists and engineers in the United States. The data were collected from the 1999 Survey of Doctorate Recipients (SDR) with the purpose of providing information to researchers and policymakers in their decision…

  13. PBF Control Building (PER619). Interior detail of control room's severe ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    PBF Control Building (PER-619). Interior detail of control room's severe fuel damage instrument panel. Indicators provided real-time information about test underway in PBF reactor. Note audio speaker. Date: May 2004. INEEL negative no, HD-41-7-4 - Idaho National Engineering Laboratory, SPERT-I & Power Burst Facility Area, Scoville, Butte County, ID

  14. Conceptual Distinctiveness Supports Detailed Visual Long-Term Memory for Real-World Objects

    ERIC Educational Resources Information Center

    Konkle, Talia; Brady, Timothy F.; Alvarez, George A.; Oliva, Aude

    2010-01-01

    Humans have a massive capacity to store detailed information in visual long-term memory. The present studies explored the fidelity of these visual long-term memory representations and examined how conceptual and perceptual features of object categories support this capacity. Observers viewed 2,800 object images with a different number of exemplars…

  15. Controlling Hay Fever Symptoms with Accurate Pollen Counts

    MedlinePlus

    ... Library ▸ Hay fever and pollen counts Share | Controlling Hay Fever Symptoms with Accurate Pollen Counts This article has ... Pongdee, MD, FAAAAI Seasonal allergic rhinitis known as hay fever is caused by pollen carried in the air ...

  16. Digital system accurately controls velocity of electromechanical drive

    NASA Technical Reports Server (NTRS)

    Nichols, G. B.

    1965-01-01

    Digital circuit accurately regulates electromechanical drive mechanism velocity. The gain and phase characteristics of digital circuits are relatively unimportant. Control accuracy depends only on the stability of the input signal frequency.

  17. WARP: accurate retrieval of shapes using phase of fourier descriptors and time warping distance.

    PubMed

    Bartolini, Ilaria; Ciaccia, Paolo; Patella, Marco

    2005-01-01

    Effective and efficient retrieval of similar shapes from large image databases is still a challenging problem in spite of the high relevance that shape information can have in describing image contents. In this paper, we propose a novel Fourier-based approach, called WARP, for matching and retrieving similar shapes. The unique characteristics of WARP are the exploitation of the phase of Fourier coefficients and the use of the Dynamic Time Warping (DTW) distance to compare shape descriptors. While phase information provides a more accurate description of object boundaries than using only the amplitude of Fourier coefficients, the DTW distance permits us to accurately match images even in the presence of (limited) phase shiftings. In terms of classical precision/recall measures, we experimentally demonstrate that WARP can gain, say, up to 35 percent in precision at a 20 percent recall level with respect to Fourier-based techniques that use neither phase nor DTW distance.

  18. Characterizing accuracy of total hemoglobin recovery using contrast-detail analysis in 3D image-guided near infrared spectroscopy with the boundary element method

    PubMed Central

    Ghadyani, Hamid R.; Srinivasan, Subhadra; Pogue, Brian W.; Paulsen, Keith D.

    2010-01-01

    The quantification of total hemoglobin concentration (HbT) obtained from multi-modality image-guided near infrared spectroscopy (IG-NIRS) was characterized using the boundary element method (BEM) for 3D image reconstruction. Multi-modality IG-NIRS systems use a priori information to guide the reconstruction process. While this has been shown to improve resolution, the effect on quantitative accuracy is unclear. Here, through systematic contrast-detail analysis, the fidelity of IG-NIRS in quantifying HbT was examined using 3D simulations. These simulations show that HbT could be recovered for medium sized (20mm in 100mm total diameter) spherical inclusions with an average error of 15%, for the physiologically relevant situation of 2:1 or higher contrast between background and inclusion. Using partial 3D volume meshes to reduce the ill-posed nature of the image reconstruction, inclusions as small as 14mm could be accurately quantified with less than 15% error, for contrasts of 1.5 or higher. This suggests that 3D IG-NIRS provides quantitatively accurate results for sizes seen early in treatment cycle of patients undergoing neoadjuvant chemotherapy when the tumors are larger than 30mm. PMID:20720975

  19. Accurate tracking of high dynamic vehicles with translated GPS

    NASA Astrophysics Data System (ADS)

    Blankshain, Kenneth M.

    The GPS concept and the translator processing system (TPS) which were developed for accurate and cost-effective tracking of various types of high dynamic expendable vehicles are described. A technique used by the translator processing system (TPS) to accomplish very accurate high dynamic tracking is presented. Automatic frequency control and fast Fourier transform processes are combined to track 100 g acceleration and 100 g/s jerk with 1-sigma velocity measurement error less than 1 ft/sec.

  20. Accurate Alignment of Plasma Channels Based on Laser Centroid Oscillations

    SciTech Connect

    Gonsalves, Anthony; Nakamura, Kei; Lin, Chen; Osterhoff, Jens; Shiraishi, Satomi; Schroeder, Carl; Geddes, Cameron; Toth, Csaba; Esarey, Eric; Leemans, Wim

    2011-03-23

    A technique has been developed to accurately align a laser beam through a plasma channel by minimizing the shift in laser centroid and angle at the channel outptut. If only the shift in centroid or angle is measured, then accurate alignment is provided by minimizing laser centroid motion at the channel exit as the channel properties are scanned. The improvement in alignment accuracy provided by this technique is important for minimizing electron beam pointing errors in laser plasma accelerators.