Science.gov

Sample records for accurate detailed information

  1. Accurate fundamental parameters and detailed abundance patterns from spectroscopy of 93 solar-type Kepler targets

    NASA Astrophysics Data System (ADS)

    Bruntt, H.; Basu, S.; Smalley, B.; Chaplin, W. J.; Verner, G. A.; Bedding, T. R.; Catala, C.; Gazzano, J.-C.; Molenda-Żakowicz, J.; Thygesen, A. O.; Uytterhoeven, K.; Hekker, S.; Huber, D.; Karoff, C.; Mathur, S.; Mosser, B.; Appourchaux, T.; Campante, T. L.; Elsworth, Y.; García, R. A.; Handberg, R.; Metcalfe, T. S.; Quirion, P.-O.; Régulo, C.; Roxburgh, I. W.; Stello, D.; Christensen-Dalsgaard, J.; Kawaler, S. D.; Kjeldsen, H.; Morris, R. L.; Quintana, E. V.; Sanderfer, D. T.

    2012-06-01

    We present a detailed spectroscopic study of 93 solar-type stars that are targets of the NASA/Kepler mission and provide detailed chemical composition of each target. We find that the overall metallicity is well represented by Fe lines. Relative abundances of light elements (CNO) and α elements are generally higher for low-metallicity stars. Our spectroscopic analysis benefits from the accurately measured surface gravity from the asteroseismic analysis of the Kepler light curves. The accuracy on the log g parameter is better than 0.03 dex and is held fixed in the analysis. We compare our Teff determination with a recent colour calibration of VT-KS [TYCHO V magnitude minus Two Micron All Sky Survey (2MASS) KS magnitude] and find very good agreement and a scatter of only 80 K, showing that for other nearby Kepler targets, this index can be used. The asteroseismic log g values agree very well with the classical determination using Fe I-Fe II balance, although we find a small systematic offset of 0.08 dex (asteroseismic log g values are lower). The abundance patterns of metals, α elements and the light elements (CNO) show that a simple scaling by [Fe/H] is adequate to represent the metallicity of the stars, except for the stars with metallicity below -0.3, where α-enhancement becomes important. However, this is only important for a very small fraction of the Kepler sample. We therefore recommend that a simple scaling with [Fe/H] be employed in the asteroseismic analyses of large ensembles of solar-type stars.

  2. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  3. 78 FR 34604 - Submitting Complete and Accurate Information

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-10

    ... COMMISSION 10 CFR Part 50 Submitting Complete and Accurate Information AGENCY: Nuclear Regulatory Commission... accurate information as would a licensee or an applicant for a license.'' DATES: Submit comments by August... may submit comments by any of the following methods (unless this document describes a different...

  4. DETAIL OF PLAQUE WITH ADDITIONAL DESIGN AND CONSTRUCTION INFORMATION, SOUTHEAST ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OF PLAQUE WITH ADDITIONAL DESIGN AND CONSTRUCTION INFORMATION, SOUTHEAST ABUTMENT - Connecticut Avenue Bridge, Spans Rock Creek & Potomac Parkway at Connecticut Avenue, Washington, District of Columbia, DC

  5. A review of the kinetic detail required for accurate predictions of normal shock waves

    NASA Technical Reports Server (NTRS)

    Muntz, E. P.; Erwin, Daniel A.; Pham-Van-diep, Gerald C.

    1991-01-01

    Several aspects of the kinetic models used in the collision phase of Monte Carlo direct simulations have been studied. Accurate molecular velocity distribution function predictions require a significantly increased number of computational cells in one maximum slope shock thickness, compared to predictions of macroscopic properties. The shape of the highly repulsive portion of the interatomic potential for argon is not well modeled by conventional interatomic potentials; this portion of the potential controls high Mach number shock thickness predictions, indicating that the specification of the energetic repulsive portion of interatomic or intermolecular potentials must be chosen with care for correct modeling of nonequilibrium flows at high temperatures. It has been shown for inverse power potentials that the assumption of variable hard sphere scattering provides accurate predictions of the macroscopic properties in shock waves, by comparison with simulations in which differential scattering is employed in the collision phase. On the other hand, velocity distribution functions are not well predicted by the variable hard sphere scattering model for softer potentials at higher Mach numbers.

  6. Detailed and Highly Accurate 3d Models of High Mountain Areas by the Macs-Himalaya Aerial Camera Platform

    NASA Astrophysics Data System (ADS)

    Brauchle, J.; Hein, D.; Berger, R.

    2015-04-01

    Remote sensing in areas with extreme altitude differences is particularly challenging. In high mountain areas specifically, steep slopes result in reduced ground pixel resolution and degraded quality in the DEM. Exceptionally high brightness differences can in part no longer be imaged by the sensors. Nevertheless, detailed information about mountainous regions is highly relevant: time and again glacier lake outburst floods (GLOFs) and debris avalanches claim dozens of victims. Glaciers are sensitive to climate change and must be carefully monitored. Very detailed and accurate 3D maps provide a basic tool for the analysis of natural hazards and the monitoring of glacier surfaces in high mountain areas. There is a gap here, because the desired accuracies are often not achieved. It is for this reason that the DLR Institute of Optical Sensor Systems has developed a new aerial camera, the MACS-Himalaya. The measuring unit comprises four camera modules with an overall aperture angle of 116° perpendicular to the direction of flight. A High Dynamic Range (HDR) mode was introduced so that within a scene, bright areas such as sun-flooded snow and dark areas such as shaded stone can be imaged. In 2014, a measuring survey was performed on the Nepalese side of the Himalayas. The remote sensing system was carried by a Stemme S10 motor glider. Amongst other targets, the Seti Valley, Kali-Gandaki Valley and the Mt. Everest/Khumbu Region were imaged at heights up to 9,200 m. Products such as dense point clouds, DSMs and true orthomosaics with a ground pixel resolution of up to 15 cm were produced. Special challenges and gaps in the investigation of high mountain areas, approaches for resolution of these problems, the camera system and the state of evaluation are presented with examples.

  7. 19. DETAIL OF AIR FORCE WEATHER INFORMATION TERMINAL AND CHART ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. DETAIL OF AIR FORCE WEATHER INFORMATION TERMINAL AND CHART RECORDER LOCATED IMMEDIATELY NORTH OF CONSOLE IN PHOTOS A-15 THROUGH A-18. - Vandenberg Air Force Base, Space Launch Complex 3, Launch Operations Building, Napa & Alden Roads, Lompoc, Santa Barbara County, CA

  8. The level of detail required in a deformable phantom to accurately perform quality assurance of deformable image registration

    NASA Astrophysics Data System (ADS)

    Saenz, Daniel L.; Kim, Hojin; Chen, Josephine; Stathakis, Sotirios; Kirby, Neil

    2016-09-01

    The primary purpose of the study was to determine how detailed deformable image registration (DIR) phantoms need to adequately simulate human anatomy and accurately assess the quality of DIR algorithms. In particular, how many distinct tissues are required in a phantom to simulate complex human anatomy? Pelvis and head-and-neck patient CT images were used for this study as virtual phantoms. Two data sets from each site were analyzed. The virtual phantoms were warped to create two pairs consisting of undeformed and deformed images. Otsu’s method was employed to create additional segmented image pairs of n distinct soft tissue CT number ranges (fat, muscle, etc). A realistic noise image was added to each image. Deformations were applied in MIM Software (MIM) and Velocity deformable multi-pass (DMP) and compared with the known warping. Images with more simulated tissue levels exhibit more contrast, enabling more accurate results. Deformation error (magnitude of the vector difference between known and predicted deformation) was used as a metric to evaluate how many CT number gray levels are needed for a phantom to serve as a realistic patient proxy. Stabilization of the mean deformation error was reached by three soft tissue levels for Velocity DMP and MIM, though MIM exhibited a persisting difference in accuracy between the discrete images and the unprocessed image pair. A minimum detail of three levels allows a realistic patient proxy for use with Velocity and MIM deformation algorithms.

  9. The level of detail required in a deformable phantom to accurately perform quality assurance of deformable image registration.

    PubMed

    Saenz, Daniel L; Kim, Hojin; Chen, Josephine; Stathakis, Sotirios; Kirby, Neil

    2016-09-01

    The primary purpose of the study was to determine how detailed deformable image registration (DIR) phantoms need to adequately simulate human anatomy and accurately assess the quality of DIR algorithms. In particular, how many distinct tissues are required in a phantom to simulate complex human anatomy? Pelvis and head-and-neck patient CT images were used for this study as virtual phantoms. Two data sets from each site were analyzed. The virtual phantoms were warped to create two pairs consisting of undeformed and deformed images. Otsu's method was employed to create additional segmented image pairs of n distinct soft tissue CT number ranges (fat, muscle, etc). A realistic noise image was added to each image. Deformations were applied in MIM Software (MIM) and Velocity deformable multi-pass (DMP) and compared with the known warping. Images with more simulated tissue levels exhibit more contrast, enabling more accurate results. Deformation error (magnitude of the vector difference between known and predicted deformation) was used as a metric to evaluate how many CT number gray levels are needed for a phantom to serve as a realistic patient proxy. Stabilization of the mean deformation error was reached by three soft tissue levels for Velocity DMP and MIM, though MIM exhibited a persisting difference in accuracy between the discrete images and the unprocessed image pair. A minimum detail of three levels allows a realistic patient proxy for use with Velocity and MIM deformation algorithms. PMID:27494827

  10. Ultra-accurate collaborative information filtering via directed user similarity

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  11. Automatic and Accurate Shadow Detection Using Near-Infrared Information.

    PubMed

    Rüfenacht, Dominic; Fredembach, Clément; Süsstrunk, Sabine

    2014-08-01

    We present a method to automatically detect shadows in a fast and accurate manner by taking advantage of the inherent sensitivity of digital camera sensors to the near-infrared (NIR) part of the spectrum. Dark objects, which confound many shadow detection algorithms, often have much higher reflectance in the NIR. We can thus build an accurate shadow candidate map based on image pixels that are dark both in the visible and NIR representations. We further refine the shadow map by incorporating ratios of the visible to the NIR image, based on the observation that commonly encountered light sources have very distinct spectra in the NIR band. The results are validated on a new database, which contains visible/NIR images for a large variety of real-world shadow creating illuminant conditions, as well as manually labeled shadow ground truth. Both quantitative and qualitative evaluations show that our method outperforms current state-of-the-art shadow detection algorithms in terms of accuracy and computational efficiency.

  12. A new accurate pill recognition system using imprint information

    NASA Astrophysics Data System (ADS)

    Chen, Zhiyuan; Kamata, Sei-ichiro

    2013-12-01

    Great achievements in modern medicine benefit human beings. Also, it has brought about an explosive growth of pharmaceuticals that current in the market. In daily life, pharmaceuticals sometimes confuse people when they are found unlabeled. In this paper, we propose an automatic pill recognition technique to solve this problem. It functions mainly based on the imprint feature of the pills, which is extracted by proposed MSWT (modified stroke width transform) and described by WSC (weighted shape context). Experiments show that our proposed pill recognition method can reach an accurate rate up to 92.03% within top 5 ranks when trying to classify more than 10 thousand query pill images into around 2000 categories.

  13. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Reasonable steps to assure information is... Reasonable Steps Commission Will Take To Assure Information It Discloses Is Accurate, and That Disclosure Is... Administers § 1101.32 Reasonable steps to assure information is accurate. (a) The Commission considers...

  14. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Reasonable steps to assure information is... Reasonable Steps Commission Will Take To Assure Information It Discloses Is Accurate, and That Disclosure Is... Administers § 1101.32 Reasonable steps to assure information is accurate. (a) The Commission considers...

  15. Aggregate versus Individual-Level Sexual Behavior Assessment: How Much Detail Is Needed to Accurately Estimate HIV/STI Risk?

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Galletly, Carol L.; McAuliffe, Timothy L.; DiFranceisco, Wayne; Raymond, H. Fisher; Chesson, Harrell W.

    2010-01-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate).…

  16. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... recordkeeping transfer agents, maintenance of current control book, retention of certificate detail and “buy-in..., ten business days; and (iii) With respect to all other recordkeeping transfer agents, five business... security transferred or issued. For the purposes of this paragraph, “promptly” means within two...

  17. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... recordkeeping transfer agents, maintenance of current control book, retention of certificate detail and “buy-in... or the number of shares and related market value of equity securities comprising any buy-in executed... (ii) The reason for the buy-in. (d) Every co-transfer agent shall respond promptly to all...

  18. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... recordkeeping transfer agents, maintenance of current control book, retention of certificate detail and “buy-in... or the number of shares and related market value of equity securities comprising any buy-in executed... (ii) The reason for the buy-in. (d) Every co-transfer agent shall respond promptly to all...

  19. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... recordkeeping transfer agents, maintenance of current control book, retention of certificate detail and “buy-in... or the number of shares and related market value of equity securities comprising any buy-in executed... (ii) The reason for the buy-in. (d) Every co-transfer agent shall respond promptly to all...

  20. 17 CFR 240.17Ad-10 - Prompt posting of certificate detail to master securityholder files, maintenance of accurate...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... certificate detail and “buy-in” of physical over-issuance. (a)(1) Every recordkeeping transfer agent shall... or the number of shares and related market value of equity securities comprising any buy-in executed... (ii) The reason for the buy-in. (d) Every co-transfer agent shall respond promptly to all...

  1. Tomato Analyzer: a useful software application to collect accurate and detailed morphological and colorimetric data from two-dimensional objects.

    PubMed

    Rodríguez, Gustavo R; Moyseenko, Jennifer B; Robbins, Matthew D; Morejón, Nancy Huarachi; Francis, David M; van der Knaap, Esther

    2010-03-16

    Measuring fruit morphology and color traits of vegetable and fruit crops in an objective and reproducible way is important for detailed phenotypic analyses of these traits. Tomato Analyzer (TA) is a software program that measures 37 attributes related to two-dimensional shape in a semi-automatic and reproducible manner. Many of these attributes, such as angles at the distal and proximal ends of the fruit and areas of indentation, are difficult to quantify manually. The attributes are organized in ten categories within the software: Basic Measurement, Fruit Shape Index, Blockiness, Homogeneity, Proximal Fruit End Shape, Distal Fruit End Shape, Asymmetry, Internal Eccentricity, Latitudinal Section and Morphometrics. The last category requires neither prior knowledge nor predetermined notions of the shape attributes, so morphometric analysis offers an unbiased option that may be better adapted to high-throughput analyses than attribute analysis. TA also offers the Color Test application that was designed to collect color measurements from scanned images and allow scanning devices to be calibrated using color standards. TA provides several options to export and analyze shape attribute, morphometric, and color data. The data may be exported to an excel file in batch mode (more than 100 images at one time) or exported as individual images. The user can choose between output that displays the average for each attribute for the objects in each image (including standard deviation), or an output that displays the attribute values for each object on the image. TA has been a valuable and effective tool for indentifying and confirming tomato fruit shape Quantitative Trait Loci (QTL), as well as performing in-depth analyses of the effect of key fruit shape genes on plant morphology. Also, TA can be used to objectively classify fruit into various shape categories. Lastly, fruit shape and color traits in other plant species as well as other plant organs such as leaves and seeds

  2. How Iron-Containing Proteins Control Dioxygen Chemistry: A Detailed Atomic Level Description Via Accurate Quantum Chemical and Mixed Quantum Mechanics/Molecular Mechanics Calculations.

    SciTech Connect

    Friesner, Richard A.; Baik, Mu-Hyun; Gherman, Benjamin F.; Guallar, Victor; Wirstam, Maria E.; Murphy, Robert B.; Lippard, Stephen J.

    2003-03-01

    Over the past several years, rapid advances in computational hardware, quantum chemical methods, and mixed quantum mechanics/molecular mechanics (QM/MM) techniques have made it possible to model accurately the interaction of ligands with metal-containing proteins at an atomic level of detail. In this paper, we describe the application of our computational methodology, based on density functional (DFT) quantum chemical methods, to two diiron-containing proteins that interact with dioxygen: methane monooxygenase (MMO) and hemerythrin (Hr). Although the active sites are structurally related, the biological function differs substantially. MMO is an enzyme found in methanotrophic bacteria and hydroxylates aliphatic C-H bonds, whereas Hr is a carrier protein for dioxygen used by a number of marine invertebrates. Quantitative descriptions of the structures and energetics of key intermediates and transition states involved in the reaction with dioxygen are provided, allowing their mechanisms to be compared and contrasted in detail. An in-depth understanding of how the chemical identity of the first ligand coordination shell, structural features, electrostatic and van der Waals interactions of more distant shells control ligand binding and reactive chemistry is provided, affording a systematic analysis of how iron-containing proteins process dioxygen. Extensive contact with experiment is made in both systems, and a remarkable degree of accuracy and robustness of the calculations is obtained from both a qualitative and quantitative perspective.

  3. Effects of detailed soil spatial information on watershed modeling across different model scales

    NASA Astrophysics Data System (ADS)

    Quinn, Trevor; Zhu, A.-Xing; Burt, James E.

    2005-12-01

    Hydro-ecological modelers often use spatial variation of soil information derived from conventional soil surveys in simulation of hydro-ecological processes over watersheds at mesoscale (10-100 km 2). Conventional soil surveys are not designed to provide the same level of spatial detail as terrain and vegetation inputs derived from digital terrain analysis and remote sensing techniques. Soil property layers derived from conventional soil surveys are often incompatible with detailed terrain and remotely sensed data due to their difference in scales. The objective of this research is to examine the effect of scale incompatibility between soil information and the detailed digital terrain data and remotely sensed information by comparing simulations of watershed processes based on the conventional soil map and those simulations based on detailed soil information across different simulation scales. The detailed soil spatial information was derived using a GIS (geographical information system), expert knowledge, and fuzzy logic based predictive mapping approach (Soil Land Inference Model, SoLIM). The Regional Hydro-Ecological Simulation System (RHESSys) is used to simulate two watershed processes: net photosynthesis and stream flow. The difference between simulation based on the conventional soil map and that based on the detailed predictive soil map at a given simulation scale is perceived to be the effect of scale incompatibility between conventional soil data and the rest of the (more detailed) data layers at that scale. Two modeling approaches were taken in this study: the lumped parameter approach and the distributed parameter approach. The results over two small watersheds indicate that the effect does not necessarily always increase or decrease as the simulation scale becomes finer or coarser. For a given watershed there seems to be a fixed scale at which the effect is consistently low for the simulated processes with both the lumped parameter approach and the

  4. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    NASA Astrophysics Data System (ADS)

    Xuan, C.; Oda, H.

    2013-12-01

    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution

  5. PC-based Multiple Information System Interface (PC/MISI) detailed design and implementation plan

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    The design plan for the personal computer multiple information system interface (PC/MISI) project is discussed. The document is intended to be used as a blueprint for the implementation of the system. Each component is described in the detail necessary to allow programmers to implement the system. A description of the system data flow and system file structures is given.

  6. Differential contribution of visual and auditory information to accurately predict the direction and rotational motion of a visual stimulus.

    PubMed

    Park, Seoung Hoon; Kim, Seonjin; Kwon, MinHyuk; Christou, Evangelos A

    2016-03-01

    Vision and auditory information are critical for perception and to enhance the ability of an individual to respond accurately to a stimulus. However, it is unknown whether visual and auditory information contribute differentially to identify the direction and rotational motion of the stimulus. The purpose of this study was to determine the ability of an individual to accurately predict the direction and rotational motion of the stimulus based on visual and auditory information. In this study, we recruited 9 expert table-tennis players and used table-tennis service as our experimental model. Participants watched recorded services with different levels of visual and auditory information. The goal was to anticipate the direction of the service (left or right) and the rotational motion of service (topspin, sidespin, or cut). We recorded their responses and quantified the following outcomes: (i) directional accuracy and (ii) rotational motion accuracy. The response accuracy was the accurate predictions relative to the total number of trials. The ability of the participants to predict the direction of the service accurately increased with additional visual information but not with auditory information. In contrast, the ability of the participants to predict the rotational motion of the service accurately increased with the addition of auditory information to visual information but not with additional visual information alone. In conclusion, this finding demonstrates that visual information enhances the ability of an individual to accurately predict the direction of the stimulus, whereas additional auditory information enhances the ability of an individual to accurately predict the rotational motion of stimulus.

  7. The utility of accurate mass and LC elution time information in the analysis of complex proteomes

    SciTech Connect

    Norbeck, Angela D.; Monroe, Matthew E.; Adkins, Joshua N.; Anderson, Kevin K.; Daly, Don S.; Smith, Richard D.

    2005-08-01

    Theoretical tryptic digests of all predicted proteins from the genomes of three organisms of varying complexity were evaluated for specificity and possible utility of combined peptide accurate mass and predicted LC normalized elution time (NET) information. The uniqueness of each peptide was evaluated using its combined mass (+/- 5 ppm and 1 ppm) and NET value (no constraint, +/- 0.05 and 0.01 on a 0-1 NET scale). The set of peptides both underestimates actual biological complexity due to the lack of specific modifications, and overestimates the expected complexity since many proteins will not be present in the sample or observable on the mass spectrometer because of dynamic range limitations. Once a peptide is identified from an LCMS/MS experiment, its mass and elution time is representative of a unique fingerprint for that peptide. The uniqueness of that fingerprint in comparison to that for the other peptides present is indicative of the ability to confidently identify that peptide based on accurate mass and NET measurements. These measurements can be made using HPLC coupled with high resolution MS in a high-throughput manner. Results show that for organisms with comparatively small proteomes, such as Deinococcus radiodurans, modest mass and elution time accuracies are generally adequate for peptide identifications. For more complex proteomes, increasingly accurate easurements are required. However, the majority of proteins should be uniquely identifiable by using LC-MS with mass accuracies within +/- 1 ppm and elution time easurements within +/- 0.01 NET.

  8. Quantitatively mapping cellular viscosity with detailed organelle information via a designed PET fluorescent probe.

    PubMed

    Liu, Tianyu; Liu, Xiaogang; Spring, David R; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-01-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  9. Quantitatively Mapping Cellular Viscosity with Detailed Organelle Information via a Designed PET Fluorescent Probe

    NASA Astrophysics Data System (ADS)

    Liu, Tianyu; Liu, Xiaogang; Spring, David R.; Qian, Xuhong; Cui, Jingnan; Xu, Zhaochao

    2014-06-01

    Viscosity is a fundamental physical parameter that influences diffusion in biological processes. The distribution of intracellular viscosity is highly heterogeneous, and it is challenging to obtain a full map of cellular viscosity with detailed organelle information. In this work, we report 1 as the first fluorescent viscosity probe which is able to quantitatively map cellular viscosity with detailed organelle information based on the PET mechanism. This probe exhibited a significant ratiometric fluorescence intensity enhancement as solvent viscosity increases. The emission intensity increase was attributed to combined effects of the inhibition of PET due to restricted conformational access (favorable for FRET, but not for PET), and the decreased PET efficiency caused by viscosity-dependent twisted intramolecular charge transfer (TICT). A full map of subcellular viscosity was successfully constructed via fluorescent ratiometric detection and fluorescence lifetime imaging; it was found that lysosomal regions in a cell possess the highest viscosity, followed by mitochondrial regions.

  10. Capturing Accurate and Useful Information on Medication-Related Telenursing Triage Calls.

    PubMed

    Lake, R; Li, L; Baysari, M; Byrne, M; Robinson, M; Westbrook, J I

    2016-01-01

    Registered nurses providing telenursing triage and advice services record information on the medication related calls they handle. However the quality and consistency of these data were rarely examined. Our aim was to examine medication related calls made to the healthdirect advice service in November 2014, to assess their basic characteristics and how the data entry format influenced information collected and data consistency. Registered nurses selected the patient question type from a range of categories, and entered the medications involved in a free text field. Medication names were manually extracted from the free text fields. We also compared the selected patient question type with the free text description of the call, in order to gauge data consistency. Results showed that nurses provided patients with advice on medication-related queries in a timely matter (the median call duration of 9 minutes). From 1835 calls, we were able to identify and classify 2156 medications into 384 generic names. However, in 204 cases (11.2% of calls) no medication name was entered. A further 308 (15.0%) of the medication names entered were not identifiable. When we compared the selected patient question with the free text description of calls, we found that these were consistent in 63.27% of cases. Telenursing and triage advice services provide a valuable resource to the public with quick and easily accessible advice. To support nurses provide quality services and record accurate information about the queries, appropriate data entry format and design would be beneficial. PMID:27440292

  11. Conditional mutual inclusive information enables accurate quantification of associations in gene regulatory networks.

    PubMed

    Zhang, Xiujun; Zhao, Juan; Hao, Jin-Kao; Zhao, Xing-Ming; Chen, Luonan

    2015-03-11

    Mutual information (MI), a quantity describing the nonlinear dependence between two random variables, has been widely used to construct gene regulatory networks (GRNs). Despite its good performance, MI cannot separate the direct regulations from indirect ones among genes. Although the conditional mutual information (CMI) is able to identify the direct regulations, it generally underestimates the regulation strength, i.e. it may result in false negatives when inferring gene regulations. In this work, to overcome the problems, we propose a novel concept, namely conditional mutual inclusive information (CMI2), to describe the regulations between genes. Furthermore, with CMI2, we develop a new approach, namely CMI2NI (CMI2-based network inference), for reverse-engineering GRNs. In CMI2NI, CMI2 is used to quantify the mutual information between two genes given a third one through calculating the Kullback-Leibler divergence between the postulated distributions of including and excluding the edge between the two genes. The benchmark results on the GRNs from DREAM challenge as well as the SOS DNA repair network in Escherichia coli demonstrate the superior performance of CMI2NI. Specifically, even for gene expression data with small sample size, CMI2NI can not only infer the correct topology of the regulation networks but also accurately quantify the regulation strength between genes. As a case study, CMI2NI was also used to reconstruct cancer-specific GRNs using gene expression data from The Cancer Genome Atlas (TCGA). CMI2NI is freely accessible at http://www.comp-sysbio.org/cmi2ni.

  12. Fracture Network Characteristics Informed by Detailed Studies of Chlorinated Solvent Plumes in Sedimentary Rock Aquifers

    NASA Astrophysics Data System (ADS)

    Parker, B. L.; Chapman, S.

    2015-12-01

    Various numerical approaches have been used to simulate contaminant plumes in fractured porous rock, but the one that allows field and laboratory measurements to be most directly used as inputs to these models is the Discrete Fracture Network (DFN) Approach. To effectively account for fracture-matrix interactions, emphasis must be placed on identifying and parameterizing all of the fractures that participate substantially in groundwater flow and contaminated transport. High resolution plume studies at four primary research sites, where chlorinated solvent plumes serve as long-term (several decades) tracer tests, provide insight concerning the density of the fracture network unattainable by conventional methods. Datasets include contaminant profiles from detailed VOC subsampling informed by continuous core logs, hydraulic head and transmissivity profiles, packer testing and sensitive temperature logging methods in FLUTe™ lined holes. These show presence of many more transmissive fractures, contrasting observations of only a few flow zones per borehole obtained from conventional hydraulic tests including flow metering in open boreholes. Incorporating many more fractures with a wider range of transmissivities is key to predicting contaminant migration. This new understanding of dense fracture networks combined with matrix property measurements have informed 2-D DFN flow and transport modelling using Fractran and HydroGeosphere to simulate plume characteristics ground-truthed by detailed field site plume characterization. These process-based simulations corroborate field findings that plumes in sedimentary rock after decades of transport show limited plume front distances and strong internal plume attenuation by diffusion, transverse dispersion and slow degradation. This successful application of DFN modeling informed by field-derived parameters demonstrates how the DFN Approach can be applied to other sites to inform plume migration rates and remedial efficacy.

  13. Detailed Clinical Models: Representing Knowledge, Data and Semantics in Healthcare Information Technology

    PubMed Central

    2014-01-01

    Objectives This paper will present an overview of the developmental effort in harmonizing clinical knowledge modeling using the Detailed Clinical Models (DCMs), and will explain how it can contribute to the preservation of Electronic Health Records (EHR) data. Methods Clinical knowledge modeling is vital for the management and preservation of EHR and data. Such modeling provides common data elements and terminology binding with the intention of capturing and managing clinical information over time and location independent from technology. Any EHR data exchange without an agreed clinical knowledge modeling will potentially result in loss of information. Results Many attempts exist from the past to model clinical knowledge for the benefits of semantic interoperability using standardized data representation and common terminologies. The objective of each project is similar with respect to consistent representation of clinical data, using standardized terminologies, and an overall logical approach. However, the conceptual, logical, and the technical expressions are quite different in one clinical knowledge modeling approach versus another. There currently are synergies under the Clinical Information Modeling Initiative (CIMI) in order to create a harmonized reference model for clinical knowledge models. Conclusions The goal for the CIMI is to create a reference model and formalisms based on for instance the DCM (ISO/TS 13972), among other work. A global repository of DCMs may potentially be established in the future. PMID:25152829

  14. Advanced information processing system: The Army Fault-Tolerant Architecture detailed design overview

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven

    1994-01-01

    The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.

  15. Using geometrical, textural, and contextual information of land parcels for classification of detailed urban land use

    USGS Publications Warehouse

    Wu, S.-S.; Qiu, X.; Usery, E.L.; Wang, L.

    2009-01-01

    Detailed urban land use data are important to government officials, researchers, and businesspeople for a variety of purposes. This article presents an approach to classifying detailed urban land use based on geometrical, textural, and contextual information of land parcels. An area of 6 by 14 km in Austin, Texas, with land parcel boundaries delineated by the Travis Central Appraisal District of Travis County, Texas, is tested for the approach. We derive fifty parcel attributes from relevant geographic information system (GIS) and remote sensing data and use them to discriminate among nine urban land uses: single family, multifamily, commercial, office, industrial, civic, open space, transportation, and undeveloped. Half of the 33,025 parcels in the study area are used as training data for land use classification and the other half are used as testing data for accuracy assessment. The best result with a decision tree classification algorithm has an overall accuracy of 96 percent and a kappa coefficient of 0.78, and two naive, baseline models based on the majority rule and the spatial autocorrelation rule have overall accuracy of 89 percent and 79 percent, respectively. The algorithm is relatively good at classifying single-family, multifamily, commercial, open space, and undeveloped land uses and relatively poor at classifying office, industrial, civic, and transportation land uses. The most important attributes for land use classification are the geometrical attributes, particularly those related to building areas. Next are the contextual attributes, particularly those relevant to the spatial relationship between buildings, then the textural attributes, particularly the semivariance texture statistic from 0.61-m resolution images.

  16. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    SciTech Connect

    Allan, M.E.; Wilson, M.L.; Wightman, J. )

    1996-01-01

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.

  17. Coupling geostatistics to detailed reservoir description allows better visualization and more accurate characterization/simulation of turbidite reservoirs: Elk Hills oil field, California

    SciTech Connect

    Allan, M.E.; Wilson, M.L.; Wightman, J.

    1996-12-31

    The Elk Hills giant oilfield, located in the southern San Joaquin Valley of California, has produced 1.1 billion barrels of oil from Miocene and shallow Pliocene reservoirs. 65% of the current 64,000 BOPD production is from the pressure-supported, deeper Miocene turbidite sands. In the turbidite sands of the 31 S structure, large porosity & permeability variations in the Main Body B and Western 31 S sands cause problems with the efficiency of the waterflooding. These variations have now been quantified and visualized using geostatistics. The end result is a more detailed reservoir characterization for simulation. Traditional reservoir descriptions based on marker correlations, cross-sections and mapping do not provide enough detail to capture the short-scale stratigraphic heterogeneity needed for adequate reservoir simulation. These deterministic descriptions are inadequate to tie with production data as the thinly bedded sand/shale sequences blur into a falsely homogenous picture. By studying the variability of the geologic & petrophysical data vertically within each wellbore and spatially from well to well, a geostatistical reservoir description has been developed. It captures the natural variability of the sands and shales that was lacking from earlier work. These geostatistical studies allow the geologic and petrophysical characteristics to be considered in a probabilistic model. The end-product is a reservoir description that captures the variability of the reservoir sequences and can be used as a more realistic starting point for history matching and reservoir simulation.

  18. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics

    PubMed Central

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research. PMID:27571061

  19. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics.

    PubMed

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research. PMID:27571061

  20. Highly Accurate Prediction of Protein-Protein Interactions via Incorporating Evolutionary Information and Physicochemical Characteristics.

    PubMed

    Li, Zheng-Wei; You, Zhu-Hong; Chen, Xing; Gui, Jie; Nie, Ru

    2016-01-01

    Protein-protein interactions (PPIs) occur at almost all levels of cell functions and play crucial roles in various cellular processes. Thus, identification of PPIs is critical for deciphering the molecular mechanisms and further providing insight into biological processes. Although a variety of high-throughput experimental techniques have been developed to identify PPIs, existing PPI pairs by experimental approaches only cover a small fraction of the whole PPI networks, and further, those approaches hold inherent disadvantages, such as being time-consuming, expensive, and having high false positive rate. Therefore, it is urgent and imperative to develop automatic in silico approaches to predict PPIs efficiently and accurately. In this article, we propose a novel mixture of physicochemical and evolutionary-based feature extraction method for predicting PPIs using our newly developed discriminative vector machine (DVM) classifier. The improvements of the proposed method mainly consist in introducing an effective feature extraction method that can capture discriminative features from the evolutionary-based information and physicochemical characteristics, and then a powerful and robust DVM classifier is employed. To the best of our knowledge, it is the first time that DVM model is applied to the field of bioinformatics. When applying the proposed method to the Yeast and Helicobacter pylori (H. pylori) datasets, we obtain excellent prediction accuracies of 94.35% and 90.61%, respectively. The computational results indicate that our method is effective and robust for predicting PPIs, and can be taken as a useful supplementary tool to the traditional experimental methods for future proteomics research.

  1. Arthroscopic optical coherence tomography provides detailed information on articular cartilage lesions in horses.

    PubMed

    te Moller, N C R; Brommer, H; Liukkonen, J; Virén, T; Timonen, M; Puhakka, P H; Jurvelin, J S; van Weeren, P R; Töyräs, J

    2013-09-01

    Arthroscopy enables direct inspection of the articular surface, but provides no information on deeper cartilage layers. Optical coherence tomography (OCT), based on measurement of reflection and backscattering of light, is a diagnostic technique used in cardiovascular surgery and ophthalmology. It provides cross-sectional images at resolutions comparable to that of low-power microscopy. The aim of this study was to determine if OCT is feasible for advanced clinical assessment of lesions in equine articular cartilage during diagnostic arthroscopy. Diagnostic arthroscopy of 36 metacarpophalangeal joints was carried out ex vivo. Of these, 18 joints with varying degrees of cartilage damage were selected, wherein OCT arthroscopy was conducted using an OCT catheter (diameter 0.9 mm) inserted through standard instrument portals. Five sites of interest, occasionally supplemented with other locations where defects were encountered, were arthroscopically graded according to the International Cartilage Repair Society (ICRS) classification system. The same sites were evaluated qualitatively (ICRS classification and morphological description of the lesions) and quantitatively (measurement of cartilage thickness) on OCT images. OCT provided high resolution images of cartilage enabling determination of cartilage thickness. Comparing ICRS grades determined by both arthroscopy and OCT revealed poor agreement. Furthermore, OCT visualised a spectrum of lesions, including cavitation, fibrillation, superficial and deep clefts, erosion, ulceration and fragmentation. In addition, with OCT the arthroscopically inaccessible area between the dorsal MC3 and P1 was reachable in some cases. Arthroscopically-guided OCT provided more detailed and quantitative information on the morphology of articular cartilage lesions than conventional arthroscopy. OCT could therefore improve the diagnostic value of arthroscopy in equine orthopaedic surgery. PMID:23810744

  2. Detailed Soil Information for Hydrologic Modeling in the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Bliss, N. B.; Waltman, S. W.; Neale, A. C.

    2010-12-01

    Detailed soil data for the Conterminous United States are being made available to hydrologic modelers and others in a new gridded format. The Soil Survey Geographic (SSURGO) Database is now 86 percent complete for the Conterminous United States. The soil properties of interest to hydrologists include available water capacity, bulk density, saturated hydraulic conductivity, field capacity, porosity, average soil thickness, soil organic matter or carbon content, and percentages of sand, silt, clay, and rocks. The methods for creating the gridded format data summarize the attributes across soil horizons and soil components to create a value for each attribute at the mapunit level. Separate gridded products can be developed for specific depth zones, as required. The SSURGO data are being continuously improved by National Cooperative Soil Survey under the leadership of the U.S. Department of Agriculture Natural Resources Conservation Service (NRCS). Readily accessible gridded soils data have several advantages over vector data, such as easier integration with other land surface datasets. Currently, the data are available at a 30-meter resolution in the Albers Equal Area projection. The compilation of the new database has been made possible as part of a National Atlas of Ecosystem Services being developed under the leadership of the US Environmental Protection Agency (EPA), along with many partner organizations including the NRCS and the United States Geological Survey. When complete, the atlas information will include many ecosystem features and will be used in a wide variety of ecosystem service assessments.

  3. Subjective sense of memory strength and the objective amount of information accurately remembered are related to distinct neural correlates at encoding.

    PubMed

    Qin, Shaozheng; van Marle, Hein J F; Hermans, Erno J; Fernández, Guillén

    2011-06-15

    Although commonly used, the term memory strength is not well defined in humans. Besides durability, it has been conceptualized by retrieval characteristics, such as subjective confidence associated with retrieval, or objectively, by the amount of information accurately retrieved. Behaviorally, these measures are not necessarily correlated, indicating that distinct neural processes may underlie them. Thus, we aimed at disentangling neural activity at encoding associated with either a subsequent subjective sense of memory strength or with a subsequent objective amount of information remembered. Using functional magnetic resonance imaging (fMRI), participants were scanned while incidentally encoding a series of photographs of complex scenes. The next day, they underwent two memory tests, quantifying memory strength either subjectively (confidence on remembering the gist of a scene) or objectively (the number of details accurately remembered within a scene). Correlations between these measurements were mutually partialed out in subsequent memory analyses of fMRI data. Results revealed that activation in left ventral lateral prefrontal cortex and temporoparietal junction predicted subsequent confidence ratings. In contrast, parahippocampal and hippocampal activity predicted the number of details remembered. Our findings suggest that memory strength may reflect a functionally heterogeneous set of (at least two) phenomena. One phenomenon appears related to prefrontal and temporoparietal top-down modulations, resulting in the subjective sense of memory strength that is potentially based on gist memory. The other phenomenon is likely related to medial-temporal binding processes, determining the amount of information accurately encoded into memory. Thus, our study dissociated two distinct phenomena that are usually described as memory strength.

  4. 78 FR 42796 - 30-Day Notice of Proposed Information Collection: HUD Standard Grant Application Forms: Detailed...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-17

    ...: Detailed Budget Form (HUD-424-CB), Budget Worksheet (HUD-424CBW), Application for Federal Assistance (SF...-424-CB), Budget Worksheet (HUD- 424CBW), Application for Federal Assistance (SF-424), and the...

  5. Accurate refinement of docked protein complexes using evolutionary information and deep learning.

    PubMed

    Akbal-Delibas, Bahar; Farhoodi, Roshanak; Pomplun, Marc; Haspel, Nurit

    2016-06-01

    One of the major challenges for protein docking methods is to accurately discriminate native-like structures from false positives. Docking methods are often inaccurate and the results have to be refined and re-ranked to obtain native-like complexes and remove outliers. In a previous work, we introduced AccuRefiner, a machine learning based tool for refining protein-protein complexes. Given a docked complex, the refinement tool produces a small set of refined versions of the input complex, with lower root-mean-square-deviation (RMSD) of atomic positions with respect to the native structure. The method employs a unique ranking tool that accurately predicts the RMSD of docked complexes with respect to the native structure. In this work, we use a deep learning network with a similar set of features and five layers. We show that a properly trained deep learning network can accurately predict the RMSD of a docked complex with 1.40 Å error margin on average, by approximating the complex relationship between a wide set of scoring function terms and the RMSD of a docked structure. The network was trained on 35000 unbound docking complexes generated by RosettaDock. We tested our method on 25 different putative docked complexes produced also by RosettaDock for five proteins that were not included in the training data. The results demonstrate that the high accuracy of the ranking tool enables AccuRefiner to consistently choose the refinement candidates with lower RMSD values compared to the coarsely docked input structures. PMID:26846813

  6. Accurate refinement of docked protein complexes using evolutionary information and deep learning.

    PubMed

    Akbal-Delibas, Bahar; Farhoodi, Roshanak; Pomplun, Marc; Haspel, Nurit

    2016-06-01

    One of the major challenges for protein docking methods is to accurately discriminate native-like structures from false positives. Docking methods are often inaccurate and the results have to be refined and re-ranked to obtain native-like complexes and remove outliers. In a previous work, we introduced AccuRefiner, a machine learning based tool for refining protein-protein complexes. Given a docked complex, the refinement tool produces a small set of refined versions of the input complex, with lower root-mean-square-deviation (RMSD) of atomic positions with respect to the native structure. The method employs a unique ranking tool that accurately predicts the RMSD of docked complexes with respect to the native structure. In this work, we use a deep learning network with a similar set of features and five layers. We show that a properly trained deep learning network can accurately predict the RMSD of a docked complex with 1.40 Å error margin on average, by approximating the complex relationship between a wide set of scoring function terms and the RMSD of a docked structure. The network was trained on 35000 unbound docking complexes generated by RosettaDock. We tested our method on 25 different putative docked complexes produced also by RosettaDock for five proteins that were not included in the training data. The results demonstrate that the high accuracy of the ranking tool enables AccuRefiner to consistently choose the refinement candidates with lower RMSD values compared to the coarsely docked input structures.

  7. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    ERIC Educational Resources Information Center

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  8. 16 CFR 1101.32 - Reasonable steps to assure information is accurate.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... to release to the public: (1) The Commission staff or a qualified person or entity outside the... will review the information in light of the comments. The degree of review by the Commission and...

  9. An examination of information quality as a moderator of accurate personality judgment.

    PubMed

    Letzring, Tera D; Human, Lauren J

    2014-10-01

    Information quality is an important moderator of the accuracy of personality judgment, and this article describes research focusing on how specific kinds of information are related to accuracy. In this study, 228 participants (159 female, 69 male; mean age = 23.43; 86.4% Caucasian) in unacquainted dyads were assigned to discuss thoughts and feelings, discuss behaviors, or engage in behaviors. Interactions lasted 25-30 min, and participants provided ratings of their partners and themselves following the interaction on the Big Five traits, ego-control, and ego-resiliency. Next, the amount of different types of information made available by each participant was objectively coded. The accuracy criterion, composed of self- and acquaintance ratings, was used to assess distinctive and normative accuracy using the Social Accuracy Model. Participants in the discussion conditions achieved higher distinctive accuracy than participants who engaged in behaviors, but normative accuracy did not differ across conditions. Information about specific behaviors and general behaviors were among the most consistent predictors of higher distinctive accuracy. Normative accuracy was more likely to decrease than increase when higher-quality information was available. Verbal information about behaviors is the most useful for learning about how people are unique.

  10. Polyallelic structural variants can provide accurate, highly informative genetic markers focused on diagnosis and therapeutic targets: Accuracy vs. Precision.

    PubMed

    Roses, A D

    2016-02-01

    Structural variants (SVs) include all insertions, deletions, and rearrangements in the genome, with several common types of nucleotide repeats including single sequence repeats, short tandem repeats, and insertion-deletion length variants. Polyallelic SVs provide highly informative markers for association studies with well-phenotyped cohorts. SVs can influence gene regulation by affecting epigenetics, transcription, splicing, and/or translation. Accurate assays of polyallelic SV loci are required to define the range and allele frequency of variable length alleles. PMID:26517180

  11. Information Systems Security and Computer Crime in the IS Curriculum: A Detailed Examination

    ERIC Educational Resources Information Center

    Foltz, C. Bryan; Renwick, Janet S.

    2011-01-01

    The authors examined the extent to which information systems (IS) security and computer crime are covered in information systems programs. Results suggest that IS faculty believe security coverage should be increased in required, elective, and non-IS courses. However, respondent faculty members are concerned that existing curricula leave little…

  12. Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

    PubMed Central

    Mantel, Bruno; Stoffregen, Thomas A.; Campbell, Alain; Bardy, Benoît G.

    2015-01-01

    Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate. PMID:25856410

  13. Combining Evolutionary Information and an Iterative Sampling Strategy for Accurate Protein Structure Prediction.

    PubMed

    Braun, Tatjana; Koehler Leman, Julia; Lange, Oliver F

    2015-12-01

    Recent work has shown that the accuracy of ab initio structure prediction can be significantly improved by integrating evolutionary information in form of intra-protein residue-residue contacts. Following this seminal result, much effort is put into the improvement of contact predictions. However, there is also a substantial need to develop structure prediction protocols tailored to the type of restraints gained by contact predictions. Here, we present a structure prediction protocol that combines evolutionary information with the resolution-adapted structural recombination approach of Rosetta, called RASREC. Compared to the classic Rosetta ab initio protocol, RASREC achieves improved sampling, better convergence and higher robustness against incorrect distance restraints, making it the ideal sampling strategy for the stated problem. To demonstrate the accuracy of our protocol, we tested the approach on a diverse set of 28 globular proteins. Our method is able to converge for 26 out of the 28 targets and improves the average TM-score of the entire benchmark set from 0.55 to 0.72 when compared to the top ranked models obtained by the EVFold web server using identical contact predictions. Using a smaller benchmark, we furthermore show that the prediction accuracy of our method is only slightly reduced when the contact prediction accuracy is comparatively low. This observation is of special interest for protein sequences that only have a limited number of homologs.

  14. Detailed requirements document for common software of shuttle program information management system

    NASA Technical Reports Server (NTRS)

    Everette, J. M.; Bradfield, L. D.; Horton, C. L.

    1975-01-01

    Common software was investigated as a method for minimizing development and maintenance cost of the shuttle program information management system (SPIMS) applications while reducing the time-frame of their development. Those requirements satisfying these criteria are presented along with the stand-alone modules which may be used directly by applications. The SPIMS applications operating on the CYBER 74 computer, are specialized information management systems which use System 2000 as a data base manager. Common software provides the features to support user interactions on a CRT terminal using form input and command response capabilities. These features are available as subroutines to the applications.

  15. Honey bees can perform accurately directed waggle dances based solely on information from a homeward trip.

    PubMed

    Edrich, Wolfgang

    2015-10-01

    Honey bees were displaced several 100 m from their hive to an unfamiliar site and provisioned with honey. After feeding, almost two-thirds of the bees flew home to their hive within a 50 min observation time. About half of these returning, bees signalled the direction of the release site in waggle dances thus demonstrating that the dance can be guided entirely by information gathered on a single homeward trip. The likely reason for the bees' enthusiastic dancing on their initial return from this new site was the highly rewarding honeycomb that they were given there. The attractive nature of the site is confirmed by many of these bees revisiting the site and continuing to forage there.

  16. Accurately decoding visual information from fMRI data obtained in a realistic virtual environment

    PubMed Central

    Floren, Andrew; Naylor, Bruce; Miikkulainen, Risto; Ress, David

    2015-01-01

    Three-dimensional interactive virtual environments (VEs) are a powerful tool for brain-imaging based cognitive neuroscience that are presently under-utilized. This paper presents machine-learning based methods for identifying brain states induced by realistic VEs with improved accuracy as well as the capability for mapping their spatial topography on the neocortex. VEs provide the ability to study the brain under conditions closer to the environment in which humans evolved, and thus to probe deeper into the complexities of human cognition. As a test case, we designed a stimulus to reflect a military combat situation in the Middle East, motivated by the potential of using real-time functional magnetic resonance imaging (fMRI) in the treatment of post-traumatic stress disorder. Each subject experienced moving through the virtual town where they encountered 1–6 animated combatants at different locations, while fMRI data was collected. To analyze the data from what is, compared to most studies, more complex and less controlled stimuli, we employed statistical machine learning in the form of Multi-Voxel Pattern Analysis (MVPA) with special attention given to artificial Neural Networks (NN). Extensions to NN that exploit the block structure of the stimulus were developed to improve the accuracy of the classification, achieving performances from 58 to 93% (chance was 16.7%) with six subjects. This demonstrates that MVPA can decode a complex cognitive state, viewing a number of characters, in a dynamic virtual environment. To better understand the source of this information in the brain, a novel form of sensitivity analysis was developed to use NN to quantify the degree to which each voxel contributed to classification. Compared with maps produced by general linear models and the searchlight approach, these sensitivity maps revealed a more diverse pattern of information relevant to the classification of cognitive state. PMID:26106315

  17. A Tale of Two Course Guides: Providing Students with Detailed Course Information

    ERIC Educational Resources Information Center

    Hanson, Karen; Williamson, Kasi

    2010-01-01

    Where do students find out about courses they might take? Potentially, from just about anywhere: friends, bulletin boards, department Web sites, advisors, e-mails, or flyers posted in the halls. Of course, some of these sources are more trustworthy than others. Where should students go to get reliable information that can help them make wise…

  18. Detailed design specification for the ALT Shuttle Information Extraction Subsystem (SIES)

    NASA Technical Reports Server (NTRS)

    Clouette, G. L.; Fitzpatrick, W. N.

    1976-01-01

    The approach and landing test (ALT) shuttle information extraction system (SIES) is described in terms of general requirements and system characteristics output products and processing options, output products and data sources, and system data flow. The ALT SIES is a data reduction system designed to satisfy certain data processing requirements for the ALT phase of the space shuttle program. The specific ALT SIES data processing requirements are stated in the data reduction complex approach and landing test data processing requirements. In general, ALT SIES must produce time correlated data products as a result of standardized data reduction or special purpose analytical processes. The main characteristics of ALT SIES are: (1) the system operates in a batch (non-interactive) mode; (2) the processing is table driven; (3) it is data base oriented; (4) it has simple operating procedures; and (5) it requires a minimum of run time information.

  19. Accurate prediction of interfacial residues in two-domain proteins using evolutionary information: implications for three-dimensional modeling.

    PubMed

    Bhaskara, Ramachandra M; Padhi, Amrita; Srinivasan, Narayanaswamy

    2014-07-01

    With the preponderance of multidomain proteins in eukaryotic genomes, it is essential to recognize the constituent domains and their functions. Often function involves communications across the domain interfaces, and the knowledge of the interacting sites is essential to our understanding of the structure-function relationship. Using evolutionary information extracted from homologous domains in at least two diverse domain architectures (single and multidomain), we predict the interface residues corresponding to domains from the two-domain proteins. We also use information from the three-dimensional structures of individual domains of two-domain proteins to train naïve Bayes classifier model to predict the interfacial residues. Our predictions are highly accurate (∼85%) and specific (∼95%) to the domain-domain interfaces. This method is specific to multidomain proteins which contain domains in at least more than one protein architectural context. Using predicted residues to constrain domain-domain interaction, rigid-body docking was able to provide us with accurate full-length protein structures with correct orientation of domains. We believe that these results can be of considerable interest toward rational protein and interaction design, apart from providing us with valuable information on the nature of interactions.

  20. Informed Consent for Interventional Radiology Procedures: A Survey Detailing Current European Practice

    SciTech Connect

    O'Dwyer, H.M.; Lyon, S.M.; Fotheringham, T.; Lee, M.J.

    2003-09-15

    Purpose: Official recommendations for obtaining informed consent for interventional radiology procedures are that the patient gives their consent to the operator more than 24 hr prior to the procedure. This has significant implications for interventional radiology practice. The purpose of this study was to identify the proportion of European interventional radiologists who conform to these guidelines. Methods: A questionnaire was designed consisting of 12 questions on current working practice and opinions regarding informed consent. These questions related to where, when and by whom consent was obtained from the patient. Questions also related to the use of formal consent forms and written patient information leaflets. Respondents were asked whether they felt patients received adequate explanation regarding indications for intervention,the procedure, alternative treatment options and complications. The questionnaire was distributed to 786 European interventional radiologists who were members of interventional societies. The anonymous replies were then entered into a database and analyzed. Results: Two hundred and fifty-four (32.3%) questionnaires were returned. Institutions were classified as academic (56.7%),non-academic (40.5%) or private (2.8%). Depending on the procedure,in a significant proportion of patients consent was obtained in the outpatient department (22%), on the ward (65%) and in the radiology day case ward (25%), but in over half (56%) of patients consent or re-consent was obtained in the interventional suite. Fifty percent of respondents indicated that they obtain consent more than 24 hr before some procedures, in 42.9% consent is obtained on the morning of the procedure and 48.8% indicated that in some patients consent is obtained immediately before the procedure. We found that junior medical staff obtained consent in 58% of cases. Eighty-two percent of respondents do not use specific consent forms and 61% have patient information leaflets. The

  1. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  2. Sewerage Mapping and Information System of the Metropolis of Tokyo (SEMIS) : Details of the Development and Outline of the System

    NASA Astrophysics Data System (ADS)

    Kawakami, Kouichi; Sekita, Mitsunobu

    It is essential to manage sewerage ledgers as information when maintaining and controlling sewerage, one of the infrastructures of cities. The Bureau of Sewerage developed the full scale Sewerage Mapping and Information System (SEMIS), the first trial done by a local government in this country and has operated it since 1985. Before the development the questionnaires were conducted to survey the use of sewerage ledgers by staffs engaged in sewage works, and means of improving how to prepare plans of sewerage were considered based on them. Employing these means the Bureau made a database of plans and descriptions which comprise sewerage ledgers, and then constructed the computer system which manages it comprehensively. The details of the development and the system outline are described.

  3. Academic detailing.

    PubMed

    Shankar, P R; Jha, N; Piryani, R M; Bajracharya, O; Shrestha, R; Thapa, H S

    2010-01-01

    There are a number of sources available to prescribers to stay up to date about medicines. Prescribers in rural areas in developing countries however, may not able to access some of them. Interventions to improve prescribing can be educational, managerial, and regulatory or use a mix of strategies. Detailing by the pharmaceutical industry is widespread. Academic detailing (AD) has been classically seen as a form of continuing medical education in which a trained health professional such as a physician or pharmacist visits physicians in their offices to provide evidence-based information. Face-to-face sessions, preferably on an individual basis, clear educational and behavioural objectives, establishing credibility with respect to objectivity, stimulating physician interaction, use of concise graphic educational materials, highlighting key messages, and when possible, providing positive reinforcement of improved practices in follow-up visits can increase success of AD initiatives. AD is common in developed countries and certain examples have been cited in this review. In developing countries the authors have come across reports of AD in Pakistan, Sudan, Argentina and Uruguay, Bihar state in India, Zambia, Cuba, Indonesia and Mexico. AD had a consistent, small but potentially significant impact on prescribing practices. AD has much less resources at its command compared to the efforts by the industry. Steps have to be taken to formally start AD in Nepal and there may be specific hindering factors similar to those in other developing nations. PMID:21209521

  4. When the Details Matter – Sensitivities in PRA Calculations That Could Affect Risk-Informed Decision-Making

    SciTech Connect

    Dana L. Kelly; Nathan O. Siu

    2010-06-01

    As the U.S. Nuclear Regulatory Commission (NRC) continues its efforts to increase its use of risk information in decision making, the detailed, quantitative results of probabilistic risk assessment (PRA) calculations are coming under increased scrutiny. Where once analysts and users were not overly concerned with figure of merit variations that were less than an order of magnitude, now factors of two or even less can spark heated debate regarding modeling approaches and assumptions. The philosophical and policy-related aspects of this situation are well-recognized by the PRA community. On the other hand, the technical implications for PRA methods and modeling have not been as widely discussed. This paper illustrates the potential numerical effects of choices as to the details of models and methods for parameter estimation with three examples: 1) the selection of the time period data for parameter estimation, and issues related to component boundary and failure mode definitions; 2) the selection of alternative diffuse prior distributions, including the constrained noninformative prior distribution, in Bayesian parameter estimation; and 3) the impact of uncertainty in calculations for recovery of offsite power.

  5. Center for Information Services, Phase II: Detailed System Design and Programming, Part 7 - Text Processing, Phase IIA Final Report.

    ERIC Educational Resources Information Center

    Silva, Georgette M.

    Libraries, as well as larger information networks, are necessarily based upon the storage of information files consisting in many cases of written materials and texts such as books, serials, abstracts, manuscripts and archives. At the present stage of the "information explosion" no librarian can afford to ignore the contribution of modern…

  6. Transient Auditory Storage of Acoustic Details Is Associated with Release of Speech from Informational Masking in Reverberant Conditions

    ERIC Educational Resources Information Center

    Huang, Ying; Huang, Qiang; Chen, Xun; Wu, Xihong; Li, Liang

    2009-01-01

    Perceptual integration of the sound directly emanating from the source with reflections needs both temporal storage and correlation computation of acoustic details. We examined whether the temporal storage is frequency dependent and associated with speech unmasking. In Experiment 1, a break in correlation (BIC) between interaurally correlated…

  7. The role of cognitive switching in head-up displays. [to determine pilot ability to accurately extract information from either of two sources

    NASA Technical Reports Server (NTRS)

    Fischer, E.

    1979-01-01

    The pilot's ability to accurately extract information from either one or both of two superimposed sources of information was determined. Static, aerial, color 35 mm slides of external runway environments and slides of corresponding static head-up display (HUD) symbology were used as the sources. A three channel tachistoscope was utilized to show either the HUD alone, the scene alone, or the two slides superimposed. Cognitive performance of the pilots was assessed by determining the percentage of correct answers given to two HUD related questions, two scene related questions, or one HUD and one scene related question.

  8. Robust fundamental frequency estimation in sustained vowels: Detailed algorithmic comparisons and information fusion with adaptive Kalman filtering

    PubMed Central

    Tsanas, Athanasios; Zañartu, Matías; Little, Max A.; Fox, Cynthia; Ramig, Lorraine O.; Clifford, Gari D.

    2014-01-01

    There has been consistent interest among speech signal processing researchers in the accurate estimation of the fundamental frequency (F0) of speech signals. This study examines ten F0 estimation algorithms (some well-established and some proposed more recently) to determine which of these algorithms is, on average, better able to estimate F0 in the sustained vowel /a/. Moreover, a robust method for adaptively weighting the estimates of individual F0 estimation algorithms based on quality and performance measures is proposed, using an adaptive Kalman filter (KF) framework. The accuracy of the algorithms is validated using (a) a database of 117 synthetic realistic phonations obtained using a sophisticated physiological model of speech production and (b) a database of 65 recordings of human phonations where the glottal cycles are calculated from electroglottograph signals. On average, the sawtooth waveform inspired pitch estimator and the nearly defect-free algorithms provided the best individual F0 estimates, and the proposed KF approach resulted in a ∼16% improvement in accuracy over the best single F0 estimation algorithm. These findings may be useful in speech signal processing applications where sustained vowels are used to assess vocal quality, when very accurate F0 estimation is required. PMID:24815269

  9. Robust fundamental frequency estimation in sustained vowels: detailed algorithmic comparisons and information fusion with adaptive Kalman filtering.

    PubMed

    Tsanas, Athanasios; Zañartu, Matías; Little, Max A; Fox, Cynthia; Ramig, Lorraine O; Clifford, Gari D

    2014-05-01

    There has been consistent interest among speech signal processing researchers in the accurate estimation of the fundamental frequency (F(0)) of speech signals. This study examines ten F(0) estimation algorithms (some well-established and some proposed more recently) to determine which of these algorithms is, on average, better able to estimate F(0) in the sustained vowel /a/. Moreover, a robust method for adaptively weighting the estimates of individual F(0) estimation algorithms based on quality and performance measures is proposed, using an adaptive Kalman filter (KF) framework. The accuracy of the algorithms is validated using (a) a database of 117 synthetic realistic phonations obtained using a sophisticated physiological model of speech production and (b) a database of 65 recordings of human phonations where the glottal cycles are calculated from electroglottograph signals. On average, the sawtooth waveform inspired pitch estimator and the nearly defect-free algorithms provided the best individual F(0) estimates, and the proposed KF approach resulted in a ∼16% improvement in accuracy over the best single F(0) estimation algorithm. These findings may be useful in speech signal processing applications where sustained vowels are used to assess vocal quality, when very accurate F(0) estimation is required. PMID:24815269

  10. Crowdsourcing detailed flood data

    NASA Astrophysics Data System (ADS)

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad

    2015-04-01

    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  11. Assignment of Calibration Information to Deeper Phylogenetic Nodes is More Effective in Obtaining Precise and Accurate Divergence Time Estimates.

    PubMed

    Mello, Beatriz; Schrago, Carlos G

    2014-01-01

    Divergence time estimation has become an essential tool for understanding macroevolutionary events. Molecular dating aims to obtain reliable inferences, which, within a statistical framework, means jointly increasing the accuracy and precision of estimates. Bayesian dating methods exhibit the propriety of a linear relationship between uncertainty and estimated divergence dates. This relationship occurs even if the number of sites approaches infinity and places a limit on the maximum precision of node ages. However, how the placement of calibration information may affect the precision of divergence time estimates remains an open question. In this study, relying on simulated and empirical data, we investigated how the location of calibration within a phylogeny affects the accuracy and precision of time estimates. We found that calibration priors set at median and deep phylogenetic nodes were associated with higher precision values compared to analyses involving calibration at the shallowest node. The results were independent of the tree symmetry. An empirical mammalian dataset produced results that were consistent with those generated by the simulated sequences. Assigning time information to the deeper nodes of a tree is crucial to guarantee the accuracy and precision of divergence times. This finding highlights the importance of the appropriate choice of outgroups in molecular dating. PMID:24855333

  12. The Lunar Laser Ranging Experiment: Accurate ranges have given a large improvement in the lunar orbit and new selenophysical information.

    PubMed

    Bender, P L; Currie, D G; Poultney, S K; Alley, C O; Dicke, R H; Wilkinson, D T; Eckhardt, D H; Faller, J E; Kaula, W M; Mulholland, J D; Plotkin, H H; Silverberg, E C; Williams, J G

    1973-10-19

    The lunar ranging measurements now being made at the McDonald Observatory have an accuracy of 1 nsec in round-trip travel time. This corresponds to 15 cm in the one-way distance. The use of lasers with pulse-lengths of less than 1 nsec is expected to give an accuracy of 2 to 3 cm in the next few years. A new station is under construction in Hawaii, and additional stations in other countries are either in operation or under development. It is hoped that these stations will form the basis for a worldwide network to determine polar motion and earth rotation on a regular basis, and will assist in providing information about movement of the tectonic plates making up the earth's surface. Several mobile lunar ranging stations with telescopes having diameters of 1.0 m or less could, in the future, greatly extend the information obtainable about motions within and between the tectonic plates. The data obtained so far by the McDonald Observatory have been used to generate a new lunar ephemeris based on direct numerical integration of the equations of motion for the moon and planets. With this ephemeris, the range to the three Apollo retro-reflectors can be fit to an accuracy of 5 m by adjusting the differences in moments of inertia of the moon about its principal axes, the selenocentric coordinates of the reflectors, and the McDonald longitude. The accuracy of fitting the results is limited currently by errors of the order of an arc second in the angular orientation of the moon, as derived from the best available theory of how the moon rotates in response to the torques acting on it. Both a new calculation of the moon's orientation as a function of time based on direct numerical integration of the torque equations and a new analytic theory of the moon's orientation are expected to be available soon, and to improve considerably the accuracy of fitting the data. The accuracy already achieved routinely in lunar laser ranging represents a hundredfold improvement over any

  13. The Lunar Laser Ranging Experiment: Accurate ranges have given a large improvement in the lunar orbit and new selenophysical information.

    PubMed

    Bender, P L; Currie, D G; Poultney, S K; Alley, C O; Dicke, R H; Wilkinson, D T; Eckhardt, D H; Faller, J E; Kaula, W M; Mulholland, J D; Plotkin, H H; Silverberg, E C; Williams, J G

    1973-10-19

    The lunar ranging measurements now being made at the McDonald Observatory have an accuracy of 1 nsec in round-trip travel time. This corresponds to 15 cm in the one-way distance. The use of lasers with pulse-lengths of less than 1 nsec is expected to give an accuracy of 2 to 3 cm in the next few years. A new station is under construction in Hawaii, and additional stations in other countries are either in operation or under development. It is hoped that these stations will form the basis for a worldwide network to determine polar motion and earth rotation on a regular basis, and will assist in providing information about movement of the tectonic plates making up the earth's surface. Several mobile lunar ranging stations with telescopes having diameters of 1.0 m or less could, in the future, greatly extend the information obtainable about motions within and between the tectonic plates. The data obtained so far by the McDonald Observatory have been used to generate a new lunar ephemeris based on direct numerical integration of the equations of motion for the moon and planets. With this ephemeris, the range to the three Apollo retro-reflectors can be fit to an accuracy of 5 m by adjusting the differences in moments of inertia of the moon about its principal axes, the selenocentric coordinates of the reflectors, and the McDonald longitude. The accuracy of fitting the results is limited currently by errors of the order of an arc second in the angular orientation of the moon, as derived from the best available theory of how the moon rotates in response to the torques acting on it. Both a new calculation of the moon's orientation as a function of time based on direct numerical integration of the torque equations and a new analytic theory of the moon's orientation are expected to be available soon, and to improve considerably the accuracy of fitting the data. The accuracy already achieved routinely in lunar laser ranging represents a hundredfold improvement over any

  14. Eye lens proteomics: from global approach to detailed information about phakinin and gamma E and F crystallin genes.

    PubMed

    Hoehenwarter, Wolfgang; Kumar, Nalin M; Wacker, Maik; Zimny-Arndt, Ursula; Klose, Joachim; Jungblut, Peter R

    2005-01-01

    Exploration of the lenticular proteome poses a challenging and worthwhile undertaking as cataracts, the products of a disease phenotype elicited by this proteome, remains the leading cause of vision impairment worldwide. The complete ten day old lens proteome of Mus musculus C57BL/6J was resolved into 900 distinct spots by large gel carrier ampholyte based 2-DE. The predicted amino acid sequences of all 16 crystallins ubiquitous in mammals were corroborated by mass spectrometry (MS). In detailed individual spot analyses, the primary structure of the full murine C57BL/6J beaded filament component phakinin CP49 was sequenced by liquid chromatography/electrospray ionization-tandem MS and amended at two positions. This definitive polypeptide sequence was aligned to the mouse genome, thus identifying the entire C57BL/6J genomic coding region. Also, two murine C57/6J polypeptides, both previously classified as gamma F crystallin, were clearly distinguished by MS and electrophoretic mobility. Both were assigned to their respective genes, one of the polypeptides was reclassified as C57BL/6J gamma E crystallin. Building on these data and previous investigations an updated crystallin reference map was put forth and several non crystallin lenticular components were examined. These results represent the first part of a comprehensive investigation of the mouse lens proteome (http://www.mpiib-berlin.mpg.de/2D-PAGE) with emphasis on understanding genetic effects on proteins and disease development.

  15. Argon Cluster Sputtering Source for ToF-SIMS Depth Profiling of Insulating Materials: High Sputter Rate and Accurate Interfacial Information.

    PubMed

    Wang, Zhaoying; Liu, Bingwen; Zhao, Evan W; Jin, Ke; Du, Yingge; Neeway, James J; Ryan, Joseph V; Hu, Dehong; Zhang, Kelvin H L; Hong, Mina; Le Guernic, Solenne; Thevuthasan, Suntharampilai; Wang, Fuyi; Zhu, Zihua

    2015-08-01

    The use of an argon cluster ion sputtering source has been demonstrated to perform superiorly relative to traditional oxygen and cesium ion sputtering sources for ToF-SIMS depth profiling of insulating materials. The superior performance has been attributed to effective alleviation of surface charging. A simulated nuclear waste glass (SON68) and layered hole-perovskite oxide thin films were selected as model systems because of their fundamental and practical significance. Our results show that high sputter rates and accurate interfacial information can be achieved simultaneously for argon cluster sputtering, whereas this is not the case for cesium and oxygen sputtering. Therefore, the implementation of an argon cluster sputtering source can significantly improve the analysis efficiency of insulating materials and, thus, can expand its applications to the study of glass corrosion, perovskite oxide thin film characterization, and many other systems of interest.

  16. IrisPlex: a sensitive DNA tool for accurate prediction of blue and brown eye colour in the absence of ancestry information.

    PubMed

    Walsh, Susan; Liu, Fan; Ballantyne, Kaye N; van Oven, Mannis; Lao, Oscar; Kayser, Manfred

    2011-06-01

    A new era of 'DNA intelligence' is arriving in forensic biology, due to the impending ability to predict externally visible characteristics (EVCs) from biological material such as those found at crime scenes. EVC prediction from forensic samples, or from body parts, is expected to help concentrate police investigations towards finding unknown individuals, at times when conventional DNA profiling fails to provide informative leads. Here we present a robust and sensitive tool, termed IrisPlex, for the accurate prediction of blue and brown eye colour from DNA in future forensic applications. We used the six currently most eye colour-informative single nucleotide polymorphisms (SNPs) that previously revealed prevalence-adjusted prediction accuracies of over 90% for blue and brown eye colour in 6168 Dutch Europeans. The single multiplex assay, based on SNaPshot chemistry and capillary electrophoresis, both widely used in forensic laboratories, displays high levels of genotyping sensitivity with complete profiles generated from as little as 31pg of DNA, approximately six human diploid cell equivalents. We also present a prediction model to correctly classify an individual's eye colour, via probability estimation solely based on DNA data, and illustrate the accuracy of the developed prediction test on 40 individuals from various geographic origins. Moreover, we obtained insights into the worldwide allele distribution of these six SNPs using the HGDP-CEPH samples of 51 populations. Eye colour prediction analyses from HGDP-CEPH samples provide evidence that the test and model presented here perform reliably without prior ancestry information, although future worldwide genotype and phenotype data shall confirm this notion. As our IrisPlex eye colour prediction test is capable of immediate implementation in forensic casework, it represents one of the first steps forward in the creation of a fully individualised EVC prediction system for future use in forensic DNA intelligence.

  17. Details of assessing information content of the Tropospheric Infrared Mapping Spectrometers (TIMS) GEO-CAPE instrument concept when applied for several infrared ozone bands

    NASA Astrophysics Data System (ADS)

    Rairden, R. L.; Kumer, J. B.; Roche, A. E.; Desouza-Machado, S. G.; Chatfield, R. B.; Blatherwick, R.

    2009-12-01

    With support of NASA ESTO Instrument Incubator Program (IIP) Tropospheric Infrared Mapping Spectrometers (TIMS) have been demonstrated for multi-layer retrieval of Atmospheric CO. Two TIMS units operating in spectral regions centered at 2.33 and 4.68 µm were developed for this demonstration. Here we present the details of scaling the characteristics of the demonstration measurements including spectral range, sample spacing and resolution, and noise per sample to the scenario of GEO-CAPE mission and to several additional wave length regions. This includes the detail of expanding to more than two spectral regions. It includes an example of scaling the noise as demonstrated by the demonstration measurements to the space case, and to other spectral regions. Common with our oral presentation, methods based on these scaled instrument characteristics for estimating vertical information content are reviewed. The methods are applied and estimated vertical information content of measurements in ozone bands near 9.4, 4.7, 3.6 and 3.3 µm and in various combinations of these bands is presented. A simple simultaneous retrieval of humidity and ozone from atmospheric spectral absorption data in the 3.3 and 3.6 µm regions that was obtained by a solar viewing FTS is briefly presented. This is partially analogous to the retrieval of ozone from the earth’s surface diffuse reflection of sunlight as viewed from space. It supports the premise that these space borne measurements can contribute to the quality of the GEO-CAPE ozone measurements.

  18. General Information about Testicular Cancer

    MedlinePlus

    ... professional versions have detailed information written in technical language. The patient versions are written in easy-to-understand, nontechnical language. Both versions have cancer information that is accurate ...

  19. General Information about Prostate Cancer

    MedlinePlus

    ... professional versions have detailed information written in technical language. The patient versions are written in easy-to-understand, nontechnical language. Both versions have cancer information that is accurate ...

  20. General Information about Urethral Cancer

    MedlinePlus

    ... professional versions have detailed information written in technical language. The patient versions are written in easy-to-understand, nontechnical language. Both versions have cancer information that is accurate ...

  1. Establishing an Appropriate Level of Detail (LoD) for a Building Information Model (BIM) - West Block, Parliament Hill, Ottawa, Canada

    NASA Astrophysics Data System (ADS)

    Fai, S.; Rafeiro, J.

    2014-05-01

    In 2011, Public Works and Government Services Canada (PWGSC) embarked on a comprehensive rehabilitation of the historically significant West Block of Canada's Parliament Hill. With over 17 thousand square meters of floor space, the West Block is one of the largest projects of its kind in the world. As part of the rehabilitation, PWGSC is working with the Carleton Immersive Media Studio (CIMS) to develop a building information model (BIM) that can serve as maintenance and life-cycle management tool once construction is completed. The scale and complexity of the model have presented many challenges. One of these challenges is determining appropriate levels of detail (LoD). While still a matter of debate in the development of international BIM standards, LoD is further complicated in the context of heritage buildings because we must reconcile the LoD of the BIM with that used in the documentation process (terrestrial laser scan and photogrammetric survey data). In this paper, we will discuss our work to date on establishing appropriate LoD within the West Block BIM that will best serve the end use. To facilitate this, we have developed a single parametric model for gothic pointed arches that can be used for over seventy-five unique window types present in the West Block. Using the AEC (CAN) BIM as a reference, we have developed a workflow to test each of these window types at three distinct levels of detail. We have found that the parametric Gothic arch significantly reduces the amount of time necessary to develop scenarios to test appropriate LoD.

  2. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  3. General Information about Adult Brain Tumors

    MedlinePlus

    ... professional versions have detailed information written in technical language. The patient versions are written in easy-to-understand, nontechnical language. Both versions have cancer information that is accurate ...

  4. A Homegrown Design for Data Warehousing: A District Customizes Its Own Process for Generating Detailed Information about Students in Real Time

    ERIC Educational Resources Information Center

    Thompson, Terry J.; Gould, Karen J.

    2005-01-01

    In recent years the Metropolitan School District of Wayne Township in Indianapolis has been awash in data. In attempts to improve levels of student achievement, the authors collected all manner of statistical details about students and schools and attempted to perform data analysis as part of the school improvement process. The authors were never…

  5. Detailed cross sections of the Eocene Green River Formation along the north and east margins of the Piceance Basin, western Colorado, using measured sections and drill hole information

    USGS Publications Warehouse

    Johnson, Ronald C.

    2014-01-01

    This report presents two detailed cross sections of the Eocene Green River Formation in the Piceance Basin, northwestern Colorado, constructed from eight detailed measured sections, fourteen core holes, and two rotary holes. The Eocene Green River Formation in the Piceance Basin contains the world’s largest known oil shale deposit with more than 1.5 billion barrels of oil in place. It was deposited in Lake Uinta, a long-lived saline lake that once covered much of the Piceance Basin and the Uinta Basin to the west. The cross sections extend across the northern and eastern margins of the Piceance Basin and are intended to aid in correlating between surface sections and the subsurface in the basin.

  6. Student Information Systems Demystified: The Increasing Demand for Accurate, Timely Data Means Schools and Districts Are Relying Heavily on SIS Technologies

    ERIC Educational Resources Information Center

    McIntire, Todd

    2004-01-01

    Student information systems, one of the first applications of computer technology in education, are undergoing a significant transition yet again. The first major shift in SIS technologies occurred about 15 years ago when they evolved from mainframe programs to client-server solutions. Now, vendors across the board are offering centralized…

  7. Detailed design package for design of a video system providing optimal visual information for controlling payload and experiment operations with television

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A detailed description of a video system for controlling space shuttle payloads and experiments is presented in the preliminary design review and critical design review, first and second engineering design reports respectively, and in the final report submitted jointly with the design package. The material contained in the four subsequent sections of the package contains system descriptions, design data, and specifications for the recommended 2-view system. Section 2 contains diagrams relating to the simulation test configuration of the 2-view system. Section 3 contains descriptions and drawings of the deliverable breadboard equipment. A description of the recommended system is contained in Section 4 with equipment specifications in Section 5.

  8. Effect of detailed information in the minority game: optimality of 2-day memory and enhanced efficiency due to random exogenous data

    NASA Astrophysics Data System (ADS)

    Sasidevan, V.

    2016-07-01

    In the minority game (MG), an odd number of heterogeneous and adaptive agents choose between two alternatives and those who end up on the minority side win. When the information available to the agents to make their choice is the identity of the minority side for the past m days, it is well-known that the emergent coordination among the agents is maximum when m∼ {{log}2}(N) . The optimal memory-length thus increases with the system size. In this work we show that, in MG when the information available to the agents to make their choice is the strength of the minority side for the past m days, the optimal memory length for the agents is always two (m  =  2) for large enough system sizes. The system is inefficient for m  =  1 and converges to random choice behaviour for m>2 for large N. Surprisingly, providing the agents with uniformly and randomly sampled m  =  1 exogenous information results in an increase in coordination between them compared to the case of endogenous information with any value of m. This is in stark contrast to the conventional MG, where agent’s coordination is invariant or gets worse with respect to such random exogenous information.

  9. Detailed mechanism of benzene oxidation

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1987-01-01

    A detailed quantitative mechanism for the oxidation of benzene in both argon and nitrogen diluted systems is presented. Computed ignition delay time for argon diluted mixtures are in satisfactory agreement with experimental results for a wide range of initial conditions. An experimental temperature versus time profile for a nitrogen diluted oxidation was accurately matched and several concentration profiles were matched qualitatively. Application of sensitivity analysis has given approximate rate constant expressions for the two dominant heat release reactions, the oxidation of C6H5 and C5H5 radicals by molecular oxygen.

  10. Detailed requirements document for Stowage List and Hardware Tracking System (SLAHTS). [computer based information management system in support of space shuttle orbiter stowage configuration

    NASA Technical Reports Server (NTRS)

    Keltner, D. J.

    1975-01-01

    The stowage list and hardware tracking system, a computer based information management system, used in support of the space shuttle orbiter stowage configuration and the Johnson Space Center hardware tracking is described. The input, processing, and output requirements that serve as a baseline for system development are defined.

  11. Systematic assessment of coordinated activity cliffs formed by kinase inhibitors and detailed characterization of activity cliff clusters and associated SAR information.

    PubMed

    Dimova, Dilyana; Stumpfe, Dagmar; Bajorath, Jürgen

    2015-01-27

    From currently available kinase inhibitors and their activity data, clusters of coordinated activity cliffs were systematically derived and subjected to cluster index and index map analysis. Type I-like inhibitors with well-defined IC50 measurements were found to provide a large knowledge base of activity cliff clusters for 266 targets from nine kinase groups. On the basis of index map analysis, these clusters were systematically organized according to structural similarity of inhibitors and activity cliff diversity and prioritized for structure-activity relationship (SAR) analysis. From prioritized clusters, interpretable SAR information can be extracted. It is also shown that activity cliff clusters formed by ATP site-directed inhibitors often represent local SAR environments of rather different complexity and interpretability. In addition, activity cliff clusters including promiscuous kinase inhibitors have been determined. Only a small subset of inhibitors was found to change activity cliff roles in different clusters. The activity cliff clusters described herein and their index map organization substantially enrich SAR information associated with kinase inhibitors in compound subsets of limited size. The cluster and index map information is made available upon request to provide opportunities for further SAR exploration. On the basis of our analysis and the data provided, activity cliff clusters and corresponding inhibitor series for kinase targets of interest can be readily selected.

  12. Profitable capitation requires accurate costing.

    PubMed

    West, D A; Hicks, L L; Balas, E A; West, T D

    1996-01-01

    In the name of costing accuracy, nurses are asked to track inventory use on per treatment basis when more significant costs, such as general overhead and nursing salaries, are usually allocated to patients or treatments on an average cost basis. Accurate treatment costing and financial viability require analysis of all resources actually consumed in treatment delivery, including nursing services and inventory. More precise costing information enables more profitable decisions as is demonstrated by comparing the ratio-of-cost-to-treatment method (aggregate costing) with alternative activity-based costing methods (ABC). Nurses must participate in this costing process to assure that capitation bids are based upon accurate costs rather than simple averages. PMID:8788799

  13. Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross Bracing Detail, Vertical Cross Bracing-End Detail - Cumberland Covered Bridge, Spanning Mississinewa River, Matthews, Grant County, IN

  14. We Built This House; It's Time to Move in: Leveraging Existing DICOM Structure to More Completely Utilize Readily Available Detailed Contrast Administration Information.

    PubMed

    Hirsch, Jeffrey D; Siegel, Eliot L; Balasubramanian, Sridhar; Wang, Kenneth C

    2015-08-01

    The Digital Imaging and Communications in Medicine (DICOM) standard is the universal format for interoperability in medical imaging. In addition to imaging data, DICOM has evolved to support a wide range of imaging metadata including contrast administration data that is readily available from many modern contrast injectors. Contrast agent, route of administration, start and stop time, volume, flow rate, and duration can be recorded using DICOM attributes [1]. While this information is sparsely and inconsistently recorded in routine clinical practice, it could potentially be of significant diagnostic value. This work will describe parameters recorded by automatic contrast injectors, summarize the DICOM mechanisms available for tracking contrast injection data, and discuss the role of such data in clinical radiology. PMID:25700615

  15. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  16. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  17. Detailed Clinical Models: A Review

    PubMed Central

    Goossen-Baremans, Anneke; van der Zel, Michael

    2010-01-01

    Objectives Due to the increasing use of electronic patient records and other health care information technology, we see an increase in requests to utilize these data. A highly level of standardization is required during the gathering of these data in the clinical context in order to use it for analyses. Detailed Clinical Models (DCM) have been created toward this purpose and several initiatives have been implemented in various parts of the world to create standardized models. This paper presents a review of DCM. Methods Two types of analyses are presented; one comparing DCM against health care information architectures and a second bottom up approach from concept analysis to representation. In addition core parts of the draft ISO standard 13972 on DCM are used such as clinician involvement, data element specification, modeling, meta information, and repository and governance. Results Six initiatives were selected: Intermountain Healthcare, 13606/OpenEHR Archetypes, Clinical Templates, Clinical Contents Models, Health Level 7 templates, and Dutch Detailed Clinical Models. Each model selected was reviewed for their overall development, involvement of clinicians, use of data types, code bindings, expressing semantics, modeling, meta information, use of repository and governance. Conclusions Using both a top down and bottom up approach to comparison reveals many commonalties and differences between initiatives. Important differences include the use of or lack of a reference model and expressiveness of models. Applying clinical data element standards facilitates the use of conceptual DCM models in different technical representations. PMID:21818440

  18. Chord Splicing & Joining Detail; Chord & CrossBracing Joint Details; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Chord Splicing & Joining Detail; Chord & Cross-Bracing Joint Details; Cross Bracing Center Joint Detail; Chord & Diagonal Joint Detail - Vermont Covered Bridge, Highland Park, spanning Kokomo Creek at West end of Deffenbaugh Street (moved to), Kokomo, Howard County, IN

  19. Arch & Chord Joint Detail; Crossbracing Center Joint Detail; Chord, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Arch & Chord Joint Detail; Crossbracing Center Joint Detail; Chord, Panel Post, Tie & Diagonal Brace Joint Detail; Chord, Panel Post, Tie & Crossbracing Joint Detail - Dunlapsville Covered Bridge, Spanning East Fork Whitewater River, Dunlapsville, Union County, IN

  20. Accurate and Accidental Empathy.

    ERIC Educational Resources Information Center

    Chandler, Michael

    The author offers two controversial criticisms of what are rapidly becoming standard assessment procedures for the measurement of empathic skill. First, he asserts that assessment procedures which attend exclusively to the accuracy with which subjects are able to characterize other people's feelings provide little or no useful information about…

  1. Detailed sensory memory, sloppy working memory.

    PubMed

    Sligte, Ilja G; Vandenbroucke, Annelinde R E; Scholte, H Steven; Lamme, Victor A F

    2010-01-01

    Visual short-term memory (VSTM) enables us to actively maintain information in mind for a brief period of time after stimulus disappearance. According to recent studies, VSTM consists of three stages - iconic memory, fragile VSTM, and visual working memory - with increasingly stricter capacity limits and progressively longer lifetimes. Still, the resolution (or amount of visual detail) of each VSTM stage has remained unexplored and we test this in the present study. We presented people with a change detection task that measures the capacity of all three forms of VSTM, and we added an identification display after each change trial that required people to identify the "pre-change" object. Accurate change detection plus pre-change identification requires subjects to have a high-resolution representation of the "pre-change" object, whereas change detection or identification only can be based on the hunch that something has changed, without exactly knowing what was presented before. We observed that people maintained 6.1 objects in iconic memory, 4.6 objects in fragile VSTM, and 2.1 objects in visual working memory. Moreover, when people detected the change, they could also identify the pre-change object on 88% of the iconic memory trials, on 71% of the fragile VSTM trials and merely on 53% of the visual working memory trials. This suggests that people maintain many high-resolution representations in iconic memory and fragile VSTM, but only one high-resolution object representation in visual working memory. PMID:21897823

  2. Accurate Optical Reference Catalogs

    NASA Astrophysics Data System (ADS)

    Zacharias, N.

    2006-08-01

    Current and near future all-sky astrometric catalogs on the ICRF are reviewed with the emphasis on reference star data at optical wavelengths for user applications. The standard error of a Hipparcos Catalogue star position is now about 15 mas per coordinate. For the Tycho-2 data it is typically 20 to 100 mas, depending on magnitude. The USNO CCD Astrograph Catalog (UCAC) observing program was completed in 2004 and reductions toward the final UCAC3 release are in progress. This all-sky reference catalogue will have positional errors of 15 to 70 mas for stars in the 10 to 16 mag range, with a high degree of completeness. Proper motions for the about 60 million UCAC stars will be derived by combining UCAC astrometry with available early epoch data, including yet unpublished scans of the complete set of AGK2, Hamburg Zone astrograph and USNO Black Birch programs. Accurate positional and proper motion data are combined in the Naval Observatory Merged Astrometric Dataset (NOMAD) which includes Hipparcos, Tycho-2, UCAC2, USNO-B1, NPM+SPM plate scan data for astrometry, and is supplemented by multi-band optical photometry as well as 2MASS near infrared photometry. The Milli-Arcsecond Pathfinder Survey (MAPS) mission is currently being planned at USNO. This is a micro-satellite to obtain 1 mas positions, parallaxes, and 1 mas/yr proper motions for all bright stars down to about 15th magnitude. This program will be supplemented by a ground-based program to reach 18th magnitude on the 5 mas level.

  3. LF460 detail design

    NASA Technical Reports Server (NTRS)

    1971-01-01

    This is the final technical report documenting the detail design of the LF460, and advanced turbotip lift fan intended for application with the YJ97-GE-100 turbojet jet generator to a V/STOL transport research aircraft. Primary objective of the design was to achieve a low noise level while maintaining the high thrust/weight ratio capability of a high pressure ratio lift fan. Report covers design requirements and summarizes activities and final results in the areas of aerodynamic and mechanical design, component and system performance, acoustic features and final noise predictions.

  4. Details of meiosis

    SciTech Connect

    1993-12-31

    Chapter 18, discusses the details of meiosis, beginning with the structure and number of chiasmata, i.e., the cytological term for two homologous chromosomes forming a bivalent which begin to repel each other until they are held together only at the point of crossing-over. The synaptonemal complex which consists of two lateral elements which contain protein and RNA is also discussed. The chapter concludes with a description of meiosis in polyploids, human meiosis, and the behavior of X and Y chromosomes. 28 refs., 8 figs.

  5. Detailed Simulations of Weak-to-Strong Ignition of a H2/O2/Ar Mixture in Shock-Tubes

    NASA Astrophysics Data System (ADS)

    Ihme, Matthias; Sun, Yong; Deiterding, Ralf

    The accurate description of chemical-kinetic models is critical for characterizing effects of new fuel compositions on existing propulsion systems and for developing future combustion technologies. Among other facilities, shock tubes remain hereby invaluable in providing detailed information about ignition delay times, extinction limits, and species time-histories for the development and validation of reaction mechanisms.

  6. Detailed Debunking of Denial

    NASA Astrophysics Data System (ADS)

    Enting, I. G.; Abraham, J. P.

    2012-12-01

    The disinformation campaign against climate science has been compared to a guerilla war whose tactics undermine the traditional checks and balances of science. One comprehensive approach has to been produce archives of generic responses such as the websites of RealClimate and SkepticalScience. We review our experiences with an alternative approach of detailed responses to a small number of high profile cases. Our particular examples were Professor Ian Plimer and Christopher Monckton, the Third Viscount Monckton of Brenchley, each of whom has been taken seriously by political leaders in our respective countries. We relate our experiences to comparable examples such as John Mashey's analysis of the Wegman report and the formal complaints about Lomborg's "Skeptical Environmentalist" and Durkin's "Great Global Warming Swindle". Our two approaches used contrasting approaches: an on-line video of a lecture vs an evolving compendium of misrepresentations. Additionally our approaches differed in the emphasis. The analysis of Monckton concentrated on the misrepresentation of the science, while the analysis of Plimer concentrated on departures from accepted scientific practice: fabrication of data, misrepresentation of cited sources and unattributed use of the work of others. Benefits of an evolving compendium were the ability to incorporate contributions from members of the public who had identified additional errors and the scope for addressing new aspects as they came to public attention. `Detailed debunking' gives non-specialists a reference point for distinguishing non-science when engaging in public debate.

  7. Clinical professional governance for detailed clinical models.

    PubMed

    Goossen, William; Goossen-Baremans, Anneke

    2013-01-01

    This chapter describes the need for Detailed Clinical Models for contemporary Electronic Health Systems, data exchange and data reuse. It starts with an explanation of the components related to Detailed Clinical Models with a brief summary of knowledge representation, including terminologies representing clinic relevant "things" in the real world, and information models that abstract these in order to let computers process data about these things. Next, Detailed Clinical Models are defined and their purpose is described. It builds on existing developments around the world and accumulates in current work to create a technical specification at the level of the International Standards Organization. The core components of properly expressed Detailed Clinical Models are illustrated, including clinical knowledge and context, data element specification, code bindings to terminologies and meta-information about authors, versioning among others. Detailed Clinical Models to date are heavily based on user requirements and specify the conceptual and logical levels of modelling. It is not precise enough for specific implementations, which requires an additional step. However, this allows Detailed Clinical Models to serve as specifications for many different kinds of implementations. Examples of Detailed Clinical Models are presented both in text and in Unified Modelling Language. Detailed Clinical Models can be positioned in health information architectures, where they serve at the most detailed granular level. The chapter ends with examples of projects that create and deploy Detailed Clinical Models. All have in common that they can often reuse materials from earlier projects, and that strict governance of these models is essential to use them safely in health care information and communication technology. Clinical validation is one point of such governance, and model testing another. The Plan Do Check Act cycle can be applied for governance of Detailed Clinical Models

  8. Detail of Triton

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This color photo of Neptune's large satellite Triton was obtained on Aug. 24 1989 at a range of 530,000 kilometers (330,000 miles). The resolution is about 10 kilometers (6.2 miles), sufficient to begin to show topographic detail. The image was made from pictures taken through the green, violet and ultraviolet filters. In this technique, regions that are highly reflective in the ultraviolet appear blue in color. In reality, there is no part of Triton that would appear blue to the eye. The bright southern hemisphere of Triton, which fills most of this frame, is generally pink in tone as is the even brighter equatorial band. The darker regions north of the equator also tend to be pink or reddish in color. JPL manages the Voyager project for NASA's Office of Space Science, Washington, DC.

  9. Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Horizontal Cross Bracing Detail, Vertical Cross Bracing Detail, Horizontal Cross Bracing Joint, Vertical Cross Bracing End Detail - Ceylon Covered Bridge, Limberlost Park, spanning Wabash River at County Road 900 South, Geneva, Adams County, IN

  10. Detailing 'measures that matter'.

    PubMed

    Heavisides, Bob

    2010-04-01

    In a paper originally presented at last October's Healthcare Estates conference in Harrogate, Bob Heavisides, director of facilities at the Milton Keynes NHS Foundation Trust, explains how estates and facilities directors can provide a package of information based on a number of "measures that matter" to demonstrate to their boards that safe systems of work, operational efficiency and effectiveness, and operational parameters, are within, or better than, equivalent-sized Trusts.

  11. roof truss detail, historic strap hinge detail Chopawamsic Recreational ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    roof truss detail, historic strap hinge detail - Chopawamsic Recreational Demonstration Area - Cabin Camp 1, Main Arts and Crafts Lodge, Prince William Forest Park, Triangle, Prince William County, VA

  12. DETAILED STUDIES OF ELECTRON COOLING FRICTION FORCE.

    SciTech Connect

    FEDOTOV, A.V.; BRUHWILER, D.L.; ABELL, D.T.; SIDORIN, A.O.

    2005-09-18

    High-energy electron cooling for RHIC presents many unique features and challenges. An accurate estimate of the cooling times requires detailed simulation of the electron cooling process. The first step towards such calculations is to have an accurate description of the cooling force. Numerical simulations are being used to explore various features of the friction force which appear due to several effects, including the anisotropy of the electron distribution in velocity space and the effect of a strong solenoidal magnetic field. These aspects are being studied in detail using the VORFAL code, which explicitly resolves close binary collisions. Results are compared with available asymptotic and empirical formulas and also, using the BETACOOL code, with direct numerical integration of less approximate expressions over the specified electron distribution function.

  13. Morphological details in bloodstain particles.

    PubMed

    De Wael, K; Lepot, L

    2015-01-01

    During the commission of crimes blood can be transferred to the clothing of the offender or on other crime related objects. Bloodstain particles are sub-millimetre sized flakes that are lost from dried bloodstains. The nature of these red particles is easily confirmed using spectroscopic methods. In casework, bloodstain particles showing highly detailed morphological features were observed. These provided a rationale for a series of experiments described in this work. It was found that the "largest" particles are shed from blood deposited on polyester and polyamide woven fabrics. No particles are lost from the stains made on absorbent fabrics and from those made on knitted fabrics. The morphological features observed in bloodstain particles can provide important information on the substrates from which they were lost. PMID:25437904

  14. The Finer Details: Climate Modeling

    NASA Technical Reports Server (NTRS)

    2000-01-01

    If you want to know whether you will need sunscreen or an umbrella for tomorrow's picnic, you can simply read the local weather report. However, if you are calculating the impact of gas combustion on global temperatures, or anticipating next year's rainfall levels to set water conservation policy, you must conduct a more comprehensive investigation. Such complex matters require long-range modeling techniques that predict broad trends in climate development rather than day-to-day details. Climate models are built from equations that calculate the progression of weather-related conditions over time. Based on the laws of physics, climate model equations have been developed to predict a number of environmental factors, for example: 1. Amount of solar radiation that hits the Earth. 2. Varying proportions of gases that make up the air. 3. Temperature at the Earth's surface. 4. Circulation of ocean and wind currents. 5. Development of cloud cover. Numerical modeling of the climate can improve our understanding of both the past and, the future. A model can confirm the accuracy of environmental measurements taken. in, the past and can even fill in gaps in those records. In addition, by quantifying the relationship between different aspects of climate, scientists can estimate how a future change in one aspect may alter the rest of the world. For example, could an increase in the temperature of the Pacific Ocean somehow set off a drought on the other side of the world? A computer simulation could lead to an answer for this and other questions. Quantifying the chaotic, nonlinear activities that shape our climate is no easy matter. You cannot run these simulations on your desktop computer and expect results by the time you have finished checking your morning e-mail. Efficient and accurate climate modeling requires powerful computers that can process billions of mathematical calculations in a single second. The NCCS exists to provide this degree of vast computing capability.

  15. Detail, Scandia Hotel, view to southwest showing details of balloon ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail, Scandia Hotel, view to southwest showing details of balloon framing, including full two-story studs notched to carry girts supporting second story floor joists (210mm lens) - Scandia Hotel, 225 First Street, Eureka, Humboldt County, CA

  16. Detail East Pier Elevation, Transverse Section, Detail Roof Plan As ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail East Pier Elevation, Transverse Section, Detail Roof Plan As Found - Sulphite Railroad Bridge, Former Boston & Maine Railroad (originally Tilton & Franklin Railroad) spanning Winnipesautee River, Franklin, Merrimack County, NH

  17. Detail view of ornamental lighting detail of southwest corner of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail view of ornamental lighting detail of southwest corner of Sixth Street Bridge. Looking northeast - Sixth Street Bridge, Spanning 101 Freeway at Sixth Street, Los Angeles, Los Angeles County, CA

  18. 58. DETAIL OF PINION AND BULL GEARS: Detail view towards ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    58. DETAIL OF PINION AND BULL GEARS: Detail view towards northeast of the pinion and bull gears of the winding machinery. - San Francisco Cable Railway, Washington & Mason Streets, San Francisco, San Francisco County, CA

  19. 6. Detail of front entry on E elevation. Detail of ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Detail of front entry on E elevation. Detail of round, terra cotta medallions on E elevation indicating date of building. - Central of Georgia Railway, Red (Administration) Building, 233 West Broad Street, Savannah, Chatham County, GA

  20. Detail of pumps in troughs, detail of truss attachment ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Detail of pumps in troughs, detail of truss - attachment to the wall - as well as the troughs themselves. Interior of the main hatchery building, view to the east. - Prairie Creek Fish Hatchery, Hwy. 101, Orick, Humboldt County, CA

  1. A DArT marker genetic map of perennial ryegrass (Lolium perenne L.) integrated with detailed comparative mapping information; comparison with existing DArT marker genetic maps of Lolium perenne, L. multiflorum and Festuca pratensis

    PubMed Central

    2013-01-01

    Background Ryegrasses and fescues (genera, Lolium and Festuca) are species of forage and turf grasses which are used widely in agricultural and amenity situations. They are classified within the sub-family Pooideae and so are closely related to Brachypodium distachyon, wheat, barley, rye and oats. Recently, a DArT array has been developed which can be used in generating marker and mapping information for ryegrasses and fescues. This represents a potential common marker set for ryegrass and fescue researchers which can be linked through to comparative genomic information for the grasses. Results A F2 perennial ryegrass genetic map was developed consisting of 7 linkage groups defined by 1316 markers and deriving a total map length of 683 cM. The marker set included 866 DArT and 315 gene sequence-based markers. Comparison with previous DArT mapping studies in perennial and Italian ryegrass (L. multiflorum) identified 87 and 105 DArT markers in common, respectively, of which 94% and 87% mapped to homoeologous linkage groups. A similar comparison with meadow fescue (F. pratensis) identified only 28 DArT markers in common, of which c. 50% mapped to non-homoelogous linkage groups. In L. perenne, the genetic distance spanned by the DArT markers encompassed the majority of the regions that could be described in terms of comparative genomic relationships with rice, Brachypodium distachyon, and Sorghum bicolor. Conclusions DArT markers are likely to be a useful common marker resource for ryegrasses and fescues, though the success in aligning different populations through the mapping of common markers will be influenced by degrees of population interrelatedness. The detailed mapping of DArT and gene-based markers in this study potentially allows comparative relationships to be derived in future mapping populations characterised using solely DArT markers. PMID:23819624

  2. Fast and accurate low-dimensional reduction of biophysically detailed neuron models.

    PubMed

    Marasco, Addolorata; Limongiello, Alessandro; Migliore, Michele

    2012-01-01

    Realistic modeling of neurons are quite successful in complementing traditional experimental techniques. However, their networks require a computational power beyond the capabilities of current supercomputers, and the methods used so far to reduce their complexity do not take into account the key features of the cells nor critical physiological properties. Here we introduce a new, automatic and fast method to map realistic neurons into equivalent reduced models running up to > 40 times faster while maintaining a very high accuracy of the membrane potential dynamics during synaptic inputs, and a direct link with experimental observables. The mapping of arbitrary sets of synaptic inputs, without additional fine tuning, would also allow the convenient and efficient implementation of a new generation of large-scale simulations of brain regions reproducing the biological variability observed in real neurons, with unprecedented advances to understand higher brain functions. PMID:23226594

  3. Fast and accurate low-dimensional reduction of biophysically detailed neuron models.

    PubMed

    Marasco, Addolorata; Limongiello, Alessandro; Migliore, Michele

    2012-01-01

    Realistic modeling of neurons are quite successful in complementing traditional experimental techniques. However, their networks require a computational power beyond the capabilities of current supercomputers, and the methods used so far to reduce their complexity do not take into account the key features of the cells nor critical physiological properties. Here we introduce a new, automatic and fast method to map realistic neurons into equivalent reduced models running up to > 40 times faster while maintaining a very high accuracy of the membrane potential dynamics during synaptic inputs, and a direct link with experimental observables. The mapping of arbitrary sets of synaptic inputs, without additional fine tuning, would also allow the convenient and efficient implementation of a new generation of large-scale simulations of brain regions reproducing the biological variability observed in real neurons, with unprecedented advances to understand higher brain functions.

  4. NNLOPS accurate associated HW production

    NASA Astrophysics Data System (ADS)

    Astill, William; Bizon, Wojciech; Re, Emanuele; Zanderighi, Giulia

    2016-06-01

    We present a next-to-next-to-leading order accurate description of associated HW production consistently matched to a parton shower. The method is based on reweighting events obtained with the HW plus one jet NLO accurate calculation implemented in POWHEG, extended with the MiNLO procedure, to reproduce NNLO accurate Born distributions. Since the Born kinematics is more complex than the cases treated before, we use a parametrization of the Collins-Soper angles to reduce the number of variables required for the reweighting. We present phenomenological results at 13 TeV, with cuts suggested by the Higgs Cross section Working Group.

  5. Influences on physicians' adoption of electronic detailing (e-detailing).

    PubMed

    Alkhateeb, Fadi M; Doucette, William R

    2009-01-01

    E-detailing means using digital technology: internet, video conferencing and interactive voice response. There are two types of e-detailing: interactive (virtual) and video. Currently, little is known about what factors influence physicians' adoption of e-detailing. The objectives of this study were to test a model of physicians' adoption of e-detailing and to describe physicians using e-detailing. A mail survey was sent to a random sample of 2000 physicians practicing in Iowa. Binomial logistic regression was used to test the model of influences on physician adoption of e-detailing. On the basis of Rogers' model of adoption, the independent variables included relative advantage, compatibility, complexity, peer influence, attitudes, years in practice, presence of restrictive access to traditional detailing, type of specialty, academic affiliation, type of practice setting and control variables. A total of 671 responses were received giving a response rate of 34.7%. A total of 141 physicians (21.0%) reported using of e-detailing. The overall adoption model for using either type of e-detailing was found to be significant. Relative advantage, peer influence, attitudes, type of specialty, presence of restrictive access and years of practice had significant influences on physician adoption of e-detailing. The model of adoption of innovation is useful to explain physicians' adoption of e-detailing. PMID:19306198

  6. Making detailed predictions makes (some) predictions worse

    NASA Astrophysics Data System (ADS)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  7. Accurate mask model for advanced nodes

    NASA Astrophysics Data System (ADS)

    Zine El Abidine, Nacer; Sundermann, Frank; Yesilada, Emek; Ndiaye, El Hadji Omar; Mishra, Kushlendra; Paninjath, Sankaranarayanan; Bork, Ingo; Buck, Peter; Toublan, Olivier; Schanen, Isabelle

    2014-07-01

    Standard OPC models consist of a physical optical model and an empirical resist model. The resist model compensates the optical model imprecision on top of modeling resist development. The optical model imprecision may result from mask topography effects and real mask information including mask ebeam writing and mask process contributions. For advanced technology nodes, significant progress has been made to model mask topography to improve optical model accuracy. However, mask information is difficult to decorrelate from standard OPC model. Our goal is to establish an accurate mask model through a dedicated calibration exercise. In this paper, we present a flow to calibrate an accurate mask enabling its implementation. The study covers the different effects that should be embedded in the mask model as well as the experiment required to model them.

  8. The fine details of evolution.

    PubMed

    Laskowski, Roman A; Thornton, Janet M; Sternberg, Michael J E

    2009-08-01

    Charles Darwin's theory of evolution was based on studies of biology at the species level. In the time since his death, studies at the molecular level have confirmed his ideas about the kinship of all life on Earth and have provided a wealth of detail about the evolutionary relationships between different species and a deeper understanding of the finer workings of natural selection. We now have a wealth of data, including the genome sequences of a wide range of organisms, an even larger number of protein sequences, a significant knowledge of the three-dimensional structures of proteins, DNA and other biological molecules, and a huge body of information about the operation of these molecules as systems in the molecular machinery of all living things. This issue of Biochemical Society Transactions contains papers from oral presentations given at a Biochemical Society Focused Meeting to commemorate the 200th Anniversary of Charles Darwin's birth, held on 26-27 January 2009 at the Wellcome Trust Conference Centre, Cambridge. The talks reported on some of the insights into evolution which have been obtained from the study of protein sequences, structures and systems. PMID:19614583

  9. Automatic classification and accurate size measurement of blank mask defects

    NASA Astrophysics Data System (ADS)

    Bhamidipati, Samir; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter

    2015-07-01

    A blank mask and its preparation stages, such as cleaning or resist coating, play an important role in the eventual yield obtained by using it. Blank mask defects' impact analysis directly depends on the amount of available information such as the number of defects observed, their accurate locations and sizes. Mask usability qualification at the start of the preparation process, is crudely based on number of defects. Similarly, defect information such as size is sought to estimate eventual defect printability on the wafer. Tracking of defect characteristics, specifically size and shape, across multiple stages, can further be indicative of process related information such as cleaning or coating process efficiencies. At the first level, inspection machines address the requirement of defect characterization by detecting and reporting relevant defect information. The analysis of this information though is still largely a manual process. With advancing technology nodes and reducing half-pitch sizes, a large number of defects are observed; and the detailed knowledge associated, make manual defect review process an arduous task, in addition to adding sensitivity to human errors. Cases where defect information reported by inspection machine is not sufficient, mask shops rely on other tools. Use of CDSEM tools is one such option. However, these additional steps translate into increased costs. Calibre NxDAT based MDPAutoClassify tool provides an automated software alternative to the manual defect review process. Working on defect images generated by inspection machines, the tool extracts and reports additional information such as defect location, useful for defect avoidance[4][5]; defect size, useful in estimating defect printability; and, defect nature e.g. particle, scratch, resist void, etc., useful for process monitoring. The tool makes use of smart and elaborate post-processing algorithms to achieve this. Their elaborateness is a consequence of the variety and

  10. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  11. Memory for details with self-referencing.

    PubMed

    Serbun, Sarah J; Shih, Joanne Y; Gutchess, Angela H

    2011-11-01

    Self-referencing benefits item memory, but little is known about the ways in which referencing the self affects memory for details. Experiment 1 assessed whether the effects of self-referencing operate only at the item, or general, level or whether they also enhance memory for specific visual details of objects. Participants incidentally encoded objects by making judgements in reference to the self, a close other (one's mother), or a familiar other (Bill Clinton). Results indicate that referencing the self or a close other enhances both specific and general memory. Experiments 2 and 3 assessed verbal memory for source in a task that relied on distinguishing between different mental operations (internal sources). The results indicate that self-referencing disproportionately enhances source memory, relative to conditions referencing other people, semantic, or perceptual information. We conclude that self-referencing not only enhances specific memory for both visual and verbal information, but can also disproportionately improve memory for specific internal source details.

  12. 15. CYLINDER DETAILS; DETAILS OF STEEL FOR CYLINDERS NO. 50 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. CYLINDER DETAILS; DETAILS OF STEEL FOR CYLINDERS NO. 50 (PIER 5) AND NO. 66 (PIER 6), DWG. 83, CH BY AF, ECL, APPROVED BY O.F. LACKEY, MAY 18, 1908 - Baltimore Inner Harbor, Pier 5, South of Pratt Street between Market Place & Concord Street, Baltimore, Independent City, MD

  13. 10. CYLINDER DETAILS: DETAIL OF STEEL FOR CYLINDER NO. 59, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. CYLINDER DETAILS: DETAIL OF STEEL FOR CYLINDER NO. 59, PIER NO. 6, DWG. 86, 3/4" = 1', MADE BY A.F., CHECKED BY E.C.L., APPROVED BY O.F. LACKEY, JUNE 2, 1908 - Baltimore Inner Harbor, Pier 6, South of Pratt Street between Concord Street & Jones Falls outlet, Baltimore, Independent City, MD

  14. 33 CFR 116.20 - Detailed investigation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Bridge Administration Program determines that a Detailed Investigation should be conducted, the District Commander will initiate an investigation that addresses all of the pertinent data regarding the bridge, including information obtained at a public meeting held under § 116.25. As part of the investigation,...

  15. Occupation Competency Profile: Steel Detailer Program.

    ERIC Educational Resources Information Center

    Alberta Learning, Edmonton. Apprenticeship and Industry Training.

    This document presents information about the apprenticeship training program of Alberta, Canada, in general and the steel detailer program in particular. The first part of the document discusses the following items: Alberta's apprenticeship and industry training system; the apprenticeship and industry training committee structure; local…

  16. An attempt to obtain a detailed declination chart from the United States magnetic anomaly map

    USGS Publications Warehouse

    Alldredge, L.R.

    1989-01-01

    Modern declination charts of the United States show almost no details. It was hoped that declination details could be derived from the information contained in the existing magnetic anomaly map of the United States. This could be realized only if all of the survey data were corrected to a common epoch, at which time a main-field vector model was known, before the anomaly values were computed. Because this was not done, accurate declination values cannot be determined. In spite of this conclusion, declination values were computed using a common main-field model for the entire United States to see how well they compared with observed values. The computed detailed declination values were found to compare less favourably with observed values of declination than declination values computed from the IGRF 1985 model itself. -from Author

  17. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models. PMID:27111139

  18. Accurate phase-shift velocimetry in rock

    NASA Astrophysics Data System (ADS)

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  19. Accurate phase-shift velocimetry in rock.

    PubMed

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R; Holmes, William M

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  20. Computed tomography:the details.

    SciTech Connect

    Doerry, Armin Walter

    2007-07-01

    Computed Tomography (CT) is a well established technique, particularly in medical imaging, but also applied in Synthetic Aperture Radar (SAR) imaging. Basic CT imaging via back-projection is treated in many texts, but often with insufficient detail to appreciate subtleties such as the role of non-uniform sampling densities. Herein are given some details often neglected in many texts.

  1. Hdr Imaging for Feature Detection on Detailed Architectural Scenes

    NASA Astrophysics Data System (ADS)

    Kontogianni, G.; Stathopoulou, E. K.; Georgopoulos, A.; Doulamis, A.

    2015-02-01

    3D reconstruction relies on accurate detection, extraction, description and matching of image features. This is even truer for complex architectural scenes that pose needs for 3D models of high quality, without any loss of detail in geometry or color. Illumination conditions influence the radiometric quality of images, as standard sensors cannot depict properly a wide range of intensities in the same scene. Indeed, overexposed or underexposed pixels cause irreplaceable information loss and degrade digital representation. Images taken under extreme lighting environments may be thus prohibitive for feature detection/extraction and consequently for matching and 3D reconstruction. High Dynamic Range (HDR) images could be helpful for these operators because they broaden the limits of illumination range that Standard or Low Dynamic Range (SDR/LDR) images can capture and increase in this way the amount of details contained in the image. Experimental results of this study prove this assumption as they examine state of the art feature detectors applied both on standard dynamic range and HDR images.

  2. Accurate Mass Measurements in Proteomics

    SciTech Connect

    Liu, Tao; Belov, Mikhail E.; Jaitly, Navdeep; Qian, Weijun; Smith, Richard D.

    2007-08-01

    To understand different aspects of life at the molecular level, one would think that ideally all components of specific processes should be individually isolated and studied in details. Reductionist approaches, i.e., studying one biological event at a one-gene or one-protein-at-a-time basis, indeed have made significant contributions to our understanding of many basic facts of biology. However, these individual “building blocks” can not be visualized as a comprehensive “model” of the life of cells, tissues, and organisms, without using more integrative approaches.1,2 For example, the emerging field of “systems biology” aims to quantify all of the components of a biological system to assess their interactions and to integrate diverse types of information obtainable from this system into models that could explain and predict behaviors.3-6 Recent breakthroughs in genomics, proteomics, and bioinformatics are making this daunting task a reality.7-14 Proteomics, the systematic study of the entire complement of proteins expressed by an organism, tissue, or cell under a specific set of conditions at a specific time (i.e., the proteome), has become an essential enabling component of systems biology. While the genome of an organism may be considered static over short timescales, the expression of that genome as the actual gene products (i.e., mRNAs and proteins) is a dynamic event that is constantly changing due to the influence of environmental and physiological conditions. Exclusive monitoring of the transcriptomes can be carried out using high-throughput cDNA microarray analysis,15-17 however the measured mRNA levels do not necessarily correlate strongly with the corresponding abundances of proteins,18-20 The actual amount of functional proteins can be altered significantly and become independent of mRNA levels as a result of post-translational modifications (PTMs),21 alternative splicing,22,23 and protein turnover.24,25 Moreover, the functions of expressed

  3. 13. DETAIL VIEW OF BRIDGE, LOOKING SOUTH FROM ROADWAY. DETAIL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. DETAIL VIEW OF BRIDGE, LOOKING SOUTH FROM ROADWAY. DETAIL VIEW OF THE PIERS AND LIGHTING FIXTURES ON THE COLORADO STREET BRIDGE. THIS VIEW SHOWS A PORTION OF THE BRIDGE ALONG THE SOUTH SIDE OF THE ROADWAY. EACH FIXTURE ALSO ORIGINALLY HAD FOUR ADDITIONAL GLOBES, WHICH EXTENDED FROM THE COLUMN BELOW THE MAIN GLOBE. THE 'REFUGE' SEATING AREAS ARE ORIGINAL, WHILE THE RAILING IS A LATER ADDITION. - Colorado Street Bridge, Spanning Arroyo Seco at Colorado Boulevard, Pasadena, Los Angeles County, CA

  4. 5 CFR 370.104 - Length of details.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Length of details. 370.104 Section 370.104 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.104 Length of details. (a) Details may be for a period of between 3...

  5. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  6. Accurate Telescope Mount Positioning with MEMS Accelerometers

    NASA Astrophysics Data System (ADS)

    Mészáros, L.; Jaskó, A.; Pál, A.; Csépány, G.

    2014-08-01

    This paper describes the advantages and challenges of applying microelectromechanical accelerometer systems (MEMS accelerometers) in order to attain precise, accurate, and stateless positioning of telescope mounts. This provides a completely independent method from other forms of electronic, optical, mechanical or magnetic feedback or real-time astrometry. Our goal is to reach the subarcminute range which is considerably smaller than the field-of-view of conventional imaging telescope systems. Here we present how this subarcminute accuracy can be achieved with very cheap MEMS sensors and we also detail how our procedures can be extended in order to attain even finer measurements. In addition, our paper discusses how can a complete system design be implemented in order to be a part of a telescope control system.

  7. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  8. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material. PMID:11366835

  9. 49 CFR 7.6 - Deletion of identifying detail.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Deletion of identifying detail. 7.6 Section 7.6 Transportation Office of the Secretary of Transportation PUBLIC AVAILABILITY OF INFORMATION Information Required To Be Made Public by DOT § 7.6 Deletion of identifying detail. Whenever it is determined to...

  10. Eros details enhanced by computer processing

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The NEAR camera's ability to show details of Eros's surface is limited by the spacecraft's distance from the asteroid. That is, the closer the spacecraft is to the surface, the more that details are visible. However mission scientists regularly use computer processing to squeeze an extra measure of information from returned data. In a technique known as 'superresolution', many images of the same scene acquired at very, very slightly different camera pointing are carefully overlain and processed to bright out details even smaller than would normally be visible. In this rendition constructed out of 20 image frames acquired Feb. 12, 2000, the images have first been enhanced ('high-pass filtered') to accentuate small-scale details. Superresolution was then used to bring out features below the normal ability of the camera to resolve.

    Built and managed by The Johns Hopkins University Applied Physics Laboratory, Laurel, Maryland, NEAR was the first spacecraft launched in NASA's Discovery Program of low-cost, small-scale planetary missions. See the NEAR web page at http://near.jhuapl.edu for more details.

  11. Accurate documentation and wound measurement.

    PubMed

    Hampton, Sylvie

    This article, part 4 in a series on wound management, addresses the sometimes routine yet crucial task of documentation. Clear and accurate records of a wound enable its progress to be determined so the appropriate treatment can be applied. Thorough records mean any practitioner picking up a patient's notes will know when the wound was last checked, how it looked and what dressing and/or treatment was applied, ensuring continuity of care. Documenting every assessment also has legal implications, demonstrating due consideration and care of the patient and the rationale for any treatment carried out. Part 5 in the series discusses wound dressing characteristics and selection.

  12. Accurate calculation of the absolute free energy of binding for drug molecules† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c5sc02678d Click here for additional data file.

    PubMed Central

    Aldeghi, Matteo; Heifetz, Alexander; Bodkin, Michael J.; Knapp, Stefan

    2016-01-01

    Accurate prediction of binding affinities has been a central goal of computational chemistry for decades, yet remains elusive. Despite good progress, the required accuracy for use in a drug-discovery context has not been consistently achieved for drug-like molecules. Here, we perform absolute free energy calculations based on a thermodynamic cycle for a set of diverse inhibitors binding to bromodomain-containing protein 4 (BRD4) and demonstrate that a mean absolute error of 0.6 kcal mol–1 can be achieved. We also show a similar level of accuracy (1.0 kcal mol–1) can be achieved in pseudo prospective approach. Bromodomains are epigenetic mark readers that recognize acetylation motifs and regulate gene transcription, and are currently being investigated as therapeutic targets for cancer and inflammation. The unprecedented accuracy offers the exciting prospect that the binding free energy of drug-like compounds can be predicted for pharmacologically relevant targets. PMID:26798447

  13. Detailed spectral analysis of decellularized skin implants

    NASA Astrophysics Data System (ADS)

    Timchenko, E. V.; Timchenko, P. E.; Volova, L. T.; Dolgushkin, D. A.; Shalkovsky, P. Y.; Pershutkina, S. V.

    2016-08-01

    The resutls of detailed analysis of donor skin implants using Raman spectroscopy method are presented. Fourier-deconvolution method was used to separate overlapping spectrum lines and to improve its informativeness. Based on the processed spectra were introduced coefficients that represent changes in relative concentration of implant components, which determines the quality of implants. It was established that Raman spectroscopy method can be used in assessment of skin implants.

  14. NOSS Altimeter Detailed Algorithm specifications

    NASA Technical Reports Server (NTRS)

    Hancock, D. W.; Mcmillan, J. D.

    1982-01-01

    The details of the algorithms and data sets required for satellite radar altimeter data processing are documented in a form suitable for (1) development of the benchmark software and (2) coding the operational software. The algorithms reported in detail are those established for altimeter processing. The algorithms which required some additional development before documenting for production were only scoped. The algorithms are divided into two levels of processing. The first level converts the data to engineering units and applies corrections for instrument variations. The second level provides geophysical measurements derived from altimeter parameters for oceanographic users.

  15. Seductive Details in Multimedia Messages

    ERIC Educational Resources Information Center

    Rey, Gunter Daniel

    2011-01-01

    The seductive detail principle asserts that people learn more deeply from a multimedia presentation when interesting but irrelevant adjuncts are excluded rather than included. However, critics could argue that studies about this principle contain methodological problems. The recent experiment attempts to overcome these problems. Students (N = 108)…

  16. SPLASH: Accurate OH maser positions

    NASA Astrophysics Data System (ADS)

    Walsh, Andrew; Gomez, Jose F.; Jones, Paul; Cunningham, Maria; Green, James; Dawson, Joanne; Ellingsen, Simon; Breen, Shari; Imai, Hiroshi; Lowe, Vicki; Jones, Courtney

    2013-10-01

    The hydroxyl (OH) 18 cm lines are powerful and versatile probes of diffuse molecular gas, that may trace a largely unstudied component of the Galactic ISM. SPLASH (the Southern Parkes Large Area Survey in Hydroxyl) is a large, unbiased and fully-sampled survey of OH emission, absorption and masers in the Galactic Plane that will achieve sensitivities an order of magnitude better than previous work. In this proposal, we request ATCA time to follow up OH maser candidates. This will give us accurate (~10") positions of the masers, which can be compared to other maser positions from HOPS, MMB and MALT-45 and will provide full polarisation measurements towards a sample of OH masers that have not been observed in MAGMO.

  17. Accurate thickness measurement of graphene

    NASA Astrophysics Data System (ADS)

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  18. Accurate thickness measurement of graphene.

    PubMed

    Shearer, Cameron J; Slattery, Ashley D; Stapleton, Andrew J; Shapter, Joseph G; Gibson, Christopher T

    2016-03-29

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  19. Challenges in accurate quantitation of lysophosphatidic acids in human biofluids

    PubMed Central

    Onorato, Joelle M.; Shipkova, Petia; Minnich, Anne; Aubry, Anne-Françoise; Easter, John; Tymiak, Adrienne

    2014-01-01

    Lysophosphatidic acids (LPAs) are biologically active signaling molecules involved in the regulation of many cellular processes and have been implicated as potential mediators of fibroblast recruitment to the pulmonary airspace, pointing to possible involvement of LPA in the pathology of pulmonary fibrosis. LPAs have been measured in various biological matrices and many challenges involved with their analyses have been documented. However, little published information is available describing LPA levels in human bronchoalveolar lavage fluid (BALF). We therefore conducted detailed investigations into the effects of extensive sample handling and sample preparation conditions on LPA levels in human BALF. Further, targeted lipid profiling of human BALF and plasma identified the most abundant lysophospholipids likely to interfere with LPA measurements. We present the findings from these investigations, highlighting the importance of well-controlled sample handling for the accurate quantitation of LPA. Further, we show that chromatographic separation of individual LPA species from their corresponding lysophospholipid species is critical to avoid reporting artificially elevated levels. The optimized sample preparation and LC/MS/MS method was qualified using a stable isotope-labeled LPA as a surrogate calibrant and used to determine LPA levels in human BALF and plasma from a Phase 0 clinical study comparing idiopathic pulmonary fibrosis patients to healthy controls. PMID:24872406

  20. Memory for Details with Self-Referencing

    PubMed Central

    Serbun, Sarah J.; Shih, Joanne Y.; Gutchess, Angela H.

    2011-01-01

    Self-referencing benefits item memory, but little is known about the ways in which referencing the self affects memory for details. Experiment 1 assessed whether the effects of self-referencing operate only at the item, or general, level or also enhance memory for specific visual details of objects. Participants incidentally encoded objects by making judgments in reference to the self, a close other (one’s mother), or a familiar other (Bill Clinton). Results indicate that referencing the self or a close other enhances both specific and general memory. Experiments 2 and 3 assessed verbal memory for source in a task that relied on distinguishing between different mental operations (internal sources). Results indicate that self-referencing disproportionately enhances source memory, relative to conditions referencing other people, semantic, or perceptual information. We conclude that self-referencing not only enhances specific memory for both visual and verbal information, but can disproportionately improve memory for specific internal source details as well. PMID:22092106

  1. Must Kohn-Sham oscillator strengths be accurate at threshold?

    SciTech Connect

    Yang Zenghui; Burke, Kieron; Faassen, Meta van

    2009-09-21

    The exact ground-state Kohn-Sham (KS) potential for the helium atom is known from accurate wave function calculations of the ground-state density. The threshold for photoabsorption from this potential matches the physical system exactly. By carefully studying its absorption spectrum, we show the answer to the title question is no. To address this problem in detail, we generate a highly accurate simple fit of a two-electron spectrum near the threshold, and apply the method to both the experimental spectrum and that of the exact ground-state Kohn-Sham potential.

  2. Aircraft empennage structural detail design

    NASA Technical Reports Server (NTRS)

    Meholic, Greg; Brown, Rhonda; Hall, Melissa; Harvey, Robert; Singer, Michael; Tella, Gustavo

    1993-01-01

    This project involved the detailed design of the aft fuselage and empennage structure, vertical stabilizer, rudder, horizontal stabilizer, and elevator for the Triton primary flight trainer. The main design goals under consideration were to illustrate the integration of the control systems devices used in the tail surfaces and their necessary structural supports as well as the elevator trim, navigational lighting system, electrical systems, tail-located ground tie, and fuselage/cabin interface structure. Accommodations for maintenance, lubrication, adjustment, and repairability were devised. Weight, fabrication, and (sub)assembly goals were addressed. All designs were in accordance with the FAR Part 23 stipulations for a normal category aircraft.

  3. Non-targeted analysis of electronics waste by comprehensive two-dimensional gas chromatography combined with high-resolution mass spectrometry: Using accurate mass information and mass defect analysis to explore the data.

    PubMed

    Ubukata, Masaaki; Jobst, Karl J; Reiner, Eric J; Reichenbach, Stephen E; Tao, Qingping; Hang, Jiliang; Wu, Zhanpin; Dane, A John; Cody, Robert B

    2015-05-22

    Comprehensive two-dimensional gas chromatography (GC×GC) and high-resolution mass spectrometry (HRMS) offer the best possible separation of their respective techniques. Recent commercialization of combined GC×GC-HRMS systems offers new possibilities for the analysis of complex mixtures. However, such experiments yield enormous data sets that require new informatics tools to facilitate the interpretation of the rich information content. This study reports on the analysis of dust obtained from an electronics recycling facility by using GC×GC in combination with a new high-resolution time-of-flight (TOF) mass spectrometer. New software tools for (non-traditional) Kendrick mass defect analysis were developed in this research and greatly aided in the identification of compounds containing chlorine and bromine, elements that feature in most persistent organic pollutants (POPs). In essence, the mass defect plot serves as a visual aid from which halogenated compounds are recognizable on the basis of their mass defect and isotope patterns. Mass chromatograms were generated based on specific ions identified in the plots as well as region of the plot predominantly occupied by halogenated contaminants. Tentative identification was aided by database searches, complementary electron-capture negative ionization experiments and elemental composition determinations from the exact mass data. These included known and emerging flame retardants, such as polybrominated diphenyl ethers (PBDEs), hexabromobenzene, tetrabromo bisphenol A and tris (1-chloro-2-propyl) phosphate (TCPP), as well as other legacy contaminants such as polychlorinated biphenyls (PCBs) and polychlorinated terphenyls (PCTs).

  4. Fast and Provably Accurate Bilateral Filtering.

    PubMed

    Chaudhury, Kunal N; Dabhade, Swapnil D

    2016-06-01

    The bilateral filter is a non-linear filter that uses a range filter along with a spatial filter to perform edge-preserving smoothing of images. A direct computation of the bilateral filter requires O(S) operations per pixel, where S is the size of the support of the spatial filter. In this paper, we present a fast and provably accurate algorithm for approximating the bilateral filter when the range kernel is Gaussian. In particular, for box and Gaussian spatial filters, the proposed algorithm can cut down the complexity to O(1) per pixel for any arbitrary S . The algorithm has a simple implementation involving N+1 spatial filterings, where N is the approximation order. We give a detailed analysis of the filtering accuracy that can be achieved by the proposed approximation in relation to the target bilateral filter. This allows us to estimate the order N required to obtain a given accuracy. We also present comprehensive numerical results to demonstrate that the proposed algorithm is competitive with the state-of-the-art methods in terms of speed and accuracy. PMID:27093722

  5. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  6. Recovering and preventing loss of detailed memory: differential rates of forgetting for detail types in episodic memory.

    PubMed

    Sekeres, Melanie J; Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris

    2016-02-01

    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired, while memory for central details is relatively spared. Given the sensitivity of memory to loss of details, the present study sought to investigate factors that mediate the forgetting of different types of information from naturalistic episodic memories in young healthy adults. The study investigated (1) time-dependent loss of "central" and "peripheral" details from episodic memories, (2) the effectiveness of cuing with reminders to reinstate memory details, and (3) the role of retrieval in preventing forgetting. Over the course of 7 d, memory for naturalistic events (film clips) underwent a time-dependent loss of peripheral details, while memory for central details (the core or gist of events) showed significantly less loss. Giving brief reminders of the clips just before retrieval reinstated memory for peripheral details, suggesting that loss of details is not always permanent, and may reflect both a storage and retrieval deficit. Furthermore, retrieving a memory shortly after it was encoded prevented loss of both central and peripheral details, thereby promoting retention over time. We consider the implications of these results for behavioral and neurobiological models of retention and forgetting.

  7. Details for Manuscript Number SSM-D-06-00377R1 “Targeted Ethnography as a Critical Step to Inform Cultural Adaptations of HIV Prevention Interventions for Adults with Severe Mental Illness.”

    PubMed Central

    Gonzalez, M. Alfredo; McKinnon, Karen; Elkington, Katherine S; Pinto, Diana; Mann, Claudio Gruber; Mattos, Paulo E

    2007-01-01

    As in other countries worldwide, adults with severe mental illness (SMI) in Brazil are disproportionately infected with HIV relative to the general population. Brazilian psychiatric facilities lack tested HIV prevention interventions. To adapt existing interventions, developed only in the U.S., we conducted targeted ethnography with adults with SMI and staff from two psychiatric institutions in Brazil. We sought to characterize individual, institutional, and interpersonal factors that may affect HIV risk behavior in this population. We conducted 350 hours of ethnographic field observations in two mental health service settings in Rio de Janeiro, and 9 focus groups (n = 72) and 16 key-informant interviews with patients and staff in these settings. Data comprised field notes and audiotapes of all exchanges, which were transcribed, coded, and systematically analyzed. The ethnography characterized the institutional culture and identified: 1) patients’ risk behaviors; 2) the institutional setting; 3) intervention content; and 4) intervention format and delivery strategies. Targeted ethnography also illuminated broader contextual issues for development and implementation of HIV prevention interventions for adults with SMI in Brazil, including an institutional culture that did not systematically address patients’ sexual behavior, sexual health, or HIV sexual risk, yet strongly impacted the structure of patients’ sexual networks. Further, ethnography identified the Brazilian concept of “social responsibility” as important to prevention work with psychiatric patients. Targeted ethnography with adults with SMI and institutional staff provided information critical to the adaptation of tested U.S. HIV prevention interventions from the US for Brazilians with SMI. PMID:17475382

  8. A Generalized Detailed Balance Relation

    NASA Astrophysics Data System (ADS)

    Ruelle, David

    2016-08-01

    Given a system M in a thermal bath we obtain a generalized detailed balance relation for the ratio r=π _τ (K→ J)/π _τ (J→ K) of the transition probabilities M:J→ K and M:K→ J in time τ . We assume an active bath, containing solute molecules in metastable states. These molecules may react with M and the transition J→ K occurs through different channels α involving different reactions with the bath. We find that r=sum p^α r^α , where p^α is the probability that channel α occurs, and r^α depends on the amount of heat (more precisely enthalpy) released to the bath in channel α.

  9. Parallel kinetic Monte Carlo simulation framework incorporating accurate models of adsorbate lateral interactions

    NASA Astrophysics Data System (ADS)

    Nielsen, Jens; d'Avezac, Mayeul; Hetherington, James; Stamatakis, Michail

    2013-12-01

    Ab initio kinetic Monte Carlo (KMC) simulations have been successfully applied for over two decades to elucidate the underlying physico-chemical phenomena on the surfaces of heterogeneous catalysts. These simulations necessitate detailed knowledge of the kinetics of elementary reactions constituting the reaction mechanism, and the energetics of the species participating in the chemistry. The information about the energetics is encoded in the formation energies of gas and surface-bound species, and the lateral interactions between adsorbates on the catalytic surface, which can be modeled at different levels of detail. The majority of previous works accounted for only pairwise-additive first nearest-neighbor interactions. More recently, cluster-expansion Hamiltonians incorporating long-range interactions and many-body terms have been used for detailed estimations of catalytic rate [C. Wu, D. J. Schmidt, C. Wolverton, and W. F. Schneider, J. Catal. 286, 88 (2012)]. In view of the increasing interest in accurate predictions of catalytic performance, there is a need for general-purpose KMC approaches incorporating detailed cluster expansion models for the adlayer energetics. We have addressed this need by building on the previously introduced graph-theoretical KMC framework, and we have developed Zacros, a FORTRAN2003 KMC package for simulating catalytic chemistries. To tackle the high computational cost in the presence of long-range interactions we introduce parallelization with OpenMP. We further benchmark our framework by simulating a KMC analogue of the NO oxidation system established by Schneider and co-workers [J. Catal. 286, 88 (2012)]. We show that taking into account only first nearest-neighbor interactions may lead to large errors in the prediction of the catalytic rate, whereas for accurate estimates thereof, one needs to include long-range terms in the cluster expansion.

  10. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  11. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Details to small business concerns. 370.107 Section 370.107 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of...

  12. Space Telecommunications Radio System (STRS) Architecture, Tutorial Part 2 - Detailed

    NASA Technical Reports Server (NTRS)

    Handler, Louis

    2014-01-01

    The STRS architecture detail presentation presents each requirement in the STRS Architecture Standard with some examples and supporting information. The purpose is to give a platform provider, application provider, or application integrator a better, more detailed understanding of the STRS Architecture Standard and its use.

  13. Radiometrically accurate thermal imaging in the Landsat program

    NASA Astrophysics Data System (ADS)

    Lansing, Jack C., Jr.

    1988-01-01

    Methods of calibrating Landsat TM thermal IR data have been developed so that the residual error is reduced to 0.9 K (1 standard deviation). Methods for verifying the radiometric performance of TM on orbit and ground calibration methods are discussed. The preliminary design of the enhanced TM for Landsat-6 is considered. A technique for accurately reducing raw data from the Landsat-5 thermal band is described in detail.

  14. Accurate Method for Determining Adhesion of Cantilever Beams

    SciTech Connect

    Michalske, T.A.; de Boer, M.P.

    1999-01-08

    Using surface micromachined samples, we demonstrate the accurate measurement of cantilever beam adhesion by using test structures which are adhered over long attachment lengths. We show that this configuration has a deep energy well, such that a fracture equilibrium is easily reached. When compared to the commonly used method of determining the shortest attached beam, the present method is much less sensitive to variations in surface topography or to details of capillary drying.

  15. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  16. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  17. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  18. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  19. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  20. Recovering and Preventing Loss of Detailed Memory: Differential Rates of Forgetting for Detail Types in Episodic Memory

    ERIC Educational Resources Information Center

    Sekeres, Melanie J.; Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris

    2016-01-01

    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired,…

  1. Accurate On-Line Intervention Practices for Efficient Improvement of Reading Skills in Africa

    ERIC Educational Resources Information Center

    Marshall, Minda B.

    2016-01-01

    Lifelong learning is the only way to sustain proficient learning in a rapidly changing world. Knowledge and information are exploding across the globe. We need accurate ways to facilitate the process of drawing external factual information into an internal perceptive advantage from which to interpret and argue new information. Accurate and…

  2. The importance of accurate atmospheric modeling

    NASA Astrophysics Data System (ADS)

    Payne, Dylan; Schroeder, John; Liang, Pang

    2014-11-01

    This paper will focus on the effect of atmospheric conditions on EO sensor performance using computer models. We have shown the importance of accurately modeling atmospheric effects for predicting the performance of an EO sensor. A simple example will demonstrated how real conditions for several sites in China will significantly impact on image correction, hyperspectral imaging, and remote sensing. The current state-of-the-art model for computing atmospheric transmission and radiance is, MODTRAN® 5, developed by the US Air Force Research Laboratory and Spectral Science, Inc. Research by the US Air Force, Navy and Army resulted in the public release of LOWTRAN 2 in the early 1970's. Subsequent releases of LOWTRAN and MODTRAN® have continued until the present. Please verify that (1) all pages are present, (2) all figures are correct, (3) all fonts and special characters are correct, and (4) all text and figures fit within the red margin lines shown on this review document. Complete formatting information is available at http://SPIE.org/manuscripts Return to the Manage Active Submissions page at http://spie.org/submissions/tasks.aspx and approve or disapprove this submission. Your manuscript will not be published without this approval. Please contact author_help@spie.org with any questions or concerns. The paper will demonstrate the importance of using validated models and local measured meteorological, atmospheric and aerosol conditions to accurately simulate the atmospheric transmission and radiance. Frequently default conditions are used which can produce errors of as much as 75% in these values. This can have significant impact on remote sensing applications.

  3. What input data are needed to accurately model electromagnetic fields from mobile phone base stations?

    PubMed

    Beekhuizen, Johan; Kromhout, Hans; Bürgi, Alfred; Huss, Anke; Vermeulen, Roel

    2015-01-01

    The increase in mobile communication technology has led to concern about potential health effects of radio frequency electromagnetic fields (RF-EMFs) from mobile phone base stations. Different RF-EMF prediction models have been applied to assess population exposure to RF-EMF. Our study examines what input data are needed to accurately model RF-EMF, as detailed data are not always available for epidemiological studies. We used NISMap, a 3D radio wave propagation model, to test models with various levels of detail in building and antenna input data. The model outcomes were compared with outdoor measurements taken in Amsterdam, the Netherlands. Results showed good agreement between modelled and measured RF-EMF when 3D building data and basic antenna information (location, height, frequency and direction) were used: Spearman correlations were >0.6. Model performance was not sensitive to changes in building damping parameters. Antenna-specific information about down-tilt, type and output power did not significantly improve model performance compared with using average down-tilt and power values, or assuming one standard antenna type. We conclude that 3D radio wave propagation modelling is a feasible approach to predict outdoor RF-EMF levels for ranking exposure levels in epidemiological studies, when 3D building data and information on the antenna height, frequency, location and direction are available.

  4. What input data are needed to accurately model electromagnetic fields from mobile phone base stations?

    PubMed

    Beekhuizen, Johan; Kromhout, Hans; Bürgi, Alfred; Huss, Anke; Vermeulen, Roel

    2015-01-01

    The increase in mobile communication technology has led to concern about potential health effects of radio frequency electromagnetic fields (RF-EMFs) from mobile phone base stations. Different RF-EMF prediction models have been applied to assess population exposure to RF-EMF. Our study examines what input data are needed to accurately model RF-EMF, as detailed data are not always available for epidemiological studies. We used NISMap, a 3D radio wave propagation model, to test models with various levels of detail in building and antenna input data. The model outcomes were compared with outdoor measurements taken in Amsterdam, the Netherlands. Results showed good agreement between modelled and measured RF-EMF when 3D building data and basic antenna information (location, height, frequency and direction) were used: Spearman correlations were >0.6. Model performance was not sensitive to changes in building damping parameters. Antenna-specific information about down-tilt, type and output power did not significantly improve model performance compared with using average down-tilt and power values, or assuming one standard antenna type. We conclude that 3D radio wave propagation modelling is a feasible approach to predict outdoor RF-EMF levels for ranking exposure levels in epidemiological studies, when 3D building data and information on the antenna height, frequency, location and direction are available. PMID:24472756

  5. The ACPMAPS system: A detailed overview

    SciTech Connect

    Fischler, M.

    1992-01-01

    This paper describes the ACPMAPS computing system -- its purpose, its hardware architecture, how the system is used, and relevant programming paradigms and concepts. Features of the hardware and software will be discussed in some detail, both quantitative and qualitative. This should give some perspective as to the suitability of the ACPMAPS system for various classes of applications, and as to where this system stands in the spectrum of today's supercomputers. The ACPMAPS project at Fermilab was initiated in 1987 as a collaborations between the Advanced Computer Program (now the Computer R D department) and the lattice gauge physicists in the Theory department. ACPMAPS is an acronym for Advanced Computer Program Multiple Array Processor System -- this acronym is no longer accurate, but the name has stuck. Although research physics computations were done on ACPMAPS as early as 1989, the full-scale system was commissioned as a reliable physics tool in early 1991. The original ACPMAPS was a 5 Gflop (peak) system. An upgrade by a factor of ten in computer power and memory size, but substituting a new CPU board, will occur during early 1991 -- this is referred to as the new ACPMAPS Upgrade or 50 GF ACPMAPS. The appellation ACPMAPS II has also been applied to the upgrade; this is somewhat of a misnomer, since only one of five major components was changed.

  6. Detailed Aerosol Characterization using Polarimetric Measurements

    NASA Astrophysics Data System (ADS)

    Hasekamp, Otto; di Noia, Antonio; Stap, Arjen; Rietjens, Jeroen; Smit, Martijn; van Harten, Gerard; Snik, Frans

    2016-04-01

    Anthropogenic aerosols are believed to cause the second most important anthropogenic forcing of climate change after greenhouse gases. In contrast to the climate effect of greenhouse gases, which is understood relatively well, the negative forcing (cooling effect) caused by aerosols represents the largest reported uncertainty in the most recent assessment of the International Panel on Climate Change (IPCC). To reduce the large uncertainty on the aerosol effects on cloud formation and climate, accurate satellite measurements of aerosol optical properties (optical thickness, single scattering albedo, phase function) and microphysical properties (size distribution, refractive index, shape) are essential. There is growing consensus in the aerosol remote sensing community that multi-angle measurements of intensity and polarization are essential to unambiguously determine all relevant aerosol properties. This presentations adresses the different aspects of polarimetric remote sensing of atmospheric aerosols, including retrieval algorithm development, validation, and data needs for climate and air quality applications. During past years, at SRON-Netherlands Instite for Space Research retrieval algorithms have been developed that make full use of the capabilities of polarimetric measurements. We will show results of detailed aerosol properties from ground-based- (groundSPEX), airborne- (NASA Research Scanning Polarimeter), and satellite (POLDER) measurements. Also we will discuss observational needs for future instrumentation in order to improve our understanding of the role of aerosols in climate change and air quality.

  7. Accurate deterministic solutions for the classic Boltzmann shock profile

    NASA Astrophysics Data System (ADS)

    Yue, Yubei

    The Boltzmann equation or Boltzmann transport equation is a classical kinetic equation devised by Ludwig Boltzmann in 1872. It is regarded as a fundamental law in rarefied gas dynamics. Rather than using macroscopic quantities such as density, temperature, and pressure to describe the underlying physics, the Boltzmann equation uses a distribution function in phase space to describe the physical system, and all the macroscopic quantities are weighted averages of the distribution function. The information contained in the Boltzmann equation is surprisingly rich, and the Euler and Navier-Stokes equations of fluid dynamics can be derived from it using series expansions. Moreover, the Boltzmann equation can reach regimes far from the capabilities of fluid dynamical equations, such as the realm of rarefied gases---the topic of this thesis. Although the Boltzmann equation is very powerful, it is extremely difficult to solve in most situations. Thus the only hope is to solve it numerically. But soon one finds that even a numerical simulation of the equation is extremely difficult, due to both the complex and high-dimensional integral in the collision operator, and the hyperbolic phase-space advection terms. For this reason, until few years ago most numerical simulations had to rely on Monte Carlo techniques. In this thesis I will present a new and robust numerical scheme to compute direct deterministic solutions of the Boltzmann equation, and I will use it to explore some classical gas-dynamical problems. In particular, I will study in detail one of the most famous and intrinsically nonlinear problems in rarefied gas dynamics, namely the accurate determination of the Boltzmann shock profile for a gas of hard spheres.

  8. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  9. Detailed Astrometric Analysis of Pluto

    NASA Astrophysics Data System (ADS)

    ROSSI, GUSTAVO B.; Vieira-Martins, R.; Camargo, J. I.; Assafin, M.

    2013-05-01

    Abstract (2,250 Maximum Characters): Pluto is the main representant of the transneptunian objects (TNO's), presenting some peculiarities such as an atmosphere and a satellite system with 5 known moons: Charon, discovered in 1978, Nix and Hydra, in 2006, P4 in 2011 and P5 in 2012. Until the arrival of the New Horizons spacecraft to this system (july 2015), stellar occultations are the most efficient method, from the ground, to know physical and dinamical properties of this system. In 2010, it was evident a drift in declinations (about 20 mas/year) comparing to the ephemerides. This fact motivated us to remake the reductions and analysis of a great set of our observations at OPD/LNA, in a total of 15 years. The ephemerides and occultations results was then compared with the astrometric and photometric reductions of CCD images of Pluto (around 6500 images). Two corrections were used for a refinement of the data set: diferential chromatic refraction and photocenter. The first is due to the mean color of background stars beeing redder than the color of Pluto, resulting in a slightly different path of light through the atmosphere (that may cause a difference in position of 0.1”). It became more evident because Pluto is crossing the region of the galactic plane. The photocenter correction is based on two gaussians curves overlapped, with different hights and non-coincident centers, corresponding to Pluto and Charon (since they have less than 1” of angular separation). The objective is to separate these two gaussian curves from the observed one and find the right position of Pluto. The method is strongly dependent of the hight of each of the gaussian curves, related to the respective albedos of charon and Pluto. A detailed analysis of the astrometric results, as well a comparison with occultation results was made. Since Pluto has an orbital period of 248,9 years and our interval of observation is about 15 years, we have around 12% of its observed orbit and also, our

  10. Measuring Fisher information accurately in correlated neural populations.

    PubMed

    Kanitscheider, Ingmar; Coen-Cagli, Ruben; Kohn, Adam; Pouget, Alexandre

    2015-06-01

    Neural responses are known to be variable. In order to understand how this neural variability constrains behavioral performance, we need to be able to measure the reliability with which a sensory stimulus is encoded in a given population. However, such measures are challenging for two reasons: First, they must take into account noise correlations which can have a large influence on reliability. Second, they need to be as efficient as possible, since the number of trials available in a set of neural recording is usually limited by experimental constraints. Traditionally, cross-validated decoding has been used as a reliability measure, but it only provides a lower bound on reliability and underestimates reliability substantially in small datasets. We show that, if the number of trials per condition is larger than the number of neurons, there is an alternative, direct estimate of reliability which consistently leads to smaller errors and is much faster to compute. The superior performance of the direct estimator is evident both for simulated data and for neuronal population recordings from macaque primary visual cortex. Furthermore we propose generalizations of the direct estimator which measure changes in stimulus encoding across conditions and the impact of correlations on encoding and decoding, typically denoted by Ishuffle and Idiag respectively.

  11. A Detailed Chemical Kinetic Model for TNT

    SciTech Connect

    Pitz, W J; Westbrook, C K

    2005-01-13

    A detailed chemical kinetic mechanism for 2,4,6-tri-nitrotoluene (TNT) has been developed to explore problems of explosive performance and soot formation during the destruction of munitions. The TNT mechanism treats only gas-phase reactions. Reactions for the decomposition of TNT and for the consumption of intermediate products formed from TNT are assembled based on information from the literature and on current understanding of aromatic chemistry. Thermodynamic properties of intermediate and radical species are estimated by group additivity. Reaction paths are developed based on similar paths for aromatic hydrocarbons. Reaction-rate constant expressions are estimated from the literature and from analogous reactions where the rate constants are available. The detailed reaction mechanism for TNT is added to existing reaction mechanisms for RDX and for hydrocarbons. Computed results show the effect of oxygen concentration on the amount of soot precursors that are formed in the combustion of RDX and TNT mixtures in N{sub 2}/O{sub 2} mixtures.

  12. Surprising the Writer: Discovering Details through Research and Reading.

    ERIC Educational Resources Information Center

    Broaddus, Karen; Ivey, Gay

    2002-01-01

    Describes how students parallel the process of author Megan McDonald in conducting research and collecting information to provide ideas for the form and content of their writing. Notes that guiding students to record and organize information in a graphic format helps them to transfer those interesting details to new types of writing. (SG)

  13. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  14. Accurate SHAPE-directed RNA structure determination

    PubMed Central

    Deigan, Katherine E.; Li, Tian W.; Mathews, David H.; Weeks, Kevin M.

    2009-01-01

    Almost all RNAs can fold to form extensive base-paired secondary structures. Many of these structures then modulate numerous fundamental elements of gene expression. Deducing these structure–function relationships requires that it be possible to predict RNA secondary structures accurately. However, RNA secondary structure prediction for large RNAs, such that a single predicted structure for a single sequence reliably represents the correct structure, has remained an unsolved problem. Here, we demonstrate that quantitative, nucleotide-resolution information from a SHAPE experiment can be interpreted as a pseudo-free energy change term and used to determine RNA secondary structure with high accuracy. Free energy minimization, by using SHAPE pseudo-free energies, in conjunction with nearest neighbor parameters, predicts the secondary structure of deproteinized Escherichia coli 16S rRNA (>1,300 nt) and a set of smaller RNAs (75–155 nt) with accuracies of up to 96–100%, which are comparable to the best accuracies achievable by comparative sequence analysis. PMID:19109441

  15. Enabling Detailed Energy Analyses via the Technology Performance Exchange: Preprint

    SciTech Connect

    Studer, D.; Fleming, K.; Lee, E.; Livingood, W.

    2014-08-01

    One of the key tenets to increasing adoption of energy efficiency solutions in the built environment is improving confidence in energy performance. Current industry practices make extensive use of predictive modeling, often via the use of sophisticated hourly or sub-hourly energy simulation programs, to account for site-specific parameters (e.g., climate zone, hours of operation, and space type) and arrive at a performance estimate. While such methods are highly precise, they invariably provide less than ideal accuracy due to a lack of high-quality, foundational energy performance input data. The Technology Performance Exchange was constructed to allow the transparent sharing of foundational, product-specific energy performance data, and leverages significant, external engineering efforts and a modular architecture to efficiently identify and codify the minimum information necessary to accurately predict product energy performance. This strongly-typed database resource represents a novel solution to a difficult and established problem. One of the most exciting benefits is the way in which the Technology Performance Exchange's application programming interface has been leveraged to integrate contributed foundational data into the Building Component Library. Via a series of scripts, data is automatically translated and parsed into the Building Component Library in a format that is immediately usable to the energy modeling community. This paper (1) presents a high-level overview of the project drivers and the structure of the Technology Performance Exchange; (2) offers a detailed examination of how technologies are incorporated and translated into powerful energy modeling code snippets; and (3) examines several benefits of this robust workflow.

  16. Numerical Methodology for Coupled Time-Accurate Simulations of Primary and Secondary Flowpaths in Gas Turbines

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Athavale, M. M.; Hendricks, R. C.; Steinetz, B. M.

    2006-01-01

    Detailed information of the flow-fields in the secondary flowpaths and their interaction with the primary flows in gas turbine engines is necessary for successful designs with optimized secondary flow streams. Present work is focused on the development of a simulation methodology for coupled time-accurate solutions of the two flowpaths. The secondary flowstream is treated using SCISEAL, an unstructured adaptive Cartesian grid code developed for secondary flows and seals, while the mainpath flow is solved using TURBO, a density based code with capability of resolving rotor-stator interaction in multi-stage machines. An interface is being tested that links the two codes at the rim seal to allow data exchange between the two codes for parallel, coupled execution. A description of the coupling methodology and the current status of the interface development is presented. Representative steady-state solutions of the secondary flow in the UTRC HP Rig disc cavity are also presented.

  17. Accurate oscillator strengths for ultraviolet lines of Ar I - Implications for interstellar material

    NASA Technical Reports Server (NTRS)

    Federman, S. R.; Beideck, D. J.; Schectman, R. M.; York, D. G.

    1992-01-01

    Analysis of absorption from interstellar Ar I in lightly reddened lines of sight provides information on the warm and hot components of the interstellar medium near the sun. The details of the analysis are limited by the quality of the atomic data. Accurate oscillator strengths for the Ar I lines at 1048 and 1067 A and the astrophysical implications are presented. From lifetimes measured with beam-foil spectroscopy, an f-value for 1048 A of 0.257 +/- 0.013 is obtained. Through the use of a semiempirical formalism for treating singlet-triplet mixing, an oscillator strength of 0.064 +/- 0.003 is derived for 1067 A. Because of the accuracy of the results, the conclusions of York and colleagues from spectra taken with the Copernicus satellite are strengthened. In particular, for interstellar gas in the solar neighborhood, argon has a solar abundance, and the warm, neutral material is not pervasive.

  18. Detailed Globes Enhance Education and Recreation

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Orbis World Globes creates inflatable globes-Earthballs-in many sizes that depict Earth as it is seen from space, complete with atmospheric cloud cover. Orbis designs and produces the most visually authentic replicas of Earth ever created, and NASA took notice of Orbis globes and employed a 16-inch diameter EarthBall for an educational film it made aboard the STS-45 shuttle mission. Orbis later collaborated with NASA to create two 16-foot diameter world globes for display at the 2002 Olympic Winter Games in Salt Lake City, using more detailed satellite imagery. The satellite image now printed on all Orbis globes displays 1-kilometer resolution and is 21,600 by 43,200 pixels in size, and Orbis globes are otherwise meteorologically accurate, though the cloud cover has been slightly reduced in order for most of the landforms to be visible. Orbis also developed the exclusive NightGlow Cities feature, enabling EarthBalls to display the world's cities as they appear as the Earth revolves from daylight into night. Orbis inflatable globes are available in sizes from 1 to 100 feet in diameter, with the most common being the standard 16-inch and 1-meter diameter EarthBalls. Applications include educational uses from preschools to universities, games, and for a variety of display purposes at conferences, trade shows, festivals, concerts, and parades. A 16-foot diameter Orbis globe was exhibited at the United Nations' World Urban Forum, in Vancouver, Canada; the Space 2006 conference, in San Jose, California; and the X-Prize Cup Personal Spaceflight Exposition in Las Cruces, New Mexico.

  19. Quantification of the Information Limit of Transmission Electron Microscopes

    SciTech Connect

    Barthel, J.; Thust, A.

    2008-11-14

    The resolving power of high-resolution transmission electron microscopes is characterized by the information limit, which reflects the size of the smallest object detail observable with a particular instrument. We introduce a highly accurate measurement method for the information limit, which is suitable for modern aberration-corrected electron microscopes. An experimental comparison with the traditionally applied Young's fringe method yields severe discrepancies and confirms theoretical considerations according to which the Young's fringe method does not reveal the information limit.

  20. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    Monitoring sulphur chemistry is thought to be of great importance for exoplanets. Doing this requires detailed knowledge of the spectroscopic properties of sulphur containing molecules such as hydrogen sulphide (H2S) [1], sulphur dioxide (SO2), and sulphur trioxide (SO3). Each of these molecules can be found in terrestrial environments, produced in volcano emissions on Earth, and analysis of their spectroscopic data can prove useful to the characterisation of exoplanets, as well as the study of planets in our own solar system, with both having a possible presence on Venus. A complete, high temperature list of line positions and intensities for H32 2 S is presented. The DVR3D program suite is used to calculate the bound ro-vibration energy levels, wavefunctions, and dipole transition intensities using Radau coordinates. The calculations are based on a newly determined, spectroscopically refined potential energy surface (PES) and a new, high accuracy, ab initio dipole moment surface (DMS). Tests show that the PES enables us to calculate the line positions accurately and the DMS gives satisfactory results for line intensities. Comparisons with experiment as well as with previous theoretical spectra will be presented. The results of this study will form an important addition to the databases which are considered as sources of information for space applications; especially, in analysing the spectra of extrasolar planets, and remote sensing studies for Venus and Earth, as well as laboratory investigations and pollution studies. An ab initio line list for SO3 was previously computed using the variational nuclear motion program TROVE [2], and was suitable for modelling room temperature SO3 spectra. The calculations considered transitions in the region of 0-4000 cm-1 with rotational states up to J = 85, and includes 174,674,257 transitions. A list of 10,878 experimental transitions had relative intensities placed on an absolute scale, and were provided in a form suitable

  1. Cornice Detail of Rake, Cornice Detail of Eave, Wood DoubleHung ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Cornice Detail of Rake, Cornice Detail of Eave, Wood Double-Hung Window Details, Wood Door Details - Boxley Grist Mill, Boxley vicinity on State Route 43, Buffalo National River, Ponca, Newton County, AR

  2. Study on detailed geological modelling for fluvial sandstone reservoir in Daqing oil field

    SciTech Connect

    Zhao Hanqing; Fu Zhiguo; Lu Xiaoguang

    1997-08-01

    Guided by the sedimentation theory and knowledge of modern and ancient fluvial deposition and utilizing the abundant information of sedimentary series, microfacies type and petrophysical parameters from well logging curves of close spaced thousands of wells located in a large area. A new method for establishing detailed sedimentation and permeability distribution models for fluvial reservoirs have been developed successfully. This study aimed at the geometry and internal architecture of sandbodies, in accordance to their hierarchical levels of heterogeneity and building up sedimentation and permeability distribution models of fluvial reservoirs, describing the reservoir heterogeneity on the light of the river sedimentary rules. The results and methods obtained in outcrop and modem sedimentation studies have successfully supported the study. Taking advantage of this method, the major producing layers (PI{sub 1-2}), which have been considered as heterogeneous and thick fluvial reservoirs extending widely in lateral are researched in detail. These layers are subdivided into single sedimentary units vertically and the microfacies are identified horizontally. Furthermore, a complex system is recognized according to their hierarchical levels from large to small, meander belt, single channel sandbody, meander scroll, point bar, and lateral accretion bodies of point bar. The achieved results improved the description of areal distribution of point bar sandbodies, provide an accurate and detailed framework model for establishing high resolution predicting model. By using geostatistic technique, it also plays an important role in searching for enriched zone of residual oil distribution.

  3. Validation of a fast and accurate chromatographic method for detailed quantification of vitamin E in green leafy vegetables.

    PubMed

    Cruz, Rebeca; Casal, Susana

    2013-11-15

    Vitamin E analysis in green vegetables is performed by an array of different methods, making it difficult to compare published data or choosing the adequate one for a particular sample. Aiming to achieve a consistent method with wide applicability, the current study reports the development and validation of a fast micro-method for quantification of vitamin E in green leafy vegetables. The methodology uses solid-liquid extraction based on the Folch method, with tocol as internal standard, and normal-phase HPLC with fluorescence detection. A large linear working range was confirmed, being highly reproducible, with inter-day precisions below 5% (RSD). Method sensitivity was established (below 0.02 μg/g fresh weight), and accuracy was assessed by recovery tests (>96%). The method was tested in different green leafy vegetables, evidencing diverse tocochromanol profiles, with variable ratios and amounts of α- and γ-tocopherol, and other minor compounds. The methodology is adequate for routine analyses, with a reduced chromatographic run (<7 min) and organic solvent consumption, and requires only standard chromatographic equipment available in most laboratories.

  4. Acoustic emission monitoring for assessment of steel bridge details

    SciTech Connect

    Kosnik, D. E.; Corr, D. J.; Hopwood, T.

    2011-06-23

    Acoustic emission (AE) testing was deployed on details of two large steel Interstate Highway bridges: one cantilever through-truss and one trapezoidal box girder bridge. Quantitative measurements of activity levels at known and suspected crack locations were made by monitoring AE under normal service loads (e.g., live traffic and wind). AE indications were used to direct application of radiography, resulting in identification of a previously unknown flaw, and to inform selection of a retrofit detail.

  5. Identification of potential surgical site infections leveraging an enterprise clinical information warehouse.

    PubMed

    Santangelo, Jennifer; Erdal, Selnur; Wellington, Linda; Mekhjian, Hagop; Kamal, Jyoti

    2008-11-06

    At The Ohio State University Medical Center (OSUMC), infection control practitioners (ICPs) need an accurate list of patients undergoing defined operative procedures to track surgical site infections. Using data from the OSUMC Information Warehouse (IW), we have created an automated report detailing required data. This report also displays associated surgical and pathology text or dictated reports providing additional information to the ICPs.

  6. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  7. Selecting accurate statements from the cognitive interview using confidence ratings.

    PubMed

    Roberts, Wayne T; Higham, Philip A

    2002-03-01

    Participants viewed a videotape of a simulated murder, and their recall (and confidence) was tested 1 week later with the cognitive interview. Results indicated that (a) the subset of statements assigned high confidence was more accurate than the full set of statements; (b) the accuracy benefit was limited to information that forensic experts considered relevant to an investigation, whereas peripheral information showed the opposite pattern; (c) the confidence-accuracy relationship was higher for relevant than for peripheral information; (d) the focused-retrieval phase was associated with a greater proportion of peripheral and a lesser proportion of relevant information than the other phases; and (e) only about 50% of the relevant information was elicited, and most of this was elicited in Phase 1.

  8. Optoelectronic pH Meter: Further Details

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Antony S.; Anderson, Mejody M.; Macatangay, Ariel V.

    2009-01-01

    A collection of documents provides further detailed information about an optoelectronic instrument that measures the pH of an aqueous cell-culture medium to within 0.1 unit in the range from 6.5 to 7.5. The instrument at an earlier stage of development was reported in Optoelectronic Instrument Monitors pH in a Culture Medium (MSC-23107), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 4a. To recapitulate: The instrument includes a quartz cuvette through which the medium flows as it is circulated through a bioreactor. The medium contains some phenol red, which is an organic pH-indicator dye. The cuvette sits between a light source and a photodetector. [The light source in the earlier version comprised red (625 nm) and green (558 nm) light-emitting diodes (LEDs); the light source in the present version comprises a single green- (560 nm)-or-red (623 nm) LED.] The red and green are repeatedly flashed in alternation. The responses of the photodiode to the green and red are processed electronically to obtain the ratio between the amounts of green and red light transmitted through the medium. The optical absorbance of the phenol red in the green light varies as a known function of pH. Hence, the pH of the medium can be calculated from the aforesaid ratio.

  9. Developments in remote sensing technology enable more detailed urban flood risk analysis.

    NASA Astrophysics Data System (ADS)

    Denniss, A.; Tewkesbury, A.

    2009-04-01

    Spaceborne remote sensors have been allowing us to build up a profile of planet earth for many years. With each new satellite launched we see the capabilities improve: new bands of data, higher resolution imagery, the ability to derive better elevation information. The combination of this geospatial data to create land cover and usage maps, all help inform catastrophe modelling systems. From Landsat 30m resolution to 2.44m QuickBird multispectral imagery; from 1m radar data collected by TerraSAR-X which enables rapid tracking of the rise and fall of a flood event, and will shortly have a twin satellite launched enabling elevation data creation; we are spoilt for choice in available data. However, just what is cost effective? It is always a question of choosing the appropriate level of input data detail for modelling, depending on the value of the risk. In the summer of 2007, the cost of the flooding in the UK was approximately £3bn and affected over 58,000 homes and businesses. When it comes to flood risk, we have traditionally considered rising river levels and surge tides, but with climate change and variations in our own construction behaviour, there are other factors to be taken into account. During those summer 2007 events, the Environment Agency suggested that around 70% of the properties damaged were the result of pluvial flooding, where high localised rainfall events overload localised drainage infrastructure, causing widespread flooding of properties and infrastructure. To create a risk model that is able to simulate such an event requires much more accurate source data than can be provided from satellite or radar. As these flood events cause considerable damage within relatively small, complex urban environments, therefore new high resolution remote sensing techniques have to be applied to better model these events. Detailed terrain data of England and Wales, plus cities in Scotland, have been produced by combining terrain measurements from the latest

  10. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture.

    PubMed

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-09-22

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain.

  11. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    PubMed Central

    Gao, Zhiquan; Yu, Yao; Zhou, Yu; Du, Sidan

    2015-01-01

    Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain. PMID:26402681

  12. Detailed observations of the source of terrestrial narrowband electromagnetic radiation

    NASA Technical Reports Server (NTRS)

    Kurth, W. S.

    1982-01-01

    Detailed observations are presented of a region near the terrestrial plasmapause where narrowband electromagnetic radiation (previously called escaping nonthermal continuum radiation) is being generated. These observations show a direct correspondence between the narrowband radio emissions and electron cyclotron harmonic waves near the upper hybrid resonance frequency. In addition, electromagnetic radiation propagating in the Z-mode is observed in the source region which provides an extremely accurate determination of the electron plasma frequency and, hence, density profile of the source region. The data strongly suggest that electrostatic waves and not Cerenkov radiation are the source of the banded radio emissions and define the coupling which must be described by any viable theory.

  13. Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models

    PubMed Central

    Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V.

    2013-01-01

    In the context of limiting the environmental impact of transportation, this paper reviews new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions. PMID:21597604

  14. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities

    PubMed Central

    Helb, Danica A.; Tetteh, Kevin K. A.; Felgner, Philip L.; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R.; Beeson, James G.; Tappero, Jordan; Smith, David L.; Crompton, Peter D.; Rosenthal, Philip J.; Dorsey, Grant; Drakeley, Christopher J.; Greenhouse, Bryan

    2015-01-01

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual’s recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86–0.93), whereas responses to six antigens accurately estimated an individual’s malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs. PMID:26216993

  15. Site Plan, Brief History, Site Elevation, Main Gate Detail, Southern ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Site Plan, Brief History, Site Elevation, Main Gate Detail, Southern Live Oak (Quercus Virginiana) Information - Main Gate and Auburn Oaks at Toomer's Corner, Entrance to Auburn University's Campus, Intersection of West Magnolia Avenue and South College Street, Auburn, Lee County, AL

  16. 21. DETAIL OF AREA WHERE FIRST FLOOR PASSAGEWAY USED TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. DETAIL OF AREA WHERE FIRST FLOOR PASSAGEWAY USED TO BE SHOWING VERTICAL WOOD MOLDING COVERING JOINT WHERE PARTITION USED TO BE (LEFT), TELLER'S WINDOW LINKING PASSAGEWAY WITH INFORMATION BOOTH (CENTER), AND TYPICAL FURNITURE. VIEW TO EAST. - Boise Project, Boise Project Office, 214 Broadway, Boise, Ada County, ID

  17. Modeling heterogeneous materials via two-point correlation functions. II. Algorithmic details and applications.

    PubMed

    Jiao, Y; Stillinger, F H; Torquato, S

    2008-03-01

    In the first part of this series of two papers, we proposed a theoretical formalism that enables one to model and categorize heterogeneous materials (media) via two-point correlation functions S(2) and introduced an efficient heterogeneous-medium (re)construction algorithm called the "lattice-point" algorithm. Here we discuss the algorithmic details of the lattice-point procedure and an algorithm modification using surface optimization to further speed up the (re)construction process. The importance of the error tolerance, which indicates to what accuracy the media are (re)constructed, is also emphasized and discussed. We apply the algorithm to generate three-dimensional digitized realizations of a Fontainebleau sandstone and a boron-carbide/aluminum composite from the two-dimensional tomographic images of their slices through the materials. To ascertain whether the information contained in S(2) is sufficient to capture the salient structural features, we compute the two-point cluster functions of the media, which are superior signatures of the microstructure because they incorporate topological connectedness information. We also study the reconstruction of a binary laser-speckle pattern in two dimensions, in which the algorithm fails to reproduce the pattern accurately. We conclude that in general reconstructions using S(2) only work well for heterogeneous materials with single-scale structures. However, two-point information via S(2) is not sufficient to accurately model multiscale random media. Moreover, we construct realizations of hypothetical materials with desired structural characteristics obtained by manipulating their two-point correlation functions.

  18. Processing of airborne lidar bathymetry data for detailed sea floor mapping

    NASA Astrophysics Data System (ADS)

    Tulldahl, H. Michael

    2014-10-01

    Airborne bathymetric lidar has proven to be a valuable sensor for rapid and accurate sounding of shallow water areas. With advanced processing of the lidar data, detailed mapping of the sea floor with various objects and vegetation is possible. This mapping capability has a wide range of applications including detection of mine-like objects, mapping marine natural resources, and fish spawning areas, as well as supporting the fulfillment of national and international environmental monitoring directives. Although data sets collected by subsea systems give a high degree of credibility they can benefit from a combination with lidar for surveying and monitoring larger areas. With lidar-based sea floor maps containing information of substrate and attached vegetation, the field investigations become more efficient. Field data collection can be directed into selected areas and even focused to identification of specific targets detected in the lidar map. The purpose of this work is to describe the performance for detection and classification of sea floor objects and vegetation, for the lidar seeing through the water column. With both experimental and simulated data we examine the lidar signal characteristics depending on bottom depth, substrate type, and vegetation. The experimental evaluation is based on lidar data from field documented sites, where field data were taken from underwater video recordings. To be able to accurately extract the information from the received lidar signal, it is necessary to account for the air-water interface and the water medium. The information content is hidden in the lidar depth data, also referred to as point data, and also in the shape of the received lidar waveform. The returned lidar signal is affected by environmental factors such as bottom depth and water turbidity, as well as lidar system factors such as laser beam footprint size and sounding density.

  19. Modified chemiluminescent NO analyzer accurately measures NOX

    NASA Technical Reports Server (NTRS)

    Summers, R. L.

    1978-01-01

    Installation of molybdenum nitric oxide (NO)-to-higher oxides of nitrogen (NOx) converter in chemiluminescent gas analyzer and use of air purge allow accurate measurements of NOx in exhaust gases containing as much as thirty percent carbon monoxide (CO). Measurements using conventional analyzer are highly inaccurate for NOx if as little as five percent CO is present. In modified analyzer, molybdenum has high tolerance to CO, and air purge substantially quenches NOx destruction. In test, modified chemiluminescent analyzer accurately measured NO and NOx concentrations for over 4 months with no denegration in performance.

  20. Cloud Imagers Offer New Details on Earth's Health

    NASA Technical Reports Server (NTRS)

    2009-01-01

    , limited scientists ability to acquire detailed information about individual particles. Now, experiments with specialized equipment can be flown on standard jets, making it possible for researchers to monitor and more accurately anticipate changes in Earth s atmosphere and weather patterns.

  1. 24. 'HANGAR SHEDS ELEVATIONS DETAILS; ARCHITECTURAL PLANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. 'HANGAR SHEDS - ELEVATIONS - DETAILS; ARCHITECTURAL PLANS - PLANT AREA; MODIFICATION CENTER NO. 1, DAGGETT, CALIFORNIA.' Partial elevations, and details of sliding doors and ventilator flaps, as built. Contract no. W509 Eng. 2743; File no. 555/81, revision B, dated April 6, 1943. No sheet number. - Barstow-Daggett Airport, Hangar Shed No. 4, 39500 National Trails Highway, Daggett, San Bernardino County, CA

  2. 18 CFR 401.122 - Supplementary details.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 2 2011-04-01 2011-04-01 false Supplementary details. 401.122 Section 401.122 Conservation of Power and Water Resources DELAWARE RIVER BASIN COMMISSION ADMINISTRATIVE MANUAL RULES OF PRACTICE AND PROCEDURE General Provisions § 401.122 Supplementary details....

  3. Understanding Brains: Details, Intuition, and Big Data

    PubMed Central

    Marder, Eve

    2015-01-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important. PMID:25965068

  4. High dynamic range compression and detail enhancement of infrared images in the gradient domain

    NASA Astrophysics Data System (ADS)

    Zhang, Feifei; Xie, Wei; Ma, Guorui; Qin, Qianqing

    2014-11-01

    To find the trade-off between providing an accurate perception of the global scene and improving the visibility of details without excessively distorting radiometric infrared information, a novel gradient-domain-based visualization method for high dynamic range infrared images is proposed in this study. The proposed method adopts an energy function which includes a data constraint term and a gradient constraint term. In the data constraint term, the classical histogram projection method is used to perform the initial dynamic range compression to obtain the desired pixel values and preserve the global contrast. In the gradient constraint term, the moment matching method is adopted to obtain the normalized image; then a gradient gain factor function is designed to adjust the magnitudes of the normalized image gradients and obtain the desired gradient field. Lastly, the low dynamic range image is solved from the proposed energy function. The final image is obtained by linearly mapping the low dynamic range image to the 8-bit display range. The effectiveness and robustness of the proposed method are analyzed using the infrared images obtained from different operating conditions. Compared with other well-established methods, our method shows a significant performance in terms of dynamic range compression, while enhancing the details and avoiding the common artifacts, such as halo, gradient reversal, hazy or saturation.

  5. Can Appraisers Rate Work Performance Accurately?

    ERIC Educational Resources Information Center

    Hedge, Jerry W.; Laue, Frances J.

    The ability of individuals to make accurate judgments about others is examined and literature on this subject is reviewed. A wide variety of situational factors affects the appraisal of performance. It is generally accepted that the purpose of the appraisal influences the accuracy of the appraiser. The instrumentation, or tools, available to the…

  6. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  7. Light Field Imaging Based Accurate Image Specular Highlight Removal.

    PubMed

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into "unsaturated" and "saturated" category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  8. A catalog of isolated galaxy pairs with accurate radial velocities

    NASA Astrophysics Data System (ADS)

    Chamaraux, P.; Nottale, L.

    2016-07-01

    The present paper is devoted to the construction of a catalog of isolated galaxy pairs from the Uppsala Galaxy Catalog (UGC), using accurate radial velocities. The UGC lists 12 921 galaxies to δ > -2°30' and is complete to an apparent diameter of 1'. The criteria used to define the isolated galaxy pairs are based on velocity, interdistance, reciprocity and isolation information. A peculiar investigation has allowed to gather very accurate radial velocities for pair members, from high quality HI and optical measurements (median uncertainty on velocity differences 10 kms-1). Our final catalog contains 1005 galaxy pairs with ρ > 2.5, of which 509 have ρ > 5 (50% of the pairs, i.e. 8%of the UGC galaxies) and 273 are highly isolated with ρ > 10 (27% of the pairs, i.e. 4% of the UGC galaxies). Some global properties of the pair catalog are given.

  9. Methods for accurate homology modeling by global optimization.

    PubMed

    Joo, Keehyoung; Lee, Jinwoo; Lee, Jooyoung

    2012-01-01

    High accuracy protein modeling from its sequence information is an important step toward revealing the sequence-structure-function relationship of proteins and nowadays it becomes increasingly more useful for practical purposes such as in drug discovery and in protein design. We have developed a protocol for protein structure prediction that can generate highly accurate protein models in terms of backbone structure, side-chain orientation, hydrogen bonding, and binding sites of ligands. To obtain accurate protein models, we have combined a powerful global optimization method with traditional homology modeling procedures such as multiple sequence alignment, chain building, and side-chain remodeling. We have built a series of specific score functions for these steps, and optimized them by utilizing conformational space annealing, which is one of the most successful combinatorial optimization algorithms currently available.

  10. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    SciTech Connect

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  11. Light Field Imaging Based Accurate Image Specular Highlight Removal

    PubMed Central

    Wang, Haoqian; Xu, Chenxue; Wang, Xingzheng; Zhang, Yongbing; Peng, Bo

    2016-01-01

    Specular reflection removal is indispensable to many computer vision tasks. However, most existing methods fail or degrade in complex real scenarios for their individual drawbacks. Benefiting from the light field imaging technology, this paper proposes a novel and accurate approach to remove specularity and improve image quality. We first capture images with specularity by the light field camera (Lytro ILLUM). After accurately estimating the image depth, a simple and concise threshold strategy is adopted to cluster the specular pixels into “unsaturated” and “saturated” category. Finally, a color variance analysis of multiple views and a local color refinement are individually conducted on the two categories to recover diffuse color information. Experimental evaluation by comparison with existed methods based on our light field dataset together with Stanford light field archive verifies the effectiveness of our proposed algorithm. PMID:27253083

  12. 25. 'HANGAR SHEDS TRUSSES DETAILS; ARCHITECTURAL PLANS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. 'HANGAR SHEDS - TRUSSES - DETAILS; ARCHITECTURAL PLANS - PLANT AREA; MODIFICATION CENTER NO. 1, DAGGETT, CALIFORNIA.' Sections and details of trusses, ironwork, and joints, as modified to show ridge joint detail. As built. This blueline also shows the fire suppression system, added in orange pencil for 'Project 13: Bldgs. T-30, T-50, T-70, T-90' at a later, unspecified date. Contract no. W509 Eng. 2743; File no. 555/84, revision B, dated August 24, 1942. No sheet number. - Barstow-Daggett Airport, Hangar Shed No. 4, 39500 National Trails Highway, Daggett, San Bernardino County, CA

  13. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  14. Accurate object tracking system by integrating texture and depth cues

    NASA Astrophysics Data System (ADS)

    Chen, Ju-Chin; Lin, Yu-Hang

    2016-03-01

    A robust object tracking system that is invariant to object appearance variations and background clutter is proposed. Multiple instance learning with a boosting algorithm is applied to select discriminant texture information between the object and background data. Additionally, depth information, which is important to distinguish the object from a complicated background, is integrated. We propose two depth-based models that can compensate texture information to cope with both appearance variants and background clutter. Moreover, in order to reduce the risk of drifting problem increased for the textureless depth templates, an update mechanism is proposed to select more precise tracking results to avoid incorrect model updates. In the experiments, the robustness of the proposed system is evaluated and quantitative results are provided for performance analysis. Experimental results show that the proposed system can provide the best success rate and has more accurate tracking results than other well-known algorithms.

  15. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  16. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    PubMed

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  17. Accurate, practical simulation of satellite infrared radiometer spectral data

    SciTech Connect

    Sullivan, T.J.

    1982-09-01

    This study's purpose is to determine whether a relatively simple random band model formulation of atmospheric radiation transfer in the infrared region can provide valid simulations of narrow interval satellite-borne infrared sounder system data. Detailed ozonesondes provide the pertinent atmospheric information and sets of calibrated satellite measurements provide the validation. High resolution line-by-line model calculations are included to complete the evaluation.

  18. Two highly accurate methods for pitch calibration

    NASA Astrophysics Data System (ADS)

    Kniel, K.; Härtig, F.; Osawa, S.; Sato, O.

    2009-11-01

    Among profiles, helix and tooth thickness pitch is one of the most important parameters of an involute gear measurement evaluation. In principle, coordinate measuring machines (CMM) and CNC-controlled gear measuring machines as a variant of a CMM are suited for these kinds of gear measurements. Now the Japan National Institute of Advanced Industrial Science and Technology (NMIJ/AIST) and the German national metrology institute the Physikalisch-Technische Bundesanstalt (PTB) have each developed independently highly accurate pitch calibration methods applicable to CMM or gear measuring machines. Both calibration methods are based on the so-called closure technique which allows the separation of the systematic errors of the measurement device and the errors of the gear. For the verification of both calibration methods, NMIJ/AIST and PTB performed measurements on a specially designed pitch artifact. The comparison of the results shows that both methods can be used for highly accurate calibrations of pitch standards.

  19. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  20. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  1. Preparation and accurate measurement of pure ozone.

    PubMed

    Janssen, Christof; Simone, Daniela; Guinet, Mickaël

    2011-03-01

    Preparation of high purity ozone as well as precise and accurate measurement of its pressure are metrological requirements that are difficult to meet due to ozone decomposition occurring in pressure sensors. The most stable and precise transducer heads are heated and, therefore, prone to accelerated ozone decomposition, limiting measurement accuracy and compromising purity. Here, we describe a vacuum system and a method for ozone production, suitable to accurately determine the pressure of pure ozone by avoiding the problem of decomposition. We use an inert gas in a particularly designed buffer volume and can thus achieve high measurement accuracy and negligible degradation of ozone with purities of 99.8% or better. The high degree of purity is ensured by comprehensive compositional analyses of ozone samples. The method may also be applied to other reactive gases. PMID:21456766

  2. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  3. Accurate modeling of parallel scientific computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Townsend, James C.

    1988-01-01

    Scientific codes are usually parallelized by partitioning a grid among processors. To achieve top performance it is necessary to partition the grid so as to balance workload and minimize communication/synchronization costs. This problem is particularly acute when the grid is irregular, changes over the course of the computation, and is not known until load time. Critical mapping and remapping decisions rest on the ability to accurately predict performance, given a description of a grid and its partition. This paper discusses one approach to this problem, and illustrates its use on a one-dimensional fluids code. The models constructed are shown to be accurate, and are used to find optimal remapping schedules.

  4. Line gas sampling system ensures accurate analysis

    SciTech Connect

    Not Available

    1992-06-01

    Tremendous changes in the natural gas business have resulted in new approaches to the way natural gas is measured. Electronic flow measurement has altered the business forever, with developments in instrumentation and a new sensitivity to the importance of proper natural gas sampling techniques. This paper reports that YZ Industries Inc., Snyder, Texas, combined its 40 years of sampling experience with the latest in microprocessor-based technology to develop the KynaPak 2000 series, the first on-line natural gas sampling system that is both compact and extremely accurate. This means the composition of the sampled gas must be representative of the whole and related to flow. If so, relative measurement and sampling techniques are married, gas volumes are accurately accounted for and adjustments to composition can be made.

  5. Remembering the Specific Visual Details of Presented Objects: Neuroimaging Evidence for Effects of Emotion

    ERIC Educational Resources Information Center

    Kensinger, Elizabeth A.; Schacter, Daniel L.

    2007-01-01

    Memories can be retrieved with varied amounts of visual detail, and the emotional content of information can influence the likelihood that visual detail is remembered. In the present fMRI experiment (conducted with 19 adults scanned using a 3T magnet), we examined the neural processes that correspond with recognition of the visual details of…

  6. Principle of Detailed Balance in Kinetics

    ERIC Educational Resources Information Center

    Alberty, Robert A.

    2004-01-01

    The effects of the detailed balance on chemical kinetics on the chemical monomolecular triangle reactions are illustrated. A simple experiment that illustrates oscillations, limit cycles, bifurcations and noise are illustrated along with the oscillating reactions.

  7. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-10-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  8. Accurate maser positions for MALT-45

    NASA Astrophysics Data System (ADS)

    Jordan, Christopher; Bains, Indra; Voronkov, Maxim; Lo, Nadia; Jones, Paul; Muller, Erik; Cunningham, Maria; Burton, Michael; Brooks, Kate; Green, James; Fuller, Gary; Barnes, Peter; Ellingsen, Simon; Urquhart, James; Morgan, Larry; Rowell, Gavin; Walsh, Andrew; Loenen, Edo; Baan, Willem; Hill, Tracey; Purcell, Cormac; Breen, Shari; Peretto, Nicolas; Jackson, James; Lowe, Vicki; Longmore, Steven

    2013-04-01

    MALT-45 is an untargeted survey, mapping the Galactic plane in CS (1-0), Class I methanol masers, SiO masers and thermal emission, and high frequency continuum emission. After obtaining images from the survey, a number of masers were detected, but without accurate positions. This project seeks to resolve each maser and its environment, with the ultimate goal of placing the Class I methanol maser into a timeline of high mass star formation.

  9. Accurate Molecular Polarizabilities Based on Continuum Electrostatics

    PubMed Central

    Truchon, Jean-François; Nicholls, Anthony; Iftimie, Radu I.; Roux, Benoît; Bayly, Christopher I.

    2013-01-01

    A novel approach for representing the intramolecular polarizability as a continuum dielectric is introduced to account for molecular electronic polarization. It is shown, using a finite-difference solution to the Poisson equation, that the Electronic Polarization from Internal Continuum (EPIC) model yields accurate gas-phase molecular polarizability tensors for a test set of 98 challenging molecules composed of heteroaromatics, alkanes and diatomics. The electronic polarization originates from a high intramolecular dielectric that produces polarizabilities consistent with B3LYP/aug-cc-pVTZ and experimental values when surrounded by vacuum dielectric. In contrast to other approaches to model electronic polarization, this simple model avoids the polarizability catastrophe and accurately calculates molecular anisotropy with the use of very few fitted parameters and without resorting to auxiliary sites or anisotropic atomic centers. On average, the unsigned error in the average polarizability and anisotropy compared to B3LYP are 2% and 5%, respectively. The correlation between the polarizability components from B3LYP and this approach lead to a R2 of 0.990 and a slope of 0.999. Even the F2 anisotropy, shown to be a difficult case for existing polarizability models, can be reproduced within 2% error. In addition to providing new parameters for a rapid method directly applicable to the calculation of polarizabilities, this work extends the widely used Poisson equation to areas where accurate molecular polarizabilities matter. PMID:23646034

  10. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  11. Interior building details of Building C, Room C203: detail decorative ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Interior building details of Building C, Room C-203: detail decorative radiator and four-over-four windows; southwesterly view - San Quentin State Prison, Building 22, Point San Quentin, San Quentin, Marin County, CA

  12. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  13. Can blind persons accurately assess body size from the voice?

    PubMed

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. PMID:27095264

  14. Accurate whole human genome sequencing using reversible terminator chemistry.

    PubMed

    Bentley, David R; Balasubramanian, Shankar; Swerdlow, Harold P; Smith, Geoffrey P; Milton, John; Brown, Clive G; Hall, Kevin P; Evers, Dirk J; Barnes, Colin L; Bignell, Helen R; Boutell, Jonathan M; Bryant, Jason; Carter, Richard J; Keira Cheetham, R; Cox, Anthony J; Ellis, Darren J; Flatbush, Michael R; Gormley, Niall A; Humphray, Sean J; Irving, Leslie J; Karbelashvili, Mirian S; Kirk, Scott M; Li, Heng; Liu, Xiaohai; Maisinger, Klaus S; Murray, Lisa J; Obradovic, Bojan; Ost, Tobias; Parkinson, Michael L; Pratt, Mark R; Rasolonjatovo, Isabelle M J; Reed, Mark T; Rigatti, Roberto; Rodighiero, Chiara; Ross, Mark T; Sabot, Andrea; Sankar, Subramanian V; Scally, Aylwyn; Schroth, Gary P; Smith, Mark E; Smith, Vincent P; Spiridou, Anastassia; Torrance, Peta E; Tzonev, Svilen S; Vermaas, Eric H; Walter, Klaudia; Wu, Xiaolin; Zhang, Lu; Alam, Mohammed D; Anastasi, Carole; Aniebo, Ify C; Bailey, David M D; Bancarz, Iain R; Banerjee, Saibal; Barbour, Selena G; Baybayan, Primo A; Benoit, Vincent A; Benson, Kevin F; Bevis, Claire; Black, Phillip J; Boodhun, Asha; Brennan, Joe S; Bridgham, John A; Brown, Rob C; Brown, Andrew A; Buermann, Dale H; Bundu, Abass A; Burrows, James C; Carter, Nigel P; Castillo, Nestor; Chiara E Catenazzi, Maria; Chang, Simon; Neil Cooley, R; Crake, Natasha R; Dada, Olubunmi O; Diakoumakos, Konstantinos D; Dominguez-Fernandez, Belen; Earnshaw, David J; Egbujor, Ugonna C; Elmore, David W; Etchin, Sergey S; Ewan, Mark R; Fedurco, Milan; Fraser, Louise J; Fuentes Fajardo, Karin V; Scott Furey, W; George, David; Gietzen, Kimberley J; Goddard, Colin P; Golda, George S; Granieri, Philip A; Green, David E; Gustafson, David L; Hansen, Nancy F; Harnish, Kevin; Haudenschild, Christian D; Heyer, Narinder I; Hims, Matthew M; Ho, Johnny T; Horgan, Adrian M; Hoschler, Katya; Hurwitz, Steve; Ivanov, Denis V; Johnson, Maria Q; James, Terena; Huw Jones, T A; Kang, Gyoung-Dong; Kerelska, Tzvetana H; Kersey, Alan D; Khrebtukova, Irina; Kindwall, Alex P; Kingsbury, Zoya; Kokko-Gonzales, Paula I; Kumar, Anil; Laurent, Marc A; Lawley, Cynthia T; Lee, Sarah E; Lee, Xavier; Liao, Arnold K; Loch, Jennifer A; Lok, Mitch; Luo, Shujun; Mammen, Radhika M; Martin, John W; McCauley, Patrick G; McNitt, Paul; Mehta, Parul; Moon, Keith W; Mullens, Joe W; Newington, Taksina; Ning, Zemin; Ling Ng, Bee; Novo, Sonia M; O'Neill, Michael J; Osborne, Mark A; Osnowski, Andrew; Ostadan, Omead; Paraschos, Lambros L; Pickering, Lea; Pike, Andrew C; Pike, Alger C; Chris Pinkard, D; Pliskin, Daniel P; Podhasky, Joe; Quijano, Victor J; Raczy, Come; Rae, Vicki H; Rawlings, Stephen R; Chiva Rodriguez, Ana; Roe, Phyllida M; Rogers, John; Rogert Bacigalupo, Maria C; Romanov, Nikolai; Romieu, Anthony; Roth, Rithy K; Rourke, Natalie J; Ruediger, Silke T; Rusman, Eli; Sanches-Kuiper, Raquel M; Schenker, Martin R; Seoane, Josefina M; Shaw, Richard J; Shiver, Mitch K; Short, Steven W; Sizto, Ning L; Sluis, Johannes P; Smith, Melanie A; Ernest Sohna Sohna, Jean; Spence, Eric J; Stevens, Kim; Sutton, Neil; Szajkowski, Lukasz; Tregidgo, Carolyn L; Turcatti, Gerardo; Vandevondele, Stephanie; Verhovsky, Yuli; Virk, Selene M; Wakelin, Suzanne; Walcott, Gregory C; Wang, Jingwen; Worsley, Graham J; Yan, Juying; Yau, Ling; Zuerlein, Mike; Rogers, Jane; Mullikin, James C; Hurles, Matthew E; McCooke, Nick J; West, John S; Oaks, Frank L; Lundberg, Peter L; Klenerman, David; Durbin, Richard; Smith, Anthony J

    2008-11-01

    DNA sequence information underpins genetic research, enabling discoveries of important biological or medical benefit. Sequencing projects have traditionally used long (400-800 base pair) reads, but the existence of reference sequences for the human and many other genomes makes it possible to develop new, fast approaches to re-sequencing, whereby shorter reads are compared to a reference to identify intraspecies genetic variation. Here we report an approach that generates several billion bases of accurate nucleotide sequence per experiment at low cost. Single molecules of DNA are attached to a flat surface, amplified in situ and used as templates for synthetic sequencing with fluorescent reversible terminator deoxyribonucleotides. Images of the surface are analysed to generate high-quality sequence. We demonstrate application of this approach to human genome sequencing on flow-sorted X chromosomes and then scale the approach to determine the genome sequence of a male Yoruba from Ibadan, Nigeria. We build an accurate consensus sequence from >30x average depth of paired 35-base reads. We characterize four million single-nucleotide polymorphisms and four hundred thousand structural variants, many of which were previously unknown. Our approach is effective for accurate, rapid and economical whole-genome re-sequencing and many other biomedical applications.

  15. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  16. Balancing detail and scale in assessing transparency to improve the governance of agricultural commodity supply chains

    NASA Astrophysics Data System (ADS)

    Godar, Javier; Suavet, Clément; Gardner, Toby A.; Dawkins, Elena; Meyfroidt, Patrick

    2016-03-01

    To date, assessments of the sustainability of agricultural commodity supply chains have largely relied on some combination of macro-scale footprint accounts, detailed life-cycle analyses and fine-scale traceability systems. Yet these approaches are limited in their ability to support the sustainability governance of agricultural supply chains, whether because they are intended for coarser-grained analyses, do not identify individual actors, or are too costly to be implemented in a consistent manner for an entire region of production. Here we illustrate some of the advantages of a complementary middle-ground approach that balances detail and scale of supply chain transparency information by combining consistent country-wide data on commodity production at the sub-national (e.g. municipal) level with per shipment customs data to describe trade flows of a given commodity covering all companies and production regions within that country. This approach can support supply chain governance in two key ways. First, enhanced spatial resolution of the production regions that connect to individual supply chains allows for a more accurate consideration of geographic variability in measures of risk and performance that are associated with different production practices. Second, identification of key actors that operate within a specific supply chain, including producers, traders, shippers and consumers can help discriminate coalitions of actors that have shared stake in a particular region, and that together are capable of delivering more cost-effective and coordinated interventions. We illustrate the potential of this approach with examples from Brazil, Indonesia and Colombia. We discuss how transparency information can deepen understanding of the environmental and social impacts of commodity production systems, how benefits are distributed among actors, and some of the trade-offs involved in efforts to improve supply chain sustainability. We then discuss the challenges and

  17. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well. PMID:26930054

  18. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy

    PubMed Central

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T.; Cerutti, Francesco; Chin, Mary P. W.; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G.; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R.; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both 4He and 12C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth–dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  19. The FLUKA Code: An Accurate Simulation Tool for Particle Therapy.

    PubMed

    Battistoni, Giuseppe; Bauer, Julia; Boehlen, Till T; Cerutti, Francesco; Chin, Mary P W; Dos Santos Augusto, Ricardo; Ferrari, Alfredo; Ortega, Pablo G; Kozłowska, Wioletta; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with both (4)He and (12)C ion beams. Accurate description of ionization energy losses and of particle scattering and interactions lead to the excellent agreement of calculated depth-dose profiles with those measured at leading European hadron therapy centers, both with proton and ion beams. In order to support the application of FLUKA in hospital-based environments, Flair, the FLUKA graphical interface, has been enhanced with the capability of translating CT DICOM images into voxel-based computational phantoms in a fast and well-structured way. The interface is capable of importing also radiotherapy treatment data described in DICOM RT standard. In addition, the interface is equipped with an intuitive PET scanner geometry generator and automatic recording of coincidence events. Clinically, similar cases will be presented both in terms of absorbed dose and biological dose calculations describing the various available features. PMID:27242956

  20. Vadose zone transport field study: Detailed test plan for simulated leak tests

    SciTech Connect

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from these uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to

  1. Accurately Mapping M31's Microlensing Population

    NASA Astrophysics Data System (ADS)

    Crotts, Arlin

    2004-07-01

    We propose to augment an existing microlensing survey of M31 with source identifications provided by a modest amount of ACS {and WFPC2 parallel} observations to yield an accurate measurement of the masses responsible for microlensing in M31, and presumably much of its dark matter. The main benefit of these data is the determination of the physical {or "einstein"} timescale of each microlensing event, rather than an effective {"FWHM"} timescale, allowing masses to be determined more than twice as accurately as without HST data. The einstein timescale is the ratio of the lensing cross-sectional radius and relative velocities. Velocities are known from kinematics, and the cross-section is directly proportional to the {unknown} lensing mass. We cannot easily measure these quantities without knowing the amplification, hence the baseline magnitude, which requires the resolution of HST to find the source star. This makes a crucial difference because M31 lens m ass determinations can be more accurate than those towards the Magellanic Clouds through our Galaxy's halo {for the same number of microlensing events} due to the better constrained geometry in the M31 microlensing situation. Furthermore, our larger survey, just completed, should yield at least 100 M31 microlensing events, more than any Magellanic survey. A small amount of ACS+WFPC2 imaging will deliver the potential of this large database {about 350 nights}. For the whole survey {and a delta-function mass distribution} the mass error should approach only about 15%, or about 6% error in slope for a power-law distribution. These results will better allow us to pinpoint the lens halo fraction, and the shape of the halo lens spatial distribution, and allow generalization/comparison of the nature of halo dark matter in spiral galaxies. In addition, we will be able to establish the baseline magnitude for about 50, 000 variable stars, as well as measure an unprecedentedly deta iled color-magnitude diagram and luminosity

  2. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  3. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2016-07-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  4. Detailed seafloor habitat mapping to enhance marine-resource management

    USGS Publications Warehouse

    Zawada, David G.; Hart, Kristen M.

    2010-01-01

    Pictures of the seafloor capture important information about the sediments, exposed geologic features, submerged aquatic vegetation, and animals found in a given habitat. With the emergence of marine protected areas (MPAs) as a favored tactic for preserving coral reef resources, knowledge of essential habitat components is paramount to designing effective management strategies. Surprisingly, detailed information on seafloor habitat components is not available in many areas that are being considered for MPA designation or that are already designated as MPAs. A task of the U.S. Geological Survey Coral Reef Ecosystem STudies (USGS CREST) project is addressing this issue.

  5. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  6. Accurate density functional thermochemistry for larger molecules.

    SciTech Connect

    Raghavachari, K.; Stefanov, B. B.; Curtiss, L. A.; Lucent Tech.

    1997-06-20

    Density functional methods are combined with isodesmic bond separation reaction energies to yield accurate thermochemistry for larger molecules. Seven different density functionals are assessed for the evaluation of heats of formation, Delta H 0 (298 K), for a test set of 40 molecules composed of H, C, O and N. The use of bond separation energies results in a dramatic improvement in the accuracy of all the density functionals. The B3-LYP functional has the smallest mean absolute deviation from experiment (1.5 kcal mol/f).

  7. Universality: Accurate Checks in Dyson's Hierarchical Model

    NASA Astrophysics Data System (ADS)

    Godina, J. J.; Meurice, Y.; Oktay, M. B.

    2003-06-01

    In this talk we present high-accuracy calculations of the susceptibility near βc for Dyson's hierarchical model in D = 3. Using linear fitting, we estimate the leading (γ) and subleading (Δ) exponents. Independent estimates are obtained by calculating the first two eigenvalues of the linearized renormalization group transformation. We found γ = 1.29914073 ± 10 -8 and, Δ = 0.4259469 ± 10-7 independently of the choice of local integration measure (Ising or Landau-Ginzburg). After a suitable rescaling, the approximate fixed points for a large class of local measure coincide accurately with a fixed point constructed by Koch and Wittwer.

  8. Informational and Normative Influences in Conformity from a Neurocomputational Perspective.

    PubMed

    Toelch, Ulf; Dolan, Raymond J

    2015-10-01

    We consider two distinct influences that drive conformity behaviour. Whereas informational influences facilitate adaptive and accurate responses, normative influences bias decisions to enhance social acceptance. We explore these influences from a perspective of perceptual and value-based decision-making models and apply these models to classical works on conformity. We argue that an informational account predicts a surprising tendency to conform. Moreover, we detail how normative influences fit into this framework and interact with social influences. Finally, we explore potential neuronal substrates for informational and normative influences based on a consideration of the neurobiological literature, highlighting conceptual shortcomings particularly with regard to a failure to segregate informational and normative influences.

  9. Detail in architecture: Between arts & crafts

    NASA Astrophysics Data System (ADS)

    Dulencin, Juraj

    2016-06-01

    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an

  10. Iterative feature refinement for accurate undersampled MR image reconstruction

    NASA Astrophysics Data System (ADS)

    Wang, Shanshan; Liu, Jianbo; Liu, Qiegen; Ying, Leslie; Liu, Xin; Zheng, Hairong; Liang, Dong

    2016-05-01

    Accelerating MR scan is of great significance for clinical, research and advanced applications, and one main effort to achieve this is the utilization of compressed sensing (CS) theory. Nevertheless, the existing CSMRI approaches still have limitations such as fine structure loss or high computational complexity. This paper proposes a novel iterative feature refinement (IFR) module for accurate MR image reconstruction from undersampled K-space data. Integrating IFR with CSMRI which is equipped with fixed transforms, we develop an IFR-CS method to restore meaningful structures and details that are originally discarded without introducing too much additional complexity. Specifically, the proposed IFR-CS is realized with three iterative steps, namely sparsity-promoting denoising, feature refinement and Tikhonov regularization. Experimental results on both simulated and in vivo MR datasets have shown that the proposed module has a strong capability to capture image details, and that IFR-CS is comparable and even superior to other state-of-the-art reconstruction approaches.

  11. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  12. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  13. Accurate determination of characteristic relative permeability curves

    NASA Astrophysics Data System (ADS)

    Krause, Michael H.; Benson, Sally M.

    2015-09-01

    A recently developed technique to accurately characterize sub-core scale heterogeneity is applied to investigate the factors responsible for flowrate-dependent effective relative permeability curves measured on core samples in the laboratory. The dependency of laboratory measured relative permeability on flowrate has long been both supported and challenged by a number of investigators. Studies have shown that this apparent flowrate dependency is a result of both sub-core scale heterogeneity and outlet boundary effects. However this has only been demonstrated numerically for highly simplified models of porous media. In this paper, flowrate dependency of effective relative permeability is demonstrated using two rock cores, a Berea Sandstone and a heterogeneous sandstone from the Otway Basin Pilot Project in Australia. Numerical simulations of steady-state coreflooding experiments are conducted at a number of injection rates using a single set of input characteristic relative permeability curves. Effective relative permeability is then calculated from the simulation data using standard interpretation methods for calculating relative permeability from steady-state tests. Results show that simplified approaches may be used to determine flowrate-independent characteristic relative permeability provided flow rate is sufficiently high, and the core heterogeneity is relatively low. It is also shown that characteristic relative permeability can be determined at any typical flowrate, and even for geologically complex models, when using accurate three-dimensional models.

  14. How Accurately can we Calculate Thermal Systems?

    SciTech Connect

    Cullen, D; Blomquist, R N; Dean, C; Heinrichs, D; Kalugin, M A; Lee, M; Lee, Y; MacFarlan, R; Nagaya, Y; Trkov, A

    2004-04-20

    I would like to determine how accurately a variety of neutron transport code packages (code and cross section libraries) can calculate simple integral parameters, such as K{sub eff}, for systems that are sensitive to thermal neutron scattering. Since we will only consider theoretical systems, we cannot really determine absolute accuracy compared to any real system. Therefore rather than accuracy, it would be more precise to say that I would like to determine the spread in answers that we obtain from a variety of code packages. This spread should serve as an excellent indicator of how accurately we can really model and calculate such systems today. Hopefully, eventually this will lead to improvements in both our codes and the thermal scattering models that they use in the future. In order to accomplish this I propose a number of extremely simple systems that involve thermal neutron scattering that can be easily modeled and calculated by a variety of neutron transport codes. These are theoretical systems designed to emphasize the effects of thermal scattering, since that is what we are interested in studying. I have attempted to keep these systems very simple, and yet at the same time they include most, if not all, of the important thermal scattering effects encountered in a large, water-moderated, uranium fueled thermal system, i.e., our typical thermal reactors.

  15. Accurate Stellar Parameters for Exoplanet Host Stars

    NASA Astrophysics Data System (ADS)

    Brewer, John Michael; Fischer, Debra; Basu, Sarbani; Valenti, Jeff A.

    2015-01-01

    A large impedement to our understanding of planet formation is obtaining a clear picture of planet radii and densities. Although determining precise ratios between planet and stellar host are relatively easy, determining accurate stellar parameters is still a difficult and costly undertaking. High resolution spectral analysis has traditionally yielded precise values for some stellar parameters but stars in common between catalogs from different authors or analyzed using different techniques often show offsets far in excess of their uncertainties. Most analyses now use some external constraint, when available, to break observed degeneracies between surface gravity, effective temperature, and metallicity which can otherwise lead to correlated errors in results. However, these external constraints are impossible to obtain for all stars and can require more costly observations than the initial high resolution spectra. We demonstrate that these discrepencies can be mitigated by use of a larger line list that has carefully tuned atomic line data. We use an iterative modeling technique that does not require external constraints. We compare the surface gravity obtained with our spectral synthesis modeling to asteroseismically determined values for 42 Kepler stars. Our analysis agrees well with only a 0.048 dex offset and an rms scatter of 0.05 dex. Such accurate stellar gravities can reduce the primary source of uncertainty in radii by almost an order of magnitude over unconstrained spectral analysis.

  16. Can radiation therapy treatment planning system accurately predict surface doses in postmastectomy radiation therapy patients?

    SciTech Connect

    Wong, Sharon; Back, Michael; Tan, Poh Wee; Lee, Khai Mun; Baggarley, Shaun; Lu, Jaide Jay

    2012-07-01

    Skin doses have been an important factor in the dose prescription for breast radiotherapy. Recent advances in radiotherapy treatment techniques, such as intensity-modulated radiation therapy (IMRT) and new treatment schemes such as hypofractionated breast therapy have made the precise determination of the surface dose necessary. Detailed information of the dose at various depths of the skin is also critical in designing new treatment strategies. The purpose of this work was to assess the accuracy of surface dose calculation by a clinically used treatment planning system and those measured by thermoluminescence dosimeters (TLDs) in a customized chest wall phantom. This study involved the construction of a chest wall phantom for skin dose assessment. Seven TLDs were distributed throughout each right chest wall phantom to give adequate representation of measured radiation doses. Point doses from the CMS Xio Registered-Sign treatment planning system (TPS) were calculated for each relevant TLD positions and results correlated. There were no significant difference between measured absorbed dose by TLD and calculated doses by the TPS (p > 0.05 (1-tailed). Dose accuracy of up to 2.21% was found. The deviations from the calculated absorbed doses were overall larger (3.4%) when wedges and bolus were used. 3D radiotherapy TPS is a useful and accurate tool to assess the accuracy of surface dose. Our studies have shown that radiation treatment accuracy expressed as a comparison between calculated doses (by TPS) and measured doses (by TLD dosimetry) can be accurately predicted for tangential treatment of the chest wall after mastectomy.

  17. Memory for Specific Visual Details can be Enhanced by Negative Arousing Content

    ERIC Educational Resources Information Center

    Kensinger, Elizabeth A.; Garoff-Eaton, Rachel J.; Schacter, Daniel L.

    2006-01-01

    Individuals often claim that they vividly remember information with negative emotional content. At least two types of information could lead to this sense of enhanced vividness: Information about the emotional item itself (e.g., the exact visual details of a snake) and information about the context in which the emotional item was encountered…

  18. Detailed ultraviolet asymptotics for AdS scalar field perturbations

    NASA Astrophysics Data System (ADS)

    Evnin, Oleg; Jai-akson, Puttarak

    2016-04-01

    We present a range of methods suitable for accurate evaluation of the leading asymptotics for integrals of products of Jacobi polynomials in limits when the degrees of some or all polynomials inside the integral become large. The structures in question have recently emerged in the context of effective descriptions of small amplitude perturbations in anti-de Sitter (AdS) spacetime. The limit of high degree polynomials corresponds in this situation to effective interactions involving extreme short-wavelength modes, whose dynamics is crucial for the turbulent instabilities that determine the ultimate fate of small AdS perturbations. We explicitly apply the relevant asymptotic techniques to the case of a self-interacting probe scalar field in AdS and extract a detailed form of the leading large degree behavior, including closed form analytic expressions for the numerical coefficients appearing in the asymptotics.

  19. Temperature Measurements of Dense Plasmas by Detailed Balance

    SciTech Connect

    Holl, A; Redmer, R; Ropke, G; Reinholz, H; Thiele, R; Fortmann, C; Forster, E; Cao, L; Tschentscher, T; Toleikis, S; Glenzer, S H

    2006-03-14

    Plasmas at high electron densities of n{sub e} = 10{sup 20} - 10{sup 26} cm{sup -3} and moderate temperatures T{sub e} = 1 - 20 eV are important for laboratory astrophysics, high energy density science and inertial confinement fusion. These plasmas are usually referred to as Warm Dense Matter (WDM) and are characterized by a coupling parameter of {Lambda} {approx}> 1 where correlations become important. The characterization of such plasmas is still a challenging task due to the lack of direct measurement techniques for temperatures and densities. They propose to measure the Thomson scattering spectrum of vacuum-UV radiation off density fluctuations in the plasma. Collective Thomson scattering provides accurate data for the electron temperature applying first principles. Further, this method takes advantage of the spectral asymmetry resulting from detailed balance and is independent of collisional effects in these dense systems.

  20. Detailed modeling of cluster galaxies in free-form lenses

    NASA Astrophysics Data System (ADS)

    Lam, Daniel

    2015-08-01

    The main goal of the Frontier Fields is to characterize the population of high redshift galaxies that are gravitationally lensed and magnified by foreground massive galaxy clusters. The magnification received by lensed images has to be accurately quantified in order to derive the correct science results. The magnification is in turn computed from lens models, which are constructed from various constraints, most commonly the positions and redshifts of multiply-lensed galaxies.The locations and magnification of multiple images that appear near cluster galaxies are very sensitive to the mass distribution of those individual galaxies. In current free-form lens models, they are at best crudely approximated by arbitrary mass halos and are usually being completely neglected. Given sufficient free parameters and iterations, such models may be highly consistent but their predictive power would be rather limited. This shortcoming is particularly pronounced in light of the recent discovery of the first multiply-lensed supernova in the Frontier Fields cluster MACSJ1149. The proximity of its images to cluster galaxies mandates detailed modeling on galaxy-scales, where free-form methods solely based on grid solutions simply fail.We present a hybrid free-form lens model of Abell 2744, which for the first time incorporates a detailed mass component modeled by GALFIT that accurately captures the stellar light distribution of the hundred brightest cluster galaxies. The model better reproduces the image positions than a previous version, which modeled cluster galaxies with simplistic NFW halos. Curiously, this improvement is found in all but system 2, which has two radial images appearing around the BCG. Despite its complex light profile is being captured by GALFIT, the persistent discrepancies suggest considering mass distributions that may be largely offset from the stellar light distribution.

  1. Perception of detail in 3D images

    NASA Astrophysics Data System (ADS)

    Heynderickx, Ingrid; Kaptein, Ronald

    2009-01-01

    A lot of current 3D displays suffer from the fact that their spatial resolution is lower compared to their 2D counterparts. One reason for this is that the multiple views needed to generate 3D are often spatially multiplexed. Besides this, imperfect separation of the left- and right-eye view leads to blurring or ghosting, and therefore to a decrease in perceived sharpness. However, people watching stereoscopic videos have reported that the 3D scene contained more details, compared to the 2D scene with identical spatial resolution. This is an interesting notion, that has never been tested in a systematic and quantitative way. To investigate this effect, we had people compare the amount of detail ("detailedness") in pairs of 2D and 3D images. A blur filter was applied to one of the two images, and the blur level was varied using an adaptive staircase procedure. In this way, the blur threshold for which the 2D and 3D image contained perceptually the same amount of detail could be found. Our results show that the 3D image needed to be blurred more than the 2D image. This confirms the earlier qualitative findings that 3D images contain perceptually more details than 2D images with the same spatial resolution.

  2. Detailed numerical simulations of laser cooling processes

    NASA Technical Reports Server (NTRS)

    Ramirez-Serrano, J.; Kohel, J.; Thompson, R.; Yu, N.

    2001-01-01

    We developed a detailed semiclassical numerical code of the forces applied on atoms in optical and magnetic fields to increase the understanding of the different roles that light, atomic collisions, background pressure, and number of particles play in experiments with laser cooled and trapped atoms.

  3. The rich detail of cultural symbol systems.

    PubMed

    Read, Dwight W

    2014-08-01

    The goal of forming a science of intentional behavior requires a more richly detailed account of symbolic systems than is assumed by the authors. Cultural systems are not simply the equivalent in the ideational domain of culture of the purported Baldwin Effect in the genetic domain. PMID:25162879

  4. Big Heads, Small Details and Autism

    ERIC Educational Resources Information Center

    White, Sarah; O'Reilly, Helen; Frith, Uta

    2009-01-01

    Autism is thought to be associated with a bias towards detail-focussed processing. While the cognitive basis remains controversial, one strong hypothesis is that there are high processing costs associated with changing from local into global processing. A possible neural mechanism underlying this processing style is abnormal neural connectivity;…

  5. 33 CFR 116.20 - Detailed investigation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Bridge Programs determines that a Detailed Investigation should be conducted, the District Commander will initiate an investigation that addresses all of the pertinent data regarding the bridge, including... discuss: the obstructive character of the bridge in question; the impact of that bridge upon...

  6. 33 CFR 116.20 - Detailed investigation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Bridge Programs determines that a Detailed Investigation should be conducted, the District Commander will initiate an investigation that addresses all of the pertinent data regarding the bridge, including... discuss: the obstructive character of the bridge in question; the impact of that bridge upon...

  7. 33 CFR 116.20 - Detailed investigation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Office of Bridge Programs determines that a Detailed Investigation should be conducted, the District Commander will initiate an investigation that addresses all of the pertinent data regarding the bridge... will discuss: the obstructive character of the bridge in question; the impact of that bridge...

  8. 33 CFR 116.20 - Detailed investigation.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Office of Bridge Programs determines that a Detailed Investigation should be conducted, the District Commander will initiate an investigation that addresses all of the pertinent data regarding the bridge... will discuss: the obstructive character of the bridge in question; the impact of that bridge...

  9. Contrast-detail phantom scoring methodology.

    PubMed

    Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander

    2005-03-01

    Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on

  10. Contrast-detail phantom scoring methodology.

    PubMed

    Thomas, Jerry A; Chakrabarti, Kish; Kaczmarek, Richard; Romanyukha, Alexander

    2005-03-01

    Published results of medical imaging studies which make use of contrast detail mammography (CDMAM) phantom images for analysis are difficult to compare since data are often not analyzed in the same way. In order to address this situation, the concept of ideal contrast detail curves is suggested. The ideal contrast detail curves are constructed based on the requirement of having the same product of the diameter and contrast (disk thickness) of the minimal correctly determined object for every row of the CDMAM phantom image. A correlation and comparison of five different quality parameters of the CDMAM phantom image determined for obtained ideal contrast detail curves is performed. The image quality parameters compared include: (1) contrast detail curve--a graph correlation between "minimal correct reading" diameter and disk thickness; (2) correct observation ratio--the ratio of the number of correctly identified objects to the actual total number of objects multiplied by 100; (3) image quality figure--the sum of the product of the diameter of the smallest scored object and its relative contrast; (4) figure-of-merit--the zero disk diameter value obtained from extrapolation of the contrast detail curve to the origin (e.g., zero disk diameter); and (5) k-factor--the product of the thickness and the diameter of the smallest correctly identified disks. The analysis carried out showed the existence of a nonlinear relationship between the above parameters, which means that use of different parameters of CDMAM image quality potentially can cause different conclusions about changes in image quality. Construction of the ideal contrast detail curves for CDMAM phantom is an attempt to determine the quantitative limits of the CDMAM phantom as employed for image quality evaluation. These limits are determined by the relationship between certain parameters of a digital mammography system and the set of the gold disks sizes in the CDMAM phantom. Recommendations are made on

  11. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  12. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  13. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  14. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  15. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  16. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception.

  17. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2003-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  18. Accurate Thermal Stresses for Beams: Normal Stress

    NASA Technical Reports Server (NTRS)

    Johnson, Theodore F.; Pilkey, Walter D.

    2002-01-01

    Formulations for a general theory of thermoelasticity to generate accurate thermal stresses for structural members of aeronautical vehicles were developed in 1954 by Boley. The formulation also provides three normal stresses and a shear stress along the entire length of the beam. The Poisson effect of the lateral and transverse normal stresses on a thermally loaded beam is taken into account in this theory by employing an Airy stress function. The Airy stress function enables the reduction of the three-dimensional thermal stress problem to a two-dimensional one. Numerical results from the general theory of thermoelasticity are compared to those obtained from strength of materials. It is concluded that the theory of thermoelasticity for prismatic beams proposed in this paper can be used instead of strength of materials when precise stress results are desired.

  19. Accurate metacognition for visual sensory memory representations.

    PubMed

    Vandenbroucke, Annelinde R E; Sligte, Ilja G; Barrett, Adam B; Seth, Anil K; Fahrenfort, Johannes J; Lamme, Victor A F

    2014-04-01

    The capacity to attend to multiple objects in the visual field is limited. However, introspectively, people feel that they see the whole visual world at once. Some scholars suggest that this introspective feeling is based on short-lived sensory memory representations, whereas others argue that the feeling of seeing more than can be attended to is illusory. Here, we investigated this phenomenon by combining objective memory performance with subjective confidence ratings during a change-detection task. This allowed us to compute a measure of metacognition--the degree of knowledge that subjects have about the correctness of their decisions--for different stages of memory. We show that subjects store more objects in sensory memory than they can attend to but, at the same time, have similar metacognition for sensory memory and working memory representations. This suggests that these subjective impressions are not an illusion but accurate reflections of the richness of visual perception. PMID:24549293

  20. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  1. Accurate Weather Forecasting for Radio Astronomy

    NASA Astrophysics Data System (ADS)

    Maddalena, Ronald J.

    2010-01-01

    The NRAO Green Bank Telescope routinely observes at wavelengths from 3 mm to 1 m. As with all mm-wave telescopes, observing conditions depend upon the variable atmospheric water content. The site provides over 100 days/yr when opacities are low enough for good observing at 3 mm, but winds on the open-air structure reduce the time suitable for 3-mm observing where pointing is critical. Thus, to maximum productivity the observing wavelength needs to match weather conditions. For 6 years the telescope has used a dynamic scheduling system (recently upgraded; www.gb.nrao.edu/DSS) that requires accurate multi-day forecasts for winds and opacities. Since opacity forecasts are not provided by the National Weather Services (NWS), I have developed an automated system that takes available forecasts, derives forecasted opacities, and deploys the results on the web in user-friendly graphical overviews (www.gb.nrao.edu/ rmaddale/Weather). The system relies on the "North American Mesoscale" models, which are updated by the NWS every 6 hrs, have a 12 km horizontal resolution, 1 hr temporal resolution, run to 84 hrs, and have 60 vertical layers that extend to 20 km. Each forecast consists of a time series of ground conditions, cloud coverage, etc, and, most importantly, temperature, pressure, humidity as a function of height. I use the Liebe's MWP model (Radio Science, 20, 1069, 1985) to determine the absorption in each layer for each hour for 30 observing wavelengths. Radiative transfer provides, for each hour and wavelength, the total opacity and the radio brightness of the atmosphere, which contributes substantially at some wavelengths to Tsys and the observational noise. Comparisons of measured and forecasted Tsys at 22.2 and 44 GHz imply that the forecasted opacities are good to about 0.01 Nepers, which is sufficient for forecasting and accurate calibration. Reliability is high out to 2 days and degrades slowly for longer-range forecasts.

  2. Radio Astronomers Set New Standard for Accurate Cosmic Distance Measurement

    NASA Astrophysics Data System (ADS)

    1999-06-01

    the distance to NGC 4258 as either 27 or 29 million light-years, depending on assumptions about the characteristics of this type of star in that galaxy. Other Cepheid-based galaxy distances were used to calculate the expansion rate of the universe, called the Hubble Constant, announced by a team of HST observers last week. "This difference could mean that there may be more uncertainty in Cepheid-determined distances than people have realized," said Moran. "Providing this directly-determined distance to one galaxy -- a distance that can serve as a milestone -- should be helpful in determining distances to other galaxies, and thus the Hubble Constant and the size and age of the universe" The VLBA is a system of ten radio-telescope antennas, each 25 meters (82 feet) in diameter, stretching some 5,000 miles from Mauna Kea in Hawaii to St. Croix in the U.S. Virgin Islands. Operated from NRAO's Array Operations Center in Socorro, NM, the VLBA offers astronomers the greatest resolving power of any telescope anywhere. The NRAO is a facility of the National Science Foundation, operated under cooperative agreement by Associated Universities, Inc. Background information: Determining Cosmic Distances Determining cosmic distances obviously is vital to understanding the size of the universe. In turn, knowing the size of the universe is an important step in determining its age. "The size puts a limit on how much expansion could have occurred since the Big Bang, and thus tells us something about the age," said Moran. However, determining cosmic distances has proven to be a particularly thorny problem for astronomers. In the third century, B.C., the Greek astronomer Aristarchus devised a method of using trigonometry to determine the relative distances of the Moon and Sun, but in practice his method was difficult to use. Though a great first step, he missed the mark by a factor of 20. It wasn't until 1761 that trigonometric methods produced a relatively accurate distance to Venus, thus

  3. On detailed 3D reconstruction of large indoor environments

    NASA Astrophysics Data System (ADS)

    Bondarev, Egor

    2015-03-01

    In this paper we present techniques for highly detailed 3D reconstruction of extra large indoor environments. We discuss the benefits and drawbacks of low-range, far-range and hybrid sensing and reconstruction approaches. The proposed techniques for low-range and hybrid reconstruction, enabling the reconstruction density of 125 points/cm3 on large 100.000 m3 models, are presented in detail. The techniques tackle the core challenges for the above requirements, such as a multi-modal data fusion (fusion of a LIDAR data with a Kinect data), accurate sensor pose estimation, high-density scanning and depth data noise filtering. Other important aspects for extra large 3D indoor reconstruction are the point cloud decimation and real-time rendering. In this paper, we present a method for planar-based point cloud decimation, allowing for reduction of a point cloud size by 80-95%. Besides this, we introduce a method for online rendering of extra large point clouds enabling real-time visualization of huge cloud spaces in conventional web browsers.

  4. Detail enhancement of blurred infrared images based on frequency extrapolation

    NASA Astrophysics Data System (ADS)

    Xu, Fuyuan; Zeng, Deguo; Zhang, Jun; Zheng, Ziyang; Wei, Fei; Wang, Tiedan

    2016-05-01

    A novel algorithm for enhancing the details of the blurred infrared images based on frequency extrapolation has been raised in this paper. Unlike other researchers' work, this algorithm mainly focuses on how to predict the higher frequency information based on the Laplacian pyramid separation of the blurred image. This algorithm uses the first level of the high frequency component of the pyramid of the blurred image to reverse-generate a higher, non-existing frequency component, and adds back to the histogram equalized input blurred image. A simple nonlinear operator is used to analyze the extracted first level high frequency component of the pyramid. Two critical parameters are participated in the calculation known as the clipping parameter C and the scaling parameter S. The detailed analysis of how these two parameters work during the procedure is figure demonstrated in this paper. The blurred image will become clear, and the detail will be enhanced due to the added higher frequency information. This algorithm has the advantages of computational simplicity and great performance, and it can definitely be deployed in the real-time industrial applications. We have done lots of experiments and gave illustrations of the algorithm's performance in this paper to convince its effectiveness.

  5. Bolivia-Brazil gas line route detailed

    SciTech Connect

    Not Available

    1992-05-11

    This paper reports that state oil companies of Brazil and Bolivia have signed an agreement outlining the route for a 2,270 km pipeline system to deliver natural gas from Bolivian fields to Southeast Brazil. The two sides currently are negotiating details about construction costs as well as contract volumes and prices. Capacity is projected at 283-565 MMcfd. No official details are available, but Roberto Y. Hukai, a director of the Sao Paulo engineering company Jaako Poyry/Technoplan, estimates transportation cost of the Bolivian gas at 90 cents/MMBTU. That would be competitive with the price of gas delivered to the Sao Paulo gas utility Comgas, he the. Brazil's Petroleos Brasileiro SA estimates construction of the pipeline on the Brazilian side alone with cost $1.2-1.4 billion. Bolivia's Yacimientos Petroliferos Fiscales Bolivianos (YPFB) is negotiating with private domestic and foreign investors for construction of the Bolivian portion of the project.

  6. A detailed DSMC surface chemistry model

    NASA Astrophysics Data System (ADS)

    Molchanova Shumakova, A. N.; Kashkovsky, A. V.; Bondar, Ye. A.

    2014-12-01

    This work is aimed at development of detailed molecular surface chemistry models for DSMC method, their implementation into the SMILE++ software system, verification and validation. An approach to construction of DSMC suface chemistry models based on macroscopic reaction rate data was proposed. The approach was applied to macroscopic data for the air mixture of Deutschmann et al. The resulting DSMC surface chemistry model was implemented into SMILE++ software system and verified for thermal equilibrium conditions.

  7. Detailed Jet Dynamics in a Collapsing Bubble

    NASA Astrophysics Data System (ADS)

    Supponen, Outi; Obreschkow, Danail; Kobel, Philippe; Farhat, Mohamed

    2015-12-01

    We present detailed visualizations of the micro-jet forming inside an aspherically collapsing cavitation bubble near a free surface. The high-quality visualizations of large and strongly deformed bubbles disclose so far unseen features of the dynamics inside the bubble, such as a mushroom-like flattened jet-tip, crown formation and micro-droplets. We also find that jetting near a free surface reduces the collapse time relative to the Rayleigh time.

  8. Detailed scour measurements around a debris accumulation

    USGS Publications Warehouse

    Mueller, David S.; Parola, Arthur C.

    1998-01-01

    Detailed scour measurements were made at Farm-Market 2004 over the Brazos River near Lake Jackson, Tex. during flooding in October 1994. Woody debris accumulations on bents 6, 7, and 8 obstructed flow through the bridge, causing scour of the streambed. Measurements at the site included three-dimensional velocities, channel bathymetry, water-surface elevations, water-surface slope, and discharge. Channel geometry upstream from the bridge caused approach conditions to be nonuniform.

  9. Structural concepts and details for seismic design

    SciTech Connect

    Not Available

    1991-09-01

    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  10. Downscaling NASA Climatological Data to Produce Detailed Climate Zone Maps

    NASA Technical Reports Server (NTRS)

    Chandler, William S.; Hoell, James M.; Westberg, David J.; Whitlock, Charles H.; Zhang, Taiping; Stackhouse, P. W.

    2011-01-01

    The design of energy efficient sustainable buildings is heavily dependent on accurate long-term and near real-time local weather data. To varying degrees the current meteorological networks over the globe have been used to provide these data albeit often from sites far removed from the desired location. The national need is for access to weather and solar resource data accurate enough to use to develop preliminary building designs within a short proposal time limit, usually within 60 days. The NASA Prediction Of Worldwide Energy Resource (POWER) project was established by NASA to provide industry friendly access to globally distributed solar and meteorological data. As a result, the POWER web site (power.larc.nasa.gov) now provides global information on many renewable energy parameters and several buildings-related items but at a relatively coarse resolution. This paper describes a method of downscaling NASA atmospheric assimilation model results to higher resolution and maps those parameters to produce building climate zone maps using estimates of temperature and precipitation. The distribution of climate zones for North America with an emphasis on the Pacific Northwest for just one year shows very good correspondence to the currently defined distribution. The method has the potential to provide a consistent procedure for deriving climate zone information on a global basis that can be assessed for variability and updated more regularly.

  11. In situ studies on controlling an atomically-accurate formation process of gold nanoclusters

    NASA Astrophysics Data System (ADS)

    Yang, Lina; Cheng, Hao; Jiang, Yong; Huang, Ting; Bao, Jie; Sun, Zhihu; Jiang, Zheng; Ma, Jingyuan; Sun, Fanfei; Liu, Qinghua; Yao, Tao; Deng, Huijuan; Wang, Shuxin; Zhu, Manzhou; Wei, Shiqiang

    2015-08-01

    fragmentation of the initial larger Aun clusters into metastable intermediate Au8-Au13 smaller clusters. This is a critical step, which allows for the secondary size-growth step of the intermediates toward the atomically monodisperse Au13 clusters via incorporating the reactive Au(i)-Cl species in the solution. Such a secondary-growth pathway is further confirmed by the successful growth of Au13 through reaction of isolated Au11 clusters with AuClPPh3 in the HCl environment. This work addresses the importance of reaction intermediates in guiding the way towards controllable synthesis of metal nanoclusters. Electronic supplementary information (ESI) available: Synthesis and characterization of the starting and end Au nanoclusters, assignment of the MALDI-MS peaks, details for the EXAFS curve-fitting and fitting results, parallel experiments using sulfuric acid and acetic acid as etchants, and experimental details for growing isolated Au11 into Au13 clusters in the HCl environment. See DOI: 10.1039/c5nr03711e

  12. Accurate masses for dispersion-supported galaxies

    NASA Astrophysics Data System (ADS)

    Wolf, Joe; Martinez, Gregory D.; Bullock, James S.; Kaplinghat, Manoj; Geha, Marla; Muñoz, Ricardo R.; Simon, Joshua D.; Avedo, Frank F.

    2010-08-01

    We derive an accurate mass estimator for dispersion-supported stellar systems and demonstrate its validity by analysing resolved line-of-sight velocity data for globular clusters, dwarf galaxies and elliptical galaxies. Specifically, by manipulating the spherical Jeans equation we show that the mass enclosed within the 3D deprojected half-light radius r1/2 can be determined with only mild assumptions about the spatial variation of the stellar velocity dispersion anisotropy as long as the projected velocity dispersion profile is fairly flat near the half-light radius, as is typically observed. We find M1/2 = 3 G-1< σ2los > r1/2 ~= 4 G-1< σ2los > Re, where < σ2los > is the luminosity-weighted square of the line-of-sight velocity dispersion and Re is the 2D projected half-light radius. While deceptively familiar in form, this formula is not the virial theorem, which cannot be used to determine accurate masses unless the radial profile of the total mass is known a priori. We utilize this finding to show that all of the Milky Way dwarf spheroidal galaxies (MW dSphs) are consistent with having formed within a halo of a mass of approximately 3 × 109 Msolar, assuming a Λ cold dark matter cosmology. The faintest MW dSphs seem to have formed in dark matter haloes that are at least as massive as those of the brightest MW dSphs, despite the almost five orders of magnitude spread in luminosity between them. We expand our analysis to the full range of observed dispersion-supported stellar systems and examine their dynamical I-band mass-to-light ratios ΥI1/2. The ΥI1/2 versus M1/2 relation for dispersion-supported galaxies follows a U shape, with a broad minimum near ΥI1/2 ~= 3 that spans dwarf elliptical galaxies to normal ellipticals, a steep rise to ΥI1/2 ~= 3200 for ultra-faint dSphs and a more shallow rise to ΥI1/2 ~= 800 for galaxy cluster spheroids.

  13. Revisiting the Seductive Details Effect in Multimedia Learning: Context-Dependency of Seductive Details

    ERIC Educational Resources Information Center

    Ozdemir, Devrim; Doolittle, Peter

    2015-01-01

    The purpose of this study was to investigate the effects of context-dependency of seductive details on recall and transfer in multimedia learning environments. Seductive details were interesting yet irrelevant sentences in the instructional text. Two experiments were conducted. The purpose of Experiment 1 was to identify context-dependent and…

  14. Detailed close-ups and the big picture of spliceosomes

    PubMed Central

    Jurica, Melissa S.

    2008-01-01

    Summary The spliceosome is the huge macromolecular assembly responsible for the removal of introns from pre-mRNA transcripts. The size and complexity of this dynamic cellular machine dictates that structural analysis of the spliceosome is best served by a combination of techniques. Electron microscopy is providing a more global, albeit less detailed, view of spliceosome assemblies. X-ray crystallographers and NMR spectroscopists are steadily reporting more atomic resolution structures of individual spliceosome components and fragments. Increasingly, structures of these individual pieces in complex with binding partners are yielding insights into the interfaces that hold the entire spliceosome assembly together. Although the information arising from the various structural studies of splicing machinery has not yet fully converged into a complete model, we can expect that a detailed understanding of spliceosome structure will arise at the juncture of structural and computational modeling methods. PMID:18550358

  15. Exploring Architectural Details Through a Wearable Egocentric Vision Device.

    PubMed

    Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita

    2016-01-01

    Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience. PMID:26901197

  16. Using ecological zones to increase the detail of Landsat classifications

    NASA Technical Reports Server (NTRS)

    Fox, L., III; Mayer, K. E.

    1981-01-01

    Changes in classification detail of forest species descriptions were made for Landsat data on 2.2 million acres in northwestern California. Because basic forest canopy structures may exhibit very similar E-M energy reflectance patterns in different environmental regions, classification labels based on Landsat spectral signatures alone become very generalized when mapping large heterogeneous ecological regions. By adding a seven ecological zone stratification, a 167% improvement in classification detail was made over the results achieved without it. The seven zone stratification is a less costly alternative to the inclusion of complex collateral information, such as terrain data and soil type, into the Landsat data base when making inventories of areas greater than 500,000 acres.

  17. Exploring Architectural Details Through a Wearable Egocentric Vision Device

    PubMed Central

    Alletto, Stefano; Abati, Davide; Serra, Giuseppe; Cucchiara, Rita

    2016-01-01

    Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience. PMID:26901197

  18. A highly accurate heuristic algorithm for the haplotype assembly problem

    PubMed Central

    2013-01-01

    Background Single nucleotide polymorphisms (SNPs) are the most common form of genetic variation in human DNA. The sequence of SNPs in each of the two copies of a given chromosome in a diploid organism is referred to as a haplotype. Haplotype information has many applications such as gene disease diagnoses, drug design, etc. The haplotype assembly problem is defined as follows: Given a set of fragments sequenced from the two copies of a chromosome of a single individual, and their locations in the chromosome, which can be pre-determined by aligning the fragments to a reference DNA sequence, the goal here is to reconstruct two haplotypes (h1, h2) from the input fragments. Existing algorithms do not work well when the error rate of fragments is high. Here we design an algorithm that can give accurate solutions, even if the error rate of fragments is high. Results We first give a dynamic programming algorithm that can give exact solutions to the haplotype assembly problem. The time complexity of the algorithm is O(n × 2t × t), where n is the number of SNPs, and t is the maximum coverage of a SNP site. The algorithm is slow when t is large. To solve the problem when t is large, we further propose a heuristic algorithm on the basis of the dynamic programming algorithm. Experiments show that our heuristic algorithm can give very accurate solutions. Conclusions We have tested our algorithm on a set of benchmark datasets. Experiments show that our algorithm can give very accurate solutions. It outperforms most of the existing programs when the error rate of the input fragments is high. PMID:23445458

  19. The Nigerian national blindness and visual impairment survey: Rationale, objectives and detailed methodology

    PubMed Central

    Dineen, Brendan; Gilbert, Clare E; Rabiu, Mansur; Kyari, Fatima; Mahdi, Abdull M; Abubakar, Tafida; Ezelum, Christian C; Gabriel, Entekume; Elhassan , Elizabeth; Abiose, Adenike; Faal, Hannah; Jiya, Jonathan Y; Ozemela, Chinenyem P; Lee, Pak Sang; Gudlavalleti, Murthy VS

    2008-01-01

    Background Despite having the largest population in Africa, Nigeria has no accurate population based data to plan and evaluate eye care services. A national survey was undertaken to estimate the prevalence and determine the major causes of blindness and low vision. This paper presents the detailed methodology used during the survey. Methods A nationally representative sample of persons aged 40 years and above was selected. Children aged 10–15 years and individuals aged <10 or 16–39 years with visual impairment were also included if they lived in households with an eligible adult. All participants had their height, weight, and blood pressure measured followed by assessment of presenting visual acuity, refractokeratomery, A-scan ultrasonography, visual fields and best corrected visual acuity. Anterior and posterior segments of each eye were examined with a torch and direct ophthalmoscope. Participants with visual acuity of < = 6/12 in one or both eyes underwent detailed examination including applanation tonometry, dilated slit lamp biomicroscopy, lens grading and fundus photography. All those who had undergone cataract surgery were refracted and best corrected vision recorded. Causes of visual impairment by eye and for the individual were determined using a clinical algorithm recommended by the World Health Organization. In addition, 1 in 7 adults also underwent a complete work up as described for those with vision < = 6/12 for constructing a normative data base for Nigerians. Discussion The field work for the study was completed in 30 months over the period 2005–2007 and covered 305 clusters across the entire country. Concurrently persons 40+ years were examined to form a normative data base. Analysis of the data is currently underway. Conclusion The methodology used was robust and adequate to provide estimates on the prevalence and causes of blindness in Nigeria. The survey would also provide information on barriers to accessing services, quality of life of

  20. Accurate free energy calculation along optimized paths.

    PubMed

    Chen, Changjun; Xiao, Yi

    2010-05-01

    The path-based methods of free energy calculation, such as thermodynamic integration and free energy perturbation, are simple in theory, but difficult in practice because in most cases smooth paths do not exist, especially for large molecules. In this article, we present a novel method to build the transition path of a peptide. We use harmonic potentials to restrain its nonhydrogen atom dihedrals in the initial state and set the equilibrium angles of the potentials as those in the final state. Through a series of steps of geometrical optimization, we can construct a smooth and short path from the initial state to the final state. This path can be used to calculate free energy difference. To validate this method, we apply it to a small 10-ALA peptide and find that the calculated free energy changes in helix-helix and helix-hairpin transitions are both self-convergent and cross-convergent. We also calculate the free energy differences between different stable states of beta-hairpin trpzip2, and the results show that this method is more efficient than the conventional molecular dynamics method in accurate free energy calculation.

  1. Accurate adiabatic correction in the hydrogen molecule

    NASA Astrophysics Data System (ADS)

    Pachucki, Krzysztof; Komasa, Jacek

    2014-12-01

    A new formalism for the accurate treatment of adiabatic effects in the hydrogen molecule is presented, in which the electronic wave function is expanded in the James-Coolidge basis functions. Systematic increase in the size of the basis set permits estimation of the accuracy. Numerical results for the adiabatic correction to the Born-Oppenheimer interaction energy reveal a relative precision of 10-12 at an arbitrary internuclear distance. Such calculations have been performed for 88 internuclear distances in the range of 0 < R ⩽ 12 bohrs to construct the adiabatic correction potential and to solve the nuclear Schrödinger equation. Finally, the adiabatic correction to the dissociation energies of all rovibrational levels in H2, HD, HT, D2, DT, and T2 has been determined. For the ground state of H2 the estimated precision is 3 × 10-7 cm-1, which is almost three orders of magnitude higher than that of the best previous result. The achieved accuracy removes the adiabatic contribution from the overall error budget of the present day theoretical predictions for the rovibrational levels.

  2. Fast and Accurate Exhaled Breath Ammonia Measurement

    PubMed Central

    Solga, Steven F.; Mudalel, Matthew L.; Spacek, Lisa A.; Risby, Terence H.

    2014-01-01

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations. PMID:24962141

  3. Accurate adiabatic correction in the hydrogen molecule

    SciTech Connect

    Pachucki, Krzysztof; Komasa, Jacek

    2014-12-14

    A new formalism for the accurate treatment of adiabatic effects in the hydrogen molecule is presented, in which the electronic wave function is expanded in the James-Coolidge basis functions. Systematic increase in the size of the basis set permits estimation of the accuracy. Numerical results for the adiabatic correction to the Born-Oppenheimer interaction energy reveal a relative precision of 10{sup −12} at an arbitrary internuclear distance. Such calculations have been performed for 88 internuclear distances in the range of 0 < R ⩽ 12 bohrs to construct the adiabatic correction potential and to solve the nuclear Schrödinger equation. Finally, the adiabatic correction to the dissociation energies of all rovibrational levels in H{sub 2}, HD, HT, D{sub 2}, DT, and T{sub 2} has been determined. For the ground state of H{sub 2} the estimated precision is 3 × 10{sup −7} cm{sup −1}, which is almost three orders of magnitude higher than that of the best previous result. The achieved accuracy removes the adiabatic contribution from the overall error budget of the present day theoretical predictions for the rovibrational levels.

  4. Accurate adiabatic correction in the hydrogen molecule.

    PubMed

    Pachucki, Krzysztof; Komasa, Jacek

    2014-12-14

    A new formalism for the accurate treatment of adiabatic effects in the hydrogen molecule is presented, in which the electronic wave function is expanded in the James-Coolidge basis functions. Systematic increase in the size of the basis set permits estimation of the accuracy. Numerical results for the adiabatic correction to the Born-Oppenheimer interaction energy reveal a relative precision of 10(-12) at an arbitrary internuclear distance. Such calculations have been performed for 88 internuclear distances in the range of 0 < R ⩽ 12 bohrs to construct the adiabatic correction potential and to solve the nuclear Schrödinger equation. Finally, the adiabatic correction to the dissociation energies of all rovibrational levels in H2, HD, HT, D2, DT, and T2 has been determined. For the ground state of H2 the estimated precision is 3 × 10(-7) cm(-1), which is almost three orders of magnitude higher than that of the best previous result. The achieved accuracy removes the adiabatic contribution from the overall error budget of the present day theoretical predictions for the rovibrational levels. PMID:25494728

  5. MEMS accelerometers in accurate mount positioning systems

    NASA Astrophysics Data System (ADS)

    Mészáros, László; Pál, András.; Jaskó, Attila

    2014-07-01

    In order to attain precise, accurate and stateless positioning of telescope mounts we apply microelectromechanical accelerometer systems (also known as MEMS accelerometers). In common practice, feedback from the mount position is provided by electronic, optical or magneto-mechanical systems or via real-time astrometric solution based on the acquired images. Hence, MEMS-based systems are completely independent from these mechanisms. Our goal is to investigate the advantages and challenges of applying such devices and to reach the sub-arcminute range { that is well smaller than the field-of-view of conventional imaging telescope systems. We present how this sub-arcminute accuracy can be achieved with very cheap MEMS sensors. Basically, these sensors yield raw output within an accuracy of a few degrees. We show what kind of calibration procedures could exploit spherical and cylindrical constraints between accelerometer output channels in order to achieve the previously mentioned accuracy level. We also demonstrate how can our implementation be inserted in a telescope control system. Although this attainable precision is less than both the resolution of telescope mount drive mechanics and the accuracy of astrometric solutions, the independent nature of attitude determination could significantly increase the reliability of autonomous or remotely operated astronomical observations.

  6. Uncertainty partition challenges the predictability of vital details of climate change

    NASA Astrophysics Data System (ADS)

    Fatichi, Simone; Ivanov, Valeriy Y.; Paschalis, Athanasios; Peleg, Nadav; Molnar, Peter; Rimkus, Stefan; Kim, Jongho; Burlando, Paolo; Caporali, Enrica

    2016-05-01

    Decision makers and consultants are particularly interested in "detailed" information on future climate to prepare adaptation strategies and adjust design criteria. Projections of future climate at local spatial scales and fine temporal resolutions are subject to the same uncertainties as those at the global scale but the partition among uncertainty sources (emission scenarios, climate models, and internal climate variability) remains largely unquantified. At the local scale, the uncertainty of the mean and extremes of precipitation is shown to be irreducible for mid and end-of-century projections because it is almost entirely caused by internal climate variability (stochasticity). Conversely, projected changes in mean air temperature and other meteorological variables can be largely constrained, even at local scales, if more accurate emission scenarios can be developed. The results were obtained by applying a comprehensive stochastic downscaling technique to climate model outputs for three exemplary locations. In contrast with earlier studies, the three sources of uncertainty are considered as dependent and, therefore, non-additive. The evidence of the predominant role of internal climate variability leaves little room for uncertainty reduction in precipitation projections; however, the inference is not necessarily negative, because the uncertainty of historic observations is almost as large as that for future projections with direct implications for climate change adaptation measures.

  7. Mouse models of human AML accurately predict chemotherapy response

    PubMed Central

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S.; Zhao, Zhen; Rappaport, Amy R.; Luo, Weijun; McCurrach, Mila E.; Yang, Miao-Miao; Dolan, M. Eileen; Kogan, Scott C.; Downing, James R.; Lowe, Scott W.

    2009-01-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to conventional therapy that mirror clinical experience. Specifically, murine leukemias expressing the AML1/ETO fusion oncoprotein, associated with a favorable prognosis in patients, show a dramatic response to induction chemotherapy owing to robust activation of the p53 tumor suppressor network. Conversely, murine leukemias expressing MLL fusion proteins, associated with a dismal prognosis in patients, are drug-resistant due to an attenuated p53 response. Our studies highlight the importance of genetic information in guiding the treatment of human AML, functionally establish the p53 network as a central determinant of chemotherapy response in AML, and demonstrate that genetically engineered mouse models of human cancer can accurately predict therapy response in patients. PMID:19339691

  8. Accurate eye center location through invariant isocentric patterns.

    PubMed

    Valenti, Roberto; Gevers, Theo

    2012-09-01

    Locating the center of the eyes allows for valuable information to be captured and used in a wide range of applications. Accurate eye center location can be determined using commercial eye-gaze trackers, but additional constraints and expensive hardware make these existing solutions unattractive and impossible to use on standard (i.e., visible wavelength), low-resolution images of eyes. Systems based solely on appearance are proposed in the literature, but their accuracy does not allow us to accurately locate and distinguish eye centers movements in these low-resolution settings. Our aim is to bridge this gap by locating the center of the eye within the area of the pupil on low-resolution images taken from a webcam or a similar device. The proposed method makes use of isophote properties to gain invariance to linear lighting changes (contrast and brightness), to achieve in-plane rotational invariance, and to keep low-computational costs. To further gain scale invariance, the approach is applied to a scale space pyramid. In this paper, we extensively test our approach for its robustness to changes in illumination, head pose, scale, occlusion, and eye rotation. We demonstrate that our system can achieve a significant improvement in accuracy over state-of-the-art techniques for eye center location in standard low-resolution imagery. PMID:22813958

  9. Unstable total hip arthroplasty: detailed overview.

    PubMed

    Berry, D J

    2001-01-01

    Hip dislocation is one of the most common complications of THA. Good preoperative planning, good postoperative patient education, accurate intraoperative component positioning, rigorous intraoperative testing of hip stability, and good repair of soft tissues during closure all help prevent dislocation. Early postoperative dislocations and first or second dislocations usually are treated with closed reduction and a hip guide brace or hip spica cast, but when dislocation becomes recurrent, surgical treatment usually is needed. When possible, surgical treatment is based on identifying and treating a specific problem leading to the dislocation, such as implant malposition, inadequate soft-tissue tension, or impingement. In selected circumstances, constrained implants or bipolar or tripolar implants provide powerful tools to restore hip stability.

  10. A detailed phylogeny for the Methanomicrobiales

    NASA Technical Reports Server (NTRS)

    Rouviere, P.; Mandelco, L.; Winker, S.; Woese, C. R.

    1992-01-01

    The small subunit rRNA sequence of twenty archaea, members of the Methanomicrobiales, permits a detailed phylogenetic tree to be inferred for the group. The tree confirms earlier studies, based on far fewer sequences, in showing the group to be divided into two major clusters, temporarily designated the "methanosarcina" group and the "methanogenium" group. The tree also defines phylogenetic relationships within these two groups, which in some cases do not agree with the phylogenetic relationships implied by current taxonomic names--a problem most acute for the genus Methanogenium and its relatives. The present phylogenetic characterization provides the basis for a consistent taxonomic restructuring of this major methanogenic taxon.

  11. Detailed chemical kinetic model for ethanol oxidation

    SciTech Connect

    Marinov, N.

    1997-04-01

    A detailed chemical kinetic model for ethanol oxidation has been developed and validated against a variety of experimental data sets. Laminar flame speed data obtained from a constant volume bomb, ignition delay data behind reflected shock waves, and ethanol oxidation product profiles from a turbulent flow reactor were used in this study. Very good agreement was found in modeling the data sets obtained from the three different experimental systems. The computational modeling results show that high temperature ethanol oxidation exhibits strong sensitivity to the fall-off kinetics of ethanol decomposition, branching ratio selection for c2h5oh+oh=products, and reactions involving the hydroperoxyl (HO2) radical.

  12. Instrumentation for detailed bridge-scour measurements

    USGS Publications Warehouse

    Landers, Mark N.; Mueller, David S.; Trent, Roy E.

    1993-01-01

    A portable instrumentation system is being developed to obtain channel bathymetry during floods for detailed bridge-scour measurements. Portable scour measuring systems have four components: sounding instrument, horizontal positioning instrument, deployment mechanisms, and data storage device. The sounding instrument will be a digital fathometer. Horizontal position will be measured using a range-azimuth based hydrographic survey system. The deployment mechanism designed for this system is a remote-controlled boat using a small waterplane area, twin-hull design. An on-board computer and radio will monitor the vessel instrumentation, record measured data, and telemeter data to shore.

  13. The Clinical Impact of Accurate Cystine Calculi Characterization Using Dual-Energy Computed Tomography.

    PubMed

    Haley, William E; Ibrahim, El-Sayed H; Qu, Mingliang; Cernigliaro, Joseph G; Goldfarb, David S; McCollough, Cynthia H

    2015-01-01

    Dual-energy computed tomography (DECT) has recently been suggested as the imaging modality of choice for kidney stones due to its ability to provide information on stone composition. Standard postprocessing of the dual-energy images accurately identifies uric acid stones, but not other types. Cystine stones can be identified from DECT images when analyzed with advanced postprocessing. This case report describes clinical implications of accurate diagnosis of cystine stones using DECT.

  14. Factors affecting the accurate determination of cerebrovascular blood flow using high-speed droplet imaging

    NASA Astrophysics Data System (ADS)

    Rudin, Stephen; Divani, Afshin; Wakhloo, Ajay K.; Lieber, Baruch B.; Granger, William; Bednarek, Daniel R.; Yang, Chang-Ying J.

    1998-07-01

    Detailed cerebrovascular blood flow can be more accurately determined radiographically from the new droplet tracking method previously introduced by the authors than from standard soluble contrast techniques. For example, arteriovenous malformation (AVM) transit times which are crucial for proper glue embolization treatments, were shown to be about half when using droplets compared to those measured using soluble contrast techniques. In this work, factors such as x-ray pulse duration, frame rate, system spatial resolution (focal spot size), droplet size, droplet and system contrast parameters, and system noise are considered in relation to their affect on the accurate determination of droplet location and velocity.

  15. Chord, Tie Bar & Crossbracing Joint Detail in Plan; Crossbracing ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Chord, Tie Bar & Crossbracing Joint Detail in Plan; Crossbracing Center Joint Detail in Plan; Chord Joining Detail in Plan & Elevation; Chord, Panel Post, Tie Bar, & Diagonal Brace Joint Detail; Crossbracing Center Joint Detail in Section; Chord, Panel Post, Tie Bar & Horizontal Brace Joint Detail - Narrows Bridge, Spanning Sugar Creek at Old County Road 280 East, Marshall, Parke County, IN

  16. Fast and accurate determination of modularity and its effect size

    NASA Astrophysics Data System (ADS)

    Treviño, Santiago, III; Nyberg, Amy; Del Genio, Charo I.; Bassler, Kevin E.

    2015-02-01

    We present a fast spectral algorithm for community detection in complex networks. Our method searches for the partition with the maximum value of the modularity via the interplay of several refinement steps that include both agglomeration and division. We validate the accuracy of the algorithm by applying it to several real-world benchmark networks. On all these, our algorithm performs as well or better than any other known polynomial scheme. This allows us to extensively study the modularity distribution in ensembles of Erdős-Rényi networks, producing theoretical predictions for means and variances inclusive of finite-size corrections. Our work provides a way to accurately estimate the effect size of modularity, providing a z-score measure of it and enabling a more informative comparison of networks with different numbers of nodes and links.

  17. Accurate paleointensities - the multi-method approach

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  18. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  19. Accurate Thermal Conductivities from First Principles

    NASA Astrophysics Data System (ADS)

    Carbogno, Christian

    2015-03-01

    In spite of significant research efforts, a first-principles determination of the thermal conductivity at high temperatures has remained elusive. On the one hand, Boltzmann transport techniques that include anharmonic effects in the nuclear dynamics only perturbatively become inaccurate or inapplicable under such conditions. On the other hand, non-equilibrium molecular dynamics (MD) methods suffer from enormous finite-size artifacts in the computationally feasible supercells, which prevent an accurate extrapolation to the bulk limit of the thermal conductivity. In this work, we overcome this limitation by performing ab initio MD simulations in thermodynamic equilibrium that account for all orders of anharmonicity. The thermal conductivity is then assessed from the auto-correlation function of the heat flux using the Green-Kubo formalism. Foremost, we discuss the fundamental theory underlying a first-principles definition of the heat flux using the virial theorem. We validate our approach and in particular the techniques developed to overcome finite time and size effects, e.g., by inspecting silicon, the thermal conductivity of which is particularly challenging to converge. Furthermore, we use this framework to investigate the thermal conductivity of ZrO2, which is known for its high degree of anharmonicity. Our calculations shed light on the heat resistance mechanism active in this material, which eventually allows us to discuss how the thermal conductivity can be controlled by doping and co-doping. This work has been performed in collaboration with R. Ramprasad (University of Connecticut), C. G. Levi and C. G. Van de Walle (University of California Santa Barbara).

  20. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  1. How flatbed scanners upset accurate film dosimetry.

    PubMed

    van Battum, L J; Huizenga, H; Verdaasdonk, R M; Heukelom, S

    2016-01-21

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner's transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner's optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  2. Chewing simulation with a physically accurate deformable model.

    PubMed

    Pascale, Andra Maria; Ruge, Sebastian; Hauth, Steffen; Kordaß, Bernd; Linsen, Lars

    2015-01-01

    Nowadays, CAD/CAM software is being used to compute the optimal shape and position of a new tooth model meant for a patient. With this possible future application in mind, we present in this article an independent and stand-alone interactive application that simulates the human chewing process and the deformation it produces in the food substrate. Chewing motion sensors are used to produce an accurate representation of the jaw movement. The substrate is represented by a deformable elastic model based on the finite linear elements method, which preserves physical accuracy. Collision detection based on spatial partitioning is used to calculate the forces that are acting on the deformable model. Based on the calculated information, geometry elements are added to the scene to enhance the information available for the user. The goal of the simulation is to present a complete scene to the dentist, highlighting the points where the teeth came into contact with the substrate and giving information about how much force acted at these points, which therefore makes it possible to indicate whether the tooth is being used incorrectly in the mastication process. Real-time interactivity is desired and achieved within limits, depending on the complexity of the employed geometric models. The presented simulation is a first step towards the overall project goal of interactively optimizing tooth position and shape under the investigation of a virtual chewing process using real patient data (Fig 1). PMID:26389135

  3. Super Resolution Reconstruction Based on Adaptive Detail Enhancement for ZY-3 Satellite Images

    NASA Astrophysics Data System (ADS)

    Zhu, Hong; Song, Weidong; Tan, Hai; Wang, Jingxue; Jia, Di

    2016-06-01

    Super-resolution reconstruction of sequence remote sensing image is a technology which handles multiple low-resolution satellite remote sensing images with complementary information and obtains one or more high resolution images. The cores of the technology are high precision matching between images and high detail information extraction and fusion. In this paper puts forward a new image super resolution model frame which can adaptive multi-scale enhance the details of reconstructed image. First, the sequence images were decomposed into a detail layer containing the detail information and a smooth layer containing the large scale edge information by bilateral filter. Then, a texture detail enhancement function was constructed to promote the magnitude of the medium and small details. Next, the non-redundant information of the super reconstruction was obtained by differential processing of the detail layer, and the initial super resolution construction result was achieved by interpolating fusion of non-redundant information and the smooth layer. At last, the final reconstruction image was acquired by executing a local optimization model on the initial constructed image. Experiments on ZY-3 satellite images of same phase and different phase show that the proposed method can both improve the information entropy and the image details evaluation standard comparing with the interpolation method, traditional TV algorithm and MAP algorithm, which indicate that our method can obviously highlight image details and contains more ground texture information. A large number of experiment results reveal that the proposed method is robust and universal for different kinds of ZY-3 satellite images.

  4. HUBBLE CAPTURES DETAILED IMAGE OF URANUS' ATMOSPHERE

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Hubble Space Telescope has peered deep into Uranus' atmosphere to see clear and hazy layers created by a mixture of gases. Using infrared filters, Hubble captured detailed features of three layers of Uranus' atmosphere. Hubble's images are different from the ones taken by the Voyager 2 spacecraft, which flew by Uranus 10 years ago. Those images - not taken in infrared light - showed a greenish-blue disk with very little detail. The infrared image allows astronomers to probe the structure of Uranus' atmosphere, which consists of mostly hydrogen with traces of methane. The red around the planet's edge represents a very thin haze at a high altitude. The haze is so thin that it can only be seen by looking at the edges of the disk, and is similar to looking at the edge of a soap bubble. The yellow near the bottom of Uranus is another hazy layer. The deepest layer, the blue near the top of Uranus, shows a clearer atmosphere. Image processing has been used to brighten the rings around Uranus so that astronomers can study their structure. In reality, the rings are as dark as black lava or charcoal. This false color picture was assembled from several exposures taken July 3, 1995 by the Wide Field Planetary Camera-2. CREDIT: Erich Karkoschka (University of Arizona Lunar and Planetary Lab) and NASA

  5. Thirty Meter Telescope Detailed Science Case: 2015

    NASA Astrophysics Data System (ADS)

    Skidmore, Warren; TMT International Science Development Teams; Science Advisory Committee, TMT

    2015-12-01

    The TMT Detailed Science Case describes the transformational science that the Thirty Meter Telescope will enable. Planned to begin science operations in 2024, TMT will open up opportunities for revolutionary discoveries in essentially every field of astronomy, astrophysics and cosmology, seeing much fainter objects much more clearly than existing telescopes. Per this capability, TMT's science agenda fills all of space and time, from nearby comets and asteroids, to exoplanets, to the most distant galaxies, and all the way back to the very first sources of light in the universe. More than 150 astronomers from within the TMT partnership and beyond offered input in compiling the new 2015 Detailed Science Case. The contributing astronomers represent the entire TMT partnership, including the California Institute of Technology (Caltech), the Indian Institute of Astrophysics (IIA), the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), the National Astronomical Observatory of Japan (NAOJ), the University of California, the Association of Canadian Universities for Research in Astronomy (ACURA) and US associate partner, the Association of Universities for Research in Astronomy (AURA). Cover image: artist's rendition of the TMT International Observatory on Mauna Kea opening in the late evening before beginning operations.

  6. Photoacoustic computed tomography without accurate ultrasonic transducer responses

    NASA Astrophysics Data System (ADS)

    Sheng, Qiwei; Wang, Kun; Xia, Jun; Zhu, Liren; Wang, Lihong V.; Anastasio, Mark A.

    2015-03-01

    Conventional photoacoustic computed tomography (PACT) image reconstruction methods assume that the object and surrounding medium are described by a constant speed-of-sound (SOS) value. In order to accurately recover fine structures, SOS heterogeneities should be quantified and compensated for during PACT reconstruction. To address this problem, several groups have proposed hybrid systems that combine PACT with ultrasound computed tomography (USCT). In such systems, a SOS map is reconstructed first via USCT. Consequently, this SOS map is employed to inform the PACT reconstruction method. Additionally, the SOS map can provide structural information regarding tissue, which is complementary to the functional information from the PACT image. We propose a paradigm shift in the way that images are reconstructed in hybrid PACT-USCT imaging. Inspired by our observation that information about the SOS distribution is encoded in PACT measurements, we propose to jointly reconstruct the absorbed optical energy density and SOS distributions from a combined set of USCT and PACT measurements, thereby reducing the two reconstruction problems into one. This innovative approach has several advantages over conventional approaches in which PACT and USCT images are reconstructed independently: (1) Variations in the SOS will automatically be accounted for, optimizing PACT image quality; (2) The reconstructed PACT and USCT images will possess minimal systematic artifacts because errors in the imaging models will be optimally balanced during the joint reconstruction; (3) Due to the exploitation of information regarding the SOS distribution in the full-view PACT data, our approach will permit high-resolution reconstruction of the SOS distribution from sparse array data.

  7. How accurate are Scottish cancer registration data?

    PubMed Central

    Brewster, D.; Crichton, J.; Muir, C.

    1994-01-01

    In order to assess the accuracy of Scottish cancer registration data, a random sample of 2,200 registrations, attributed to the year 1990, was generated. Relevant medical records were available for review in 2,021 (92%) cases. Registration details were reabstracted from available records and compared with data in the registry. Discrepancies in identifying items of data (surname, forename, sex and date of birth) were found in 3.5% of cases. Most were trivial and would not disturb record linkage. Discrepancy rates of 7.1% in post code of residence at the time of diagnosis (excluding differences arising through boundary changes), 11.0% in anniversary date (excluding differences of 6 weeks or less), 7.7% in histological verification status, 5.4% in ICD-9 site codes (the first three digits) and 14.5% in ICD-O morphology codes (excluding 'inferred' morphology codes) were recorded. Overall, serious discrepancies were judged to have occurred in 2.8% of cases. In many respects, therefore, Scottish cancer registration data show a high level of accuracy that compares favourably to the reported accuracy of the few other cancer registries undertaking such analyses. PMID:7947104

  8. Hounsfield unit density accurately predicts ESWL success.

    PubMed

    Magnuson, William J; Tomera, Kevin M; Lance, Raymond S

    2005-01-01

    Extracorporeal shockwave lithotripsy (ESWL) is a commonly used non-invasive treatment for urolithiasis. Helical CT scans provide much better and detailed imaging of the patient with urolithiasis including the ability to measure density of urinary stones. In this study we tested the hypothesis that density of urinary calculi as measured by CT can predict successful ESWL treatment. 198 patients were treated at Alaska Urological Associates with ESWL between January 2002 and April 2004. Of these 101 met study inclusion with accessible CT scans and stones ranging from 5-15 mm. Follow-up imaging demonstrated stone freedom in 74.2%. The overall mean Houndsfield density value for stone-free compared to residual stone groups were significantly different ( 93.61 vs 122.80 p < 0.0001). We determined by receiver operator curve (ROC) that HDV of 93 or less carries a 90% or better chance of stone freedom following ESWL for upper tract calculi between 5-15mm.

  9. Higher Education in France: A Handbook of Information Concerning Fields of Study in Each Institution. Bulletin, 1952, No. 6

    ERIC Educational Resources Information Center

    Kahler, Edith

    1952-01-01

    Advising students who wish to study in other countries is often difficult because accurate, up-to-date, and detailed information about the offerings in their higher institutions is frequently unavailable. A student wishing to study in a given country needs to know what the several institutions offer not only in his own subject area but also in…

  10. Provenance management in Swift with implementation details.

    SciTech Connect

    Gadelha, L. M. R; Clifford, B.; Mattoso, M.; Wilde, M.; Foster, I.

    2011-04-01

    The Swift parallel scripting language allows for the specification, execution and analysis of large-scale computations in parallel and distributed environments. It incorporates a data model for recording and querying provenance information. In this article we describe these capabilities and evaluate interoperability with other systems through the use of the Open Provenance Model. We describe Swift's provenance data model and compare it to the Open Provenance Model. We also describe and evaluate activities performed within the Third Provenance Challenge, which consisted of implementing a specific scientific workflow, capturing and recording provenance information of its execution, performing provenance queries, and exchanging provenance information with other systems. Finally, we propose improvements to both the Open Provenance Model and Swift's provenance system.

  11. An exposure-response database for detailed toxicity data

    SciTech Connect

    Woodall, George M.

    2008-11-15

    Risk assessment for human health effects often depends on evaluation of toxicological literature from a variety of sources. Risk assessors have limited resources for obtaining raw data, performing follow-on analyses or initiating new studies. These constraints must be balanced against a need to improve scientific credibility through improved statistical and analytical methods that optimize the use of available information. Computerized databases are used in toxicological risk assessment both for storing data and performing predictive analyses. Many systems provide primarily either bibliographic information or summary factual data from toxicological studies; few provide adequate information to allow application of dose-response models. The Exposure-Response database (ERDB) described here fills this gap by allowing entry of sufficiently detailed information on experimental design and results for each study, while limiting data entry to the most relevant. ERDB was designed to contain information from the open literature to support dose-response assessment and allow a high level of automation in performance of various types of dose-response analyses. Specifically, ERDB supports emerging analytical approaches for dose-response assessment, while accommodating the diverse nature of published literature. Exposure and response data are accessible in a relational multi-table design, with closely controlled standard fields for recording values and free-text fields to describe unique aspects of the study. Additional comparative analyses are made possible through summary tables and graphic representations of the data contained within ERDB.

  12. Accurate theoretical chemistry with coupled pair models.

    PubMed

    Neese, Frank; Hansen, Andreas; Wennmohs, Frank; Grimme, Stefan

    2009-05-19

    Quantum chemistry has found its way into the everyday work of many experimental chemists. Calculations can predict the outcome of chemical reactions, afford insight into reaction mechanisms, and be used to interpret structure and bonding in molecules. Thus, contemporary theory offers tremendous opportunities in experimental chemical research. However, even with present-day computers and algorithms, we cannot solve the many particle Schrodinger equation exactly; inevitably some error is introduced in approximating the solutions of this equation. Thus, the accuracy of quantum chemical calculations is of critical importance. The affordable accuracy depends on molecular size and particularly on the total number of atoms: for orientation, ethanol has 9 atoms, aspirin 21 atoms, morphine 40 atoms, sildenafil 63 atoms, paclitaxel 113 atoms, insulin nearly 800 atoms, and quaternary hemoglobin almost 12,000 atoms. Currently, molecules with up to approximately 10 atoms can be very accurately studied by coupled cluster (CC) theory, approximately 100 atoms with second-order Møller-Plesset perturbation theory (MP2), approximately 1000 atoms with density functional theory (DFT), and beyond that number with semiempirical quantum chemistry and force-field methods. The overwhelming majority of present-day calculations in the 100-atom range use DFT. Although these methods have been very successful in quantum chemistry, they do not offer a well-defined hierarchy of calculations that allows one to systematically converge to the correct answer. Recently a number of rather spectacular failures of DFT methods have been found-even for seemingly simple systems such as hydrocarbons, fueling renewed interest in wave function-based methods that incorporate the relevant physics of electron correlation in a more systematic way. Thus, it would be highly desirable to fill the gap between 10 and 100 atoms with highly correlated ab initio methods. We have found that one of the earliest (and now

  13. FIM measurement properties and Rasch model details.

    PubMed

    Wright, B D; Linacre, J M; Smith, R M; Heinemann, A W; Granger, C V

    1997-12-01

    To summarize, we take issue with the criticisms of Dickson & Köhler for two main reasons: 1. Rasch analysis provides a model from which to approach the analysis of the FIM, an ordinal scale, as an interval scale. The existence of examples of items or individuals which do not fit the model does not disprove the overall efficacy of the model; and 2. the principal components analysis of FIM motor items as presented by Dickson & Köhler tends to undermine rather than support their argument. Their own analyses produce a single major factor explaining between 58.5 and 67.1% of the variance, depending upon the sample, with secondary factors explaining much less variance. Finally, analysis of item response, or latent trait, is a powerful method for understanding the meaning of a measure. However, it presumes that item scores are accurate. Another concern is that Dickson & Köhler do not address the issue of reliability of scoring the FIM items on which they report, a critical point in comparing results. The Uniform Data System for Medical Rehabilitation (UDSMRSM) expends extensive effort in the training of clinicians of subscribing facilities to score items accurately. This is followed up with a credentialing process. Phase 1 involves the testing of individual clinicians who are submitting data to determine if they have achieved mastery over the use of the FIM instrument. Phase 2 involves examining the data for outlying values. When Dickson & Köhler investigate more carefully the application of the Rasch model to their FIM data, they will discover that the results presented in their paper support rather than contradict their application of the Rasch model! This paper is typical of supposed refutations of Rasch model applications. Dickson & Köhler will find that idiosyncrasies in their data and misunderstandings of the Rasch model are the only basis for a claim to have disproven the relevance of the model to FIM data. The Rasch model is a mathematical theorem (like

  14. FIM measurement properties and Rasch model details.

    PubMed

    Wright, B D; Linacre, J M; Smith, R M; Heinemann, A W; Granger, C V

    1997-12-01

    To summarize, we take issue with the criticisms of Dickson & Köhler for two main reasons: 1. Rasch analysis provides a model from which to approach the analysis of the FIM, an ordinal scale, as an interval scale. The existence of examples of items or individuals which do not fit the model does not disprove the overall efficacy of the model; and 2. the principal components analysis of FIM motor items as presented by Dickson & Köhler tends to undermine rather than support their argument. Their own analyses produce a single major factor explaining between 58.5 and 67.1% of the variance, depending upon the sample, with secondary factors explaining much less variance. Finally, analysis of item response, or latent trait, is a powerful method for understanding the meaning of a measure. However, it presumes that item scores are accurate. Another concern is that Dickson & Köhler do not address the issue of reliability of scoring the FIM items on which they report, a critical point in comparing results. The Uniform Data System for Medical Rehabilitation (UDSMRSM) expends extensive effort in the training of clinicians of subscribing facilities to score items accurately. This is followed up with a credentialing process. Phase 1 involves the testing of individual clinicians who are submitting data to determine if they have achieved mastery over the use of the FIM instrument. Phase 2 involves examining the data for outlying values. When Dickson & Köhler investigate more carefully the application of the Rasch model to their FIM data, they will discover that the results presented in their paper support rather than contradict their application of the Rasch model! This paper is typical of supposed refutations of Rasch model applications. Dickson & Köhler will find that idiosyncrasies in their data and misunderstandings of the Rasch model are the only basis for a claim to have disproven the relevance of the model to FIM data. The Rasch model is a mathematical theorem (like

  15. Accurate calculation of field and carrier distributions in doped semiconductors

    NASA Astrophysics Data System (ADS)

    Yang, Wenji; Tang, Jianping; Yu, Hongchun; Wang, Yanguo

    2012-06-01

    We use the numerical squeezing algorithm(NSA) combined with the shooting method to accurately calculate the built-in fields and carrier distributions in doped silicon films (SFs) in the micron and sub-micron thickness range and results are presented in graphical form for variety of doping profiles under different boundary conditions. As a complementary approach, we also present the methods and the results of the inverse problem (IVP) - finding out the doping profile in the SFs for given field distribution. The solution of the IVP provides us the approach to arbitrarily design field distribution in SFs - which is very important for low dimensional (LD) systems and device designing. Further more, the solution of the IVP is both direct and much easy for all the one-, two-, and three-dimensional semiconductor systems. With current efforts focused on the LD physics, knowing of the field and carrier distribution details in the LD systems will facilitate further researches on other aspects and hence the current work provides a platform for those researches.

  16. Automatically Generated, Anatomically Accurate Meshes for Cardiac Electrophysiology Problems

    PubMed Central

    Prassl, Anton J.; Kickinger, Ferdinand; Ahammer, Helmut; Grau, Vicente; Schneider, Jürgen E.; Hofer, Ernst; Vigmond, Edward J.; Trayanova, Natalia A.

    2010-01-01

    Significant advancements in imaging technology and the dramatic increase in computer power over the last few years broke the ground for the construction of anatomically realistic models of the heart at an unprecedented level of detail. To effectively make use of high-resolution imaging datasets for modeling purposes, the imaged objects have to be discretized. This procedure is trivial for structured grids. However, to develop generally applicable heart models, unstructured grids are much preferable. In this study, a novel image-based unstructured mesh generation technique is proposed. It uses the dual mesh of an octree applied directly to segmented 3-D image stacks. The method produces conformal, boundary-fitted, and hexahedra-dominant meshes. The algorithm operates fully automatically with no requirements for interactivity and generates accurate volume-preserving representations of arbitrarily complex geometries with smooth surfaces. The method is very well suited for cardiac electrophysiological simulations. In the myocardium, the algorithm minimizes variations in element size, whereas in the surrounding medium, the element size is grown larger with the distance to the myocardial surfaces to reduce the computational burden. The numerical feasibility of the approach is demonstrated by discretizing and solving the monodomain and bidomain equations on the generated grids for two preparations of high experimental relevance, a left ventricular wedge preparation, and a papillary muscle. PMID:19203877

  17. Generating Facial Expressions Using an Anatomically Accurate Biomechanical Model.

    PubMed

    Wu, Tim; Hung, Alice; Mithraratne, Kumar

    2014-11-01

    This paper presents a computational framework for modelling the biomechanics of human facial expressions. A detailed high-order (Cubic-Hermite) finite element model of the human head was constructed using anatomical data segmented from magnetic resonance images. The model includes a superficial soft-tissue continuum consisting of skin, the subcutaneous layer and the superficial Musculo-Aponeurotic system. Embedded within this continuum mesh, are 20 pairs of facial muscles which drive facial expressions. These muscles were treated as transversely-isotropic and their anatomical geometries and fibre orientations were accurately depicted. In order to capture the relative composition of muscles and fat, material heterogeneity was also introduced into the model. Complex contact interactions between the lips, eyelids, and between superficial soft tissue continuum and deep rigid skeletal bones were also computed. In addition, this paper investigates the impact of incorporating material heterogeneity and contact interactions, which are often neglected in similar studies. Four facial expressions were simulated using the developed model and the results were compared with surface data obtained from a 3D structured-light scanner. Predicted expressions showed good agreement with the experimental data.

  18. Fine Details of the Icy Surface of Ganymede

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Dramatic view of fine details in ice hills and valleys in an unnamed region on Jupiter's moon Ganymede. North is to the top of the picture and the sun illuminates the surface from the left. The finest details that can be discerned in this picture are only 11 meters across (similar to the size of an average house) some 2000 times better than previous images of this region. The bright areas in the left hand version are the sides of hills facing the sun; the dark areas are shadows. In the right hand version the processing has been changed to bring out details in the shadowed regions that are illuminated by the bright hillsides. The brightness of some of the hillsides is so high that the picture elements 'spill over' down the columns of the picture. The image was taken on June 28, 1996 from a distance of about 1000 kilometers. The Jet Propulsion Laboratory, Pasadena, CA manages the mission for NASA's Office of Space Science, Washington, DC. This image and other images and data received from Galileo are posted on the World Wide Web, on the Galileo mission home page at URL http://galileo.jpl.nasa.gov. Background information and educational context for the images can be found at URL http://www.jpl.nasa.gov/galileo/sepo

  19. Detailed assays conducted on Vietnamese crude oils

    SciTech Connect

    Du, P.Q. )

    1990-07-16

    More oil property data, in the form of recent crude oil assays, have been made available for two Vietnamese crude oils, Bach Ho (White Tiger) and Dai Hung (Big Bear). Crude oil data presented earlier gave limited properties of the crudes,which are from the Miocene formations. Further analyses have been conducted on Bach Ho crude from the Oligocene formations. Production from Oligocene is far more representative of the oils produced from the Bach Ho field and marketed worldwide. Currently, Bach Ho is the only producing field. Dai Hung is expected to be in production during the next few years. Bach Ho is currently producing at the rate of 20,000 b/d. That figure is projected to grow to 100,000 b/d by 1992 and to 120,000 b/d by 1995. Detailed assays of both crude oils are presented.

  20. Most Detailed Image of the Crab Nebula

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This new Hubble image -- one among the largest ever produced with the Earth-orbiting observatory -- shows the most detailed view so far of the entire Crab Nebula ever made. The Crab is arguably the single most interesting object, as well as one of the most studied, in all of astronomy. The image is the largest image ever taken with Hubble's WFPC2 workhorse camera.

    The Crab Nebula is one of the most intricately structured and highly dynamical objects ever observed. The new Hubble image of the Crab was assembled from 24 individual exposures taken with the NASA/ESA Hubble Space Telescope and is the highest resolution image of the entire Crab Nebula ever made.

  1. Detailed mechanism for oxidation of benzene

    NASA Technical Reports Server (NTRS)

    Bittker, David A.

    1990-01-01

    A detailed mechanism for the oxidation of benzene is presented and used to compute experimentally obtained concentration profiles and ignition delay times over a wide range of equivalence ratio and temperature. The computed results agree qualitatively with all the experimental trends. Quantitative agreement is obtained with several of the composition profiles and for the temperature dependence of the ignition delay times. There are indications, however, that some important reactions are as yet undiscovered in this mechanism. Recent literature expressions have been used for the rate coefficients of most important reactions, except for some involving phenol. The discrepancy between the phenol pyrolysis rate coefficient used in this work and a recent literature expression remains to be explained.

  2. Picornavirus uncoating intermediate captured in atomic detail

    PubMed Central

    Ren, Jingshan; Wang, Xiangxi; Hu, Zhongyu; Gao, Qiang; Sun, Yao; Li, Xuemei; Porta, Claudine; Walter, Thomas S.; Gilbert, Robert J.; Zhao, Yuguang; Axford, Danny; Williams, Mark; McAuley, Katherine; Rowlands, David J.; Yin, Weidong; Wang, Junzhi; Stuart, David I.; Rao, Zihe; Fry, Elizabeth E.

    2013-01-01

    It remains largely mysterious how the genomes of non-enveloped eukaryotic viruses are transferred across a membrane into the host cell. Picornaviruses are simple models for such viruses, and initiate this uncoating process through particle expansion, which reveals channels through which internal capsid proteins and the viral genome presumably exit the particle, although this has not been clearly seen until now. Here we present the atomic structure of an uncoating intermediate for the major human picornavirus pathogen CAV16, which reveals VP1 partly extruded from the capsid, poised to embed in the host membrane. Together with previous low-resolution results, we are able to propose a detailed hypothesis for the ordered egress of the internal proteins, using two distinct sets of channels through the capsid, and suggest a structural link to the condensed RNA within the particle, which may be involved in triggering RNA release. PMID:23728514

  3. Capture barrier distributions: Some insights and details

    SciTech Connect

    Rowley, N.; Grar, N.; Trotta, M.

    2007-10-15

    The 'experimental barrier distribution' provides a parameter-free representation of experimental heavy-ion capture cross sections that highlights the effects of entrance-channel couplings. Its relation to the s-wave transmission is discussed, and in particular it is shown how the full capture cross section can be generated from an l=0 coupled-channels calculation. Furthermore, it is shown how this transmission can be simply exploited in calculations of quasifission and evaporation-residue cross sections. The system {sup 48}Ca+{sup 154}Sm is studied in detail. A calculation of the compound-nucleus spin distribution reveals a possible energy dependence of barrier weights due to polarization arising from target and projectile quadrupole phonon states; this effect also gives rise to an entrance-channel 'extra-push'.

  4. A Look Inside: MRI Shows the Detail

    ERIC Educational Resources Information Center

    Gosman, Derek; Rose, Mary Annette

    2015-01-01

    Understanding the advantages, risks, and financial costs of medical technology is one way that technologically literate citizens can make better-informed decisions regarding their health and medical care. A cascade of advancements in medical imaging technologies (Ulmer & Jansen 2010) offers an exciting backdrop from which to help students…

  5. Generation and Memory for Contextual Detail

    ERIC Educational Resources Information Center

    Mulligan, Neil W.

    2004-01-01

    Generation enhances item memory but may not enhance other aspects of memory. In 12 experiments, the author investigated the effect of generation on context memory, motivated in part by the hypothesis that generation produces a trade-off in encoding item and contextual information. Participants generated some study words (e.g., hot-___) and read…

  6. Hubble Captures Detailed Image of Uranus' Atmosphere

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Hubble Space Telescope has peered deep into Uranus' atmosphere to see clear and hazy layers created by a mixture of gases. Using infrared filters, Hubble captured detailed features of three layers of Uranus' atmosphere.

    Hubble's images are different from the ones taken by the Voyager 2 spacecraft, which flew by Uranus 10 years ago. Those images - not taken in infrared light - showed a greenish-blue disk with very little detail.

    The infrared image allows astronomers to probe the structure of Uranus' atmosphere, which consists of mostly hydrogen with traces of methane. The red around the planet's edge represents a very thin haze at a high altitude. The haze is so thin that it can only be seen by looking at the edges of the disk, and is similar to looking at the edge of a soap bubble. The yellow near the bottom of Uranus is another hazy layer. The deepest layer, the blue near the top of Uranus, shows a clearer atmosphere.

    Image processing has been used to brighten the rings around Uranus so that astronomers can study their structure. In reality, the rings are as dark as black lava or charcoal.

    This false color picture was assembled from several exposures taken July 3, 1995 by the Wide Field Planetary Camera-2.

    The Wide Field/Planetary Camera 2 was developed by the Jet Propulsion Laboratory and managed by the Goddard Spaced Flight Center for NASA's Office of Space Science.

    This image and other images and data received from the Hubble Space Telescope are posted on the World Wide Web on the Space Telescope Science Institute home page at URL http://oposite.stsci.edu/pubinfo/

  7. Effects of Trainer Expressiveness, Seductive Details, and Trainee Goal Orientation on Training Outcomes

    ERIC Educational Resources Information Center

    Towler, Annette

    2009-01-01

    This study focuses on trainer expressiveness and trainee mastery orientation within the context of the seductive details effect. The seductive details effect refers to inclusion of "highly interesting and entertaining information that is only tangentially related to the topic" (Harp & Mayer, 1998, p. 1). One hundred thirty-two participants…

  8. Emplacement of Long Lava Flows: Detailed Topography of the Carrizozo Basalt Lava Flow, New Mexico

    NASA Technical Reports Server (NTRS)

    Zimbelman, J. R; Johnston, A. K.

    2000-01-01

    The Carrizozo flow in south-central New Mexico was examined to obtain detailed topography for a long basaltic lava flow. This information will be helpful in evaluating emplacement models for long lava flows.

  9. Interactive NCORP Map Details Community Research Sites | Division of Cancer Prevention

    Cancer.gov

    An interactive map of the NCI Community Oncology Research Program (NCORP) with detailed information on hundreds of community sites that take part in clinical trials is available on the NCORP website. |

  10. Approaches for classifying the indications for colonoscopy using detailed clinical data

    PubMed Central

    2014-01-01

    estimates from only referral notes (% difference in coefficients = 34.9%, p-value = 0.12) or procedure reports (% difference in coefficients = 27.4%, p-value = 0.23). Conclusion There was no single gold-standard source of information in medical records. The estimates of colonoscopy effectiveness from progress notes alone were the closest to estimates using adjudicated indications. Thus, the details in the medical records are necessary for accurate indication classification. PMID:24529031

  11. Seismic Waves, 4th order accurate

    SciTech Connect

    2013-08-16

    SW4 is a program for simulating seismic wave propagation on parallel computers. SW4 colves the seismic wave equations in Cartesian corrdinates. It is therefore appropriate for regional simulations, where the curvature of the earth can be neglected. SW4 implements a free surface boundary condition on a realistic topography, absorbing super-grid conditions on the far-field boundaries, and a kinematic source model consisting of point force and/or point moment tensor source terms. SW4 supports a fully 3-D heterogeneous material model that can be specified in several formats. SW4 can output synthetic seismograms in an ASCII test format, or in the SAC finary format. It can also present simulation information as GMT scripts, whixh can be used to create annotated maps. Furthermore, SW4 can output the solution as well as the material model along 2-D grid planes.

  12. Seismic Waves, 4th order accurate

    2013-08-16

    SW4 is a program for simulating seismic wave propagation on parallel computers. SW4 colves the seismic wave equations in Cartesian corrdinates. It is therefore appropriate for regional simulations, where the curvature of the earth can be neglected. SW4 implements a free surface boundary condition on a realistic topography, absorbing super-grid conditions on the far-field boundaries, and a kinematic source model consisting of point force and/or point moment tensor source terms. SW4 supports a fully 3-Dmore » heterogeneous material model that can be specified in several formats. SW4 can output synthetic seismograms in an ASCII test format, or in the SAC finary format. It can also present simulation information as GMT scripts, whixh can be used to create annotated maps. Furthermore, SW4 can output the solution as well as the material model along 2-D grid planes.« less

  13. Robust High-Resolution Cloth Using Parallelism, History-Based Collisions and Accurate Friction

    PubMed Central

    Selle, Andrew; Su, Jonathan; Irving, Geoffrey; Fedkiw, Ronald

    2015-01-01

    In this paper we simulate high resolution cloth consisting of up to 2 million triangles which allows us to achieve highly detailed folds and wrinkles. Since the level of detail is also influenced by object collision and self collision, we propose a more accurate model for cloth-object friction. We also propose a robust history-based repulsion/collision framework where repulsions are treated accurately and efficiently on a per time step basis. Distributed memory parallelism is used for both time evolution and collisions and we specifically address Gauss-Seidel ordering of repulsion/collision response. This algorithm is demonstrated by several high-resolution and high-fidelity simulations. PMID:19147895

  14. A data-management system for detailed areal interpretive data

    USGS Publications Warehouse

    Ferrigno, C.F.

    1986-01-01

    A data storage and retrieval system has been developed to organize and preserve areal interpretive data. This system can be used by any study where there is a need to store areal interpretive data that generally is presented in map form. This system provides the capability to grid areal interpretive data for input to groundwater flow models at any spacing and orientation. The data storage and retrieval system is designed to be used for studies that cover small areas such as counties. The system is built around a hierarchically structured data base consisting of related latitude-longitude blocks. The information in the data base can be stored at different levels of detail, with the finest detail being a block of 6 sec of latitude by 6 sec of longitude (approximately 0.01 sq mi). This system was implemented on a mainframe computer using a hierarchical data base management system. The computer programs are written in Fortran IV and PL/1. The design and capabilities of the data storage and retrieval system, and the computer programs that are used to implement the system are described. Supplemental sections contain the data dictionary, user documentation of the data-system software, changes that would need to be made to use this system for other studies, and information on the computer software tape. (Lantz-PTT)

  15. Towards a detailed soot model for internal combustion engines

    SciTech Connect

    Mosbach, Sebastian; Celnik, Matthew S.; Raj, Abhijeet; Kraft, Markus; Zhang, Hongzhi R.; Kubo, Shuichi; Kim, Kyoung-Oh

    2009-06-15

    In this work, we present a detailed model for the formation of soot in internal combustion engines describing not only bulk quantities such as soot mass, number density, volume fraction, and surface area but also the morphology and chemical composition of soot aggregates. The new model is based on the Stochastic Reactor Model (SRM) engine code, which uses detailed chemistry and takes into account convective heat transfer and turbulent mixing, and the soot formation is accounted for by SWEEP, a population balance solver based on a Monte Carlo method. In order to couple the gas-phase to the particulate phase, a detailed chemical kinetic mechanism describing the combustion of Primary Reference Fuels (PRFs) is extended to include small Polycyclic Aromatic Hydrocarbons (PAHs) such as pyrene, which function as soot precursor species for particle inception in the soot model. Apart from providing averaged quantities as functions of crank angle like soot mass, volume fraction, aggregate diameter, and the number of primary particles per aggregate for example, the integrated model also gives detailed information such as aggregate and primary particle size distribution functions. In addition, specifics about aggregate structure and composition, including C/H ratio and PAH ring count distributions, and images similar to those produced with Transmission Electron Microscopes (TEMs), can be obtained. The new model is applied to simulate an n-heptane fuelled Homogeneous Charge Compression Ignition (HCCI) engine which is operated at an equivalence ratio of 1.93. In-cylinder pressure and heat release predictions show satisfactory agreement with measurements. Furthermore, simulated aggregate size distributions as well as their time evolution are found to qualitatively agree with those obtained experimentally through snatch sampling. It is also observed both in the experiment as well as in the simulation that aggregates in the trapped residual gases play a vital role in the soot

  16. [Teacher Referral Information and Statistical Information Forms.

    ERIC Educational Resources Information Center

    Short, N. J.

    This rating information form used to refer children to the PIC program, elicits information concerning the child's emotional, cognitive, and personality development. See TM 001 111 for details of the program in which it is used. (DLG)

  17. Gigantic Cosmic Corkscrew Reveals New Details About Mysterious Microquasar

    NASA Astrophysics Data System (ADS)

    2004-10-01

    Image of SS 433: Red-and-Blue Line Shows Path of Constant-Speed Jets. Note Poor Match of Path to Image. CREDIT: Blundell & Bowler, NRAO/AUI/NSF SS 433 Same Image, With Colored Beads Representing Particle Ejections at Different Speeds. Particle Path Now Matches. CREDIT: Blundell & Bowler, NRAO/AUI/NSF Click Here for Page of Full-Sized Graphics The new VLA image shows two full turns of the jets' corkscrew on both sides of the core. Analyzing the image showed that if material came from the core at a constant speed, the jet paths would not accurately match the details of the image. "By simulating ejections at varying speeds, we were able to produce an exact match to the observed structure," Blundell explained. The scientists first did their match to one of the jets. "We then were stunned to see that the varying speeds that matched the structure of one jet also exactly reproduced the other jet's path," Blundell said. Matching the speeds in the two jets reproduced the observed structure even allowing for the fact that, because one jet is moving more nearly away from us than the other, it takes light longer to reach us from it, she added. The astrophysicists speculate that the changes in ejection speed may be caused by changes in the rate at which material is transferred from the companion star onto the accretion disk. The detailed new VLA image also allowed the astrophysicists to determine that SS 433 is nearly 18,000 light-years distant from Earth. Earlier estimates had the object, in the constellation Aquila, as near as 10,000 light-years. An accurate distance, the scientists said, now allows them to better determine the age of the shell of debris blown out by the supernova explosion that created the dense, compact object in the microquasar. Knowing the distance accurately also allows them to measure the actual brightness of the microquasar's components, and this, they said, improves their understanding of the physical processes at work in the system. The breakthrough image

  18. An accurate dynamical electron diffraction algorithm for reflection high-energy electron diffraction

    NASA Astrophysics Data System (ADS)

    Huang, J.; Cai, C. Y.; Lv, C. L.; Zhou, G. W.; Wang, Y. G.

    2015-12-01

    The conventional multislice method (CMS) method, one of the most popular dynamical electron diffraction calculation procedures in transmission electron microscopy, was introduced to calculate reflection high-energy electron diffraction (RHEED) as it is well adapted to deal with the deviations from the periodicity in the direction parallel to the surface. However, in the present work, we show that the CMS method is no longer sufficiently accurate for simulating RHEED with the accelerating voltage 3-100 kV because of the high-energy approximation. An accurate multislice (AMS) method can be an alternative for more accurate RHEED calculations with reasonable computing time. A detailed comparison of the numerical calculation of the AMS method and the CMS method is carried out with respect to different accelerating voltages, surface structure models, Debye-Waller factors and glancing angles.

  19. Reference module selection criteria for accurate testing of photovoltaic (PV) panels

    SciTech Connect

    Roy, J.N.; Gariki, Govardhan Rao; Nagalakhsmi, V.

    2010-01-15

    It is shown that for accurate testing of PV panels the correct selection of reference modules is important. A detailed description of the test methodology is given. Three different types of reference modules, having different I{sub SC} (short circuit current) and power (in Wp) have been used for this study. These reference modules have been calibrated from NREL. It has been found that for accurate testing, both I{sub SC} and power of the reference module must be either similar or exceed to that of modules under test. In case corresponding values of the test modules are less than a particular limit, the measurements may not be accurate. The experimental results obtained have been modeled by using simple equivalent circuit model and associated I-V equations. (author)

  20. Novel accurate and scalable 3-D MT forward solver based on a contracting integral equation method

    NASA Astrophysics Data System (ADS)

    Kruglyakov, M.; Geraskin, A.; Kuvshinov, A.

    2016-11-01

    We present a novel, open source 3-D MT forward solver based on a method of integral equations (IE) with contracting kernel. Special attention in the solver is paid to accurate calculations of Green's functions and their integrals which are cornerstones of any IE solution. The solver supports massive parallelization and is able to deal with highly detailed and contrasting models. We report results of a 3-D numerical experiment aimed at analyzing the accuracy and scalability of the code.

  1. 77 FR 3800 - Accurate NDE & Inspection, LLC; Confirmatory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... COMMISSION Accurate NDE & Inspection, LLC; Confirmatory Order In the Matter of Accurate NDE & Docket: 150... request ADR with the NRC in an attempt to resolve issues associated with this matter. In response, on August 9, 2011, Accurate NDE requested ADR to resolve this matter with the NRC. On September 28,...

  2. Learning accurate very fast decision trees from uncertain data streams

    NASA Astrophysics Data System (ADS)

    Liang, Chunquan; Zhang, Yang; Shi, Peng; Hu, Zhengguo

    2015-12-01

    Most existing works on data stream classification assume the streaming data is precise and definite. Such assumption, however, does not always hold in practice, since data uncertainty is ubiquitous in data stream applications due to imprecise measurement, missing values, privacy protection, etc. The goal of this paper is to learn accurate decision tree models from uncertain data streams for classification analysis. On the basis of very fast decision tree (VFDT) algorithms, we proposed an algorithm for constructing an uncertain VFDT tree with classifiers at tree leaves (uVFDTc). The uVFDTc algorithm can exploit uncertain information effectively and efficiently in both the learning and the classification phases. In the learning phase, it uses Hoeffding bound theory to learn from uncertain data streams and yield fast and reasonable decision trees. In the classification phase, at tree leaves it uses uncertain naive Bayes (UNB) classifiers to improve the classification performance. Experimental results on both synthetic and real-life datasets demonstrate the strong ability of uVFDTc to classify uncertain data streams. The use of UNB at tree leaves has improved the performance of uVFDTc, especially the any-time property, the benefit of exploiting uncertain information, and the robustness against uncertainty.

  3. Accurate Satellite-Derived Estimates of Tropospheric Ozone Radiative Forcing

    NASA Technical Reports Server (NTRS)

    Joiner, Joanna; Schoeberl, Mark R.; Vasilkov, Alexander P.; Oreopoulos, Lazaros; Platnick, Steven; Livesey, Nathaniel J.; Levelt, Pieternel F.

    2008-01-01

    Estimates of the radiative forcing due to anthropogenically-produced tropospheric O3 are derived primarily from models. Here, we use tropospheric ozone and cloud data from several instruments in the A-train constellation of satellites as well as information from the GEOS-5 Data Assimilation System to accurately estimate the instantaneous radiative forcing from tropospheric O3 for January and July 2005. We improve upon previous estimates of tropospheric ozone mixing ratios from a residual approach using the NASA Earth Observing System (EOS) Aura Ozone Monitoring Instrument (OMI) and Microwave Limb Sounder (MLS) by incorporating cloud pressure information from OMI. Since we cannot distinguish between natural and anthropogenic sources with the satellite data, our estimates reflect the total forcing due to tropospheric O3. We focus specifically on the magnitude and spatial structure of the cloud effect on both the shortand long-wave radiative forcing. The estimates presented here can be used to validate present day O3 radiative forcing produced by models.

  4. Personalized Orthodontic Accurate Tooth Arrangement System with Complete Teeth Model.

    PubMed

    Cheng, Cheng; Cheng, Xiaosheng; Dai, Ning; Liu, Yi; Fan, Qilei; Hou, Yulin; Jiang, Xiaotong

    2015-09-01

    The accuracy, validity and lack of relation information between dental root and jaw in tooth arrangement are key problems in tooth arrangement technology. This paper aims to describe a newly developed virtual, personalized and accurate tooth arrangement system based on complete information about dental root and skull. Firstly, a feature constraint database of a 3D teeth model is established. Secondly, for computed simulation of tooth movement, the reference planes and lines are defined by the anatomical reference points. The matching mathematical model of teeth pattern and the principle of the specific pose transformation of rigid body are fully utilized. The relation of position between dental root and alveolar bone is considered during the design process. Finally, the relative pose relationships among various teeth are optimized using the object mover, and a personalized therapeutic schedule is formulated. Experimental results show that the virtual tooth arrangement system can arrange abnormal teeth very well and is sufficiently flexible. The relation of position between root and jaw is favorable. This newly developed system is characterized by high-speed processing and quantitative evaluation of the amount of 3D movement of an individual tooth.

  5. Some articulatory details of emotional speech

    NASA Astrophysics Data System (ADS)

    Lee, Sungbok; Yildirim, Serdar; Bulut, Murtaza; Kazemzadeh, Abe; Narayanan, Shrikanth

    2005-09-01

    Differences in speech articulation among four emotion types, neutral, anger, sadness, and happiness are investigated by analyzing tongue tip, jaw, and lip movement data collected from one male and one female speaker of American English. The data were collected using an electromagnetic articulography (EMA) system while subjects produce simulated emotional speech. Pitch, root-mean-square (rms) energy and the first three formants were estimated for vowel segments. For both speakers, angry speech exhibited the largest rms energy and largest articulatory activity in terms of displacement range and movement speed. Happy speech is characterized by largest pitch variability. It has higher rms energy than neutral speech but articulatory activity is rather comparable to, or less than, neutral speech. That is, happy speech is more prominent in voicing activity than in articulation. Sad speech exhibits longest sentence duration and lower rms energy. However, its articulatory activity is no less than neutral speech. Interestingly, for the male speaker, articulation for vowels in sad speech is consistently more peripheral (i.e., more forwarded displacements) when compared to other emotions. However, this does not hold for female subject. These and other results will be discussed in detail with associated acoustics and perceived emotional qualities. [Work supported by NIH.

  6. Details of tetrahedral anisotropic mesh adaptation

    NASA Astrophysics Data System (ADS)

    Jensen, Kristian Ejlebjerg; Gorman, Gerard

    2016-04-01

    We have implemented tetrahedral anisotropic mesh adaptation using the local operations of coarsening, swapping, refinement and smoothing in MATLAB without the use of any for- N loops, i.e. the script is fully vectorised. In the process of doing so, we have made three observations related to details of the implementation: 1. restricting refinement to a single edge split per element not only simplifies the code, it also improves mesh quality, 2. face to edge swapping is unnecessary, and 3. optimising for the Vassilevski functional tends to give a little higher value for the mean condition number functional than optimising for the condition number functional directly. These observations have been made for a uniform and a radial shock metric field, both starting from a structured mesh in a cube. Finally, we compare two coarsening techniques and demonstrate the importance of applying smoothing in the mesh adaptation loop. The results pertain to a unit cube geometry, but we also show the effect of corners and edges by applying the implementation in a spherical geometry.

  7. Detailed Chemical Kinetic Modeling of Cyclohexane Oxidation

    SciTech Connect

    Silke, E J; Pitz, W J; Westbrook, C K; Ribaucour, M

    2006-11-10

    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of cyclohexane at both low and high temperatures. Reaction rate constant rules are developed for the low temperature combustion of cyclohexane. These rules can be used for in chemical kinetic mechanisms for other cycloalkanes. Since cyclohexane produces only one type of cyclohexyl radical, much of the low temperature chemistry of cyclohexane is described in terms of one potential energy diagram showing the reaction of cyclohexyl radical + O{sub 2} through five, six and seven membered ring transition states. The direct elimination of cyclohexene and HO{sub 2} from RO{sub 2} is included in the treatment using a modified rate constant of Cavallotti et al. Published and unpublished data from the Lille rapid compression machine, as well as jet-stirred reactor data are used to validate the mechanism. The effect of heat loss is included in the simulations, an improvement on previous studies on cyclohexane. Calculations indicated that the production of 1,2-epoxycyclohexane observed in the experiments can not be simulated based on the current understanding of low temperature chemistry. Possible 'alternative' H-atom isomerizations leading to different products from the parent O{sub 2}QOOH radical were included in the low temperature chemical kinetic mechanism and were found to play a significant role.

  8. The purchasable chemical space: a detailed picture.

    PubMed

    Lucas, Xavier; Grüning, Björn A; Bleher, Stefan; Günther, Stefan

    2015-05-26

    The screening of a reduced yet diverse and synthesizable region of the chemical space is a critical step in drug discovery. The ZINC database is nowadays routinely used to freely access and screen millions of commercially available compounds. We collected ∼125 million compounds from chemical catalogs and the ZINC database, yielding more than 68 million unique molecules, including a large portion of described natural products (NPs) and drugs. The data set was filtered using advanced medicinal chemistry rules to remove potentially toxic, promiscuous, metabolically labile, or reactive compounds. We studied the physicochemical properties of this compilation and identified millions of NP-like, fragment-like, inhibitors of protein-protein interactions (i-PPIs) like, and drug-like compounds. The related focused libraries were subjected to a detailed scaffold diversity analysis and compared to reference NPs and marketed drugs. This study revealed thousands of diverse chemotypes with distinct representations of building block combinations among the data sets. An analysis of the stereogenic and shape complexity properties of the libraries also showed that they present well-defined levels of complexity, following the tendency: i-PPIs-like < drug-like < fragment-like < NP-like. As the collected compounds have huge interest in drug discovery and particularly virtual screening and library design, we offer a freely available collection comprising over 37 million molecules under: http://pbox.pharmaceutical-bioinformatics.org , as well as the filtering rules used to build the focused libraries described herein.

  9. Parabiosis in Mice: A Detailed Protocol

    PubMed Central

    Kamran, Paniz; Sereti, Konstantina-Ioanna; Zhao, Peng; Ali, Shah R.; Weissman, Irving L.; Ardehali, Reza

    2013-01-01

    Parabiosis is a surgical union of two organisms allowing sharing of the blood circulation. Attaching the skin of two animals promotes formation of microvasculature at the site of inflammation. Parabiotic partners share their circulating antigens and thus are free of adverse immune reaction. First described by Paul Bert in 18641, the parabiosis surgery was refined by Bunster and Meyer in 1933 to improve animal survival2. In the current protocol, two mice are surgically joined following a modification of the Bunster and Meyer technique. Animals are connected through the elbow and knee joints followed by attachment of the skin allowing firm support that prevents strain on the sutured skin. Herein, we describe in detail the parabiotic joining of a ubiquitous GFP expressing mouse to a wild type (WT) mouse. Two weeks after the procedure, the pair is separated and GFP positive cells can be detected by flow cytometric analysis in the blood circulation of the WT mouse. The blood chimerism allows one to examine the contribution of the circulating cells from one animal in the other. PMID:24145664

  10. Retinal Connectomics: Towards Complete, Accurate Networks

    PubMed Central

    Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott

    2013-01-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  11. Chromatin States Accurately Classify Cell Differentiation Stages

    PubMed Central

    Larson, Jessica L.; Yuan, Guo-Cheng

    2012-01-01

    Gene expression is controlled by the concerted interactions between transcription factors and chromatin regulators. While recent studies have identified global chromatin state changes across cell-types, it remains unclear to what extent these changes are co-regulated during cell-differentiation. Here we present a comprehensive computational analysis by assembling a large dataset containing genome-wide occupancy information of 5 histone modifications in 27 human cell lines (including 24 normal and 3 cancer cell lines) obtained from the public domain, followed by independent analysis at three different representations. We classified the differentiation stage of a cell-type based on its genome-wide pattern of chromatin states, and found that our method was able to identify normal cell lines with nearly 100% accuracy. We then applied our model to classify the cancer cell lines and found that each can be unequivocally classified as differentiated cells. The differences can be in part explained by the differential activities of three regulatory modules associated with embryonic stem cells. We also found that the “hotspot” genes, whose chromatin states change dynamically in accordance to the differentiation stage, are not randomly distributed across the genome but tend to be embedded in multi-gene chromatin domains, and that specialized gene clusters tend to be embedded in stably occupied domains. PMID:22363642

  12. Retinal connectomics: towards complete, accurate networks.

    PubMed

    Marc, Robert E; Jones, Bryan W; Watt, Carl B; Anderson, James R; Sigulinsky, Crystal; Lauritzen, Scott

    2013-11-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 10(12)-10(15) byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies of complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  13. Locating Pd in Transformers through Detailed Model and Neural Networks

    NASA Astrophysics Data System (ADS)

    Nafisi, Hamed; Abedi, Mehrdad; Gharehpetian, Gevorg B.

    2014-03-01

    In a power transformer as one of the major component in electric power networks, partial discharge (PD) is a major source of insulation failure. Therefore the accurate and high speed techniques for locating of PD sources are required regarding to repair and maintenance. In this paper an attempt has been made to introduce the novel methods based on two different artificial neural networks (ANN) for identifying PD location in the power transformers. In present report Fuzzy ARTmap and Bayesian neural networks are employed for PD locating while using detailed model (DM) for a power transformer for simulation purposes. In present paper PD phenomenon is implemented in different points of transformer winding using threecapacitor model. Then impulse test is applied to transformer terminals in order to use produced current in neutral point for training and test of employed ANNs. In practice obtained current signals include noise components. Thus the performance of Fuzzy ARTmap and Bayesian networks for correct identification of PD location in a noisy condition for detected currents is also investigated. In this paper RBF learning procedure is used for Bayesian network, while Markov chain Monte Carlo (MCMC) method is employed for training of Fuzzy ARTmap network for locating PD in a power transformer winding and results are compared.

  14. Detailed Evaluation of MODIS Fire Radiative Power Measurements

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles

    2010-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has been gaining recognition as an important parameter for facilitating the development of various scientific studies relating to the quantitative characterization of biomass burning and their emissions. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to characterize the uncertainties associated with them, such as those due to the MODIS bow-tie effects and other factors, in order to establish their error budget for use in scientific research and applications. In this presentation, we will show preliminary results of the MODIS FRP data analysis, including comparisons with airborne measurements.

  15. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  16. Detailed gravity anomalies from Geos 3 satellite altimetry data

    NASA Technical Reports Server (NTRS)

    Gopalapillai, G. S.; Mourad, A. G.

    1979-01-01

    Detailed gravity anomalies are computed from a combination of Geos 3 satellite altimeter and terrestrial gravity data using least-squares principles. The mathematical model used is based on the Stokes' equation modified for a nonglobal solution. Using Geos 3 data in the calibration area, the effects of several anomaly parameter configurations and data densities/distributions on the anomalies and their accuracy estimates are studied. The accuracy estimates for 1 deg x 1 deg mean anomalies from low density altimetry data are of the order of 4 mgal. Comparison of these anomalies with the terrestrial data and also with Rapp's data derived using collocation techniques show rms differences of 7.2 and 4.9 mgal, respectively. Indications are that the anomaly accuracies can be improved to about 2 mgal with high density data. Estimation of 30 in. x 30 in. mean anomalies indicates accuracies of the order of 5 mgal. Proper verification of these results will be possible only when accurate ground truth data become available.

  17. Urban scale air quality modelling using detailed traffic emissions estimates

    NASA Astrophysics Data System (ADS)

    Borrego, C.; Amorim, J. H.; Tchepel, O.; Dias, D.; Rafael, S.; Sá, E.; Pimentel, C.; Fontes, T.; Fernandes, P.; Pereira, S. R.; Bandeira, J. M.; Coelho, M. C.

    2016-04-01

    The atmospheric dispersion of NOx and PM10 was simulated with a second generation Gaussian model over a medium-size south-European city. Microscopic traffic models calibrated with GPS data were used to derive typical driving cycles for each road link, while instantaneous emissions were estimated applying a combined Vehicle Specific Power/Co-operative Programme for Monitoring and Evaluation of the Long-range Transmission of Air Pollutants in Europe (VSP/EMEP) methodology. Site-specific background concentrations were estimated using time series analysis and a low-pass filter applied to local observations. Air quality modelling results are compared against measurements at two locations for a 1 week period. 78% of the results are within a factor of two of the observations for 1-h average concentrations, increasing to 94% for daily averages. Correlation significantly improves when background is added, with an average of 0.89 for the 24 h record. The results highlight the potential of detailed traffic and instantaneous exhaust emissions estimates, together with filtered urban background, to provide accurate input data to Gaussian models applied at the urban scale.

  18. Detailed 3D models of the induced electric field of transcranial magnetic stimulation coils

    NASA Astrophysics Data System (ADS)

    Salinas, F. S.; Lancaster, J. L.; Fox, P. T.

    2007-05-01

    Previous models neglected contributions from current elements spanning the full geometric extent of wires in transcranial magnetic stimulation (TMS) coils. A detailed account of TMS coil wiring geometry is shown to provide significant improvements in the accuracy of electric field (E-field) models. Modeling E-field dependence based on the TMS coil's wire width, height, shape and number of turns clearly improved the fit of calculated-to-measured E-fields near the coil body. Detailed E-field models were accurate up to the surface of the coil body (within 0.5% of measured) where simple models were often inadequate (up to 32% different from measured).

  19. Accurate, Fully-Automated NMR Spectral Profiling for Metabolomics

    PubMed Central

    Ravanbakhsh, Siamak; Liu, Philip; Bjordahl, Trent C.; Mandal, Rupasri; Grant, Jason R.; Wilson, Michael; Eisner, Roman; Sinelnikov, Igor; Hu, Xiaoyu; Luchinat, Claudio; Greiner, Russell; Wishart, David S.

    2015-01-01

    Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites) that appear in a person’s biofluids, which means such diseases can often be readily detected from a person’s “metabolic profile"—i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR) spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person’s metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid), BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the “signatures” of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF), defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error), in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively—with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications

  20. Accurate, fully-automated NMR spectral profiling for metabolomics.

    PubMed

    Ravanbakhsh, Siamak; Liu, Philip; Bjorndahl, Trent C; Bjordahl, Trent C; Mandal, Rupasri; Grant, Jason R; Wilson, Michael; Eisner, Roman; Sinelnikov, Igor; Hu, Xiaoyu; Luchinat, Claudio; Greiner, Russell; Wishart, David S

    2015-01-01

    Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites) that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR) spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid), BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF), defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error), in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of NMR in

  1. Accurate, low-cost 3D-models of gullies

    NASA Astrophysics Data System (ADS)

    Onnen, Nils; Gronz, Oliver; Ries, Johannes B.; Brings, Christine

    2015-04-01

    Soil erosion is a widespread problem in arid and semi-arid areas. The most severe form is the gully erosion. They often cut into agricultural farmland and can make a certain area completely unproductive. To understand the development and processes inside and around gullies, we calculated detailed 3D-models of gullies in the Souss Valley in South Morocco. Near Taroudant, we had four study areas with five gullies different in size, volume and activity. By using a Canon HF G30 Camcorder, we made varying series of Full HD videos with 25fps. Afterwards, we used the method Structure from Motion (SfM) to create the models. To generate accurate models maintaining feasible runtimes, it is necessary to select around 1500-1700 images from the video, while the overlap of neighboring images should be at least 80%. In addition, it is very important to avoid selecting photos that are blurry or out of focus. Nearby pixels of a blurry image tend to have similar color values. That is why we used a MATLAB script to compare the derivatives of the images. The higher the sum of the derivative, the sharper an image of similar objects. MATLAB subdivides the video into image intervals. From each interval, the image with the highest sum is selected. E.g.: 20min. video at 25fps equals 30.000 single images. The program now inspects the first 20 images, saves the sharpest and moves on to the next 20 images etc. Using this algorithm, we selected 1500 images for our modeling. With VisualSFM, we calculated features and the matches between all images and produced a point cloud. Then, MeshLab has been used to build a surface out of it using the Poisson surface reconstruction approach. Afterwards we are able to calculate the size and the volume of the gullies. It is also possible to determine soil erosion rates, if we compare the data with old recordings. The final step would be the combination of the terrestrial data with the data from our aerial photography. So far, the method works well and we

  2. Detailed transcriptome atlas of the pancreatic beta cell

    PubMed Central

    Kutlu, Burak; Burdick, David; Baxter, David; Rasschaert, Joanne; Flamez, Daisy; Eizirik, Decio L; Welsh, Nils; Goodman, Nathan; Hood, Leroy

    2009-01-01

    Background Gene expression patterns provide a detailed view of cellular functions. Comparison of profiles in disease vs normal conditions provides insights into the processes underlying disease progression. However, availability and integration of public gene expression datasets remains a major challenge. The aim of the present study was to explore the transcriptome of pancreatic islets and, based on this information, to prepare a comprehensive and open access inventory of insulin-producing beta cell gene expression, the Beta Cell Gene Atlas (BCGA). Methods We performed Massively Parallel Signature Sequencing (MPSS) analysis of human pancreatic islet samples and microarray analyses of purified rat beta cells, alpha cells and INS-1 cells, and compared the information with available array data in the literature. Results MPSS analysis detected around 7600 mRNA transcripts, of which around a third were of low abundance. We identified 2000 and 1400 transcripts that are enriched/depleted in beta cells compared to alpha cells and INS-1 cells, respectively. Microarray analysis identified around 200 transcription factors that are differentially expressed in either beta or alpha cells. We reanalyzed publicly available gene expression data and integrated these results with the new data from this study to build the BCGA. The BCGA contains basal (untreated conditions) gene expression level estimates in beta cells as well as in different cell types in human, rat and mouse pancreas. Hierarchical clustering of expression profile estimates classify cell types based on species while beta cells were clustered together. Conclusion Our gene atlas is a valuable source for detailed information on the gene expression distribution in beta cells and pancreatic islets along with insulin producing cell lines. The BCGA tool, as well as the data and code used to generate the Atlas are available at the T1Dbase website (T1DBase.org). PMID:19146692

  3. Local detailed balance: a microscopic derivation

    NASA Astrophysics Data System (ADS)

    Bauer, M.; Cornu, F.

    2015-01-01

    Thermal contact is the archetype of non-equilibrium processes driven by constant non-equilibrium constraints when the latter are enforced by reservoirs exchanging conserved microscopic quantities. At a mesoscopic scale only the energies of the macroscopic bodies are accessible together with the configurations of the contact system. We consider a class of models where the contact system, as well as macroscopic bodies, have a finite number of possible configurations. The global system, with only discrete degrees of freedom, has no microscopic Hamiltonian dynamics, but it is shown that, if the microscopic dynamics is assumed to be deterministic and ergodic and to conserve energy according to some specific pattern, and if the mesoscopic evolution of the global system is approximated by a Markov process as closely as possible, then the mesoscopic transition rates obey three constraints. In the limit where macroscopic bodies can be considered as reservoirs at thermodynamic equilibrium (but with different intensive parameters), the mesoscopic transition rates turn into transition rates for the contact system and the third constraint becomes local detailed balance; the latter is generically expressed in terms of the microscopic exchange entropy variation, namely the opposite of the variation of the thermodynamic entropy of the reservoir involved in a given microscopic jump of the contact system configuration. For a finite-time evolution after contact has been switched on, we derive a fluctuation relation for the joint probability of the heat amounts received from the various reservoirs. The generalization to systems exchanging energy, volume and matter with several reservoirs, with a possible conservative external force acting on the contact system, is given explicitly.

  4. Ancillary-service details: Dynamic scheduling

    SciTech Connect

    Hirst, E.; Kirby, B.

    1997-01-01

    Dynamic scheduling (DS) is the electronic transfer from one control area to another of the time-varying electricity consumption associated with a load or the time-varying electricity production associated with a generator. Although electric utilities have been using this technique for at least two decades, its use is growing in popularity and importance. This growth is a consequence of the major changes under way in US bulk-power markets, in particular efforts to unbundle generation from transmission and to increase competition among generation providers. DS can promote competition and increase choices. It allows consumers to purchase certain services from entities outside their physical-host area and it allows generators to sell certain services to entities other than their physical host. These services include regulation (following minute-to-minute variations in load) and operating reserves, among others. Such an increase in the number of possible suppliers and customers should encourage innovation and reduce the costs and prices of providing electricity services. The purpose of the project reported here was to collect and analyze data on utility experiences with DS. Chapter 2 provides additional details and examples of the definitions of DS. Chapter 3 explains why DS might be an attractive service that customers and generators, as well as transmission providers, might wan to use. Chapter 4 presents some of the many current DS examples the authors uncovered in their interviews. Chapter 5 discusses the costs and cost-effectiveness of DS. Chapter 6 explains what they believe can and cannot be electronically moved from one control area to another, primarily in terms of the six ancillary services that FERC defined in Order 888. Chapter 7 discusses the need for additional research on DS.

  5. Chord, Horizontal Tie Bar & Crossbracing Joint Details; Crossbracing Center ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Chord, Horizontal Tie Bar & Crossbracing Joint Details; Crossbracing Center Joint Detail; Chord, Panel Posts, Braces & Counterbrace Joint Detail - Brownsville Covered Bridge, Spanning East Fork Whitewater River (moved to Eagle Creek Park, Indianapolis), Brownsville, Union County, IN

  6. How accurately can the peak skin dose in fluoroscopy be determined using indirect dose metrics?

    SciTech Connect

    Jones, A. Kyle; Ensor, Joe E.; Pasciak, Alexander S.

    2014-07-15

    Purpose: Skin dosimetry is important for fluoroscopically-guided interventions, as peak skin doses (PSD) that result in skin reactions can be reached during these procedures. There is no consensus as to whether or not indirect skin dosimetry is sufficiently accurate for fluoroscopically-guided interventions. However, measuring PSD with film is difficult and the decision to do so must be madea priori. The purpose of this study was to assess the accuracy of different types of indirect dose estimates and to determine if PSD can be calculated within ±50% using indirect dose metrics for embolization procedures. Methods: PSD were measured directly using radiochromic film for 41 consecutive embolization procedures at two sites. Indirect dose metrics from the procedures were collected, including reference air kerma. Four different estimates of PSD were calculated from the indirect dose metrics and compared along with reference air kerma to the measured PSD for each case. The four indirect estimates included a standard calculation method, the use of detailed information from the radiation dose structured report, and two simplified calculation methods based on the standard method. Indirect dosimetry results were compared with direct measurements, including an analysis of uncertainty associated with film dosimetry. Factors affecting the accuracy of the different indirect estimates were examined. Results: When using the standard calculation method, calculated PSD were within ±35% for all 41 procedures studied. Calculated PSD were within ±50% for a simplified method using a single source-to-patient distance for all calculations. Reference air kerma was within ±50% for all but one procedure. Cases for which reference air kerma or calculated PSD exhibited large (±35%) differences from the measured PSD were analyzed, and two main causative factors were identified: unusually small or large source-to-patient distances and large contributions to reference air kerma from cone

  7. Dialing Up Telecommunications Information.

    ERIC Educational Resources Information Center

    Bates, Mary Ellen

    1993-01-01

    Describes how to find accurate, current information about telecommunications industries, products and services, rates and tariffs, and regulatory information using electronic information resources available from the private and public sectors. A sidebar article provides contact information for producers and service providers. (KRN)

  8. ACCURATE CHARACTERIZATION OF HIGH-DEGREE MODES USING MDI OBSERVATIONS

    SciTech Connect

    Korzennik, S. G.; Rabello-Soares, M. C.; Schou, J.; Larson, T. P.

    2013-08-01

    We present the first accurate characterization of high-degree modes, derived using the best Michelson Doppler Imager (MDI) full-disk full-resolution data set available. A 90 day long time series of full-disk 2 arcsec pixel{sup -1} resolution Dopplergrams was acquired in 2001, thanks to the high rate telemetry provided by the Deep Space Network. These Dopplergrams were spatially decomposed using our best estimate of the image scale and the known components of MDI's image distortion. A multi-taper power spectrum estimator was used to generate power spectra for all degrees and all azimuthal orders, up to l = 1000. We used a large number of tapers to reduce the realization noise, since at high degrees the individual modes blend into ridges and thus there is no reason to preserve a high spectral resolution. These power spectra were fitted for all degrees and all azimuthal orders, between l = 100 and l = 1000, and for all the orders with substantial amplitude. This fitting generated in excess of 5.2 Multiplication-Sign 10{sup 6} individual estimates of ridge frequencies, line widths, amplitudes, and asymmetries (singlets), corresponding to some 5700 multiplets (l, n). Fitting at high degrees generates ridge characteristics, characteristics that do not correspond to the underlying mode characteristics. We used a sophisticated forward modeling to recover the best possible estimate of the underlying mode characteristics (mode frequencies, as well as line widths, amplitudes, and asymmetries). We describe in detail this modeling and its validation. The modeling has been extensively reviewed and refined, by including an iterative process to improve its input parameters to better match the observations. Also, the contribution of the leakage matrix on the accuracy of the procedure has been carefully assessed. We present the derived set of corrected mode characteristics, which includes not only frequencies, but line widths, asymmetries, and amplitudes. We present and discuss

  9. Detailed Kinetic Modeling of Gasoline Surrogate Mixtures

    SciTech Connect

    Mehl, M; Curran, H J; Pitz, W J; Westbrook, C K

    2009-03-09

    Real fuels are complex mixtures of thousands of hydrocarbon compounds including linear and branched paraffins, naphthenes, olefins and aromatics. It is generally agreed that their behavior can be effectively reproduced by simpler fuel surrogates containing a limited number of components. In this work, a recently revised version of the kinetic model by the authors is used to analyze the combustion behavior of several components relevant to gasoline surrogate formulation. Particular attention is devoted to linear and branched saturated hydrocarbons (PRF mixtures), olefins (1-hexene) and aromatics (toluene). Model predictions for pure components, binary mixtures and multi-component gasoline surrogates are compared with recent experimental information collected in rapid compression machine, shock tube and jet stirred reactors covering a wide range of conditions pertinent to internal combustion engines. Simulation results are discussed focusing attention on the mixing effects of the fuel components.

  10. Accurate patient dosimetry of kilovoltage cone-beam CT in radiation therapy

    SciTech Connect

    Ding, George X.; Duggan, Dennis M.; Coffey, Charles W.

    2008-03-15

    The increased utilization of x-ray imaging in image-guided radiotherapy has dramatically improved the radiation treatment and the lives of cancer patients. Daily imaging procedures, such as cone-beam computed tomography (CBCT), for patient setup may significantly increase the dose to the patient's normal tissues. This study investigates the dosimetry from a kilovoltage (kV) CBCT for real patient geometries. Monte Carlo simulations were used to study the kV beams from a Varian on-board imager integrated into the Trilogy accelerator. The Monte Carlo calculated results were benchmarked against measurements and good agreement was obtained. The authors developed a novel method to calibrate Monte Carlo simulated beams with measurements using an ionization chamber in which the air-kerma calibration factors are obtained from an Accredited Dosimetry Calibration Laboratory. The authors have introduced a new Monte Carlo calibration factor, f{sub MCcal}, which is determined from the calibration procedure. The accuracy of the new method was validated by experiment. When a Monte Carlo simulated beam has been calibrated, the simulated beam can be used to accurately predict absolute dose distributions in the irradiated media. Using this method the authors calculated dose distributions to patient anatomies from a typical CBCT acquisition for different treatment sites, such as head and neck, lung, and pelvis. Their results have shown that, from a typical head and neck CBCT, doses to soft tissues, such as eye, spinal cord, and brain can be up to 8, 6, and 5 cGy, respectively. The dose to the bone, due to the photoelectric effect, can be as much as 25 cGy, about three times the dose to the soft tissue. The study provides detailed information on the additional doses to the normal tissues of a patient from a typical kV CBCT acquisition. The methodology of the Monte Carlo beam calibration developed and introduced in this study allows the user to calculate both relative and absolute

  11. Development and Validation of a Multidisciplinary Tool for Accurate and Efficient Rotorcraft Noise Prediction (MUTE)

    NASA Technical Reports Server (NTRS)

    Liu, Yi; Anusonti-Inthra, Phuriwat; Diskin, Boris

    2011-01-01

    A physics-based, systematically coupled, multidisciplinary prediction tool (MUTE) for rotorcraft noise was developed and validated with a wide range of flight configurations and conditions. MUTE is an aggregation of multidisciplinary computational tools that accurately and efficiently model the physics of the source of rotorcraft noise, and predict the noise at far-field observer locations. It uses systematic coupling approaches among multiple disciplines including Computational Fluid Dynamics (CFD), Computational Structural Dynamics (CSD), and high fidelity acoustics. Within MUTE, advanced high-order CFD tools are used around the rotor blade to predict the transonic flow (shock wave) effects, which generate the high-speed impulsive noise. Predictions of the blade-vortex interaction noise in low speed flight are also improved by using the Particle Vortex Transport Method (PVTM), which preserves the wake flow details required for blade/wake and fuselage/wake interactions. The accuracy of the source noise prediction is further improved by utilizing a coupling approach between CFD and CSD, so that the effects of key structural dynamics, elastic blade deformations, and trim solutions are correctly represented in the analysis. The blade loading information and/or the flow field parameters around the rotor blade predicted by the CFD/CSD coupling approach are used to predict the acoustic signatures at far-field observer locations with a high-fidelity noise propagation code (WOPWOP3). The predicted results from the MUTE tool for rotor blade aerodynamic loading and far-field acoustic signatures are compared and validated with a variation of experimental data sets, such as UH60-A data, DNW test data and HART II test data.

  12. The importance of accurate road data for spatial applications in public health: customizing a road network

    PubMed Central

    Frizzelle, Brian G; Evenson, Kelly R; Rodriguez, Daniel A; Laraia, Barbara A

    2009-01-01

    Background Health researchers have increasingly adopted the use of geographic information systems (GIS) for analyzing environments in which people live and how those environments affect health. One aspect of this research that is often overlooked is the quality and detail of the road data and whether or not it is appropriate for the scale of analysis. Many readily available road datasets, both public domain and commercial, contain positional errors or generalizations that may not be compatible with highly accurate geospatial locations. This study examined the accuracy, completeness, and currency of four readily available public and commercial sources for road data (North Carolina Department of Transportation, StreetMap Pro, TIGER/Line 2000, TIGER/Line 2007) relative to a custom road dataset which we developed and used for comparison. Methods and Results A custom road network dataset was developed to examine associations between health behaviors and the environment among pregnant and postpartum women living in central North Carolina in the United States. Three analytical measures were developed to assess the comparative accuracy and utility of four publicly and commercially available road datasets and the custom dataset in relation to participants' residential locations over three time periods. The exclusion of road segments and positional errors in the four comparison road datasets resulted in between 5.9% and 64.4% of respondents lying farther than 15.24 meters from their nearest road, the distance of the threshold set by the project to facilitate spatial analysis. Agreement, using a Pearson's correlation coefficient, between the customized road dataset and the four comparison road datasets ranged from 0.01 to 0.82. Conclusion This study demonstrates the importance of examining available road datasets and assessing their completeness, accuracy, and currency for their particular study area. This paper serves as an example for assessing the feasibility of readily

  13. Detailed investigations on radiative opacity and emissivity of tin plasmas in the extreme-ultraviolet region.

    PubMed

    Zeng, Jiaolong; Gao, Cheng; Yuan, Jianmin

    2010-08-01

    Radiative opacity and emissivity of tin plasmas at average ionization degree of about 10 was investigated in detail by using a fully relativistic detailed level accounting approach, in which main physical effects on the opacity were carefully taken into account. Among these physical effects, configuration interaction, in particular core-valence electron correlations, plays an important role on the determination of accurate atomic data required in the calculation of opacity. It results in a strong narrowing of lines from all transition arrays and strong absorption is located in a narrow wavelength region of 12.5-14 nm for Sn plasmas. Using a complete accurate atomic data, we investigated the opacity of Sn plasmas at a variety of physical condition. Among the respective ions of Xe6+-Xe15+ , Xe10+ has the largest absorption cross section at 13.5 nm, while the favorable physical condition for maximal absorption at 13.5 nm do not mean that Xe10+ has the largest fraction. Comparison with other theoretical results showed that a complete set of consistent accurate atomic data, which lacks very much, is essential to predict accurate opacity. Our atomic model is useful and can be applied to interpret opacity experiments. Further benchmark experiments are urgently needed to clarify the physical effects on the opacity of Sn plasmas.

  14. How seductive details do their damage: A cognitive theory of interest in science learning

    NASA Astrophysics Data System (ADS)

    Harp, Shannon Frankie

    1997-12-01

    When interesting irrelevant adjuncts, known as seductive details, are included in a passage, readers are less able to recall structurally important information and are less able to solve problems based on the main ideas in the passage as compared with readers who read the same passage without seductive details. This phenomenon is known as the seductive details effect. To investigate how seductive details interfere with learning a series of four experiments was conducted in which college students read a scientific explanation about the formation of lightning that either contained or did not contain seductive details. The seductive details effect was replicated in each of the 4 experiments. Students who read passages containing seductive details recalled significantly fewer main ideas and generated significantly fewer transfer solutions than students who read the passage with no seductive details. In Experiment 1, students read passages with or without seductive details, and with or without typographical highlighting of the main ideas. Highlighting the main ideas did not reduce the seductive details effect for either the retention of main ideas or for problem-solving performance. In Experiment 2, students read passages with or without seductive details, and with or without a statement of learning objectives prior to reading the passage. Providing learning objectives did not reduce the seductive details effect for retention of main ideas or for problem-solving performance. In Experiment 3, students read passages with or without seductive details, and with or without organizational signaling. Signaling the main ideas in the passage did nd reduce the seductive details effect for retention or for problem-solving performance. In Experiment 4, the placement of the seductive details was varied, such that they were presented before the passage, after the passage, or they were interspersed within the passage. A control group read a passage with no seductive details. Early

  15. Aperture taper determination for the half-scale accurate antenna reflector

    NASA Technical Reports Server (NTRS)

    Lambert, Kevin M.

    1990-01-01

    A simulation is described of a proposed microwave reflectance measurement in which the half scale reflector is used in a compact range type of application. The simulation is used to determine an acceptable aperture taper for the reflector which will allow for accurate measurements. Information on the taper is used in the design of a feed for the reflector.

  16. Getting a Picture that Is Both Accurate and Stable: Situation Models and Epistemic Validation

    ERIC Educational Resources Information Center

    Schroeder, Sascha; Richter, Tobias; Hoever, Inga

    2008-01-01

    Text comprehension entails the construction of a situation model that prepares individuals for situated action. In order to meet this function, situation model representations are required to be both accurate and stable. We propose a framework according to which comprehenders rely on epistemic validation to prevent inaccurate information from…

  17. [Approach to academic detailing as a hospital pharmacist].

    PubMed

    Nishikori, Atsumi

    2014-01-01

    In 2012, a new medical fee system was introduced for the clinical activities of hospital pharmacists responsible for in-patient pharmacotherapy monitoring in medical institutions in Japan. The new medical system demands greater efforts to provide the most suitable and safest medicine for each patient. By applying the concept of academic detailing to clinical pharmacists' roles in hospitals, I present drug use evaluation in three disease states (peptic ulcer, insomnia, and osteoporosis). To analyze these from multiple aspects, we not only need knowledge of drug monographs (clinical and adverse drug effects), but also the ability to evaluate a patient's adherence and cost-effectiveness. If we combine the idea of academic detailing with a clinical pharmacist's role, it is necessary to strengthen drug information skills, such as guideline or literature search skills and journal evaluation. Simultaneously, it is important to introduce new pharmaceutical education curriculums regarding evidence-based medicine (EBM), pharmacoeconomics, and professional communication in order to explore pharmacists' roles in the future. PMID:24584015

  18. Detailed HIkinematics of Tully-Fisher calibrator galaxies

    NASA Astrophysics Data System (ADS)

    Ponomareva, Anastasia A.; Verheijen, Marc A. W.; Bosma, Albert

    2016-09-01

    We present spatially-resolved HI kinematics of 32 spiral galaxies which have Cepheid or/and Tip of the Red Giant Branch distances, and define a calibrator sample for the Tully-Fisher relation. The interferometric HI data for this sample were collected from available archives and supplemented with new GMRT observations. This paper describes an uniform analysis of the HI kinematics of this inhomogeneous data set. Our main result is an atlas for our calibrator sample that presents global HI profiles, integrated HI column-density maps, HI surface density profiles and, most importantly, detailed kinematic information in the form of high-quality rotation curves derived from highly-resolved, two-dimensional velocity fields and position-velocity diagrams.

  19. A detailed spectroscopic study of an Italian fresco

    SciTech Connect

    Barilaro, Donatella; Crupi, Vincenza; Majolino, Domenico; Barone, Germana; Ponterio, Rosina

    2005-02-15

    In the present work we characterized samples of plasters and pictorial layers taken from a fresco in the Acireale Cathedral. The fresco represents the Coronation of Saint Venera, patron saint of this Ionian town. By performing a detailed spectroscopic analysis of the plaster preparation layer by Fourier-transform infrared (FTIR) spectroscopy and x-ray diffraction (XRD), and of the painting layer by FTIR and confocal Raman microspectroscopy, scanning electron microscopy+energy dispersive x-ray spectroscopy, and XRD, we were able to identify the pigments and the binders present. In particular, Raman investigation was crucial to the characterization of the pigments thanks to the high resolution of the confocal apparatus used. It is worth stressing that the simultaneous use of complementary techniques was able to provide more complete information for the conservation of the artifact we studied.

  20. Panel Post & Diagonal Brace Joint Detail; Crossbracing Center Joint ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Panel Post & Diagonal Brace Joint Detail; Crossbracing Center Joint Detail; Chord, Panel Post, Tie Bar, & Diagonal Brace Joint Detail; Chord, Tie Bar, & Crossbracing Joint Detail - Medora Bridge, Spanning East Fork of White River at State Route 235, Medora, Jackson County, IN

  1. How accurate are the nonlinear chemical Fokker-Planck and chemical Langevin equations?

    PubMed

    Grima, Ramon; Thomas, Philipp; Straube, Arthur V

    2011-08-28

    The chemical Fokker-Planck equation and the corresponding chemical Langevin equation are commonly used approximations of the chemical master equation. These equations are derived from an uncontrolled, second-order truncation of the Kramers-Moyal expansion of the chemical master equation and hence their accuracy remains to be clarified. We use the system-size expansion to show that chemical Fokker-Planck estimates of the mean concentrations and of the variance of the concentration fluctuations about the mean are accurate to order Ω(-3∕2) for reaction systems which do not obey detailed balance and at least accurate to order Ω(-2) for systems obeying detailed balance, where Ω is the characteristic size of the system. Hence, the chemical Fokker-Planck equation turns out to be more accurate than the linear-noise approximation of the chemical master equation (the linear Fokker-Planck equation) which leads to mean concentration estimates accurate to order Ω(-1∕2) and variance estimates accurate to order Ω(-3∕2). This higher accuracy is particularly conspicuous for chemical systems realized in small volumes such as biochemical reactions inside cells. A formula is also obtained for the approximate size of the relative errors in the concentration and variance predictions of the chemical Fokker-Planck equation, where the relative error is defined as the difference between the predictions of the chemical Fokker-Planck equation and the master equation divided by the prediction of the master equation. For dimerization and enzyme-catalyzed reactions, the errors are typically less than few percent even when the steady-state is characterized by merely few tens of molecules.

  2. Slim hole MWD tool accurately measures downhole annular pressure

    SciTech Connect

    Burban, B.; Delahaye, T. )

    1994-02-14

    Measurement-while-drilling of downhole pressure accurately determines annular pressure losses from circulation and drillstring rotation and helps monitor swab and surge pressures during tripping. In early 1993, two slim-hole wells (3.4 in. and 3 in. diameter) were drilled with continuous real-time electromagnetic wave transmission of downhole temperature and annular pressure. The data were obtained during all stages of the drilling operation and proved useful for operations personnel. The use of real-time measurements demonstrated the characteristic hydraulic effects of pressure surges induced by drillstring rotation in the small slim-hole annulus under field conditions. The interest in this information is not restricted to the slim-hole geometry. Monitoring or estimating downhole pressure is a key element for drilling operations. Except in special cases, no real-time measurements of downhole annular pressure during drilling and tripping have been used on an operational basis. The hydraulic effects are significant in conventional-geometry wells (3 1/2-in. drill pipe in a 6-in. hole). This paper describes the tool and the results from the field test.

  3. Novel Cortical Thickness Pattern for Accurate Detection of Alzheimer's Disease.

    PubMed

    Zheng, Weihao; Yao, Zhijun; Hu, Bin; Gao, Xiang; Cai, Hanshu; Moore, Philip

    2015-01-01

    Brain network occupies an important position in representing abnormalities in Alzheimer's disease (AD) and mild cognitive impairment (MCI). Currently, most studies only focused on morphological features of regions of interest without exploring the interregional alterations. In order to investigate the potential discriminative power of a morphological network in AD diagnosis and to provide supportive evidence on the feasibility of an individual structural network study, we propose a novel approach of extracting the correlative features from magnetic resonance imaging, which consists of a two-step approach for constructing an individual thickness network with low computational complexity. Firstly, multi-distance combination is utilized for accurate evaluation of between-region dissimilarity; and then the dissimilarity is transformed to connectivity via calculation of correlation function. An evaluation of the proposed approach has been conducted with 189 normal controls, 198 MCI subjects, and 163 AD patients using machine learning techniques. Results show that the observed correlative feature suggests significant promotion in classification performance compared with cortical thickness, with accuracy of 89.88% and area of 0.9588 under receiver operating characteristic curve. We further improved the performance by integrating both thickness and apolipoprotein E ɛ4 allele information with correlative features. New achieved accuracies are 92.11% and 79.37% in separating AD from normal controls and AD converters from non-converters, respectively. Differences between using diverse distance measurements and various correlation transformation functions are also discussed to explore an optimal way for network establishment. PMID:26444768

  4. HOW ACCURATE IS OUR KNOWLEDGE OF THE GALAXY BIAS?

    SciTech Connect

    More, Surhud

    2011-11-01

    Observations of the clustering of galaxies can provide useful information about the distribution of dark matter in the universe. In order to extract accurate cosmological parameters from galaxy surveys, it is important to understand how the distribution of galaxies is biased with respect to the matter distribution. The large-scale bias of galaxies can be quantified either by directly measuring the large-scale ({lambda} {approx}> 60 h{sup -1} Mpc) power spectrum of galaxies or by modeling the halo occupation distribution of galaxies using their clustering on small scales ({lambda} {approx}< 30 h{sup -1} Mpc). We compare the luminosity dependence of the galaxy bias (both the shape and the normalization) obtained by these methods and check for consistency. Our comparison reveals that the bias of galaxies obtained by the small-scale clustering measurements is systematically larger than that obtained from the large-scale power spectrum methods. We also find systematic discrepancies in the shape of the galaxy-bias-luminosity relation. We comment on the origin and possible consequences of these discrepancies which had remained unnoticed thus far.

  5. Evaluation of Sensitivity and Robustness of Geothermal Resource Parameters Using Detailed and Approximate Stratigraphy

    NASA Astrophysics Data System (ADS)

    Whealton, C.; Jordan, T. E.; Frone, Z. S.; Smith, J. D.; Horowitz, F. G.; Stedinger, J. R.

    2015-12-01

    Accurate assessment of the spatial variation of geothermal heat is key to distinguishing among locations for geothermal project development. Resource assessment over large areas can be accelerated by using existing subsurface data collected for other purposes, such as petroleum industry bottom-hole temperature (BHT) datasets. BHT data are notoriously noisy but in many sedimentary basins their abundance offsets the potential low quality of an individual BHT measurement. Analysis requires description of conductivity stratigraphy, which for thousands of wells with BHT values is daunting. For regional assessment, a streamlined method is to approximate the thickness and conductivity of each formation using a set of standard columns rescaled to the sediment thickness at a location. Surface heat flow and related geothermal resource metrics are estimated from these and additional parameters. This study uses Monte Carlo techniques to compare the accuracy and precision of thermal predictions at single locations by the streamlined approach to well-specific conductivity stratigraphy. For 77 wells distributed across the Appalachian Basin of NY, PA, and WV, local geological experts made available detailed information on unit thicknesses . For the streamlined method we used the Correlation of Stratigraphic Units of North America (COSUNA) columns. For both data sets, we described thermal conductivity of the strata using generic values or values from the geologically similar Anadarko Basin. The well-specific surface heat flow and temperature-at-depth were evaluated using a one-dimensional conductive heat flow model. This research addresses the sensitivity of the estimated geothermal output to the model inputs (BHT, thermal conductivity) and the robustness of the approximate stratigraphic column assumptions when estimating the geothermal output. This research was conducted as part of the Dept. of Energy Geothermal Play Fairway Analysis program.

  6. Infrared image detail enhancement approach based on improved joint bilateral filter

    NASA Astrophysics Data System (ADS)

    Liu, Ning; Chen, Xiaohong

    2016-07-01

    In this paper, we proposed a new infrared image detail enhancement approach. This approach could not only achieve the goal of enhancing the digital detail, but also make the processed image much closer to the real situation. Inspired by the joint-bilateral filter, two adjacent images were utilized to calculate the kernel functions in order to distinguish the detail information from the raw image. We also designed a new kernel function to modify the joint-bilateral filter and to eliminate the gradient reversal artifacts caused by the non-linear filtering. The new kernel is based on an adaptive emerge coefficient to realize the detail layer determination. The detail information was modified by the adaptive emerge coefficient along with two key parameters to realize the detail enhancement. Finally, we combined the processed detail layer with the base layer and rearrange the high dynamic image into monitor-suited low dynamic range to achieve better visual effect. Numerical calculation showed that this new technology has the best value compare to the previous research in detail enhancement. Figures and data flowcharts were demonstrated in the paper.

  7. Accurate description of calcium solvation in concentrated aqueous solutions.

    PubMed

    Kohagen, Miriam; Mason, Philip E; Jungwirth, Pavel

    2014-07-17

    Calcium is one of the biologically most important ions; however, its accurate description by classical molecular dynamics simulations is complicated by strong electrostatic and polarization interactions with surroundings due to its divalent nature. Here, we explore the recently suggested approach for effectively accounting for polarization effects via ionic charge rescaling and develop a new and accurate parametrization of the calcium dication. Comparison to neutron scattering and viscosity measurements demonstrates that our model allows for an accurate description of concentrated aqueous calcium chloride solutions. The present model should find broad use in efficient and accurate modeling of calcium in aqueous environments, such as those encountered in biological and technological applications.

  8. Detailed modeling analysis for soot formation and radiation in microgravity gas jet diffusion flames

    NASA Technical Reports Server (NTRS)

    Ku, Jerry C.; Tong, LI; Greenberg, Paul S.

    1995-01-01

    Radiation heat transfer in combustion systems has been receiving increasing interest. In the case of hydrocarbon fuels, a significant portion of the radiation comes from soot particles, justifying the need for detailed soot formation model and radiation transfer calculations. For laminar gas jet diffusion flames, results from this project (4/1/91 8/22/95) and another NASA study show that flame shape, soot concentration, and radiation heat fluxes are substantially different under microgravity conditions. Our emphasis is on including detailed soot transport models and a detailed solution for radiation heat transfer, and on coupling them with the flame structure calculations. In this paper, we will discuss the following three specific areas: (1) Comparing two existing soot formation models, and identifying possible improvements; (2) A simple yet reasonably accurate approach to calculating total radiative properties and/or fluxes over the spectral range; and (3) Investigating the convergence of iterations between the flame structure solver and the radiation heat transfer solver.

  9. Analysis of Dynamic Interactions between Different Drivetrain Components with a Detailed Wind Turbine Model

    NASA Astrophysics Data System (ADS)

    Bartschat, A.; Morisse, M.; Mertens, A.; Wenske, J.

    2016-09-01

    The presented work describes a detailed analysis of the dynamic interactions among mechanical and electrical drivetrain components of a modern wind turbine under the influence of parameter variations, different control mechanisms and transient excitations. For this study, a detailed model of a 2MW wind turbine with a gearbox, a permanent magnet synchronous generator and a full power converter has been developed which considers all relevant characteristics of the mechanical and electrical subsystems. This model includes an accurate representation of the aerodynamics and the mechanical properties of the rotor and the complete mechanical drivetrain. Furthermore, a detailed electrical modelling of the generator, the full scale power converter with discrete switching devices, its filters, the transformer and the grid as well as the control structure is considered. The analysis shows that, considering control measures based on active torsional damping, interactions between mechanical and electrical subsystems can significantly affect the loads and thus the individual lifetime of the components.

  10. The Combination of Laser Scanning and Structure from Motion Technology for Creation of Accurate Exterior and Interior Orthophotos of ST. Nicholas Baroque Church

    NASA Astrophysics Data System (ADS)

    Koska, B.; Křemen, T.

    2013-02-01

    Terrestrial laser scanning technology is used for creation of building documentation and 3D building model from its emerging at the turn of the millennium. Photogrammetry has even longer tradition in this field. Both technologies have some technical limitations if they are used for creation of a façade or even an interior orthophoto, but combination of both technologies seems profitable. Laser scanning can be used for creation of an accurate 3D model and photogrammetry for consequent application of high quality colour information. Both technologies were used in synergy to create the building plans, 2D drawing documentation of facades and interior views and the orthophotos of St. Nicholas Baroque church in Prague. The case study is described in details in the paper.

  11. Development of Detailed Kinetic Models for Fischer-Tropsch Fuels

    SciTech Connect

    Westbrook, C K; Pitz, W J; Carstensen, H; Dean, A M

    2008-10-28

    Fischer-Tropsch (FT) fuels can be synthesized from a syngas stream generated by the gasification of biomass. As such they have the potential to be a renewable hydrocarbon fuel with many desirable properties. However, both the chemical and physical properties are somewhat different from the petroleum-based hydrocarbons that they might replace, and it is important to account for such differences when considering using them as replacements for conventional fuels in devices such as diesel engines and gas turbines. FT fuels generally contain iso-alkanes with one or two substituted methyl groups to meet the pour-point specifications. Although models have been developed for smaller branched alkanes such as isooctane, additional efforts are required to properly capture the kinetics of the larger branched alkanes. Recently, Westbrook et al. developed a chemical kinetic model that can be used to represent the entire series of n-alkanes from C{sub 1} to C{sub 16} (Figure 1). In the current work, the model is extended to treat 2,2,4,4,6,8,8-heptamethylnonane (HMN), a large iso-alkane. The same reaction rate rules used in the iso-octane mechanism were incorporated in the HMN mechanism. Both high and low temperature chemistry was included so that the chemical kinetic model would be applicable to advanced internal combustion engines using low temperature combustion strategies. The chemical kinetic model consists of 1114 species and 4468 reactions. Concurrently with this effort, work is underway to improve the details of specific reaction classes in the mechanism, guided by high-level electronic structure calculations. Attention is focused upon development of accurate rate rules for abstraction of the tertiary hydrogens present in branched alkanes and properly accounting for the pressure dependence of the ?-scission, isomerization, and R + O{sub 2} reactions.

  12. The impact of model detail on power grid resilience measures

    NASA Astrophysics Data System (ADS)

    Auer, S.; Kleis, K.; Schultz, P.; Kurths, J.; Hellmann, F.

    2016-05-01

    Extreme events are a challenge to natural as well as man-made systems. For critical infrastructure like power grids, we need to understand their resilience against large disturbances. Recently, new measures of the resilience of dynamical systems have been developed in the complex system literature. Basin stability and survivability respectively assess the asymptotic and transient behavior of a system when subjected to arbitrary, localized but large perturbations in frequency and phase. To employ these methods that assess power grid resilience, we need to choose a certain model detail of the power grid. For the grid topology we considered the Scandinavian grid and an ensemble of power grids generated with a random growth model. So far the most popular model that has been studied is the classical swing equation model for the frequency response of generators and motors. In this paper we study a more sophisticated model of synchronous machines that also takes voltage dynamics into account, and compare it to the previously studied model. This model has been found to give an accurate picture of the long term evolution of synchronous machines in the engineering literature for post fault studies. We find evidence that some stable fix points of the swing equation become unstable when we add voltage dynamics. If this occurs the asymptotic behavior of the system can be dramatically altered, and basin stability estimates obtained with the swing equation can be dramatically wrong. We also find that the survivability does not change significantly when taking the voltage dynamics into account. Further, the limit cycle type asymptotic behaviour is strongly correlated with transient voltages that violate typical operational voltage bounds. Thus, transient voltage bounds are dominated by transient frequency bounds and play no large role for realistic parameters.

  13. Method for accurate growth of vertical-cavity surface-emitting lasers

    DOEpatents

    Chalmers, S.A.; Killeen, K.P.; Lear, K.L.

    1995-03-14

    The authors report a method for accurate growth of vertical-cavity surface-emitting lasers (VCSELs). The method uses a single reflectivity spectrum measurement to determine the structure of the partially completed VCSEL at a critical point of growth. This information, along with the extracted growth rates, allows imprecisions in growth parameters to be compensated for during growth of the remaining structure, which can then be completed with very accurate critical dimensions. Using this method, they can now routinely grow lasing VCSELs with Fabry-Perot cavity resonance wavelengths controlled to within 0.5%. 4 figs.

  14. Accurate localization and echocardiographic-pathologic correlation of tricuspid valve angiolipoma by intraoperative transesophageal echocardiography.

    PubMed

    Misra, Satyajeet; Sinha, Prabhat K; Koshy, Thomas; Sandhyamani, Samavedam; Parija, Chandrabhanu; Gopal, Kirun

    2009-11-01

    Angiolipoma (angiolipohamartoma) of the tricuspid valve (TV) is a rare tumor which may be occasionally misdiagnosed as right atrial (RA) myxoma. Transesophageal echocardiography (TEE) provides accurate information regarding the size, shape, mobility as well as site of attachment of RA tumors and is a superior modality as compared to transthoracic echocardiography (TTE). Correct diagnosis of RA tumors has therapeutic significance and guides management of patients, as myxomas are generally more aggressively managed than lipomas. We describe a rare case of a pedunculated angiolipoma of the TV which was misdiagnosed as RA myxoma on TTE and discuss the echocardiographic-pathologic correlates of the tumor as well as its accurate localization by TEE.

  15. Method for accurate growth of vertical-cavity surface-emitting lasers

    DOEpatents

    Chalmers, Scott A.; Killeen, Kevin P.; Lear, Kevin L.

    1995-01-01

    We report a method for accurate growth of vertical-cavity surface-emitting lasers (VCSELs). The method uses a single reflectivity spectrum measurement to determine the structure of the partially completed VCSEL at a critical point of growth. This information, along with the extracted growth rates, allows imprecisions in growth parameters to be compensated for during growth of the remaining structure, which can then be completed with very accurate critical dimensions. Using this method, we can now routinely grow lasing VCSELs with Fabry-Perot cavity resonance wavelengths controlled to within 0.5%.

  16. Shocking Detail of Superstar's Activity Revealed

    NASA Astrophysics Data System (ADS)

    1999-10-01

    polarization. Frequency is on 3880.0 megahertz, with audio on 6.8 megahertz. High resolution digital versions of the X-ray image (JPG, 300 dpi TIFF) and other information associated with this release are available on the Internet at: http://chandra.harvard.edu/photo/0099/index.html or via links in: http://chandra.harvard.edu

  17. What Data to Use for Forest Conservation Planning? A Comparison of Coarse Open and Detailed Proprietary Forest Inventory Data in Finland

    PubMed Central

    Lehtomäki, Joona; Tuominen, Sakari; Toivonen, Tuuli; Leinonen, Antti

    2015-01-01

    The boreal region is facing intensifying resource extraction pressure, but the lack of comprehensive biodiversity data makes operative forest conservation planning difficult. Many countries have implemented forest inventory schemes and are making extensive and up-to-date forest databases increasingly available. Some of the more detailed inventory databases, however, remain proprietary and unavailable for conservation planning. Here, we investigate how well different open and proprietary forest inventory data sets suit the purpose of conservation prioritization in Finland. We also explore how much priorities are affected by using the less accurate but open data. First, we construct a set of indices for forest conservation value based on quantitative information commonly found in forest inventories. These include the maturity of the trees, tree species composition, and site fertility. Secondly, using these data and accounting for connectivity between forest types, we investigate the patterns in conservation priority. For prioritization, we use Zonation, a method and software for spatial conservation prioritization. We then validate the prioritizations by comparing them to known areas of high conservation value. We show that the overall priority patterns are relatively consistent across different data sources and analysis options. However, the coarse data cannot be used to accurately identify the high-priority areas as it misses much of the fine-scale variation in forest structures. We conclude that, while inventory data collected for forestry purposes may be useful for forest conservation purposes, it needs to be detailed enough to be able to account for more fine-scaled features of high conservation value. These results underline the importance of making detailed inventory data publicly available. Finally, we discuss how the prioritization methodology we used could be integrated into operative forest management, especially in countries in the boreal zone. PMID

  18. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  19. 31 CFR 205.24 - How are accurate estimates maintained?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 31 Money and Finance: Treasury 2 2010-07-01 2010-07-01 false How are accurate estimates maintained... Treasury-State Agreement § 205.24 How are accurate estimates maintained? (a) If a State has knowledge that an estimate does not reasonably correspond to the State's cash needs for a Federal assistance...

  20. Evaluation of freely available ancillary data used for detailed soil mapping in Brazil

    NASA Astrophysics Data System (ADS)

    Samuel-Rosa, Alessandro; Anjos, Lúcia; Vasques, Gustavo; Heuvelink, Gerard

    2014-05-01

    Brazil is one of the world's largest food producers, and is home of both largest rainforest and largest supply of renewable fresh water on Earth. However, it lacks detailed soil information in extensive areas of the country. The best soil map covering the entire country was published at a scale of 1:5,000,000. Termination of governmental support for systematic soil mapping in the 1980's made detailed soil mapping of the whole country a very difficult task to accomplish. Nowadays, due to new user-driven demands (e.g. precision agriculture), most detailed soil maps are produced for small size areas. Many of them rely on as is freely available ancillary data, although their accuracy is usually not reported or unknown. Results from a validation exercise that we performed using ground control points from a small hilly catchment (20 km²) in Southern Brazil (-53.7995ºE, -29.6355ºN) indicate that most freely available ancillary data needs some type of correction before use. Georeferenced and orthorectified RapidEye imagery (recently acquired by the Brazilian government) has a horizontal accuracy (root-mean-square error, RMSE) of 37 m, which is worse than the value published in the metadata (32 m). Like any remote sensing imagery, RapidEye imagery needs to be correctly registered before its use for soil mapping. Topographic maps produced by the Brazilian Army and derived geological maps (scale of 1:25,000) have a horizontal accuracy of 65 m, which is more than four times the maximum value allowed by Brazilian legislation (15 m). Worse results were found for geological maps derived from 1:50,000 topographic maps (RMSE = 147 m), for which the maximum allowed value is 30 m. In most cases positional errors are of systematic origin and can be easily corrected (e.g., affine transformation). ASTER GDEM has many holes and is very noisy, making it of little use in the studied area. TOPODATA, which is SRTM kriged from originally 3 to 1 arc-second by the Brazilian National

  1. Characteristics of physicians targeted by the pharmaceutical industry to participate in e-detailing.

    PubMed

    Alkhateeb, Fadi M; Khanfar, Nile M; Doucette, William R; Loudon, David

    2009-01-01

    Electronic detailing (e-detailing) has been introduced in the last few years by the pharmaceutical industry as a new communication channel through which to promote pharmaceutical products to physicians. E-detailing involves using digital technology, such as Internet, video conferencing, and interactive voice response, by which drug companies target their marketing efforts toward specific physicians with pinpoint accuracy. A mail survey of 671 Iowa physicians was used to gather information about the physician characteristics and practice setting characteristics of those who are usually targeted by pharmaceutical companies to participate in e-detailing. A model is developed and tested to explain firms' targeting strategy for targeting physicians for e-detailing. PMID:19408179

  2. The Devil is in the Details: Using X-Ray Computed Tomography to Develop Accurate 3D Grain Characteristics and Bed Structure Metrics for Gravel Bed Rivers

    NASA Astrophysics Data System (ADS)

    Voepel, H.; Hodge, R. A.; Leyland, J.; Sear, D. A.; Ahmed, S. I.

    2014-12-01

    Uncertainty for bedload estimates in gravel bed rivers is largely driven by our inability to characterize the arrangement and orientation of the sediment grains within the bed. The characteristics of the surface structure are produced by the water working of grains, which leads to structural differences in bedforms through differential patterns of grain sorting, packing, imbrication, mortaring and degree of bed armoring. Until recently the technical and logistical difficulties of characterizing the arrangement of sediment in 3D have prohibited a full understanding of how grains interact with stream flow and the feedback mechanisms that exist. Micro-focus X-ray CT has been used for non-destructive 3D imaging of grains within a series of intact sections of river bed taken from key morphological units (see Figure 1). Volume, center of mass, points of contact, protrusion and spatial orientation of individual surface grains are derived from these 3D images, which in turn, facilitates estimates of 3D static force properties at the grain-scale such as pivoting angles, buoyancy and gravity forces, and grain exposure. By aggregating representative samples of grain-scale properties of localized interacting sediment into overall metrics, we can compare and contrast bed stability at a macro-scale with respect to stream bed morphology. Understanding differences in bed stability through representative metrics derived at the grain-scale will ultimately lead to improved bedload estimates with reduced uncertainty and increased understanding of interactions between grain-scale properties on channel morphology. Figure 1. CT-Scans of a water worked gravel-filled pot. a. 3D rendered scan showing the outer mesh, and b. the same pot with the mesh removed. c. vertical change in porosity of the gravels sampled in 5mm volumes. Values are typical of those measured in the field and lab. d. 2-D slices through the gravels at 20% depth from surface (porosity = 0.35), and e. 75% depth from surface (porosity = 0.24), showing the presence of fine sediments 'mortaring' the larger gravels. f. shows a longitudinal slide from which pivot angle measurements can be determined for contact points between particles. g. Example of two particle extraction from the CT scan showing how particle contact areas can be measured (dark area).

  3. Patterns of Communication through Interpreters: A Detailed Sociolinguistic Analysis

    PubMed Central

    Aranguri, Cesar; Davidson, Brad; Ramirez, Robert

    2006-01-01

    BACKGROUND Numerous articles have detailed how the presence of an interpreter leads to less satisfactory communication with physicians; few have studied how actual communication takes place through an interpreter in a clinical setting. OBJECTIVE Record and analyze physician-interpreter-patient interactions. DESIGN Primary care physicians with high-volume Hispanic practices were recruited for a communication study. Dyslipidemic Hispanic patients, either monolingual Spanish or bilingual Spanish-English, were recruited on the day of a normally scheduled appointment and, once consented, recorded without a researcher present in the room. Separate postvisit interviews were conducted with the patient and the physician. All interactions were fully transcribed and analyzed. PARTICIPANTS Sixteen patients were recorded interacting with 9 physicians. Thirteen patients used an interpreter with 8 physicians, and 3 patients spoke Spanish with the 1 bilingual physician. APPROACH Transcript analysis based on sociolinguistic and discourse analytic techniques, including but not limited to time speaking, analysis of questions asked and answered, and the loss of semantic information. RESULTS Speech was significantly reduced and revised by the interpreter, resulting in an alteration of linguistic features such as content, meaning, reinforcement/validation, repetition, and affect. In addition, visits that included an interpreter had virtually no rapport-building “small talk,” which typically enables the physician to gain comprehensive patient history, learn clinically relevant information, and increase emotional engagement in treatment. CONCLUSIONS The presence of an interpreter increases the difficulty of achieving good physician-patient communication. Physicians and interpreters should be trained in the process of communication and interpretation, to minimize conversational loss and maximize the information and relational exchange with interpreted patients. PMID:16808747

  4. Measurement of Fracture Geometry for Accurate Computation of Hydraulic Conductivity

    NASA Astrophysics Data System (ADS)

    Chae, B.; Ichikawa, Y.; Kim, Y.

    2003-12-01

    Fluid flow in rock mass is controlled by geometry of fractures which is mainly characterized by roughness, aperture and orientation. Fracture roughness and aperture was observed by a new confocal laser scanning microscope (CLSM; Olympus OLS1100). The wavelength of laser is 488nm, and the laser scanning is managed by a light polarization method using two galvano-meter scanner mirrors. The system improves resolution in the light axis (namely z) direction because of the confocal optics. The sampling is managed in a spacing 2.5 μ m along x and y directions. The highest measurement resolution of z direction is 0.05 μ m, which is the more accurate than other methods. For the roughness measurements, core specimens of coarse and fine grained granites were provided. Measurements were performed along three scan lines on each fracture surface. The measured data were represented as 2-D and 3-D digital images showing detailed features of roughness. Spectral analyses by the fast Fourier transform (FFT) were performed to characterize on the roughness data quantitatively and to identify influential frequency of roughness. The FFT results showed that components of low frequencies were dominant in the fracture roughness. This study also verifies that spectral analysis is a good approach to understand complicate characteristics of fracture roughness. For the aperture measurements, digital images of the aperture were acquired under applying five stages of uniaxial normal stresses. This method can characterize the response of aperture directly using the same specimen. Results of measurements show that reduction values of aperture are different at each part due to rough geometry of fracture walls. Laboratory permeability tests were also conducted to evaluate changes of hydraulic conductivities related to aperture variation due to different stress levels. The results showed non-uniform reduction of hydraulic conductivity under increase of the normal stress and different values of

  5. Gravitational waves from compact binaries inspiralling along post-Newtonian accurate eccentric orbits: Data analysis implications

    SciTech Connect

    Tessmer, Manuel; Gopakumar, Achamveedu

    2008-10-15

    Compact binaries inspiralling along eccentric orbits are plausible gravitational-wave (GW) sources for the ground-based laser interferometers. We explore the losses in the event rates incurred when searching for GWs from compact binaries inspiralling along post-Newtonian accurate eccentric orbits with certain obvious nonoptimal search templates. For the present analysis, GW signals having 2.5 post-Newtonian (PN) accurate orbital evolution are modeled following the phasing formalism, presented by T. Damour, A. Gopakumar, and B. R. Iyer [Phys. Rev. D 70, 064028 (2004)]. We demonstrate that the search templates that model in a gauge-invariant manner GWs from compact binaries inspiralling under quadrupolar radiation reaction along 2PN accurate circular orbits are very efficient in capturing our somewhat realistic GW signals. However, three types of search templates based on the adiabatic, complete adiabatic, and gauge-dependent complete nonadiabatic approximants, detailed in P. Ajith, B. R. Iyer, C. A. K. Robinson, and B. S. Sathyaprakash, Phys. Rev. D 71, 044029 (2005), relevant for the circular inspiral under the quadrupolar radiation reaction were found to be inefficient in capturing the above-mentioned eccentric signal. We conclude that further investigations will be required to probe the ability of various types of PN accurate circular templates, employed to analyze the LIGO/VIRGO data, to capture GWs from compact binaries having tiny orbital eccentricities.

  6. Gravitational waves from compact binaries inspiralling along post-Newtonian accurate eccentric orbits: Data analysis implications

    NASA Astrophysics Data System (ADS)

    Tessmer, Manuel; Gopakumar, Achamveedu

    2008-10-01

    Compact binaries inspiralling along eccentric orbits are plausible gravitational-wave (GW) sources for the ground-based laser interferometers. We explore the losses in the event rates incurred when searching for GWs from compact binaries inspiralling along post-Newtonian accurate eccentric orbits with certain obvious nonoptimal search templates. For the present analysis, GW signals having 2.5 post-Newtonian (PN) accurate orbital evolution are modeled following the phasing formalism, presented by T. Damour, A. Gopakumar, and B. R. Iyer [Phys. Rev. D 70, 064028 (2004)PRVDAQ0556-282110.1103/PhysRevD.70.064028]. We demonstrate that the search templates that model in a gauge-invariant manner GWs from compact binaries inspiralling under quadrupolar radiation reaction along 2PN accurate circular orbits are very efficient in capturing our somewhat realistic GW signals. However, three types of search templates based on the adiabatic, complete adiabatic, and gauge-dependent complete nonadiabatic approximants, detailed in P. Ajith, B. R. Iyer, C. A. K. Robinson, and B. S. Sathyaprakash, Phys. Rev. D 71, 044029 (2005)PRVDAQ0556-282110.1103/PhysRevD.71.044029, relevant for the circular inspiral under the quadrupolar radiation reaction were found to be inefficient in capturing the above-mentioned eccentric signal. We conclude that further investigations will be required to probe the ability of various types of PN accurate circular templates, employed to analyze the LIGO/VIRGO data, to capture GWs from compact binaries having tiny orbital eccentricities.

  7. History and progress on accurate measurements of the Planck constant.

    PubMed

    Steiner, Richard

    2013-01-01

    The measurement of the Planck constant, h, is entering a new phase. The CODATA 2010 recommended value is 6.626 069 57 × 10(-34) J s, but it has been a long road, and the trip is not over yet. Since its discovery as a fundamental physical constant to explain various effects in quantum theory, h has become especially important in defining standards for electrical measurements and soon, for mass determination. Measuring h in the International System of Units (SI) started as experimental attempts merely to prove its existence. Many decades passed while newer experiments measured physical effects that were the influence of h combined with other physical constants: elementary charge, e, and the Avogadro constant, N(A). As experimental techniques improved, the precision of the value of h expanded. When the Josephson and quantum Hall theories led to new electronic devices, and a hundred year old experiment, the absolute ampere, was altered into a watt balance, h not only became vital in definitions for the volt and ohm units, but suddenly it could be measured directly and even more accurately. Finally, as measurement uncertainties now approach a few parts in 10(8) from the watt balance experiments and Avogadro determinations, its importance has been linked to a proposed redefinition of a kilogram unit of mass. The path to higher accuracy in measuring the value of h was not always an example of continuous progress. Since new measurements periodically led to changes in its accepted value and the corresponding SI units, it is helpful to see why there were bumps in the road and where the different branch lines of research joined in the effort. Recalling the bumps along this road will hopefully avoid their repetition in the upcoming SI redefinition debates. This paper begins with a brief history of the methods to measure a combination of fundamental constants, thus indirectly obtaining the Planck constant. The historical path is followed in the section describing how the

  8. Accurate compressed look up table method for CGH in 3D holographic display.

    PubMed

    Gao, Chuan; Liu, Juan; Li, Xin; Xue, Gaolei; Jia, Jia; Wang, Yongtian

    2015-12-28

    Computer generated hologram (CGH) should be obtained with high accuracy and high speed in 3D holographic display, and most researches focus on the high speed. In this paper, a simple and effective computation method for CGH is proposed based on Fresnel diffraction theory and look up table. Numerical simulations and optical experiments are performed to demonstrate its feasibility. The proposed method can obtain more accurate reconstructed images with lower memory usage compared with split look up table method and compressed look up table method without sacrificing the computational speed in holograms generation, so it is called accurate compressed look up table method (AC-LUT). It is believed that AC-LUT method is an effective method to calculate the CGH of 3D objects for real-time 3D holographic display where the huge information data is required, and it could provide fast and accurate digital transmission in various dynamic optical fields in the future.

  9. Accurate compressed look up table method for CGH in 3D holographic display.

    PubMed

    Gao, Chuan; Liu, Juan; Li, Xin; Xue, Gaolei; Jia, Jia; Wang, Yongtian

    2015-12-28

    Computer generated hologram (CGH) should be obtained with high accuracy and high speed in 3D holographic display, and most researches focus on the high speed. In this paper, a simple and effective computation method for CGH is proposed based on Fresnel diffraction theory and look up table. Numerical simulations and optical experiments are performed to demonstrate its feasibility. The proposed method can obtain more accurate reconstructed images with lower memory usage compared with split look up table method and compressed look up table method without sacrificing the computational speed in holograms generation, so it is called accurate compressed look up table method (AC-LUT). It is believed that AC-LUT method is an effective method to calculate the CGH of 3D objects for real-time 3D holographic display where the huge information data is required, and it could provide fast and accurate digital transmission in various dynamic optical fields in the future. PMID:26831987

  10. Detailed high-accuracy megavoltage transmission measurements: A sensitive experimental benchmark of EGSnrc

    SciTech Connect

    Ali, E. S. M.; McEwen, M. R.; Rogers, D. W. O.

    2012-10-15

    Purpose: There are three goals for this study: (a) to perform detailed megavoltage transmission measurements in order to identify the factors that affect the measurement accuracy, (b) to use the measured data as a benchmark for the EGSnrc system in order to identify the computational limiting factors, and (c) to provide data for others to benchmark Monte Carlo codes. Methods: Transmission measurements are performed at the National Research Council Canada on a research linac whose incident electron parameters are independently known. Automated transmission measurements are made on-axis, down to a transmission value of {approx}1.7%, for eight beams between 10 MV (the lowest stable MV beam on the linac) and 30 MV, using fully stopping Be, Al, and Pb bremsstrahlung targets and no fattening filters. To diversify energy differentiation, data are acquired for each beam using low-Z and high-Z attenuators (C and Pb) and Farmer chambers with low-Z and high-Z buildup caps. Experimental corrections are applied for beam drifts (2%), polarity (2.5% typical maximum, 6% extreme), ion recombination (0.2%), leakage (0.3%), and room scatter (0.8%)-the values in parentheses are the largest corrections applied. The experimental setup and the detectors are modeled using EGSnrc, with the newly added photonuclear attenuation included (up to a 5.6% effect). A detailed sensitivity analysis is carried out for the measured and calculated transmission data. Results: The developed experimental protocol allows for transmission measurements with 0.4% uncertainty on the smallest signals. Suggestions for accurate transmission measurements are provided. Measurements and EGSnrc calculations agree typically within 0.2% for the sensitivity of the transmission values to the detector details, to the bremsstrahlung target material, and to the incident electron energy. Direct comparison of the measured and calculated transmission data shows agreement better than 2% for C (3.4% for the 10 MV beam) and

  11. Analysis of information systems for hydropower operations: Executive summary

    NASA Technical Reports Server (NTRS)

    Sohn, R. L.; Becker, L.; Estes, J.; Simonett, D.; Yeh, W.

    1976-01-01

    An analysis was performed of the operations of hydropower systems, with emphasis on water resource management, to determine how aerospace derived information system technologies can effectively increase energy output. Better utilization of water resources was sought through improved reservoir inflow forecasting based on use of hydrometeorologic information systems with new or improved sensors, satellite data relay systems, and use of advanced scheduling techniques for water release. Specific mechanisms for increased energy output were determined, principally the use of more timely and accurate short term (0-7 days) inflow information to reduce spillage caused by unanticipated dynamic high inflow events. The hydrometeorologic models used in predicting inflows were examined in detail to determine the sensitivity of inflow prediction accuracy to the many variables employed in the models, and the results were used to establish information system requirements. Sensor and data handling system capabilities were reviewed and compared to the requirements, and an improved information system concept was outlined.

  12. 7. DETAIL OF INCLINED END POST WITH PIN CONNECTION, EYEBARS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. DETAIL OF INCLINED END POST WITH PIN CONNECTION, EYEBARS AND LATICE PORTAL BRACE. FINIAL DETAILS SEEN ATOP. - Slates' Mill Bridge, Township Road 439 spanning South Branch of Tunkhannock Creek in Benton Township, Dalton, Lackawanna County, PA

  13. 9. VIEW TO NORTHEAST. DETAIL, OBLIQUE VIEW OF WEST APPROACH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. VIEW TO NORTHEAST. DETAIL, OBLIQUE VIEW OF WEST APPROACH SPAN. NOTE PIN CONNECTIONS, UNDERSIDE DETAILS, SHADOW PATTERN CAST BY STEEL OPEN GRATE DECK. - Gianella Bridge, Spanning Sacramento River at State Highway 32, Hamilton City, Glenn County, CA

  14. Lock 1 (Savannah River Lock), Elevation of North Wall, Detail ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Lock 1 (Savannah River Lock), Elevation of North Wall, Detail of Wall Foundation, Detail of Gate Pocket - Savannah & Ogeechee Barge Canal, Between Ogeechee & Savannah Rivers, Savannah, Chatham County, GA

  15. 9. South abutment, detail of collapsed east wing wall; also ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. South abutment, detail of collapsed east wing wall; also detail of bottom lateral bracing and stringers; looking southeast - Dodd Ford Bridge, County Road 147 Spanning Blue Earth River, Amboy, Blue Earth County, MN

  16. 3. East side, details of north half of east web; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. East side, details of north half of east web; also details of roadway, railing and overhead bracing; looking northeast - Dodd Ford Bridge, County Road 147 Spanning Blue Earth River, Amboy, Blue Earth County, MN

  17. 3. Detail of north loading dock area showing column, insulated ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. Detail of north loading dock area showing column, insulated doors, and detail of underside of canopy - Fort Hood, World War II Temporary Buildings, Cold Storage Building, Seventeenth Street, Killeen, Bell County, TX

  18. 6. HOUSE NO. 2. DETAIL AT EAST END OF FRONT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. HOUSE NO. 2. DETAIL AT EAST END OF FRONT SHOWING SIDING AND ROOF-WALL JUNCTURE DETAILS. VIEW TO SOUTHWEST. - Holter Hydroelectric Facility, House No. 2, End of Holter Dam Road, Wolf Creek, Lewis and Clark County, MT

  19. 7. EXTERIOR SOUTHEAST SIDE DETAIL VIEW, FACING SOUTHWEST. BUILDINGS 101, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. EXTERIOR SOUTHEAST SIDE DETAIL VIEW, FACING SOUTHWEST. BUILDINGS 101, 278 AND CANOPY 685 DETAILED. - NASA Industrial Plant, Missile Research Laboratory, 12214 Lakewood Boulevard, Downey, Los Angeles County, CA

  20. Photocopy of "sheet 4 of 8" showing window details, door ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Photocopy of "sheet 4 of 8" showing window details, door sill detail, vertical wall sections, and cross sections thru front, side and rear elevations. - Badger Mountain Lookout, .125 mile northwest of Badger Mountain summit, East Wenatchee, Douglas County, WA

  1. 14 CFR 27.685 - Control system details.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... accordance with 5 U.S.C. section 552(a) and 1 CFR part 51. Copies may be obtained from the Naval Publications... system details. (a) Each detail of each control system must be designed to prevent jamming, chafing,...

  2. A fast and accurate algorithm for diploid individual haplotype reconstruction.

    PubMed

    Wu, Jingli; Liang, Binbin

    2013-08-01

    Haplotypes can provide significant information in many research fields, including molecular biology and medical therapy. However, haplotyping is much more difficult than genotyping by using only biological techniques. With the development of sequencing technologies, it becomes possible to obtain haplotypes by combining sequence fragments. The haplotype reconstruction problem of diploid individual has received considerable attention in recent years. It assembles the two haplotypes for a chromosome given the collection of fragments coming from the two haplotypes. Fragment errors significantly increase the difficulty of the problem, and which has been shown to be NP-hard. In this paper, a fast and accurate algorithm, named FAHR, is proposed for haplotyping a single diploid individual. Algorithm FAHR reconstructs the SNP sites of a pair of haplotypes one after another. The SNP fragments that cover some SNP site are partitioned into two groups according to the alleles of the corresponding SNP site, and the SNP values of the pair of haplotypes are ascertained by using the fragments in the group that contains more SNP fragments. The experimental comparisons were conducted among the FAHR, the Fast Hare and the DGS algorithms by using the haplotypes on chromosome 1 of 60 individuals in CEPH samples, which were released by the International HapMap Project. Experimental results under different parameter settings indicate that the reconstruction rate of the FAHR algorithm is higher than those of the Fast Hare and the DGS algorithms, and the running time of the FAHR algorithm is shorter than those of the Fast Hare and the DGS algorithms. Moreover, the FAHR algorithm has high efficiency even for the reconstruction of long haplotypes and is very practical for realistic applications.

  3. 5 CFR 352.305 - Eligibility for detail.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... RIGHTS Detail and Transfer of Federal Employees to International Organizations § 352.305 Eligibility for detail. An employee is eligible for detail to an international organization with the rights provided for... Service (SES). (d) A person serving under a temporary appointment....

  4. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... cables or tubes against other parts. (d) Each element of the flight control system must have design... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be...

  5. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... cables or tubes against other parts. (d) Each element of the flight control system must have design... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be...

  6. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... cables or tubes against other parts. (d) Each element of the flight control system must have design... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be...

  7. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... cables or tubes against other parts. (d) Each element of the flight control system must have design... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be...

  8. 14 CFR 23.685 - Control system details.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... cables or tubes against other parts. (d) Each element of the flight control system must have design... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be...

  9. 14 CFR 27.685 - Control system details.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Control system details. 27.685 Section 27.685 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Design and Construction Control Systems § 27.685 Control system details. (a) Each detail of...

  10. 46 CFR 70.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 3 2011-10-01 2011-10-01 false Electrical engineering details. 70.25-1 Section 70.25-1... General Electrical Engineering Requirements § 70.25-1 Electrical engineering details. All electrical engineering details and installations shall be designed and installed in accordance with subchapter...

  11. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  12. 46 CFR 90.20-1 - Marine engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their...

  13. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  14. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. (a) All marine engineering details relative to... 40 feet in length will be found in subchapter F (Marine Engineering) of this chapter....

  15. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  16. 46 CFR 90.20-1 - Marine engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their...

  17. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Electrical engineering details. 188.25-1 Section 188.25... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J...

  18. 46 CFR 70.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 3 2010-10-01 2010-10-01 false Electrical engineering details. 70.25-1 Section 70.25-1... General Electrical Engineering Requirements § 70.25-1 Electrical engineering details. All electrical engineering details and installations shall be designed and installed in accordance with subchapter...

  19. 46 CFR 90.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1... PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details and installations shall be designed and installed in accordance...

  20. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. (a) All marine engineering details relative to... 40 feet in length will be found in subchapter F (Marine Engineering) of this chapter....

  1. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  2. 5 CFR 352.905 - Employees on detail.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Employees on detail. 352.905 Section 352.905 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS REEMPLOYMENT RIGHTS Reemployment Rights After Service With the Panama Canal Commission § 352.905 Employees on detail. (a) An employee detailed to the Commission...

  3. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J (Electrical... 46 Shipping 7 2013-10-01 2013-10-01 false Electrical engineering details. 188.25-1 Section...

  4. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J (Electrical... 46 Shipping 7 2012-10-01 2012-10-01 false Electrical engineering details. 188.25-1 Section...

  5. 46 CFR 188.25-1 - Electrical engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details shall be in accordance with subchapter J (Electrical... 46 Shipping 7 2014-10-01 2014-10-01 false Electrical engineering details. 188.25-1 Section...

  6. 33 CFR 263.19 - Detailed project reports.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... DEFENSE CONTINUING AUTHORITIES PROGRAMS General § 263.19 Detailed project reports. (a) The Detailed Project Report serves a dual purpose: the report serves both as basis for approval of a project for... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Detailed project reports....

  7. 33 CFR 263.19 - Detailed project reports.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... DEFENSE CONTINUING AUTHORITIES PROGRAMS General § 263.19 Detailed project reports. (a) The Detailed Project Report serves a dual purpose: the report serves both as basis for approval of a project for... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Detailed project reports....

  8. 33 CFR 263.19 - Detailed project reports.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... DEFENSE CONTINUING AUTHORITIES PROGRAMS General § 263.19 Detailed project reports. (a) The Detailed Project Report serves a dual purpose: the report serves both as basis for approval of a project for... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Detailed project reports....

  9. 33 CFR 263.19 - Detailed project reports.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... DEFENSE CONTINUING AUTHORITIES PROGRAMS General § 263.19 Detailed project reports. (a) The Detailed Project Report serves a dual purpose: the report serves both as basis for approval of a project for... 33 Navigation and Navigable Waters 3 2012-07-01 2012-07-01 false Detailed project reports....

  10. 33 CFR 263.19 - Detailed project reports.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... DEFENSE CONTINUING AUTHORITIES PROGRAMS General § 263.19 Detailed project reports. (a) The Detailed Project Report serves a dual purpose: the report serves both as basis for approval of a project for... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Detailed project reports....

  11. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. All marine engineering details relative to the... in length will be found in subchapter F (Marine Engineering) of this chapter....

  12. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. All marine engineering details relative to the... in length will be found in subchapter F (Marine Engineering) of this chapter....

  13. 46 CFR 90.20-1 - Marine engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their...

  14. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  15. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  16. 46 CFR 90.20-1 - Marine engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their...

  17. 46 CFR 188.20-1 - Marine engineering details.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Marine engineering details. 188.20-1 Section 188.20-1... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter....

  18. 46 CFR 90.20-1 - Marine engineering details.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Marine engineering details. 90.20-1 Section 90.20-1... PROVISIONS General Marine Engineering Requirements § 90.20-1 Marine engineering details. (a) All marine engineering details such as piping, valves, fittings, boilers, pressure vessels, etc., and their...

  19. 46 CFR 24.20-1 - Marine engineering details.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... Engineering Requirements § 24.20-1 Marine engineering details. All marine engineering details relative to the... in length will be found in subchapter F (Marine Engineering) of this chapter....

  20. Accurate calculation of diffraction-limited encircled and ensquared energy.

    PubMed

    Andersen, Torben B

    2015-09-01

    Mathematical properties of the encircled and ensquared energy functions for the diffraction-limited point-spread function (PSF) are presented. These include power series and a set of linear differential equations that facilitate the accurate calculation of these functions. Asymptotic expressions are derived that provide very accurate estimates for the relative amount of energy in the diffraction PSF that fall outside a square or rectangular large detector. Tables with accurate values of the encircled and ensquared energy functions are also presented. PMID:26368873