Sommargren, Gary E.
1999-01-01
An interferometer which has the capability of measuring optical elements and systems with an accuracy of .lambda./1000 where .lambda. is the wavelength of visible light. Whereas current interferometers employ a reference surface, which inherently limits the accuracy of the measurement to about .lambda./50, this interferometer uses an essentially perfect spherical reference wavefront generated by the fundamental process of diffraction. Whereas current interferometers illuminate the optic to be tested with an aberrated wavefront which also limits the accuracy of the measurement, this interferometer uses an essentially perfect spherical measurement wavefront generated by the fundamental process of diffraction. This interferometer is adjustable to give unity fringe visibility, which maximizes the signal-to-noise, and has the means to introduce a controlled prescribed relative phase shift between the reference wavefront and the wavefront from the optics under test, which permits analysis of the interference fringe pattern using standard phase extraction algorithms.
Sommargren, G.E.
1999-08-03
An interferometer is disclosed which has the capability of measuring optical elements and systems with an accuracy of {lambda}/1000 where {lambda} is the wavelength of visible light. Whereas current interferometers employ a reference surface, which inherently limits the accuracy of the measurement to about {lambda}/50, this interferometer uses an essentially perfect spherical reference wavefront generated by the fundamental process of diffraction. Whereas current interferometers illuminate the optic to be tested with an aberrated wavefront which also limits the accuracy of the measurement, this interferometer uses an essentially perfect spherical measurement wavefront generated by the fundamental process of diffraction. This interferometer is adjustable to give unity fringe visibility, which maximizes the signal-to-noise, and has the means to introduce a controlled prescribed relative phase shift between the reference wavefront and the wavefront from the optics under test, which permits analysis of the interference fringe pattern using standard phase extraction algorithms. 11 figs.
Wollaston prism phase-stepping point diffraction interferometer and method
Rushford, Michael C.
2004-10-12
A Wollaston prism phase-stepping point diffraction interferometer for testing a test optic. The Wollaston prism shears light into reference and signal beams, and provides phase stepping at increased accuracy by translating the Wollaston prism in a lateral direction with respect to the optical path. The reference beam produced by the Wollaston prism is directed through a pinhole of a diaphragm to produce a perfect spherical reference wave. The spherical reference wave is recombined with the signal beam to produce an interference fringe pattern of greater accuracy.
Time Reference of Verbs in Biblical Hebrew Poetry
ERIC Educational Resources Information Center
Zwyghuizen, Jill E.
2012-01-01
This dissertation suggests that the time reference of verbs in Hebrew poetry can be determined from a combination of form (aspect) and "Aktionsart" (stative vs. fientive). Specifically, perfective forms of stative verbs have past or present time reference. Perfective forms of fientive verbs have past time reference. Imperfective forms of…
Observations on the predictive value of short-term stake tests
Stan Lebow; Bessie Woodward; Patricia Lebow
2008-01-01
This paper compares average ratings of test stakes after 3, 4, 5, and 7 years exposure to their subsequent ratings after 11 years. Average ratings from over 200 treatment groups exposed in plots in southern Mississippi were compared to average ratings of a reference preservative. The analysis revealed that even perfect ratings after three years were not a reliable...
Phase shifting diffraction interferometer
Sommargren, Gary E.
1996-01-01
An interferometer which has the capability of measuring optical elements and systems with an accuracy of .lambda./1000 where .lambda. is the wavelength of visible light. Whereas current interferometers employ a reference surface, which inherently limits the accuracy of the measurement to about .lambda./50, this interferometer uses an essentially perfect spherical reference wavefront generated by the fundamental process of diffraction. This interferometer is adjustable to give unity fringe visibility, which maximizes the signal-to-noise, and has the means to introduce a controlled prescribed relative phase shift between the reference wavefront and the wavefront from the optics under test, which permits analysis of the interference fringe pattern using standard phase extraction algorithms.
Phase shifting diffraction interferometer
Sommargren, G.E.
1996-08-29
An interferometer which has the capability of measuring optical elements and systems with an accuracy of {lambda}/1000 where {lambda} is the wavelength of visible light. Whereas current interferometers employ a reference surface, which inherently limits the accuracy of the measurement to about {lambda}/50, this interferometer uses an essentially perfect spherical reference wavefront generated by the fundamental process of diffraction. This interferometer is adjustable to give unity fringe visibility, which maximizes the signal-to-noise, and has the means to introduce a controlled prescribed relative phase shift between the reference wavefront and the wavefront from the optics under test, which permits analysis of the interference fringe pattern using standard phase extraction algorithms. 8 figs.
Analysis of Four Automated Urinalysis Systems Compared to Reference Methods.
Bartosova, Kamila; Kubicek, Zdenek; Franekova, Janka; Louzensky, Gustav; Lavrikova, Petra; Jabor, Antonin
2016-11-01
The aim of this study was to compare four automated urinalysis systems: the Iris iQ200 Sprint (Iris Diagnostics, U.S.A.) combined with the Arkray AUTION MAX AX 4030, Iris + AUTION, Arkray AU 4050 (Arkray Global Business, Inc., Japan), Dirui FUS 2000 (Dirui Industrial Co., P.R.C.), and Menarini sediMAX (Menarini, Italy). Urine concentrations of protein and glucose (Iris, Dirui) were compared using reference quantitative analysis on an Abbott Architect c16000. Leukocytes, erythrocytes, epithelia, and casts (Iris, Arkray, Diuri, Menarini) were compared to urine sediment under reference light microscopy, Leica DM2000 (Leica Microsystems GmbH, Germany) with calibrated FastRead plates (Biosigma S.r.l., Italy), using both native and stained preparations. Total protein and glucose levels were measured using the Iris + AUTION system with borderline trueness, while the Dirui analysis revealed worse performances for the protein and glucose measurements. True classifications of leukocytes and erythrocytes were above 85% and 72%, respectively. Kappa statistics revealed a nearly perfect evaluation of leukocytes for all tested systems; the erythrocyte evaluation was nearly perfect for the Iris, Dirui and Arkray analyzers and substantial for the Menarini analyzer. The epithelia identification was connected to high false negativity (above 15%) in the Iris, Arkray, and Menarini analyses. False-negative casts were above 70% for all tested systems. The use of automated urinalysis demonstrated some weaknesses and should be checked by experienced laboratory staff using light microscopy.
NASA Astrophysics Data System (ADS)
Çayören, M.; Akduman, I.; Yapar, A.; Crocco, L.
2010-03-01
The reference list should have included the conference communications [1] and [2], wherein we introduced the algorithm described in this paper. Note that a less complete description of the algorithm was given in [1]. However, the example considering a bean-shaped target is the same in the two papers and it is reused in this paper by kind permission of the Applied Computational Electromagnetics Society. References [1] Crocco L, Akduman I, Çayören M and Yapar A 2007 A new method for shape reconstruction of perfectly conducting targets The 23rd Annual Review of Progress in Applied Computational Electromagnetics (Verona, Italy) [2] Çayören M, Akduman I, Yapar A and Crocco L 2007 A new algorithm for the shape reconstruction of perfectly conducting objects Progress in Electromagnetics Research Symposium (PIERS) (Beijing, PRC)
Al Sidairi, Hilal; Binkhamis, Khalifa; Jackson, Colleen; Roberts, Catherine; Heinstein, Charles; MacDonald, Jimmy; Needle, Robert; Hatchette, Todd F; LeBlanc, Jason J
2017-11-01
Serology remains the mainstay for diagnosis of Epstein-Barr virus (EBV) infection. This study compared two automated platforms (BioPlex 2200 and Architect i2000SR) to test three EBV serological markers: viral capsid antigen (VCA) immunoglobulins of class M (IgM), VCA immunoglobulins of class G (IgG) and EBV nuclear antigen-1 (EBNA-1) IgG. Using sera from 65 patients at various stages of EBV disease, BioPlex demonstrated near-perfect agreement for all EBV markers compared to a consensus reference. The agreement for Architect was near-perfect for VCA IgG and EBNA-1 IgG, and substantial for VCA IgM despite five equivocal results. Since the majority of testing in our hospital was from adults with EBNA-1 IgG positive results, post-implementation analysis of an EBNA-based algorithm showed advantages over parallel testing of the three serologic markers. This small verification demonstrated that both automated systems for EBV serology had good performance for all EBV markers, and an EBNA-based testing algorithm is ideal for an adult hospital.
Design and tolerance analysis of a transmission sphere by interferometer model
NASA Astrophysics Data System (ADS)
Peng, Wei-Jei; Ho, Cheng-Fong; Lin, Wen-Lung; Yu, Zong-Ru; Huang, Chien-Yao; Hsu, Wei-Yao
2015-09-01
The design of a 6-in, f/2.2 transmission sphere for Fizeau interferometry is presented in this paper. To predict the actual performance during design phase, we build an interferometer model combined with tolerance analysis in Zemax. Evaluating focus imaging is not enough for a double pass optical system. Thus, we study the interferometer model that includes system error, wavefronts reflected from reference surface and tested surface. Firstly, we generate a deformation map of the tested surface. Because of multiple configurations in Zemax, we can get the test wavefront and the reference wavefront reflected from the tested surface and the reference surface of transmission sphere respectively. According to the theory of interferometry, we subtract both wavefronts to acquire the phase of tested surface. Zernike polynomial is applied to transfer the map from phase to sag and to remove piston, tilt and power. The restored map is the same as original map; because of no system error exists. Secondly, perturbed tolerances including fabrication of lenses and assembly are considered. The system error occurs because the test and reference beam are no longer common path perfectly. The restored map is inaccurate while the system error is added. Although the system error can be subtracted by calibration, it should be still controlled within a small range to avoid calibration error. Generally the reference wavefront error including the system error and the irregularity of the reference surface of 6-in transmission sphere is measured within peak-to-valley (PV) 0.1 λ (λ=0.6328 um), which is not easy to approach. Consequently, it is necessary to predict the value of system error before manufacture. Finally, a prototype is developed and tested by a reference surface with PV 0.1 λ irregularity.
ERIC Educational Resources Information Center
Boardman, Randolph M.
2010-01-01
In a perfect world, students would never talk back to school staff and never argue or fight with each other. They would complete all their assigned tasks, and disciplinary actions never would be needed. Unfortunately, people don't live in a perfect world. Student behavior is a daily concern. Teachers continue to refer students to the office as a…
2003-11-04
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, a balloon gently lifts the solar array panel to be installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-04
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, the Gravity Probe B spacecraft is seen with all four solar array panels installed. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-03
VANDENBERG AFB, CALIF. - A worker in the NASA spacecraft processing facility on North Vandenberg Air Force Base adjust the supports on a solar array panel to be lifted and installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-04
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, the Gravity Probe B spacecraft is seen with two solar array panels installed. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-04
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, a worker checks the installation of a solar array panel onto the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-03
VANDENBERG AFB, CALIF. - Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base prepare for the installation of solar array panel 3 on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-03
VANDENBERG AFB, CALIF. - Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base prepare to rotate the framework containing one of four solar panels to be installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-03
VANDENBERG AFB, CALIF. - Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base work on a solar array panel to be installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-04
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, workers prepare to attach the top of a solar array panel onto the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-03
VANDENBERG AFB, CALIF. - Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base attach a solar array panel on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
2003-11-03
VANDENBERG AFB, CALIF. - Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base attach supports to a solar array panel to be lifted and installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
Computation of Thermally Perfect Compressible Flow Properties
NASA Technical Reports Server (NTRS)
Witte, David W.; Tatum, Kenneth E.; Williams, S. Blake
1996-01-01
A set of compressible flow relations for a thermally perfect, calorically imperfect gas are derived for a value of c(sub p) (specific heat at constant pressure) expressed as a polynomial function of temperature and developed into a computer program, referred to as the Thermally Perfect Gas (TPG) code. The code is available free from the NASA Langley Software Server at URL http://www.larc.nasa.gov/LSS. The code produces tables of compressible flow properties similar to those found in NACA Report 1135. Unlike the NACA Report 1135 tables which are valid only in the calorically perfect temperature regime the TPG code results are also valid in the thermally perfect, calorically imperfect temperature regime, giving the TPG code a considerably larger range of temperature application. Accuracy of the TPG code in the calorically perfect and in the thermally perfect, calorically imperfect temperature regimes are verified by comparisons with the methods of NACA Report 1135. The advantages of the TPG code compared to the thermally perfect, calorically imperfect method of NACA Report 1135 are its applicability to any type of gas (monatomic, diatomic, triatomic, or polyatomic) or any specified mixture of gases, ease-of-use, and tabulated results.
Exact Solutions in Three-Dimensional Gravity
NASA Astrophysics Data System (ADS)
García-Díaz, Alberto A.
2017-09-01
Preface; 1. Introduction; 2. Point particles; 3. Dust solutions; 4. AdS cyclic symmetric stationary solutions; 5. Perfect fluid static stars; 6. Static perfect fluid stars with Λ; 7. Hydrodynamic equilibrium; 8. Stationary perfect fluid with Λ; 9. Friedmann–Robertson–Walker cosmologies; 10. Dilaton-inflaton FRW cosmologies; 11. Einstein–Maxwell solutions; 12. Nonlinear electrodynamics black hole; 13. Dilaton minimally coupled to gravity; 14. Dilaton non-minimally coupled to gravity; 15. Low energy 2+1 string gravity; 16. Topologically massive gravity; 17. Bianchi type spacetimes in TMG; 18. Petrov type N wave metrics; 19. Kundt spacetimes in TMG; 20. Cotton tensor in Riemannian spacetimes; References; Index.
Performance of the Xpert HIV-1 Viral Load Assay: a Systematic Review and Meta-analysis
Nash, Madlen; Huddart, Sophie; Badar, Sayema; Baliga, Shrikala; Saravu, Kavitha
2018-01-01
ABSTRACT Viral load (VL) is the preferred treatment-monitoring approach for HIV-positive patients. However, more rapid, near-patient, and low-complexity assays are needed to scale up VL testing. The Xpert HIV-1 VL assay (Cepheid, Sunnyvale, CA) is a new, automated molecular test, and it can leverage the GeneXpert systems that are being used widely for tuberculosis diagnosis. We systematically reviewed the evidence on the performance of this new tool in comparison to established reference standards. A total of 12 articles (13 studies) in which HIV patient VLs were compared between Xpert HIV VL assay and a reference standard VL assay were identified. Study quality was generally high, but substantial variability was observed in the number and type of agreement measures reported. Correlation coefficients between Xpert and reference assays were high, with a pooled Pearson correlation (n = 8) of 0.94 (95% confidence interval [CI], 0.89, 0.97) and Spearman correlation (n = 3) of 0.96 (95% CI, 0.86, 0.99). Bland-Altman metrics (n = 11) all were within 0.35 log copies/ml of perfect agreement. Overall, Xpert HIV-1 VL performed well compared to current reference tests. The minimal training and infrastructure requirements for the Xpert HIV-1 VL assay make it attractive for use in resource-constrained settings, where point-of-care VL testing is most needed. PMID:29386266
Performance of the Xpert HIV-1 Viral Load Assay: a Systematic Review and Meta-analysis.
Nash, Madlen; Huddart, Sophie; Badar, Sayema; Baliga, Shrikala; Saravu, Kavitha; Pai, Madhukar
2018-04-01
Viral load (VL) is the preferred treatment-monitoring approach for HIV-positive patients. However, more rapid, near-patient, and low-complexity assays are needed to scale up VL testing. The Xpert HIV-1 VL assay (Cepheid, Sunnyvale, CA) is a new, automated molecular test, and it can leverage the GeneXpert systems that are being used widely for tuberculosis diagnosis. We systematically reviewed the evidence on the performance of this new tool in comparison to established reference standards. A total of 12 articles (13 studies) in which HIV patient VLs were compared between Xpert HIV VL assay and a reference standard VL assay were identified. Study quality was generally high, but substantial variability was observed in the number and type of agreement measures reported. Correlation coefficients between Xpert and reference assays were high, with a pooled Pearson correlation ( n = 8) of 0.94 (95% confidence interval [CI], 0.89, 0.97) and Spearman correlation ( n = 3) of 0.96 (95% CI, 0.86, 0.99). Bland-Altman metrics ( n = 11) all were within 0.35 log copies/ml of perfect agreement. Overall, Xpert HIV-1 VL performed well compared to current reference tests. The minimal training and infrastructure requirements for the Xpert HIV-1 VL assay make it attractive for use in resource-constrained settings, where point-of-care VL testing is most needed. Copyright © 2018 Nash et al.
Role of color memory in successive color constancy.
Ling, Yazhu; Hurlbert, Anya
2008-06-01
We investigate color constancy for real 2D paper samples using a successive matching paradigm in which the observer memorizes a reference surface color under neutral illumination and after a temporal interval selects a matching test surface under the same or different illumination. We find significant effects of the illumination, reference surface, and their interaction on the matching error. We characterize the matching error in the absence of illumination change as the "pure color memory shift" and introduce a new index for successive color constancy that compares this shift against the matching error under changing illumination. The index also incorporates the vector direction of the matching errors in chromaticity space, unlike the traditional constancy index. With this index, we find that color constancy is nearly perfect.
ERIC Educational Resources Information Center
Thompson, Michael; Tsui, Stella; Leung, Chi Fan
2011-01-01
A sweet spot is referred to in sport as the perfect place to strike a ball with a racquet or bat. It is the point of contact between bat and ball where maximum results can be produced with minimal effort from the hand of the player. Similar physics can be applied to the less inspiring examples of door stops; the perfect position of a door stop is…
Validity of flowmeter data in heterogeneous alluvial aquifers
NASA Astrophysics Data System (ADS)
Bianchi, Marco
2017-04-01
Numerical simulations are performed to evaluate the impact of medium-scale sedimentary architecture and small-scale heterogeneity on the validity of the borehole flowmeter test, a widely used method for measuring hydraulic conductivity (K) at the scale required for detailed groundwater flow and solute transport simulations. Reference data from synthetic K fields representing the range of structures and small-scale heterogeneity typically observed in alluvial systems are compared with estimated values from numerical simulations of flowmeter tests. Systematic errors inherent in the flowmeter K estimates are significant when the reference K field structure deviates from the hypothetical perfectly stratified conceptual model at the basis of the interpretation method of flowmeter tests. Because of these errors, the true variability of the K field is underestimated and the distributions of the reference K data and log-transformed spatial increments are also misconstrued. The presented numerical analysis shows that the validity of flowmeter based K data depends on measureable parameters defining the architecture of the hydrofacies, the conductivity contrasts between the hydrofacies and the sub-facies-scale K variability. A preliminary geological characterization is therefore essential for evaluating the optimal approach for accurate K field characterization.
An implicit adaptation algorithm for a linear model reference control system
NASA Technical Reports Server (NTRS)
Mabius, L.; Kaufman, H.
1975-01-01
This paper presents a stable implicit adaptation algorithm for model reference control. The constraints for stability are found using Lyapunov's second method and do not depend on perfect model following between the system and the reference model. Methods are proposed for satisfying these constraints without estimating the parameters on which the constraints depend.
Flinkman, Mika; Nakauchi, Shigeki
2017-10-01
In this research, three illuminants that improve color discrimination ability of people with red-green color vision deficiency were developed. The illuminants are close to daylight-colored and were produced by using spectral optimization. Deutans were the focus of this research, but a few protans were also tested for reference. The illuminants were produced by combining different types of LEDs, and their effects were tested with several test subjects with and without color vision deficiency using the Ishihara color vision test and the Farnsworth Panel D-15 test. The illuminant with the most powerful effect provided near perfect results with the Ishihara test for deutans, while the other two illuminants produced smaller improvements. The Farnsworth Panel D-15 test produced results that were similar to the Ishihara test though generally the color discrimination of blue hues was weaker under the most powerful illuminant.
Classical and quantum communication without a shared reference frame.
Bartlett, Stephen D; Rudolph, Terry; Spekkens, Robert W
2003-07-11
We show that communication without a shared reference frame is possible using entangled states. Both classical and quantum information can be communicated with perfect fidelity without a shared reference frame at a rate that asymptotically approaches one classical bit or one encoded qubit per transmitted qubit. We present an optical scheme to communicate classical bits without a shared reference frame using entangled photon pairs and linear optical Bell state measurements.
Palacios-Flores, Kim; García-Sotelo, Jair; Castillo, Alejandra; Uribe, Carina; Aguilar, Luis; Morales, Lucía; Gómez-Romero, Laura; Reyes, José; Garciarubio, Alejandro; Boege, Margareta; Dávila, Guillermo
2018-01-01
We present a conceptually simple, sensitive, precise, and essentially nonstatistical solution for the analysis of genome variation in haploid organisms. The generation of a Perfect Match Genomic Landscape (PMGL), which computes intergenome identity with single nucleotide resolution, reveals signatures of variation wherever a query genome differs from a reference genome. Such signatures encode the precise location of different types of variants, including single nucleotide variants, deletions, insertions, and amplifications, effectively introducing the concept of a general signature of variation. The precise nature of variants is then resolved through the generation of targeted alignments between specific sets of sequence reads and known regions of the reference genome. Thus, the perfect match logic decouples the identification of the location of variants from the characterization of their nature, providing a unified framework for the detection of genome variation. We assessed the performance of the PMGL strategy via simulation experiments. We determined the variation profiles of natural genomes and of a synthetic chromosome, both in the context of haploid yeast strains. Our approach uncovered variants that have previously escaped detection. Moreover, our strategy is ideally suited for further refining high-quality reference genomes. The source codes for the automated PMGL pipeline have been deposited in a public repository. PMID:29367403
2003-11-04
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, workers stand by as the balloon at right is released to lift the solar array panel into position for installation on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
Biedermann, N; Hayes, B; Usher, K; Williams, A
2000-01-01
In research, there is no perfection: no perfect method, no perfect sample, and no perfect data analyses tool. Coming to this understanding helps the researcher identify the inadequacies of their preferred method. This paper discusses the criticisms of the oral history method, drawing reference to its challenges and difficulties in relation to its use in nursing research. Oral history has the advantage over more traditional historical approaches in that the narrators can interpret events, personalities and relationships within the interview that are not accessible from written sources. The oral history interview may also provide a forum for unveiling documents and photographs, which might not have been otherwise discovered. Nonetheless, oral history, like most methodologies, is not flawless. This paper discusses the limitations of oral history and suggests ways in which a nurse can use oral history to provide an account of aspects of nursing history.
Experimental and Computational Aerothermodynamics of a Mars Entry Vehicle
NASA Technical Reports Server (NTRS)
Hollis, Brian R.
1996-01-01
An aerothermodynamic database has been generated through both experimental testing and computational fluid dynamics simulations for a 70 deg sphere-cone configuration based on the NASA Mars Pathfinder entry vehicle. The aerothermodynamics of several related parametric configurations were also investigated. Experimental heat-transfer data were obtained at hypersonic test conditions in both a perfect gas air wind tunnel and in a hypervelocity, high-enthalpy expansion tube in which both air and carbon dioxide were employed as test gases. In these facilities, measurements were made with thin-film temperature-resistance gages on both the entry vehicle models and on the support stings of the models. Computational results for freestream conditions equivalent to those of the test facilities were generated using an axisymmetric/2D laminar Navier-Stokes solver with both perfect-gas and nonequilibrium thermochemical models. Forebody computational and experimental heating distributions agreed to within the experimental uncertainty for both the perfect-gas and high-enthalpy test conditions. In the wake, quantitative differences between experimental and computational heating distributions for the perfect-gas conditions indicated transition of the free shear layer near the reattachment point on the sting. For the high enthalpy cases, agreement to within, or slightly greater than, the experimental uncertainty was achieved in the wake except within the recirculation region, where further grid resolution appeared to be required. Comparisons between the perfect-gas and high-enthalpy results indicated that the wake remained laminar at the high-enthalpy test conditions, for which the Reynolds number was significantly lower than that of the perfect-gas conditions.
When Practice Doesn't Make Perfect: Effects of Task Goals on Learning Computing Concepts
ERIC Educational Resources Information Center
Miller, Craig S.; Settle, Amber
2011-01-01
Specifying file references for hypertext links is an elementary competence that nevertheless draws upon core computational thinking concepts such as tree traversal and the distinction between relative and absolute references. In this article we explore the learning effects of different instructional strategies in the context of an introductory…
Palacios-Flores, Kim; García-Sotelo, Jair; Castillo, Alejandra; Uribe, Carina; Aguilar, Luis; Morales, Lucía; Gómez-Romero, Laura; Reyes, José; Garciarubio, Alejandro; Boege, Margareta; Dávila, Guillermo
2018-04-01
We present a conceptually simple, sensitive, precise, and essentially nonstatistical solution for the analysis of genome variation in haploid organisms. The generation of a Perfect Match Genomic Landscape (PMGL), which computes intergenome identity with single nucleotide resolution, reveals signatures of variation wherever a query genome differs from a reference genome. Such signatures encode the precise location of different types of variants, including single nucleotide variants, deletions, insertions, and amplifications, effectively introducing the concept of a general signature of variation. The precise nature of variants is then resolved through the generation of targeted alignments between specific sets of sequence reads and known regions of the reference genome. Thus, the perfect match logic decouples the identification of the location of variants from the characterization of their nature, providing a unified framework for the detection of genome variation. We assessed the performance of the PMGL strategy via simulation experiments. We determined the variation profiles of natural genomes and of a synthetic chromosome, both in the context of haploid yeast strains. Our approach uncovered variants that have previously escaped detection. Moreover, our strategy is ideally suited for further refining high-quality reference genomes. The source codes for the automated PMGL pipeline have been deposited in a public repository. Copyright © 2018 by the Genetics Society of America.
NASA Astrophysics Data System (ADS)
Ren, Zhi Ying.; Gao, ChengHui.; Han, GuoQiang.; Ding, Shen; Lin, JianXing.
2014-04-01
Dual tree complex wavelet transform (DT-CWT) exhibits superiority of shift invariance, directional selectivity, perfect reconstruction (PR), and limited redundancy and can effectively separate various surface components. However, in nano scale the morphology contains pits and convexities and is more complex to characterize. This paper presents an improved approach which can simultaneously separate reference and waviness and allows an image to remain robust against abnormal signals. We included a bilateral filtering (BF) stage in DT-CWT to solve imaging problems. In order to verify the feasibility of the new method and to test its performance we used a computer simulation based on three generations of Wavelet and Improved DT-CWT and we conducted two case studies. Our results show that the improved DT-CWT not only enhances the robustness filtering under the conditions of abnormal interference, but also possesses accuracy and reliability of the reference and waviness from the 3-D nano scalar surfaces.
2003-11-10
VANDENBERG AFB, CALIF. - In the NASA spacecraft processing facility on North Vandenberg Air Force Base, Dr. Francis Everitt, principal investigator, and Brad Parkinson, co-principal investigator, both from Stanford University, hold one of the small gyroscopes used in the Gravity Probe B spacecraft. The GP-B towers behind them. The Gravity Probe B mission is a relativity experiment developed by NASA’s Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einstein’s general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earth’s rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
Rychlik, Michał; Samborski, Włodzimierz
2015-01-01
The aim of this study was to assess the validity and test-retest reliability of Thermovision Technique of Dry Needling (TTDN) for the gluteus minimus muscle. TTDN is a new thermography approach used to support trigger points (TrPs) diagnostic criteria by presence of short-term vasomotor reactions occurring in the area where TrPs refer pain. Method. Thirty chronic sciatica patients (n=15 TrP-positive and n=15 TrPs-negative) and 15 healthy volunteers were evaluated by TTDN three times during two consecutive days based on TrPs of the gluteus minimus muscle confirmed additionally by referred pain presence. TTDN employs average temperature (T avr), maximum temperature (T max), low/high isothermal-area, and autonomic referred pain phenomenon (AURP) that reflects vasodilatation/vasoconstriction. Validity and test-retest reliability were assessed concurrently. Results. Two components of TTDN validity and reliability, T avr and AURP, had almost perfect agreement according to κ (e.g., thigh: 0.880 and 0.938; calf: 0.902 and 0.956, resp.). The sensitivity for T avr, T max, AURP, and high isothermal-area was 100% for everyone, but specificity of 100% was for T avr and AURP only. Conclusion. TTDN is a valid and reliable method for T avr and AURP measurement to support TrPs diagnostic criteria for the gluteus minimus muscle when digitally evoked referred pain pattern is present. PMID:26137486
ERIC Educational Resources Information Center
Merwin, Rhonda M.; Wilson, Kelly G.
2005-01-01
Thirty-two subjects completed 2 stimulus equivalence tasks using a matching-to-sample paradigm. One task involved direct reinforcement of conditional discriminations designed to produce derived relations between self-referring stimuli (e.g., me, myself, I) and positive evaluation words (e.g., whole, desirable, perfect). The other task was designed…
Hit by a Perfect Storm? Art & Design in the National Student Survey
ERIC Educational Resources Information Center
Yorke, Mantz; Orr, Susan; Blair, Bernadette
2014-01-01
There has long been the suspicion amongst staff in Art & Design that the ratings given to their subject disciplines in the UK's National Student Survey are adversely affected by a combination of circumstances--a "perfect storm". The "perfect storm" proposition is tested by comparing ratings for Art & Design with those…
Too Many References, Just Cut a Few and It Will Be Perfect: APA vs. Chicago
ERIC Educational Resources Information Center
Matusov, Eugene
2011-01-01
The purpose of the author's rebuttal is to provide a sociocultural critique of Wolff-Michael Roth and Michael Cole's arguments and of the current "Mind, Culture, and Activity" ("MCA") policy regarding referencing. In sum, Roth and Cole write that some unnamed "MCA" authors (and some scholars in general) abuse the reference practice by…
Jafarzadeh, S Reza; Johnson, Wesley O; Gardner, Ian A
2016-03-15
The area under the receiver operating characteristic (ROC) curve (AUC) is used as a performance metric for quantitative tests. Although multiple biomarkers may be available for diagnostic or screening purposes, diagnostic accuracy is often assessed individually rather than in combination. In this paper, we consider the interesting problem of combining multiple biomarkers for use in a single diagnostic criterion with the goal of improving the diagnostic accuracy above that of an individual biomarker. The diagnostic criterion created from multiple biomarkers is based on the predictive probability of disease, conditional on given multiple biomarker outcomes. If the computed predictive probability exceeds a specified cutoff, the corresponding subject is allocated as 'diseased'. This defines a standard diagnostic criterion that has its own ROC curve, namely, the combined ROC (cROC). The AUC metric for cROC, namely, the combined AUC (cAUC), is used to compare the predictive criterion based on multiple biomarkers to one based on fewer biomarkers. A multivariate random-effects model is proposed for modeling multiple normally distributed dependent scores. Bayesian methods for estimating ROC curves and corresponding (marginal) AUCs are developed when a perfect reference standard is not available. In addition, cAUCs are computed to compare the accuracy of different combinations of biomarkers for diagnosis. The methods are evaluated using simulations and are applied to data for Johne's disease (paratuberculosis) in cattle. Copyright © 2015 John Wiley & Sons, Ltd.
FY16 Status Report on Development of Integrated EPP and SMT Design Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jetter, R. I.; Sham, T. -L.; Wang, Y.
2016-08-01
The goal of the Elastic-Perfectly Plastic (EPP) combined integrated creep-fatigue damage evaluation approach is to incorporate a Simplified Model Test (SMT) data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. The EPP methodology is based on the idea that creep damage and strain accumulation can be bounded by a properly chosen “pseudo” yield strength used in an elastic-perfectly plastic analysis, thus avoiding the need for stress classification. The originalmore » SMT approach is based on the use of elastic analysis. The experimental data, cycles to failure, is correlated using the elastically calculated strain range in the test specimen and the corresponding component strain is also calculated elastically. The advantage of this approach is that it is no longer necessary to use the damage interaction, or D-diagram, because the damage due to the combined effects of creep and fatigue are accounted in the test data by means of a specimen that is designed to replicate or bound the stress and strain redistribution that occurs in actual components when loaded in the creep regime. The reference approach to combining the two methodologies and the corresponding uncertainties and validation plans are presented. Results from recent key feature tests are discussed to illustrate the applicability of the EPP methodology and the behavior of materials at elevated temperature when undergoing stress and strain redistribution due to plasticity and creep.« less
NASA Astrophysics Data System (ADS)
Russano, G.; Cavalleri, A.; Cesarini, A.; Dolesi, R.; Ferroni, V.; Gibert, F.; Giusteri, R.; Hueller, M.; Liu, L.; Pivato, P.; Tu, H. B.; Vetrugno, D.; Vitale, S.; Weber, W. J.
2018-02-01
LISA Pathfinder is a differential accelerometer with the main goal being to demonstrate the near perfect free-fall of reference test masses, as is needed for an orbiting gravitational wave observatory, with a target sensitivity of 30 fm s‑2 Hz-1/2 at 1 mHz. Any lasting background differential acceleration between the two test masses must be actively compensated, and noise associated with the applied actuation force can be a dominant source of noise. To remove this actuation, and the associated force noise, a ‘free-fall’ actuation control scheme has been designed; actuation is limited to brief impulses, with both test masses in free-fall in the time between the impulses, allowing measurement of the remaining acceleration noise sources. In this work, we present an on-ground torsion pendulum testing campaign of this technique and associated data analysis algorithms at a level nearing the sub-femto-g/\\sqrtHz performance required for LISA Pathfinder.
QUALITY CONTROL - VARIABILITY IN PROTOCOLS
The EPA Risk Reduction Engineering Laboratory’s Quality Assurance Office, which published the popular pocket guide Preparing Perfect Project Plans, is now introducing another quality assurance reference aid. The document Variability in Protocols (VIP) was initially designed as a ...
Object Based Systems Engineering
2011-10-17
practically impossible where the original SMEs are unavailable or lack perfect recall. 7. Capture the precious and transient logic behind this...complex system. References 1. FITCH, J. Exploiting Decision-to-Requirements Traceability, briefing to NDIA CMMI Conference, November, 2009 2
Validation of extremes within the Perfect-Predictor Experiment of the COST Action VALUE
NASA Astrophysics Data System (ADS)
Hertig, Elke; Maraun, Douglas; Wibig, Joanna; Vrac, Mathieu; Soares, Pedro; Bartholy, Judith; Pongracz, Rita; Mares, Ileana; Gutierrez, Jose Manuel; Casanueva, Ana; Alzbutas, Robertas
2016-04-01
Extreme events are of widespread concern due to their damaging consequences on natural and anthropogenic systems. From science to applications the statistical attributes of rare and infrequent occurrence and low probability become connected with the socio-economic aspect of strong impact. Specific end-user needs regarding information about extreme events depend on the type of application, but as a joining element there is always the request for easily accessible climate change information with a clear description of their uncertainties and limitations. Within the Perfect-Predictor Experiment of the COST Action VALUE extreme indices modelled from a wide range of downscaling methods are compared to reference indices calculated from observational data. The experiment uses reference data from a selection of 86 weather stations representative of the different climates in Europe. Results are presented for temperature and precipitation extremes and include aspects of the marginal distribution as well as spell-length related aspects.
A nudging-based data assimilation method: the Back and Forth Nudging (BFN) algorithm
NASA Astrophysics Data System (ADS)
Auroux, D.; Blum, J.
2008-03-01
This paper deals with a new data assimilation algorithm, called Back and Forth Nudging. The standard nudging technique consists in adding to the equations of the model a relaxation term that is supposed to force the observations to the model. The BFN algorithm consists in repeatedly performing forward and backward integrations of the model with relaxation (or nudging) terms, using opposite signs in the direct and inverse integrations, so as to make the backward evolution numerically stable. This algorithm has first been tested on the standard Lorenz model with discrete observations (perfect or noisy) and compared with the variational assimilation method. The same type of study has then been performed on the viscous Burgers equation, comparing again with the variational method and focusing on the time evolution of the reconstruction error, i.e. the difference between the reference trajectory and the identified one over a time period composed of an assimilation period followed by a prediction period. The possible use of the BFN algorithm as an initialization for the variational method has also been investigated. Finally the algorithm has been tested on a layered quasi-geostrophic model with sea-surface height observations. The behaviours of the two algorithms have been compared in the presence of perfect or noisy observations, and also for imperfect models. This has allowed us to reach a conclusion concerning the relative performances of the two algorithms.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, a worker checks the installation of a solar array panel onto the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, the Gravity Probe B spacecraft is seen with two solar array panels installed. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base attach a solar array panel on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, the Gravity Probe B spacecraft is seen with all four solar array panels installed. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base work on a solar array panel to be installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base attach a solar array panel on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base prepare for the installation of solar array panel 3 on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
ERIC Educational Resources Information Center
Besken, Miri
2016-01-01
The perceptual fluency hypothesis claims that items that are easy to perceive at encoding induce an illusion that they will be easier to remember, despite the finding that perception does not generally affect recall. The current set of studies tested the predictions of the perceptual fluency hypothesis with a picture generation manipulation.…
Yang, Ming; Yang, Yuan-Zhen; Wang, Ya-Qi; Wu, Zhen-Feng; Wang, Xue-Cheng; Luo, Jing
2017-03-01
Product quality relies on not only testing methods,but also the design and development, production control and product manufacturing all aspects of logistics management. Quality comes from the process control level.Therefore, it is very important to accurately identify the factors that may induce quality risk in the production process and quality control measures correspondingly.This article systematically analyzes the source of the quality risk of all aspects of the production process in traditional Chinese medicine preparation. Discussing ways and methods of quality risk identification of traditional Chinese medicine preparation and providing references for perfecting the whole process quality management of traditional Chinese medicine preparation. Copyright© by the Chinese Pharmaceutical Association.
WOGEN. Work Order Generation Macros for Word Perfect 6.X for Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grenz, G.
Included are three general WP macros (two independent and one multiple) and a template used at the Test Reactor Area (TRA) for the generation of the Work Orders (WO`s) used to perform corrective and preventative maintenance, as well as modifications of existing systems and installation of new systems. They incorporate facility specific requirements as well as selected federal/state orders. These macros are used to generate a WP document which is then converted into ASCII text for import to the maintenance software. Currently we are using MCRS but should be compatible with other platforms such as Passport. Reference the included filemore » Wogen.txt for installation and usage instructions.« less
Work Order Generation Macros for Word Perfect 6.X for Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grenz, Gordon G.
Included are three general WP macros (two independent and one multiple) and a template used at the Test Reactor Area (TRA) for the generation of the Work Orders (WO's) used to perform corrective and preventative maintenance, as well as modifications of existing systems and installation of new systems. They incorporate facility specific requirements as well as selected federal/state orders. These macros are used to generate a WP document which is then converted into ASCII text for import to the maintenance software. Currently we are using MCRS but should be compatible with other platforms such as Passport. Reference the included filemore » Wogen.txt for installation and usage instructions.« less
Yang, Jubiao; Yu, Feimi; Krane, Michael; Zhang, Lucy T
2018-01-01
In this work, a non-reflective boundary condition, the Perfectly Matched Layer (PML) technique, is adapted and implemented in a fluid-structure interaction numerical framework to demonstrate that proper boundary conditions are not only necessary to capture correct wave propagations in a flow field, but also its interacted solid behavior and responses. While most research on the topics of the non-reflective boundary conditions are focused on fluids, little effort has been done in a fluid-structure interaction setting. In this study, the effectiveness of the PML is closely examined in both pure fluid and fluid-structure interaction settings upon incorporating the PML algorithm in a fully-coupled fluid-structure interaction framework, the Immersed Finite Element Method. The performance of the PML boundary condition is evaluated and compared to reference solutions with a variety of benchmark test cases including known and expected solutions of aeroacoustic wave propagation as well as vortex shedding and advection. The application of the PML in numerical simulations of fluid-structure interaction is then investigated to demonstrate the efficacy and necessity of such boundary treatment in order to capture the correct solid deformation and flow field without the requirement of a significantly large computational domain.
The structure of paranoia in the general population.
Bebbington, Paul E; McBride, Orla; Steel, Craig; Kuipers, Elizabeth; Radovanovic, Mirjana; Brugha, Traolach; Jenkins, Rachel; Meltzer, Howard I; Freeman, Daniel
2013-06-01
Psychotic phenomena appear to form a continuum with normal experience and beliefs, and may build on common emotional interpersonal concerns. We tested predictions that paranoid ideation is exponentially distributed and hierarchically arranged in the general population, and that persecutory ideas build on more common cognitions of mistrust, interpersonal sensitivity and ideas of reference. Items were chosen from the Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II) questionnaire and the Psychosis Screening Questionnaire in the second British National Survey of Psychiatric Morbidity (n = 8580), to test a putative hierarchy of paranoid development using confirmatory factor analysis, latent class analysis and factor mixture modelling analysis. Different types of paranoid ideation ranged in frequency from less than 2% to nearly 30%. Total scores on these items followed an almost perfect exponential distribution (r = 0.99). Our four a priori first-order factors were corroborated (interpersonal sensitivity; mistrust; ideas of reference; ideas of persecution). These mapped onto four classes of individual respondents: a rare, severe, persecutory class with high endorsement of all item factors, including persecutory ideation; a quasi-normal class with infrequent endorsement of interpersonal sensitivity, mistrust and ideas of reference, and no ideas of persecution; and two intermediate classes, characterised respectively by relatively high endorsement of items relating to mistrust and to ideas of reference. The paranoia continuum has implications for the aetiology, mechanisms and treatment of psychotic disorders, while confirming the lack of a clear distinction from normal experiences and processes.
Wang, Zhuoyu; Dendukuri, Nandini; Pai, Madhukar; Joseph, Lawrence
2017-11-01
When planning a study to estimate disease prevalence to a pre-specified precision, it is of interest to minimize total testing cost. This is particularly challenging in the absence of a perfect reference test for the disease because different combinations of imperfect tests need to be considered. We illustrate the problem and a solution by designing a study to estimate the prevalence of childhood tuberculosis in a hospital setting. All possible combinations of 3 commonly used tuberculosis tests, including chest X-ray, tuberculin skin test, and a sputum-based test, either culture or Xpert, are considered. For each of the 11 possible test combinations, 3 Bayesian sample size criteria, including average coverage criterion, average length criterion and modified worst outcome criterion, are used to determine the required sample size and total testing cost, taking into consideration prior knowledge about the accuracy of the tests. In some cases, the required sample sizes and total testing costs were both reduced when more tests were used, whereas, in other examples, lower costs are achieved with fewer tests. Total testing cost should be formally considered when designing a prevalence study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Longhi, Stefano, E-mail: stefano.longhi@fisi.polimi.it
Quantum recurrence and dynamic localization are investigated in a class of ac-driven tight-binding Hamiltonians, the Krawtchouk quantum chain, which in the undriven case provides a paradigmatic Hamiltonian model that realizes perfect quantum state transfer and mirror inversion. The equivalence between the ac-driven single-particle Krawtchouk Hamiltonian H{sup -hat} (t) and the non-interacting ac-driven bosonic junction Hamiltonian enables to determine in a closed form the quasi energy spectrum of H{sup -hat} (t) and the conditions for exact wave packet reconstruction (dynamic localization). In particular, we show that quantum recurrence, which is predicted by the general quantum recurrence theorem, is exact for themore » Krawtchouk quantum chain in a dense range of the driving amplitude. Exact quantum recurrence provides perfect wave packet reconstruction at a frequency which is fractional than the driving frequency, a phenomenon that can be referred to as fractional dynamic localization.« less
Risk Aversion and the Value of Information.
ERIC Educational Resources Information Center
Eeckhoudt, Louis; Godfroid, Phillippe
2000-01-01
Explains why risk aversion does not always induce a greater information value, but instead may induce a lower information value when increased. Presents a basic model defining the concept of perfect information value and providing a numerical illustration. Includes references. (CMK)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amouyal, Gregory, E-mail: gregamouyal@hotmail.com; Thiounn, Nicolas, E-mail: nicolas.thiounn@aphp.fr; Pellerin, Olivier, E-mail: olivier.pellerin@aphp.fr
BackgroundProstatic artery embolization (PAE) has been performed for a few years, but there is no report on PAE using the PErFecTED technique outside from the team that initiated this approach.ObjectiveThis single-center retrospective open label study reports our experience and clinical results on patients suffering from symptomatic BPH, who underwent PAE aiming at using the PErFecTED technique.Materials and MethodsWe treated 32 consecutive patients, mean age 65 (52–84 years old) between December 2013 and January 2015. Patients were referred for PAE after failure of medical treatment and refusal or contra-indication to surgery. They were treated using the PErFecTED technique, when feasible, with 300–500 µmmore » calibrated microspheres (two-night hospital stay or outpatient procedure). Follow-up was performed at 3, 6, and 12 months.ResultsWe had a 100 % immediate technical success of embolization (68 % of feasibility of the PErFecTED technique) with no immediate complications. After a mean follow-up of 7.7 months, we observed a 78 % rate of clinical success. Mean IPSS decreased from 15.3 to 4.2 (p = .03), mean QoL from 5.4 to 2 (p = .03), mean Qmax increased from 9.2 to 19.2 (p = .25), mean prostatic volume decreased from 91 to 62 (p = .009) mL. There was no retrograde ejaculation and no major complication.ConclusionPAE using the PErFecTED technique is a safe and efficient technique to treat bothersome LUTS related to BPH. It is of interest to note that the PErFecTED technique cannot be performed in some cases for anatomical reasons.« less
Comparing diagnostic tests on benefit-risk.
Pennello, Gene; Pantoja-Galicia, Norberto; Evans, Scott
2016-01-01
Comparing diagnostic tests on accuracy alone can be inconclusive. For example, a test may have better sensitivity than another test yet worse specificity. Comparing tests on benefit risk may be more conclusive because clinical consequences of diagnostic error are considered. For benefit-risk evaluation, we propose diagnostic yield, the expected distribution of subjects with true positive, false positive, true negative, and false negative test results in a hypothetical population. We construct a table of diagnostic yield that includes the number of false positive subjects experiencing adverse consequences from unnecessary work-up. We then develop a decision theory for evaluating tests. The theory provides additional interpretation to quantities in the diagnostic yield table. It also indicates that the expected utility of a test relative to a perfect test is a weighted accuracy measure, the average of sensitivity and specificity weighted for prevalence and relative importance of false positive and false negative testing errors, also interpretable as the cost-benefit ratio of treating non-diseased and diseased subjects. We propose plots of diagnostic yield, weighted accuracy, and relative net benefit of tests as functions of prevalence or cost-benefit ratio. Concepts are illustrated with hypothetical screening tests for colorectal cancer with test positive subjects being referred to colonoscopy.
Can the Farnsworth D15 Color Vision Test Be Defeated through Practice?
Ng, Jason S; Liem, Sophia C
2018-05-01
This study suggests that it is possible for some patients with severe red-green color vision deficiency to do perfectly on the Farnsworth D15 test after practicing it. The Farnsworth D15 is a commonly used test to qualify people for certain occupations. For patients with color vision deficiency, there may be high motivation to try to pass the test through practice to gain entry into a particular occupation. There is no evidence in the literature on whether it is possible for patients to learn to pass the D15 test through practice. Ten subjects with inherited red-green color vision deficiency and 15 color-normal subjects enrolled in the study. All subjects had anomaloscope testing, color vision book tests, and a Farnsworth D15 at an initial visit. For the D15, the number of major crossovers was determined for each subject. Failing the D15 was determined as greater than 1 major crossover. Subjects with color vision deficiency practiced the D15 as long as desired to achieve a perfect score and then returned for a second visit for D15 testing. A paired t test was used to analyze the number of major crossovers at visit 1 versus visit 2. Color-normal subjects did not have any major crossovers. Subjects with color vision deficiency had significantly (P < .001) fewer major crossovers on the D15 test at visit 2 (mean/SD = 2.5/3.0), including five subjects with dichromacy that achieved perfect D15 performance, compared to visit 1 (mean/SD = 8.7/1.3). Practice of the Farnsworth D15 test can lead to perfect performance for some patients with color vision deficiency, and this should be considered in certain cases where occupational entry is dependent on D15 testing.
Borrini, Francesco; Bolognese, Antonio; Lamy, Aude; Sabourin, Jean-Christophe
2015-01-01
KRAS genotyping is mandatory in metastatic colorectal cancer treatment prior to undertaking antiepidermal growth factor receptor (EGFR) monoclonal antibody therapy. BRAF V600E mutation is often present in colorectal carcinoma with CpG island methylator phenotype and microsatellite instability. Currently, KRAS and BRAF evaluation is based on molecular biology techniques such as SNaPshot or Sanger sequencing. As molecular testing is performed on formalin-fixed paraffin-embedded (FFPE) samples, immunodetection would appear to be an attractive alternative for detecting mutations. Thus, our objective was to assess the validity of KRAS and BRAF immunodetection of mutations compared with the genotyping reference method in colorectal adenocarcinoma. KRAS and BRAF genotyping was assessed by SNaPshot. A rabbit anti-human KRAS polyclonal antibody was tested on 33 FFPE colorectal tumor samples with known KRAS status. Additionally, a mouse anti-human BRAF monoclonal antibody was tested on 30 FFPE tumor samples with known BRAF status. KRAS immunostaining demonstrated both poor sensitivity (27%) and specificity (64%) in detecting KRAS mutation. Conversely, BRAF immunohistochemistry showed perfect sensitivity (100%) and specificity (100%) in detecting V600E mutation. Although molecular biology remains the reference method for detecting KRAS mutation, immunohistochemistry could be an attractive method for detecting BRAF V600E mutation in colorectal cancer. PMID:25983749
Highlights from BNL and RHIC 2015
NASA Astrophysics Data System (ADS)
Tannenbaum, M. J.
The following sections are included: * Introduction * News from BNL since ISSP2014 * RHIC Operations in 2015 and accelerator future plans * 10th Anniversary Celebration of the Perfect Liquid * RHIC Beam Energy Scan (BES) in search of a critical point-aided by Lattice QCD * References
NASA Astrophysics Data System (ADS)
Guerrout, EL-Hachemi; Ait-Aoudia, Samy; Michelucci, Dominique; Mahiou, Ramdane
2018-05-01
Many routine medical examinations produce images of patients suffering from various pathologies. With the huge number of medical images, the manual analysis and interpretation became a tedious task. Thus, automatic image segmentation became essential for diagnosis assistance. Segmentation consists in dividing the image into homogeneous and significant regions. We focus on hidden Markov random fields referred to as HMRF to model the problem of segmentation. This modelisation leads to a classical function minimisation problem. Broyden-Fletcher-Goldfarb-Shanno algorithm referred to as BFGS is one of the most powerful methods to solve unconstrained optimisation problem. In this paper, we investigate the combination of HMRF and BFGS algorithm to perform the segmentation operation. The proposed method shows very good segmentation results comparing with well-known approaches. The tests are conducted on brain magnetic resonance image databases (BrainWeb and IBSR) largely used to objectively confront the results obtained. The well-known Dice coefficient (DC) was used as similarity metric. The experimental results show that, in many cases, our proposed method approaches the perfect segmentation with a Dice Coefficient above .9. Moreover, it generally outperforms other methods in the tests conducted.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, workers stand by as the balloon at right is released to lift the solar array panel into position for installation on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, workers prepare to attach the top of a solar array panel onto the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base attach supports to a solar array panel to be lifted and installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, workers prepare to attach the top of a solar array panel onto the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. Workers in the NASA spacecraft processing facility on North Vandenberg Air Force Base prepare to rotate the framework containing one of four solar panels to be installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. A worker in the NASA spacecraft processing facility on North Vandenberg Air Force Base adjust the supports on a solar array panel to be lifted and installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, a balloon gently lifts the solar array panel to be installed on the Gravity Probe B spacecraft. Installing each array is a 3-day process and includes a functional deployment test. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
Educational Gymnastics: Enhancing Children's Physical Literacy
ERIC Educational Resources Information Center
Baumgarten, Sam; Pagnano-Richardson, Karen
2010-01-01
Virtually all current physical education curriculum guides and textbooks include sections on learner outcomes based on the national standards for physical education, which often refer to gymnastics skills. Gymnastics is a perfect venue for teaching movement concepts, developing and maintaining overall body fitness, fostering personal and social…
NASA Technical Reports Server (NTRS)
Larimer, James; Gille, Jennifer; Luszcz, Jeff; Hindson, William S. (Technical Monitor)
1997-01-01
Carlson and Cohen suggest that 'the perfect image is one that looks like a piece of the world viewed through a picture frame.' They propose that the metric for the perfect image be the discriminability of the reconstructed image from the ideal image the reconstruction is meant to represent. If these two images, the ideal and the reconstruction are noticeably different, then the reconstruction is less than perfect. If they cannot be discriminated then the reconstructed image is perfect. This definition has the advantage that it can be used to define 'good enough' image quality. An image that fully satisfies a task's image quality requirements for example text legibility, is selected to be the standard. Rendered images are then compared to the standard. Rendered images that are indiscriminable from the standard are good enough. Test patterns and test image sets serve as standards for many tasks and are commonplace to the image communications and display industries, so this is not a new nor novel idea.
Boshkova, T; Mitev, K
2016-03-01
In this work we present test procedures, approval criteria and results from two metrological inspections of a certified large volume (152)Eu source (drum about 200L) intended for calibration of HPGe gamma assay systems used for activity measurement of radioactive waste drums. The aim of the inspections was to prove the stability of the calibration source during its working life. The large volume source was designed and produced in 2007. It consists of 448 identical sealed radioactive sources (modules) apportioned in 32 transparent plastic tubes which were placed in a wooden matrix which filled the drum. During the inspections the modules were subjected to tests for verification of their certified characteristics. The results show a perfect compliance with the NIST basic guidelines for the properties of a radioactive certified reference material (CRM) and demonstrate the stability of the large volume CRM-drum after 7 years of operation. Copyright © 2015 Elsevier Ltd. All rights reserved.
A differential equation for the Generalized Born radii.
Fogolari, Federico; Corazza, Alessandra; Esposito, Gennaro
2013-06-28
The Generalized Born (GB) model offers a convenient way of representing electrostatics in complex macromolecules like proteins or nucleic acids. The computation of atomic GB radii is currently performed by different non-local approaches involving volume or surface integrals. Here we obtain a non-linear second-order partial differential equation for the Generalized Born radius, which may be solved using local iterative algorithms. The equation is derived under the assumption that the usual GB approximation to the reaction field obeys Laplace's equation. The equation admits as particular solutions the correct GB radii for the sphere and the plane. The tests performed on a set of 55 different proteins show an overall agreement with other reference GB models and "perfect" Poisson-Boltzmann based values.
High stability wavefront reference source
Feldman, M.; Mockler, D.J.
1994-05-03
A thermally and mechanically stable wavefront reference source which produces a collimated output laser beam is disclosed. The output beam comprises substantially planar reference wavefronts which are useful for aligning and testing optical interferometers. The invention receives coherent radiation from an input optical fiber, directs a diverging input beam of the coherent radiation to a beam folding mirror (to produce a reflected diverging beam), and collimates the reflected diverging beam using a collimating lens. In a class of preferred embodiments, the invention includes a thermally and mechanically stable frame comprising rod members connected between a front end plate and a back end plate. The beam folding mirror is mounted on the back end plate, and the collimating lens mounted to the rods between the end plates. The end plates and rods are preferably made of thermally stable metal alloy. Preferably, the input optical fiber is a single mode fiber coupled to an input end of a second single mode optical fiber that is wound around a mandrel fixedly attached to the frame of the apparatus. The output end of the second fiber is cleaved so as to be optically flat, so that the input beam emerging therefrom is a nearly perfect diverging spherical wave. 7 figures.
High stability wavefront reference source
Feldman, Mark; Mockler, Daniel J.
1994-01-01
A thermally and mechanically stable wavefront reference source which produces a collimated output laser beam. The output beam comprises substantially planar reference wavefronts which are useful for aligning and testing optical interferometers. The invention receives coherent radiation from an input optical fiber, directs a diverging input beam of the coherent radiation to a beam folding mirror (to produce a reflected diverging beam), and collimates the reflected diverging beam using a collimating lens. In a class of preferred embodiments, the invention includes a thermally and mechanically stable frame comprising rod members connected between a front end plate and a back end plate. The beam folding mirror is mounted on the back end plate, and the collimating lens mounted to the rods between the end plates. The end plates and rods are preferably made of thermally stable metal alloy. Preferably, the input optical fiber is a single mode fiber coupled to an input end of a second single mode optical fiber that is wound around a mandrel fixedly attached to the frame of the apparatus. The output end of the second fiber is cleaved so as to be optically flat, so that the input beam emerging therefrom is a nearly perfect diverging spherical wave.
Code of Federal Regulations, 2010 CFR
2010-07-01
... property subject to forfeiture. (f) The terms Chief, Asset Forfeiture and Money Laundering Section, and Chief, refer to the Chief of the Asset Forfeiture and Money Laundering Section, Criminal Division...) Was created as a result of an exchange of money, goods, or services; and (3) Is perfected against the...
Overemphasis on Perfectly Competitive Markets in Microeconomics Principles Textbooks
ERIC Educational Resources Information Center
Hill, Roderick; Myatt, Anthony
2007-01-01
Microeconomic principles courses focus on perfectly competitive markets far more than other market structures. The authors examine five possible reasons for this but find none of them sufficiently compelling. They conclude that textbook authors should place more emphasis on how economists select appropriate models and test models' predictions…
Time reference in agrammatic aphasia: A cross-linguistic study
Bastiaanse, Roelien; Bamyaci, Elif; Hsu, Chien-Ju; Lee, Jiyeon; Duman, Tuba Yarbay; Thompson, Cynthia K.
2015-01-01
It has been shown across several languages that verb inflection is difficult for agrammatic aphasic speakers. In particular, Tense inflection is vulnerable. Several theoretical accounts for this have been posed, for example, a pure syntactic one suggesting that the Tense node is unavailable due to its position in the syntactic tree (Friedmann & Grodzinsky, 1997); one suggesting that the interpretable features of the Tense node are underspecified (Burchert, Swoboda-Moll, & De Bleser, 2005; Wenzlaff & Clahsen, 2004, 2005); and a morphosemantic one, arguing that the diacritic Tense features are affected in agrammatism (Faroqi–Shah & Dickey, 2009; Lee, Milman, & Thompson, 2008). However recent findings (Bastiaanse, 2008) and a reanalysis of some oral production studies (e.g. Lee et al., 2008; Nanousi, Masterson, Druks, & Atkinson, 2006) suggest that both Tense and Aspect are impaired and, most importantly, reference to the past is selectively impaired, both through simple verb forms (such as simple past in English) and through periphrastic verb forms (such as the present perfect, ‘has V-ed’, in English). It will be argued that reference to the past is discourse linked and reference to the present and future is not (Zagona, 2003, in press). In-line with Avrutin’s (2000) theory that suggests discourse linking is impaired in Broca’s aphasia, the PAst DIscourse LInking Hypothesis (PADILIH) has been formulated. Three predictions were tested: (1) patients with agrammatic aphasia are selectively impaired in use of grammatical morphology associated with reference to the past, whereas, inflected forms which refer to the present and future are relatively spared; (2) this impairment is language-independent; and (3) this impairment will occur in both production and comprehension. Agrammatic Chinese, English and Turkish speakers were tested with the Test for Assessing Reference of Time (TART; Bastiaanse, Jonkers, & Thompson, unpublished). Results showed that both the English and Turkish agrammatic speakers performed as hypothesized, showing a selective deficit for production of inflected forms referring to the past, despite the typological difference between the languages. The Chinese agrammatic speakers were poor in reference to the past as well, but reference to the present and future also was severely impaired. For comprehension, the results were strikingly similar for the three languages: reference to the past was impaired for all. These results confirmed our hypothesis that reference to the past is discourse linked and, therefore, grammatical morphology used for reference to the past is impaired in agrammatic aphasia, whether this is done through Tense and/or Aspect markers. PMID:26451073
NASA Plum Brook's B-2 Test Facility: Thermal Vacuum and Propellant Test Facility
NASA Technical Reports Server (NTRS)
Kudlac, Maureen T.; Weaver, Harold F.; Cmar, Mark D.
2012-01-01
The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) Plum Brook Station (PBS) Spacecraft Propulsion Research Facility, commonly referred to as B-2, is NASA's third largest thermal vacuum facility. It is the largest designed to store and transfer large quantities of liquid hydrogen and liquid oxygen, and is perfectly suited to support developmental testing of upper stage chemical propulsion systems as well as fully integrated stages. The facility is also capable of providing thermal-vacuum simulation services to support testing of large lightweight structures, Cryogenic Fluid Management (CFM) systems, electric propulsion test programs, and other In-Space propulsion programs. A recently completed integrated system test demonstrated the refurbished thermal vacuum capabilities of the facility. The test used the modernized data acquisition and control system to monitor the facility. The heat sink provided a uniform temperature environment of approximately 77 K. The modernized infrared lamp array produced a nominal heat flux of 1.4 kW/sq m. With the lamp array and heat sink operating simultaneously, the thermal systems produced a heat flux pattern simulating radiation to space on one surface and solar exposure on the other surface.
Pairagon: a highly accurate, HMM-based cDNA-to-genome aligner.
Lu, David V; Brown, Randall H; Arumugam, Manimozhiyan; Brent, Michael R
2009-07-01
The most accurate way to determine the intron-exon structures in a genome is to align spliced cDNA sequences to the genome. Thus, cDNA-to-genome alignment programs are a key component of most annotation pipelines. The scoring system used to choose the best alignment is a primary determinant of alignment accuracy, while heuristics that prevent consideration of certain alignments are a primary determinant of runtime and memory usage. Both accuracy and speed are important considerations in choosing an alignment algorithm, but scoring systems have received much less attention than heuristics. We present Pairagon, a pair hidden Markov model based cDNA-to-genome alignment program, as the most accurate aligner for sequences with high- and low-identity levels. We conducted a series of experiments testing alignment accuracy with varying sequence identity. We first created 'perfect' simulated cDNA sequences by splicing the sequences of exons in the reference genome sequences of fly and human. The complete reference genome sequences were then mutated to various degrees using a realistic mutation simulator and the perfect cDNAs were aligned to them using Pairagon and 12 other aligners. To validate these results with natural sequences, we performed cross-species alignment using orthologous transcripts from human, mouse and rat. We found that aligner accuracy is heavily dependent on sequence identity. For sequences with 100% identity, Pairagon achieved accuracy levels of >99.6%, with one quarter of the errors of any other aligner. Furthermore, for human/mouse alignments, which are only 85% identical, Pairagon achieved 87% accuracy, higher than any other aligner. Pairagon source and executables are freely available at http://mblab.wustl.edu/software/pairagon/
Perfect polydactylism in hind feet of a gray squirrel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dunaway, P.B.
1969-01-01
An adult gray squirrel from an isolated natural population had seven toes and nine plantar tubercles on each hind foot. The extra digits were articulated with supernumerary cuneiforms and appeared to have been functional. Polydactylism in wild adult rodents is probably a rare anomaly. 6 references, 2 figures.
Cyber and the American Way of War
2015-04-13
perfect fit in the American way of war, cyber’s uniqueness will challenge the current American way of war. To operate effectively in war that includes...Counter Terrorism Reference Center. 36 Danzig, Richard J. Surviving on a Diet of Poisoned Fruit: Reducing the National Security Risks of America’s
The Digital Sublime: Lessons from Kelli Connell's "Double Life"
ERIC Educational Resources Information Center
Huang, Yi-hui
2012-01-01
The digital sublime refers to digital-composite photography that presents "the existence of something unpresentable" and that renders a matchless look a sophisticated fabrication, a perfect and clean composition, a maximum color saturation, a multiple-point perspective, and stunning or newfangled content. Abandoning the traditional one-shot mode…
Perfect count: a novel approach for the single platform enumeration of absolute CD4+ T-lymphocytes.
Storie, Ian; Sawle, Alex; Goodfellow, Karen; Whitby, Liam; Granger, Vivian; Ward, Rosalie Y; Peel, Janet; Smart, Theresa; Reilly, John T; Barnett, David
2004-01-01
The derivation of reliable CD4(+) T lymphocyte counts is vital for the monitoring of disease progression and therapeutic effectiveness in HIV(+) individuals. Flow cytometry has emerged as the method of choice for CD4(+) T lymphocyte enumeration, with single-platform technology, coupled with reference counting beads, fast becoming the "gold standard." However, although single-platform, bead-based, sample acquisition requires the ratio of beads to cells to remain unchanged, there is no available method, until recently, to monitor this. Perfect Count beads have been developed to address this issue and to incorporate two bead populations, with different densities, to allow the detection of inadequate mixing. Comparison of the relative proportions of both beads with the manufacture's defined limits enables an internal QC check during sample acquisition. In this study, we have compared CD4(+) T lymphocyte counts, obtained from 104 HIV(+) patients, using TruCount beads with MultiSet software (defined as the predicated method) and the new Perfect Count beads, incorporating an in house sequential gating strategy. We have demonstrated an excellent degree of correlation between the predicate method and the Perfect Count system (r(2) = 0.9955; Bland Altman bias +27 CD4(+) T lymphocytes/microl). The Perfect Count system is a robust method for performing single platform absolute counts and has the added advantage of having internal QC checks. Such an approach enables the operator to identify potential problems during sample preparation, acquisition and analysis. Copyright 2003 Wiley-Liss, Inc.
Junction Potentials Bias Measurements of Ion Exchange Membrane Permselectivity.
Kingsbury, Ryan S; Flotron, Sophie; Zhu, Shan; Call, Douglas F; Coronell, Orlando
2018-04-17
Ion exchange membranes (IEMs) are versatile materials relevant to a variety of water and waste treatment, energy production, and industrial separation processes. The defining characteristic of IEMs is their ability to selectively allow positive or negative ions to permeate, which is referred to as permselectivity. Measured values of permselectivity that equal unity (corresponding to a perfectly selective membrane) or exceed unity (theoretically impossible) have been reported for cation exchange membranes (CEMs). Such nonphysical results call into question our ability to correctly measure this crucial membrane property. Because weighing errors, temperature, and measurement uncertainty have been shown to not explain these anomalous permselectivity results, we hypothesized that a possible explanation are junction potentials that occur at the tips of reference electrodes. In this work, we tested this hypothesis by comparing permselectivity values obtained from bare Ag/AgCl wire electrodes (which have no junction) to values obtained from single-junction reference electrodes containing two different electrolytes. We show that permselectivity values obtained using reference electrodes with junctions were greater than unity for CEMs. In contrast, electrodes without junctions always produced permselectivities lower than unity. Electrodes with junctions also resulted in artificially low permselectivity values for AEMs compared to electrodes without junctions. Thus, we conclude that junctions in reference electrodes introduce two biases into results in the IEM literature: (i) permselectivity values larger than unity for CEMs and (ii) lower permselectivity values for AEMs compared to those for CEMs. These biases can be avoided by using electrodes without a junction.
Modeling Rare and Unique Documents: Using FRBR[subscript OO]/CIDOC CRM
ERIC Educational Resources Information Center
Le Boeuf, Patrick
2012-01-01
Both the library and the museum communities have developed conceptual models for the information they produce about the collections they hold: FRBR (Functional Requirements for Bibliographic Records) and CIDOC CRM (Conceptual Reference Model). But neither proves perfectly adequate when it comes to some specific types of rare and unique materials:…
Basic Grammar in Use: Reference and Practice for Students of English.
ERIC Educational Resources Information Center
Murphy, Raymond
This basic grammar book for beginning to low-intermediate level students of English contains 106 units. The units are divided into the following categories: Present; Past; Present Perfect; Passive; Future and Modals; Imperative; "There" and "It"; Verb Forms; Auxiliary Verbs; Negatives; Questions; "To" and "-ing"; Reported Speech; "Get" and "Go";…
12 CFR 201.110 - Goods held by persons employed by owner.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Board has taken into consideration the changes that have occurred in commercial law and practice since 1933. Modern commercial law, embodied in the Uniform Commercial Code, refers to “perfecting security interests” rather than “securing title” to goods. The Board believes that if, under State law, the issuance...
Spectral Anonymization of Data
Lasko, Thomas A.; Vinterbo, Staal A.
2011-01-01
The goal of data anonymization is to allow the release of scientifically useful data in a form that protects the privacy of its subjects. This requires more than simply removing personal identifiers from the data, because an attacker can still use auxiliary information to infer sensitive individual information. Additional perturbation is necessary to prevent these inferences, and the challenge is to perturb the data in a way that preserves its analytic utility. No existing anonymization algorithm provides both perfect privacy protection and perfect analytic utility. We make the new observation that anonymization algorithms are not required to operate in the original vector-space basis of the data, and many algorithms can be improved by operating in a judiciously chosen alternate basis. A spectral basis derived from the data’s eigenvectors is one that can provide substantial improvement. We introduce the term spectral anonymization to refer to an algorithm that uses a spectral basis for anonymization, and we give two illustrative examples. We also propose new measures of privacy protection that are more general and more informative than existing measures, and a principled reference standard with which to define adequate privacy protection. PMID:21373375
Mens, Petra F; Matelon, Raphael J; Nour, Bakri Y M; Newman, Dave M; Schallig, Henk D F H
2010-07-19
This study describes the laboratory evaluation of a novel diagnostic platform for malaria. The Magneto Optical Test (MOT) is based on the bio-physical detection of haemozoin in clinical samples. Having an assay time of around one minute, it offers the potential of high throughput screening. Blood samples of confirmed malaria patients from different regions of Africa, patients with other diseases and healthy non-endemic controls were used in the present study. The samples were analysed with two reference tests, i.e. an histidine rich protein-2 based rapid diagnostic test (RDT) and a conventional Pan-Plasmodium PCR, and the MOT as index test. Data were entered in 2 x 2 tables and analysed for sensitivity and specificity. The agreement between microscopy, RDT and PCR and the MOT assay was determined by calculating Kappa values with a 95% confidence interval. The observed sensitivity/specificity of the MOT test in comparison with clinical description, RDT or PCR ranged from 77.2 - 78.8% (sensitivity) and from 72.5 - 74.6% (specificity). In general, the agreement between MOT and the other assays is around 0.5 indicating a moderate agreement between the reference and the index test. However, when RDT and PCR are compared to each other, an almost perfect agreement can be observed (k = 0.97) with a sensitivity and specificity of >95%. Although MOT sensitivity and specificity are currently not yet at a competing level compared to other diagnostic test, such as PCR and RDTs, it has a potential to rapidly screen patients for malaria in endemic as well as non-endemic countries.
Sattler, Tatjana; Wodak, Eveline; Revilla-Fernández, Sandra; Schmoll, Friedrich
2014-12-18
In recent years, several new ELISAs for the detection of antibodies against the porcine reproductive and respiratory disease virus (PRRSV) in pig serum have been developed. To interpret the results, specificity and sensitivity data as well as agreement to a reference ELISA must be available. In this study, three commercial ELISAs (INgezim PRRS 2.0 - ELISA II, Priocheck® PRRSV Ab porcine - ELISA III and CIVTEST suis PRRS E/S PLUS - ELISA IV, detecting PRRSV type 1 antibodies) were compared to a standard ELISA (IDEXX PRRS X3 Ab Test - ELISA I). The serum of three pigs vaccinated with an attenuated PRRSV live vaccine (genotype 2) was tested prior to and several times after the vaccination. Furthermore, serum samples of 245 pigs of PRRSV positive herds, 309 pigs of monitored PRRSV negative herds, 256 fatteners of assumed PRRSV negative herds with unknown herd history and 92 wild boars were tested with all four ELISAs. ELISAs II and III were able to detect seroconversion of vaccinated pigs with a similar reliability. According to kappa coefficient, the results showed an almost perfect agreement between ELISA I as reference and ELISA II and III (kappa > 0.8), and substantial agreement between ELISA I and ELISA IV (kappa = 0.71). Sensitivity of ELISA II, III and IV was 96.0%, 100% and 91.5%, respectively. The specificity of the ELISAs determined in samples of monitored PRRSV negative herds was 99.0%, 95.1% and 96.4%, respectively. In assumed negative farms that were not continually monitored, more positive samples were found with ELISA II to IV. The reference ELISA I had a specificity of 100% in this study. All tested ELISAs were able to detect a PRRSV positive herd. The specificity and sensitivity of the tested commercial ELISAs, however, differed. ELISA II had the highest specificity and ELISA III had the highest sensitivity in comparison to the reference ELISA. ELISA IV had a lower sensitivity and specificity than the other ELISAs.
Spherical gradient-index lenses as perfect imaging and maximum power transfer devices.
Gordon, J M
2000-08-01
Gradient-index lenses can be viewed from the perspectives of both imaging and nonimaging optics, that is, in terms of both image fidelity and achievable flux concentration. The simple class of gradient-index lenses with spherical symmetry, often referred to as modified Luneburg lenses, is revisited. An alternative derivation for established solutions is offered; the method of Fermat's strings and the principle of skewness conservation are invoked. Then these nominally perfect imaging devices are examined from the additional vantage point of power transfer, and the degree to which they realize the thermodynamic limit to flux concentration is determined. Finally, the spherical gradient-index lens of the fish eye is considered as a modified Luneburg lens optimized subject to material constraints.
X-ray Moiré deflectometry using synthetic reference images
Stutman, Dan; Valdivia, Maria Pia; Finkenthal, Michael
2015-06-25
Moiré fringe deflectometry with grating interferometers is a technique that enables refraction-based x-ray imaging using a single exposure of an object. To obtain the refraction image, the method requires a reference fringe pattern (without the object). Our study shows that, in order to avoid artifacts, the reference pattern must be exactly matched in phase with the object fringe pattern. In experiments, however, it is difficult to produce a perfectly matched reference pattern due to unavoidable interferometer drifts. We present a simple method to obtain matched reference patterns using a phase-scan procedure to generate synthetic Moiré images. As a result, themore » method will enable deflectometric diagnostics of transient phenomena such as laser-produced plasmas and could improve the sensitivity and accuracy of medical phase-contrast imaging.« less
Space, myth and cinematography
NASA Astrophysics Data System (ADS)
Hambardzumov, Arsen
2016-12-01
There exist both ancient and modern myths. The competition of good and evil, sanctity, mythic hero character, etc. make up those myths. Connection between the myth and literature, art and mainly cinematography is highly essential. Hollywood is a striking example of that connection, in other words "A Dream Factory". The mythic component in American films is obvious. It refers to the product structure which is frequently created by mythic rules. One of its striking examples is D. Lucas's film "Star wars. Episode IV - New Hope" (1977): The film plot is built on the struggle between the good and the evil. On one hand those are the representatives of the Empire with Darth Vader and princess Leia with her devotees on the other. The space has played a unique role for Greek philosophers as well. It was the symbol of perfection and grace. The attempt to approach this perfection, the desire to see the internal similarity besides the external one has been reflected in S. Kubrick's film "2001: Space Odyssey" (1968). Showing the space distance director looks for perfection in us which lies in the harmony of truth, human and nature.
Benchmark radar targets for the validation of computational electromagnetics programs
NASA Technical Reports Server (NTRS)
Woo, Alex C.; Wang, Helen T. G.; Schuh, Michael J.; Sanders, Michael L.
1993-01-01
Results are presented of a set of computational electromagnetics validation measurements referring to three-dimensional perfectly conducting smooth targets, performed for the Electromagnetic Code Consortium. Plots are presented for both the low- and high-frequency measurements of the NASA almond, an ogive, a double ogive, a cone-sphere, and a cone-sphere with a gap.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-17
... website at http://www.nasdaqtrader.com/micro.aspx?id=PHLXRulefilings , at the principal office of the Exchange, on the Commission's website at http://www.sec.gov/ , and at the Commission's Public Reference... designed to promote just and equitable principles of trade, to remove impediments to and perfect the...
From the Secondary Section: Green Pens, Marginal Notes--Rethinking Writing and Student Engagement
ERIC Educational Resources Information Center
Chadwick, Jocelyn A.
2012-01-01
In his foundational work, "English Composition and Rhetoric," Alexander Bain set forth the framework for what students and teachers now routinely refer to as the five-paragraph essay. Teachers were so inculcated with Bain's paradigm for the "perfect" essay format, they in turn have inculcated their students, and they just say now, "Write an…
Saini, V.; Riekerink, R. G. M. Olde; McClure, J. T.; Barkema, H. W.
2011-01-01
Determining the accuracy and precision of a measuring instrument is pertinent in antimicrobial susceptibility testing. This study was conducted to predict the diagnostic accuracy of the Sensititre MIC mastitis panel (Sensititre) and agar disk diffusion (ADD) method with reference to the manual broth microdilution test method for antimicrobial resistance profiling of Escherichia coli (n = 156), Staphylococcus aureus (n = 154), streptococcal (n = 116), and enterococcal (n = 31) bovine clinical mastitis isolates. The activities of ampicillin, ceftiofur, cephalothin, erythromycin, oxacillin, penicillin, the penicillin-novobiocin combination, pirlimycin, and tetracycline were tested against the isolates. Diagnostic accuracy was determined by estimating the area under the receiver operating characteristic curve; intertest essential and categorical agreements were determined as well. Sensititre and the ADD method demonstrated moderate to highly accurate (71 to 99%) and moderate to perfect (71 to 100%) predictive accuracies for 74 and 76% of the isolate-antimicrobial MIC combinations, respectively. However, the diagnostic accuracy was low for S. aureus-ceftiofur/oxacillin combinations and other streptococcus-ampicillin combinations by either testing method. Essential agreement between Sensititre automatic MIC readings and MIC readings obtained by the broth microdilution test method was 87%. Essential agreement between Sensititre automatic and manual MIC reading methods was 97%. Furthermore, the ADD test method and Sensititre MIC method exhibited 92 and 91% categorical agreement (sensitive, intermediate, resistant) of results, respectively, compared with the reference method. However, both methods demonstrated lower agreement for E. coli-ampicillin/cephalothin combinations than for Gram-positive isolates. In conclusion, the Sensititre and ADD methods had moderate to high diagnostic accuracy and very good essential and categorical agreement for most udder pathogen-antimicrobial combinations and can be readily employed in veterinary diagnostic laboratories. PMID:21270215
Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M
2015-01-01
Objective Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection in general practice. Setting All testing procedures were performed at a diagnostic centre for primary care in the Netherlands. Urine samples were collected at four general practices. Primary and secondary outcome measures Analytical performance and agreement of the POCT analysers regarding nitrite, leucocytes and erythrocytes, with the laboratory reference standard, was the primary outcome measure, and analysed by calculating sensitivity, specificity, positive and negative predictive value, and Cohen's κ coefficient for agreement. Secondary outcome measures were the user-friendliness of the POCT analysers, in addition to other characteristics of the analysers. Results The following six POCT analysers were evaluated: Uryxxon Relax (Macherey Nagel), Urisys 1100 (Roche), Clinitek Status (Siemens), Aution 11 (Menarini), Aution Micro (Menarini) and Urilyzer (Analyticon). Analytical performance was good for all analysers. Compared with laboratory reference standards, overall agreement was good, but differed per parameter and per analyser. Concerning the nitrite test, the most important test for clinical practice, all but one showed perfect agreement with the laboratory standard. For leucocytes and erythrocytes specificity was high, but sensitivity was considerably lower. Agreement for leucocytes varied between good to very good, and for the erythrocyte test between fair and good. First-time users indicated that the analysers were easy to use. They expected higher productivity and accuracy when using these analysers in daily practice. Conclusions The overall performance and user-friendliness of all six commercially available POCT urine analysers was sufficient to justify routine use in suspected urinary tract infections in general practice. PMID:25986635
Dental age assessment of southern Chinese using the United Kingdom Caucasian reference dataset.
Jayaraman, Jayakumar; Roberts, Graham J; King, Nigel M; Wong, Hai Ming
2012-03-10
Dental age assessment is one the most accurate methods for estimating the age of an unknown person. Demirjian's dataset on a French-Canadian population has been widely tested for its applicability on various ethnic groups including southern Chinese. Following inaccurate results from these studies, investigators are now confronted with using alternate datasets for comparison. Testing the applicability of other reliable datasets which result in accurate findings might limit the need to develop population specific standards. Recently, a Reference Data Set (RDS) similar to the Demirjian was prepared in the United Kingdom (UK) and has been subsequently validated. The advantages of the UK Caucasian RDS includes versatility from including both the maxillary and mandibular dentitions, involvement of a wide age group of subjects for evaluation and the possibility of precise age estimation with the mathematical technique of meta-analysis. The aim of this study was to evaluate the applicability of the United Kingdom Caucasian RDS on southern Chinese subjects. Dental panoramic tomographs (DPT) of 266 subjects (133 males and 133 females) aged 2-21 years that were previously taken for clinical diagnostic purposes were selected and scored by a single calibrated examiner based on Demirjian's classification of tooth developmental stages (A-H). The ages corresponding to each stage of tooth developmental stage were obtained from the UK dataset. Intra-examiner reproducibility was tested and the Cohen kappa (0.88) showed that the level of agreement was 'almost perfect'. The estimated dental age was then compared with the chronological age using a paired t-test, with statistical significance set at p<0.01. The results showed that the UK dataset, underestimated the age of southern Chinese subjects by 0.24 years but the results were not statistically significant. In conclusion, the UK Caucasian RDS may not be suitable for estimating the age of southern Chinese subjects and there is a need for an ethnic specific reference dataset for southern Chinese. Copyright © 2011. Published by Elsevier Ireland Ltd.
ERIC Educational Resources Information Center
To, Son Thanh
2012-01-01
"Belief state" refers to the set of possible world states satisfying the agent's (usually imperfect) knowledge. The use of belief state allows the agent to reason about the world with incomplete information, by considering each possible state in the belief state individually, in the same way as if it had perfect knowledge. However, the…
Enko, Dietmar; Pollheimer, Verena; Németh, Stefan; Pühringer, Helene; Stolba, Robert; Halwachs-Baumann, Gabriele; Kriegshäuser, Gernot
2016-01-01
Genetic testing is a standard technique for the diagnosis of primary adult-type hypolactasia, also referred to as lactase non-persistence. The aim of this study was to compare the lactase gene (LCT) C/T-13910 polymorphism genotyping results of two commercially available real-time (RT)-PCR assays in patients referred to our outpatient clinic for primary lactose malabsorption testing. Furthermore, concomitant conditions of fructose/sorbitol malabsorption were assessed. Samples obtained from 100 patients were tested in parallel using the LCT T-13910C ToolSet for Light Cycler (Roche, Rotkreuz, Switzerland) and the LCT-13910C>T RealFast Assay (ViennaLab Diagnostics GmbH, Vienna, Austria). Additionally, patients were also screened for the presence of fructose/sorbitol malabsorption by functional hydrogen (H2)/methane (CH4) breath testing (HMBT). Cohen's Kappa (κ) was used to calculate the agreement between the two genotyping methods. The exact Chi-Square test was performed to compare fructose/sorbitol HMBT with LCT genotyping results. Twenty-one (21.0%) patients had a LCT C/C-13910 genotype suggestive of lactase non-persistence, and 79 (79.0%) patients were identified with either a LCT T/C-13910 or T/T-13910 genotype (i.e., lactase persistence). In all genotype groups, concordance between the two RT-PCR assays was 100%. Cohen's κ demonstrated perfect observed agreement (p < 0.001, κ = 1). Fructose and sorbitol malabsorption was observed in 13/100 (13.0%) and 25/100 (25.0%) individuals, respectively. Both RT-PCR assays are robust and reliable LCT genotyping tools in a routine clinical setting. Concomitant fructose and/or sorbitol malabsorption should be considered in individuals with suspected lactase-non-persistence. However, standardization of clinical interpretation of laboratory HMBT results is required.
Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network
NASA Astrophysics Data System (ADS)
Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu
2018-04-01
This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.
Werner, Liliana; Müller, Matthias; Tetz, Manfred
2008-02-01
To evaluate the microstructure of the edges of currently available square-edged hydrophobic intraocular lenses (IOLs) in terms of their deviation from an ideal square. Berlin Eye Research Institute, Berlin, Germany. Sixteen designs of hydrophobic acrylic or silicone IOLs were studied. For each design, a +20.0 diopter (D) IOL and a +0.0 D IOL (or the lowest available plus dioptric power) were evaluated. The IOL edge was imaged under high-magnification scanning electron microscopy using a standardized technique. The area above the lateral-posterior edge, representing the deviation from a perfect square, was measured in square microns using reference circles of 40 microm and 60 microm of radius and the AutoCAD LT 2000 system (Autodesk). The IOLs were compared with an experimental square-edged poly(methyl methacrylate) (PMMA) IOL (reference IOL) with an edge design that effectively stopped lens epithelial cell growth in culture in a preliminary study. Two round-edged silicone IOLs were used as controls. The hydrophobic IOLs used, labeled as square-edged IOLs, had an area of deviation from a perfect square ranging from 4.8 to 338.4 microm(2) (40 microm radius reference circle) and from 0.2 to 524.4 microm(2) (60 microm radius circle). The deviation area for the square-edged PMMA IOL was 34.0 microm(2) with a 40 microm radius circle and 37.5 microm(2) with a 60 microm radius circle. The respective values for the +20.0 D control silicone IOL were 729.3 microm(2) and 1525.3 microm(2) and for the +0.0 D control silicone IOL, 727.3 microm(2) and 1512.7 microm(2). Seven silicone IOLs of 5 designs had area values that were close to those of the reference square-edged PMMA IOL. Several differences in edge finishing between the IOLs analyzed were also observed. There was a large variation in the deviation area from a perfect square as well as in the edge finishing, not only between different IOL designs but also between different powers of the same design. Clinically, factors such as the shrink-wrapping of the IOL by the capsule may even out or modify the influence of these variations in terms of preventing posterior capsule opacification.
A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
Evaluating the reliability of an injury prevention screening tool: Test-retest study.
Gittelman, Michael A; Kincaid, Madeline; Denny, Sarah; Wervey Arnold, Melissa; FitzGerald, Michael; Carle, Adam C; Mara, Constance A
2016-10-01
A standardized injury prevention (IP) screening tool can identify family risks and allow pediatricians to address behaviors. To assess behavior changes on later screens, the tool must be reliable for an individual and ideally between household members. Little research has examined the reliability of safety screening tool questions. This study utilized test-retest reliability of parent responses on an existing IP questionnaire and also compared responses between household parents. Investigators recruited parents of children 0 to 1 year of age during admission to a tertiary care children's hospital. When both parents were present, one was chosen as the "primary" respondent. Primary respondents completed the 30-question IP screening tool after consent, and they were re-screened approximately 4 hours later to test individual reliability. The "second" parent, when present, only completed the tool once. All participants received a 10-dollar gift card. Cohen's Kappa was used to estimate test-retest reliability and inter-rater agreement. Standard test-retest criteria consider Kappa values: 0.0 to 0.40 poor to fair, 0.41 to 0.60 moderate, 0.61 to 0.80 substantial, and 0.81 to 1.00 as almost perfect reliability. One hundred five families participated, with five lost to follow-up. Thirty-two (30.5%) parent dyads completed the tool. Primary respondents were generally mothers (88%) and Caucasian (72%). Test-retest of the primary respondents showed their responses to be almost perfect; average 0.82 (SD = 0.13, range 0.49-1.00). Seventeen questions had almost perfect test-retest reliability and 11 had substantial reliability. However, inter-rater agreement between household members for 12 objective questions showed little agreement between responses; inter-rater agreement averaged 0.35 (SD = 0.34, range -0.19-1.00). One question had almost perfect inter-rater agreement and two had substantial inter-rater agreement. The IP screening tool used by a single individual had excellent test-retest reliability for nearly all questions. However, when a reporter changes from pre- to postintervention, differences may reflect poor reliability or different subjective experiences rather than true change.
Luginaah, Isaac N; Yiridoe, Emmanuel K; Taabazuing, Mary-Margaret
2005-10-01
This paper examines efforts by some churches in Ghana to reduce the spread of HIV/AIDS. The analysis is based on focus group discussions with two groups of men and two groups of women, along with in-depth interviews with 13 pastors and marriage counsellors in the churches studied. In response to government and public criticisms about human rights violations, churches that previously imposed mandatory HIV testing on members planning to marry now have voluntary testing programmes. However, the results suggest that what the churches refer to as voluntary testing may not be truly voluntary. Cultural values and traditional practices, including traditional courtship and marriage rites (which are performed before church weddings), not only clash with considerations about pre-marital HIV testing but also complicate the contentious issue of confidentiality of information on HIV testing. Associated with these complexities and issues of confidentiality is a reluctance among participants, particularly those from northern Ghana, to test for HIV. The results reveal how broader social impacts of HIV testing for those planning to marry may extend beyond individuals or couples in different cultural contexts. The findings also support the general view that there are no perfect or easy solutions to combating the HIV/AIDS pandemic. Practical solutions and programs for Ghana cannot be neutral to cultural values and need to be tailored for particular (ethnic) populations.
Validation of a Dumbbell Body Sway Test in Olympic Air Pistol Shooting
Mon, Daniel; Zakynthinaki, Maria S.; Cordente, Carlos A.; Monroy Antón, Antonio; López Jiménez, David
2014-01-01
We present and validate a test able to provide reliable body sway measurements in air pistol shooting, without the use of a gun. 46 senior male pistol shooters who participated in Spanish air pistol championships participated in the study. Body sway data of two static bipodal balance tests have been compared: during the first test, shooting was simulated by use of a dumbbell, while during the second test the shooters own pistol was used. Both tests were performed the day previous to the competition, during the official training time and at the training stands to simulate competition conditions. The participantś performance was determined as the total score of 60 shots at competition. Apart from the commonly used variables that refer to movements of the shooters centre of pressure (COP), such as COP displacements on the X and Y axes, maximum and average COP velocities and total COP area, the present analysis also included variables that provide information regarding the axes of the COP ellipse (length and angle in respect to X). A strong statistically significant correlation between the two tests was found (with an interclass correlation varying between 0.59 and 0.92). A statistically significant inverse linear correlation was also found between performance and COP movements. The study concludes that dumbbell tests are perfectly valid for measuring body sway by simulating pistol shooting. PMID:24756067
Lehtola, Susi; Parkhill, John; Head-Gordon, Martin
2016-10-07
Novel implementations based on dense tensor storage are presented here for the singlet-reference perfect quadruples (PQ) [J. A. Parkhill et al., J. Chem. Phys. 130, 084101 (2009)] and perfect hextuples (PH) [J. A. Parkhill and M. Head-Gordon, J. Chem. Phys. 133, 024103 (2010)] models. The methods are obtained as block decompositions of conventional coupled-cluster theory that are exact for four electrons in four orbitals (PQ) and six electrons in six orbitals (PH), but that can also be applied to much larger systems. PQ and PH have storage requirements that scale as the square, and as the cube of the numbermore » of active electrons, respectively, and exhibit quartic scaling of the computational effort for large systems. Applications of the new implementations are presented for full-valence calculations on linear polyenes (C nH n+2), which highlight the excellent computational scaling of the present implementations that can routinely handle active spaces of hundreds of electrons. The accuracy of the models is studied in the π space of the polyenes, in hydrogen chains (H 50), and in the π space of polyacene molecules. In all cases, the results compare favorably to density matrix renormalization group values. With the novel implementation of PQ, active spaces of 140 electrons in 140 orbitals can be solved in a matter of minutes on a single core workstation, and the relatively low polynomial scaling means that very large systems are also accessible using parallel computing.« less
NASA Astrophysics Data System (ADS)
Lehtola, Susi; Parkhill, John; Head-Gordon, Martin
2016-10-01
Novel implementations based on dense tensor storage are presented for the singlet-reference perfect quadruples (PQ) [J. A. Parkhill et al., J. Chem. Phys. 130, 084101 (2009)] and perfect hextuples (PH) [J. A. Parkhill and M. Head-Gordon, J. Chem. Phys. 133, 024103 (2010)] models. The methods are obtained as block decompositions of conventional coupled-cluster theory that are exact for four electrons in four orbitals (PQ) and six electrons in six orbitals (PH), but that can also be applied to much larger systems. PQ and PH have storage requirements that scale as the square, and as the cube of the number of active electrons, respectively, and exhibit quartic scaling of the computational effort for large systems. Applications of the new implementations are presented for full-valence calculations on linear polyenes (CnHn+2), which highlight the excellent computational scaling of the present implementations that can routinely handle active spaces of hundreds of electrons. The accuracy of the models is studied in the π space of the polyenes, in hydrogen chains (H50), and in the π space of polyacene molecules. In all cases, the results compare favorably to density matrix renormalization group values. With the novel implementation of PQ, active spaces of 140 electrons in 140 orbitals can be solved in a matter of minutes on a single core workstation, and the relatively low polynomial scaling means that very large systems are also accessible using parallel computing.
Consistency-based rectification of nonrigid registrations
Gass, Tobias; Székely, Gábor; Goksel, Orcun
2015-01-01
Abstract. We present a technique to rectify nonrigid registrations by improving their group-wise consistency, which is a widely used unsupervised measure to assess pair-wise registration quality. While pair-wise registration methods cannot guarantee any group-wise consistency, group-wise approaches typically enforce perfect consistency by registering all images to a common reference. However, errors in individual registrations to the reference then propagate, distorting the mean and accumulating in the pair-wise registrations inferred via the reference. Furthermore, the assumption that perfect correspondences exist is not always true, e.g., for interpatient registration. The proposed consistency-based registration rectification (CBRR) method addresses these issues by minimizing the group-wise inconsistency of all pair-wise registrations using a regularized least-squares algorithm. The regularization controls the adherence to the original registration, which is additionally weighted by the local postregistration similarity. This allows CBRR to adaptively improve consistency while locally preserving accurate pair-wise registrations. We show that the resulting registrations are not only more consistent, but also have lower average transformation error when compared to known transformations in simulated data. On clinical data, we show improvements of up to 50% target registration error in breathing motion estimation from four-dimensional MRI and improvements in atlas-based segmentation quality of up to 65% in terms of mean surface distance in three-dimensional (3-D) CT. Such improvement was observed consistently using different registration algorithms, dimensionality (two-dimensional/3-D), and modalities (MRI/CT). PMID:26158083
NASA Astrophysics Data System (ADS)
Anastassiu, Hristos T.
2003-04-01
The physical optics approximation is employed in the derivation of a closed form expression for the radar cross section (RCS) of a flat, perfectly conducting plate of various shapes, located over a dielectric, possibly lossy half-space. The half-space is assumed to lie in the far field region of the plate. The well-known "four-path model" is invoked in a first-order approximation of the half-space contribution to the scattering mechanisms. Numerical results are compared to a reference, Moment Method solution, and the agreement is investigated, to assess the accuracy of the approximations used. The analytical expressions derived can facilitate very fast RCS calculations for realistic scatterers, such as ships in a sea environment, or aircraft flying low over the ground.
TriAnnot: A Versatile and High Performance Pipeline for the Automated Annotation of Plant Genomes
Leroy, Philippe; Guilhot, Nicolas; Sakai, Hiroaki; Bernard, Aurélien; Choulet, Frédéric; Theil, Sébastien; Reboux, Sébastien; Amano, Naoki; Flutre, Timothée; Pelegrin, Céline; Ohyanagi, Hajime; Seidel, Michael; Giacomoni, Franck; Reichstadt, Mathieu; Alaux, Michael; Gicquello, Emmanuelle; Legeai, Fabrice; Cerutti, Lorenzo; Numa, Hisataka; Tanaka, Tsuyoshi; Mayer, Klaus; Itoh, Takeshi; Quesneville, Hadi; Feuillet, Catherine
2012-01-01
In support of the international effort to obtain a reference sequence of the bread wheat genome and to provide plant communities dealing with large and complex genomes with a versatile, easy-to-use online automated tool for annotation, we have developed the TriAnnot pipeline. Its modular architecture allows for the annotation and masking of transposable elements, the structural, and functional annotation of protein-coding genes with an evidence-based quality indexing, and the identification of conserved non-coding sequences and molecular markers. The TriAnnot pipeline is parallelized on a 712 CPU computing cluster that can run a 1-Gb sequence annotation in less than 5 days. It is accessible through a web interface for small scale analyses or through a server for large scale annotations. The performance of TriAnnot was evaluated in terms of sensitivity, specificity, and general fitness using curated reference sequence sets from rice and wheat. In less than 8 h, TriAnnot was able to predict more than 83% of the 3,748 CDS from rice chromosome 1 with a fitness of 67.4%. On a set of 12 reference Mb-sized contigs from wheat chromosome 3B, TriAnnot predicted and annotated 93.3% of the genes among which 54% were perfectly identified in accordance with the reference annotation. It also allowed the curation of 12 genes based on new biological evidences, increasing the percentage of perfect gene prediction to 63%. TriAnnot systematically showed a higher fitness than other annotation pipelines that are not improved for wheat. As it is easily adaptable to the annotation of other plant genomes, TriAnnot should become a useful resource for the annotation of large and complex genomes in the future. PMID:22645565
2003-01-01
xml Internet . Teal Group Corp. Aviation Week and Space Technology , 18 March 2003, 1. 62 Babak Minovi, “Turbine Industry Struggles with Weak Markets ...xml Internet . Teal Group Corp. Aviation Week and Space Technology , 18 March 2003, 1. 64 Babak Minovi, “Turbine Industry Struggles with Weak Markets ...what several executives referred to as the “perfect storm” now blowing through the aviation market . With this information many questions remain: Will
Air-water analogy and the study of hydraulic models
NASA Technical Reports Server (NTRS)
Supino, Giulio
1953-01-01
The author first sets forth some observations about the theory of models. Then he established certain general criteria for the construction of dynamically similar models in water and in air, through reference to the perfect fluid equations and to the ones pertaining to viscous flow. It is, in addition, pointed out that there are more cases in which the analogy is possible than is commonly supposed.
Absolute calibration of optical flats
Sommargren, Gary E.
2005-04-05
The invention uses the phase shifting diffraction interferometer (PSDI) to provide a true point-by-point measurement of absolute flatness over the surface of optical flats. Beams exiting the fiber optics in a PSDI have perfect spherical wavefronts. The measurement beam is reflected from the optical flat and passed through an auxiliary optic to then be combined with the reference beam on a CCD. The combined beams include phase errors due to both the optic under test and the auxiliary optic. Standard phase extraction algorithms are used to calculate this combined phase error. The optical flat is then removed from the system and the measurement fiber is moved to recombine the two beams. The newly combined beams include only the phase errors due to the auxiliary optic. When the second phase measurement is subtracted from the first phase measurement, the absolute phase error of the optical flat is obtained.
Strong correlation in incremental full configuration interaction
NASA Astrophysics Data System (ADS)
Zimmerman, Paul M.
2017-06-01
Incremental Full Configuration Interaction (iFCI) reaches high accuracy electronic energies via a many-body expansion of the correlation energy. In this work, the Perfect Pairing (PP) ansatz replaces the Hartree-Fock reference of the original iFCI method. This substitution captures a large amount of correlation at zero-order, which allows iFCI to recover the remaining correlation energy with low-order increments. The resulting approach, PP-iFCI, is size consistent, size extensive, and systematically improvable with increasing order of incremental expansion. Tests on multiple single bond, multiple double bond, and triple bond dissociations of main group polyatomics using double and triple zeta basis sets demonstrate the power of the method for handling strong correlation. The smooth dissociation profiles that result from PP-iFCI show that FCI-quality ground state computations are now within reach for systems with up to about 10 heavy atoms.
NASA Technical Reports Server (NTRS)
2003-01-01
VANDENBERG AFB, CALIF. In the NASA spacecraft processing facility on North Vandenberg Air Force Base, Dr. Francis Everitt, principal investigator, and Brad Parkinson, co-principal investigator, both from Stanford University, hold one of the small gyroscopes used in the Gravity Probe B spacecraft. The GP-B towers behind them. The Gravity Probe B mission is a relativity experiment developed by NASAs Marshall Space Flight Center, Stanford University and Lockheed Martin. The spacecraft will test two extraordinary predictions of Albert Einsteins general theory of relativity that he advanced in 1916: the geodetic effect (how space and time are warped by the presence of the Earth) and frame dragging (how Earths rotation drags space and time around with it). Gravity Probe B consists of four sophisticated gyroscopes that will provide an almost perfect space-time reference system. The mission will look in a precision manner for tiny changes in the direction of spin.
Zhang, Rufan; Zhang, Yingying; Wei, Fei
2017-02-21
Carbon nanotubes (CNTs) have drawn intensive research interest in the past 25 years due to their excellent properties and wide applications. Ultralong CNTs refers to the horizontally aligned CNT arrays which are usually grown on flat substrates, parallel with each other with large intertube distances. They usually have perfect structures, excellent properties, and lengths up to centimeters, even decimeters. Ultralong CNTs are promising candidates as building blocks for transparent displays, nanoelectronics, superstrong tethers, aeronautics and aerospace materials, etc. The controlled synthesis of ultralong CNTs with perfect structures is the key to fully exploit the extraordinary properties of CNTs. CNTs are typical one-dimensional single-crystal nanomaterials. It has always been a great challenge how to grow macroscale single-crystals with no defects. Thus, the synthesis of ultralong CNTs with no defect is of significant importance from both fundamental and industrial aspects. In this Account, we focus on our progress on the controlled synthesis of ultralong CNTs with perfect structures and excellent properties. A deep understanding of the CNT growth mechanism is the first step for the controlled synthesis of ultralong CNTs with high quality. We first introduce the growth mechanism for ultralong CNTs and the main factor affecting their structures. We then discuss the strategies to control the defects in the as-grown ultralong CNTs. With these approaches, ultralong high-quality CNTs with different structures can be obtained. By completely eliminating the factors which may induce defects in the CNT walls, ultralong CNTs with perfect structures can be obtained. Their chiral indices keep unchanged for several centimeters long along the axial direction of the CNTs. The defect-free structures render the ultralong CNTs with excellent electrical, mechanical and thermal properties. The as-grown ultralong CNTs exhibit superhigh mechanical strength (>100 GPa) and their breaking strain (>17.5%) reach the theoretical limits. They also show excellent electrical and thermal properties. In addition, centimeters long CNTs showed macroscale interwall superlubricious properties due to their defect-free structures. Ultralong, defect-free CNTs with controlled structures are highly desirable for many high-end applications. We hope that this Account will shed light on the controlled synthesis of ultralong CNTs with perfect structures and excellent properties. Moreover, the growth mechanism and controlled synthesis of ultralong CNTs with perfect structures also offers a good model for other one-dimensional nanomaterials.
Bendable X-ray Optics for High Resolution Imaging
NASA Technical Reports Server (NTRS)
Gubarev, M.; Ramsey, B.; Kilaru, K.; Atkins, C.; Broadway, D.
2014-01-01
Current state-of the-art for x-ray optics fabrication calls for either the polishing of massive substrates into high-angular-resolution mirrors or the replication of thin, lower-resolution, mirrors from perfectly figured mandrels. Future X-ray Missions will require a change in this optics fabrication paradigm in order to achieve sub-arcsecond resolution in light-weight optics. One possible approach to this is to start with perfectly flat, light-weight surface, bend it into a perfect cone, form the desired mirror figure by material deposition, and insert the resulting mirror into a telescope structure. Such an approach is currently being investigated at MSFC, and a status report will be presented detailing the results of finite element analyses, bending tests and differential deposition experiments.
Non-invasive assessment of liver fibrosis
Papastergiou, Vasilios; Tsochatzis, Emmanuel; Burroughs, Andrew K.
2012-01-01
The presence and degree of hepatic fibrosis is crucial in order to make therapeutic decisions and predict clinical outcomes. Currently, the place of liver biopsy as the standard of reference for assessing liver fibrosis has been challenged by the increasing awareness of a number of drawbacks related to its use (invasiveness, sampling error, inter-/intraobserver variability). In parallel with this, noninvasive assessment of liver fibrosis has experienced explosive growth in recent years and a wide spectrum of noninvasive methods ranging from serum assays to imaging techniques have been developed. Some are validated methods, such as the Fibrotest/ Fibrosure and transient elastography in Europe, and are gaining a growing role in routine clinical practice, especially in chronic hepatitis C. Large-scale validation is awaited in the setting of other chronic liver diseases. However, noninvasive tests used to detect significant fibrosis and cirrhosis, the two major clinical endpoints, are not yet at a level of performance suitable for routine diagnostic tests, and there is still no perfect surrogate or method able to completely replace an optimal liver biopsy. This article aims to review current noninvasive tests for the assessment of liver fibrosis and the perspectives for their rational use in clinical practice. PMID:24714123
A novel expert system for objective masticatory efficiency assessment
2018-01-01
Most of the tools and diagnosis models of Masticatory Efficiency (ME) are not well documented or severely limited to simple image processing approaches. This study presents a novel expert system for ME assessment based on automatic recognition of mixture patterns of masticated two-coloured chewing gums using a combination of computational intelligence and image processing techniques. The hypotheses tested were that the proposed system could accurately relate specimens to the number of chewing cycles, and that it could identify differences between the mixture patterns of edentulous individuals prior and after complete denture treatment. This study enrolled 80 fully-dentate adults (41 females and 39 males, 25 ± 5 years of age) as the reference population; and 40 edentulous adults (21 females and 19 males, 72 ± 8.9 years of age) for the testing group. The system was calibrated using the features extracted from 400 samples covering 0, 10, 15, and 20 chewing cycles. The calibrated system was used to automatically analyse and classify a set of 160 specimens retrieved from individuals in the testing group in two appointments. The ME was then computed as the predicted number of chewing strokes that a healthy reference individual would need to achieve a similar degree of mixture measured against the real number of cycles applied to the specimen. The trained classifier obtained a Mathews Correlation Coefficient score of 0.97. ME measurements showed almost perfect agreement considering pre- and post-treatment appointments separately (κ ≥ 0.95). Wilcoxon signed-rank test showed that a complete denture treatment for edentulous patients elicited a statistically significant increase in the ME measurements (Z = -2.31, p < 0.01). We conclude that the proposed expert system proved able and reliable to accurately identify patterns in mixture and provided useful ME measurements. PMID:29385165
Serdar, Muhittin A; Turan, Mustafa; Cihan, Murat
2008-06-01
Laboratory specialists currently need to access scientific-based information at anytime and anywhere. A considerable period of time and too much effort are required to access this information through existing accumulated data. Personal digital assistants (PDA) are supposed to provide an effective solution with commercial software for this problem. In this study, 11 commercial software products (UpToDate, ePocrates, Inforetrive, Pepid, eMedicine, FIRST Consult, and 5 laboratory e-books released by Skyscape and/or Isilo) were selected and the benefits of their use were evaluated by seven laboratory specialists. The assessment of the software was performed based on the number of the tests included, the software content of detailed information for each test-like process, method, interpretation of results, reference ranges, critical values, interferences, equations, pathophysiology, supplementary technical details such as sample collection principles, and additional information such as linked references, evidence-based data, test cost, etc. In terms of technique, the following items are considered: the amount of memory required to run the software, the graphical user interface, which is a user-friendly instrument, and the frequency of new and/or up-date releases. There is still no perfect program, as we have anticipated. Interpretation of laboratory results may require software with an integrated program. However, methodological data are mostly not included in the software evaluated. It seems that these shortcomings will be fixed in the near future, and PDAs and relevant medical applications will also become indispensable for all physicians including laboratory specialists in the field of training/education and in patient care.
LITTLE JOE 2 - LAUNCH VEHICLES - VA
1961-04-13
G61-00030 (4 Nov. 1959) --- Launch of Little Joe-2 from Wallops Island carrying Mercury spacecraft test article. The suborbital test flight of the Mercury capsule was to test the escape system. Vehicle functioned perfectly, but escape rocket ignited several seconds too late. Photo credit: NASA
Evaluation of bearing capacity of piles from cone penetration test data : technical summary.
DOT National Transportation Integrated Search
1999-11-01
Among the different in situ tests, cone penetration test (CPT) is considered the most frequently used method for characterization of geomedia. Due to the soft nature of soil deposits in Louisiana, the CPT is considered a perfect tool for characteriza...
Glossary | Efficient Windows Collaborative
double-hung windows as a means of counterbalancing the weight of the sash during opening and closing. Bay a fixed sash or a double-hung window. Also referred to as bead stop. Blackbody. The ideal, perfect member of the lower sash which meet at the middle of a double-hung window. Clerestory. A window in the
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-07
... proposing to assess fines ranging from $2,000 to $4,000 for a first offense and $4,000 to $5,000 for a second offense. Any subsequent offenses will be subject to a fine of $5,000 or referred to C2's Business... principles of trade, to prevent fraudulent and manipulative acts, to remove impediments to and to perfect the...
[The application of radiological image in forensic medicine].
Zhang, Ji-Zong; Che, Hong-Min; Xu, Li-Xiang
2006-04-01
Personal identification is an important work in forensic investigation included sex discrimination, age and stature estimation. Human identification depended on radiological image technique analysis is a practice and proper method in forensic science field. This paper intended to understand the advantage and defect by reviewed the employing of forensic radiology in forensic science field broadly and provide a reference to perfect the application of forensic radiology in forensic science field.
Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network
Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu
2018-01-01
This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629
Tosun, Ozge Celiker; Solmaz, Ulas; Ekin, Atalay; Tosun, Gokhan; Gezer, Cenk; Ergenoglu, Ahmet Mete; Yeniel, Ahmet Ozgur; Mat, Emre; Malkoc, Mehtap; Askar, Niyazi
2016-01-01
[Purpose] The aim of this study was to evaluate whether the effect of pelvic floor exercises on pelvic floor muscle strength could be detected via ultrasonography in patients with urinary incontinence. [Subjects and Methods] Of 282 incontinent patients, 116 participated in the study and were randomly divided into a pelvic floor muscle training (n=65) group or control group (n=51). The pelvic floor muscle training group was given pelvic floor exercise training for 12 weeks. Both groups were evaluated at the beginning of the study and after 12 weeks. Abdominal ultrasonography measurements in transverse and longitudinal planes, the PERFECT scheme, perineometric evaluation, the stop test, the stress test, and the pad test were used to assess pelvic floor muscle strength in all cases. [Results] After training, the PERFECT, perineometry and transabdominal ultrasonography measurements were found to be significantly improved, and the stop test and pad test results were significantly decreased in the pelvic floor muscle training group, whereas no difference was observed in the control group. There was a positive correlation between the PERFECT force measurement scale and ultrasonography force measurement scale before and after the intervention in the control and pelvic floor muscle training groups (r=0.632 and r=0.642, respectively). [Conclusion] Ultrasonography can be used as a noninvasive method to identify the change in pelvic floor muscle strength with exercise training. PMID:27065519
Moulton, Calum
2014-10-01
Perfect pitch, or absolute pitch (AP), is defined as the ability to identify or produce the pitch of a sound without need for a reference pitch, and is generally regarded as a valuable asset to the musician. However, there has been no recent review of the literature examining its aetiology and its utility taking into account emerging scientific advances in AP research, notably in functional imaging. This review analyses the key empirical research on AP, focusing on genetic and neuroimaging studies. The review concludes that: AP probably has a genetic predisposition, although this is based on limited evidence; early musical training is almost certainly essential for AP acquisition; and, although there is evidence that it may be relevant to speech processing, AP can interfere with relative pitch, an ability on which humans rely to communicate effectively. The review calls into question the value of AP to musicians and non-musicians alike. © 2014 Royal College of Physicians.
Minimal mechanisms for school formation in self-propelled particles
NASA Astrophysics Data System (ADS)
Li, Yue-Xian; Lukeman, Ryan; Edelstein-Keshet, Leah
2008-05-01
In the context of social organisms, a school refers to a cohesive group of organisms that share a common speed and direction of motion, as well as a common axis of body alignment or polarization. Schools are also noted for the relatively fixed nearest-neighbour distances between individuals. The rules of interaction that lead to the formation and maintenance of a school structure have been explored experimentally, analytically, and by simulation. Interest in biological examples, and non-biological “self-propelled particles” such as robots, vehicles, or autonomous agents leads to the question of what are the simplest possible sets of rules that can assure the formation and the stability of the “perfect school”: an aggregate in which the nearest-neighbour distances and speeds are identical. Here we explore mechanisms that lead to a perfect school structure in one and two dimensions. We consider distance-detection as well as velocity-detection between the interacting pairs of self-propelled particles. We construct interaction forces and formulate schooling equations. In the simplest cases, these equations have analytic solutions. In many cases, the stability of the perfect school can be explored. We then investigate how these structures form and evolve over time from various initial configurations using simulations. We study the relationship between the assumed interaction forces and the school patterns that emerge. While true biological schools are far from perfect, the insights gained from this investigation can help to understand some properties of real schools, and to suggest the appropriate properties of artificial schools where coordinated motion is desired.
Najafi-Yazdi, A.; Mongeau, L.
2012-01-01
The Lattice Boltzmann Method (LBM) is a well established computational tool for fluid flow simulations. This method has been recently utilized for low Mach number computational aeroacoustics. Robust and nonreflective boundary conditions, similar to those used in Navier-Stokes solvers, are needed for LBM-based aeroacoustics simulations. The goal of the present study was to develop an absorbing boundary condition based on the perfectly matched layer (PML) concept for LBM. The derivation of formulations for both two and three dimensional problems are presented. The macroscopic behavior of the new formulation is discussed. The new formulation was tested using benchmark acoustic problems. The perfectly matched layer concept appears to be very well suited for LBM, and yielded very low acoustic reflection factor. PMID:23526050
Random Matrix Theory Approach to Chaotic Coherent Perfect Absorbers
NASA Astrophysics Data System (ADS)
Li, Huanan; Suwunnarat, Suwun; Fleischmann, Ragnar; Schanz, Holger; Kottos, Tsampikos
2017-01-01
We employ random matrix theory in order to investigate coherent perfect absorption (CPA) in lossy systems with complex internal dynamics. The loss strength γCPA and energy ECPA, for which a CPA occurs, are expressed in terms of the eigenmodes of the isolated cavity—thus carrying over the information about the chaotic nature of the target—and their coupling to a finite number of scattering channels. Our results are tested against numerical calculations using complex networks of resonators and chaotic graphs as CPA cavities.
Explaining evolution via constrained persistent perfect phylogeny
2014-01-01
Background The perfect phylogeny is an often used model in phylogenetics since it provides an efficient basic procedure for representing the evolution of genomic binary characters in several frameworks, such as for example in haplotype inference. The model, which is conceptually the simplest, is based on the infinite sites assumption, that is no character can mutate more than once in the whole tree. A main open problem regarding the model is finding generalizations that retain the computational tractability of the original model but are more flexible in modeling biological data when the infinite site assumption is violated because of e.g. back mutations. A special case of back mutations that has been considered in the study of the evolution of protein domains (where a domain is acquired and then lost) is persistency, that is the fact that a character is allowed to return back to the ancestral state. In this model characters can be gained and lost at most once. In this paper we consider the computational problem of explaining binary data by the Persistent Perfect Phylogeny model (referred as PPP) and for this purpose we investigate the problem of reconstructing an evolution where some constraints are imposed on the paths of the tree. Results We define a natural generalization of the PPP problem obtained by requiring that for some pairs (character, species), neither the species nor any of its ancestors can have the character. In other words, some characters cannot be persistent for some species. This new problem is called Constrained PPP (CPPP). Based on a graph formulation of the CPPP problem, we are able to provide a polynomial time solution for the CPPP problem for matrices whose conflict graph has no edges. Using this result, we develop a parameterized algorithm for solving the CPPP problem where the parameter is the number of characters. Conclusions A preliminary experimental analysis shows that the constrained persistent perfect phylogeny model allows to explain efficiently data that do not conform with the classical perfect phylogeny model. PMID:25572381
Roelandt, S; Van der Stede, Y; Czaplicki, G; Van Loo, H; Van Driessche, E; Dewulf, J; Hooyberghs, J; Faes, C
2015-06-06
Currently, there are no perfect reference tests for the in vivo detection of Neospora caninum infection. Two commercial N caninum ELISA tests are currently used in Belgium for bovine sera (TEST A and TEST B). The goal of this study is to evaluate these tests used at their current cut-offs, with a no gold standard approach, for the test purpose of (1) demonstration of freedom of infection at purchase and (2) diagnosis in aborting cattle. Sera of two study populations, Abortion population (n=196) and Purchase population (n=514), were selected and tested with both ELISA's. Test results were entered in a Bayesian model with informative priors on population prevalences only (Scenario 1). As sensitivity analysis, two more models were used: one with informative priors on test diagnostic accuracy (Scenario 2) and one with all priors uninformative (Scenario 3). The accuracy parameters were estimated from the first model: diagnostic sensitivity (Test A: 93.54 per cent-Test B: 86.99 per cent) and specificity (Test A: 90.22 per cent-Test B: 90.15 per cent) were high and comparable (Bayesian P values >0.05). Based on predictive values in the two study populations, both tests were fit for purpose, despite an expected false negative fraction of ±0.5 per cent in the Purchase population and ±5 per cent in the Abortion population. In addition, a false positive fraction of ±3 per cent in the overall Purchase population and ±4 per cent in the overall Abortion population was found. British Veterinary Association.
NASA Astrophysics Data System (ADS)
Cersullo, Federica; Wildi, François; Chazelas, Bruno; Pepe, Francesco
2017-05-01
Context. The field of exoplanet research is moving towards the detection and characterization of habitable planets. These exo-Earths can be easily found around low-mass stars by using either photometric transit or radial-velocity (RV) techniques. In the latter case the gain is twofold because the signal induced by the planet of a given mass is higher due to the more favourable planet-star mass ratio and because the habitable zone lies closer to the star. However, late-type stars emit mainly in the infrared (IR) wavelength range, which calls for IR instruments. Aims: SPIRou is a stable RV IR spectrograph addressing these ambitious scientific objectives. As with any other spectrograph, calibration and drift monitoring is fundamental to achieve high precision. However, the IR domain suffers from a lack of suitable reference spectral sources. Our goal was to build, test and finally operate a Fabry-Pérot-based RV-reference module able to provide the needed spectral information over the full wavelength range of SPIRou. Methods: We adapted the existing HARPS Fabry-Pérot calibrator for operation in the IR domain. After manufacturing and assembly, we characterized the FP RV-module in the laboratory before delivering it to the SPIRou integration site. In particular, we measured finesse, transmittance, and spectral flux of the system. Results: The measured finesse value of F = 12.8 corresponds perfectly to the theoretical value. The total transmittance at peak is of the order of 0.5%, mainly limited by fibre-connectors and interfaces. Nevertheless, the provided flux is in line with the the requirements set by the SPIRou instrument. Although we could test the stability of the system, we estimated it by comparing the SPIRou Fabry-Pérot with the already operating HARPS system and demonstrated a stability of better than 1 m s-1 during a night. Conclusions: Once installed on SPIRou, we will test the full spectral characteristics and stability of the RV-reference module. The goal will be to prove that the line position and shape stability of all lines is better than 0.3 m s-1 between two calibration sequences (typically 24 h), such that the RV-reference module can be used to monitor instrumental drifts. In principle, the system is also intrinsically stable over longer time scales such that it can also be used for calibration purposes.
Schot, Marjolein J C; van Delft, Sanne; Kooijman-Buiting, Antoinette M J; de Wit, Niek J; Hopstaken, Rogier M
2015-05-18
Various point-of-care testing (POCT) urine analysers are commercially available for routine urine analysis in general practice. The present study compares analytical performance, agreement and user-friendliness of six different POCT urine analysers for diagnosing urinary tract infection in general practice. All testing procedures were performed at a diagnostic centre for primary care in the Netherlands. Urine samples were collected at four general practices. Analytical performance and agreement of the POCT analysers regarding nitrite, leucocytes and erythrocytes, with the laboratory reference standard, was the primary outcome measure, and analysed by calculating sensitivity, specificity, positive and negative predictive value, and Cohen's κ coefficient for agreement. Secondary outcome measures were the user-friendliness of the POCT analysers, in addition to other characteristics of the analysers. The following six POCT analysers were evaluated: Uryxxon Relax (Macherey Nagel), Urisys 1100 (Roche), Clinitek Status (Siemens), Aution 11 (Menarini), Aution Micro (Menarini) and Urilyzer (Analyticon). Analytical performance was good for all analysers. Compared with laboratory reference standards, overall agreement was good, but differed per parameter and per analyser. Concerning the nitrite test, the most important test for clinical practice, all but one showed perfect agreement with the laboratory standard. For leucocytes and erythrocytes specificity was high, but sensitivity was considerably lower. Agreement for leucocytes varied between good to very good, and for the erythrocyte test between fair and good. First-time users indicated that the analysers were easy to use. They expected higher productivity and accuracy when using these analysers in daily practice. The overall performance and user-friendliness of all six commercially available POCT urine analysers was sufficient to justify routine use in suspected urinary tract infections in general practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Launch of Little Joe I-B from Wallops Island
1960-01-21
B60-00364 (4 Nov. 1959) --- Launch of Little Joe-2 from Wallops Island carrying Mercury spacecraft test article. The suborbital test flight of the Mercury capsule was to test the escape system. Vehicle functioned perfectly, but escape rocket ignited several seconds too late. Photo credit: NASA
Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de
2012-10-01
The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions < 50% in the LMCA and < 70% in the ADA. The agreement for lesions > 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.
Factory overload testing of a large power transformer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Douglas, D.H.; Lawrence, C.O.; Templeton, J.B.
1985-09-01
A factory overload test of up to 150% of the nameplate rating was run on a 224 MVA autotransformer. The results of this test were of great value and were used in identifying transformer overload limitations, in evaluating loading guide oil and winding equations, exponents and time constants, and in helping to perfect a factory overload test procedure.
Solomon, Nadia; Fields, Paul J.; Tamarozzi, Francesca; Brunetti, Enrico; Macpherson, Calum N. L.
2017-01-01
Cystic echinococcosis (CE), a parasitic zoonosis, results in cyst formation in the viscera. Cyst morphology depends on developmental stage. In 2003, the World Health Organization (WHO) published a standardized ultrasound (US) classification for CE, for use among experts as a standard of comparison. This study examined the reliability of this classification. Eleven international CE and US experts completed an assessment of eight WHO classification images and 88 test images representing cyst stages. Inter- and intraobserver reliability and observer performance were assessed using Fleiss' and Cohen's kappa. Interobserver reliability was moderate for WHO images (κ = 0.600, P < 0.0001) and substantial for test images (κ = 0.644, P < 0.0001), with substantial to almost perfect interobserver reliability for stages with pathognomonic signs (CE1, CE2, and CE3) for WHO (0.618 < κ < 0.904) and test images (0.642 < κ < 0.768). Comparisons of expert performances against the majority classification for each image were significant for WHO (0.413 < κ < 1.000, P < 0.005) and test images (0.718 < κ < 0.905, P < 0.0001); and intraobserver reliability was significant for WHO (0.520 < κ < 1.000, P < 0.005) and test images (0.690 < κ < 0.896, P < 0.0001). Findings demonstrate moderate to substantial interobserver and substantial to almost perfect intraobserver reliability for the WHO classification, with substantial to almost perfect interobserver reliability for pathognomonic stages. This confirms experts' abilities to reliably identify WHO-defined pathognomonic signs of CE, demonstrating that the WHO classification provides a reproducible way of staging CE. PMID:28070008
[Use of the M.H.Q. test (Middlesex Hospital Questionnaire) in the diagnosis of neuroses].
Cornia, C; Lorenzini, R
1977-05-19
A psychological interview and the MHQ, Koch, Rorschach, TAT, Machover and family design psychological tests were conducted in pneumopathic patients. The results obtained with the MHQ were compared with those of the other tests with respect to the diagnosis of psychoneurosis. A perfect fit was observed.
Standardized Tests: Purpose Is the Point
ERIC Educational Resources Information Center
Popham, W. James
2016-01-01
"U.S. students are being educated less well these days than they should be," writes W. James Popham. One key contributing factor is that educators often use the wrong tests to make their most important educational decisions. Two recent events have made it a perfect time to change the way we conduct our educational testing: growing…
A finite-difference time-domain electromagnetic solver in a generalized coordinate system
NASA Astrophysics Data System (ADS)
Hochberg, Timothy Allen
A new, finite-difference, time-domain method for the simulation of full-wave electromagnetic wave propogation in complex structures is developed. This method is simple and flexible; it allows for the simulation of transient wave propogation in a large class of practical structures. Boundary conditions are implemented for perfect and imperfect electrically conducting boundaries, perfect magnetically conducting boundaries, and absorbing boundaries. The method is validated with the aid of several different types of test cases. Two types of coaxial cables with helical breaks are simulated and the results are discussed.
Using potential performance theory to test five hypotheses about meta-attribution.
Trafimow, David; Hunt, Gayle; Rice, Stephen; Geels, Kasha
2011-01-01
Based on I. Kant's (1991) distinction between perfect and imperfect duties and the attribution literature pertaining to that distinction, the authors proposed and tested 5 hypotheses about meta-attribution. More specifically, violations of perfect duties have been shown to arouse both more negative affect and stronger correspondent inferences than do violations of imperfect duties (e.g., D. Trafimow, I. K. Bromgard, K. A. Finlay, & T. Ketelaar, 2005). But when it comes to making meta-attributions-that is, guessing the attributions others would make-is the affect differential an advantage or a disadvantage? In addition to the null hypothesis of no effect, the authors proposed and tested additional hypotheses about how negative affect might increase or decrease the effectiveness of people's meta-attribution strategies and how even if there is no effect on strategy effectiveness, negative affect could increase or decrease the consistencies with which these strategies could be used.
A Novel Three-Dimensional Vector Analysis of Axial Globe Position in Thyroid Eye Disease
Guo, Jie; Yuan, Yifei; Zhang, Rui; Huang, Wenhu
2017-01-01
Purpose. To define a three-dimensional (3D) vector method to describe the axial globe position in thyroid eye disease (TED). Methods. CT data from 59 patients with TED were collected and 3D images were reconstructed. A reference coordinate system was established, and the coordinates of the corneal apex and the eyeball center were calculated to obtain the globe vector EC→. The measurement reliability was evaluated. The parameters of EC→ were analyzed and compared with the results of two-dimensional (2D) CT measurement, Hertel exophthalmometry, and strabismus tests. Results. The reliability of EC→ measurement was excellent. The difference between EC→ and 2D CT measurement was significant (p = 0.003), and EC→ was more consistent with Hertel exophthalmometry than with 2D CT measurement (p < 0.001). There was no significant difference between EC→ and Hirschberg test, and a strong correlation was found between EC→ and synoptophore test. When one eye had a larger deviation angle than its fellow, its corneal apex shifted in the corresponding direction, but the shift of the eyeball center was not significant. The parameters of EC→ were almost perfectly consistent with the geometrical equation. Conclusions. The establishment of a 3D globe vector is feasible and reliable, and it could provide more information in the axial globe position. PMID:28491471
A time reference distribution concept for a time division communication network
NASA Technical Reports Server (NTRS)
Stover, H. A.
1973-01-01
Starting with an assumed ideal network having perfect clocks at every node and known fixed transmission delays between nodes, the effects of adding tolerances to both transmission delays and nodal clocks is described. The advantages of controlling tolerances on time rather than frequency are discussed. Then a concept is presented for maintaining these tolerances on time throughout the network. This concept, called time reference distribution, is a systematic technique for distributing time reference to all nodes of the network. It is reliable, survivable and possesses many other desirable characteristics. Some of its features such as an excellent self monitoring capability will be pointed out. Some preliminary estimates of the accuracy that might be expected are developed and there is a brief discussion of the impact upon communication system costs. Time reference distribution is a concept that appears very attractive. It has not had experimental evaluation and has not yet been endorsed for use in any communication network.
Willis, Rohan; Pierangeli, Silvia S; Jaskowski, Troy D; Malmberg, Elisabeth; Guerra, Marta; Salmon, Jane E; Petri, Michelle; Branch, D Ware; Tebo, Anne E
2016-06-01
To investigate the performance characteristics and impact of newly developed reference calibrators on the commutability between anti-β2 glycoprotein I (anti-β2 GPI) immunoassays in antiphospholipid syndrome (APS) and/or systemic lupus erythematosus (SLE). Immunoglobulin G (IgG) and immunoglobulin M (IgM) anti-β2 GPI immunoassays from four manufacturers were evaluated. Serum samples from 269 patients (APS only, n = 31; SLE and APS, n = 83; SLE only, n = 129; pregnancy-related clinical manifestations without APS, n = 26) and 162 women with histories of successful pregnancies were tested. Results were expressed in kit-specific arbitrary units and in the calibrator reference units (RUs) based on 99th percentile cutoff values. Diagnostic accuracies, correlation between kits, and specific clinical manifestations in APS were investigated. The sensitivities of the assays ranged from 15.8% to 27.2% (IgG) and 12.3% to 15.8% (IgM) while specificities ranged from 79.4% to 86.5% (IgG) and 80.6% to 84.5% (IgM). There was moderate to almost perfect interassay reliability (Cohen κ, 0.69-0.98), and Spearman correlation coefficients were generally improved when results of the IgG determinations were expressed in RUs. Although qualitative agreements between immunoassays for both antibody isotypes are acceptable, correlations with APS clinical manifestations were kit dependent. Only the use of IgG reference material improved quantitative correlations between assays. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Morales, Marco U; Saker, Saker; Wilde, Craig; Pellizzari, Carlo; Pallikaris, Aristophanes; Notaroberto, Neil; Rubinstein, Martin; Rui, Chiara; Limoli, Paolo; Smolek, Michael K; Amoaku, Winfried M
2016-11-01
The purpose of this study was to establish a normal reference database for fixation stability measured with the bivariate contour ellipse area (BCEA) in the Macular Integrity Assessment (MAIA) microperimeter. Subjects were 358 healthy volunteers who had the MAIA examination. Fixation stability was assessed using two BCEA fixation indices (63% and 95% proportional values) and the percentage of fixation points within 1° and 2° from the fovea (P1 and P2). Statistical analysis was performed with linear regression and Pearson's product moment correlation coefficient. Average areas of 0.80 deg 2 (min = 0.03, max = 3.90, SD = 0.68) for the index BCEA@63% and 2.40 deg 2 (min = 0.20, max = 11.70, SD = 2.04) for the index BCEA@95% were found. The average values of P1 and P2 were 95% (min = 76, max = 100, SD = 5.31) and 99% (min = 91, max = 100, SD = 1.42), respectively. The Pearson's product moment test showed an almost perfect correlation index, r = 0.999, between BCEA@63% and BCEA@95%. Index P1 showed a very strong correlation with BCEA@63%, r = -0.924, as well as with BCEA@95%, r = -0.925. Index P2 demonstrated a slightly lower correlation with both BCEA@63% and BCEA@95%, r = -0.874 and -0.875, respectively. The single parameter of the BCEA@95% may be taken as accurately reporting fixation stability and serves as a reference database of normal subjects with a cutoff area of 2.40 ± 2.04 deg 2 in MAIA microperimeter. Fixation stability can be measured with different indices. This study originates reference fixation values for the MAIA using a single fixation index.
2010-03-01
strong while the temperatures over Scandinavia and Europe (eastern Arctic) are warmer and winds are weaker than average (Serreze and Barry 2005...than the Fram Strait branch than previously thought. This could facilitate an increase in the frequency of storms reaching higher latitudes...REFERENCES Ackerman, J. T., 2008: Climate Change, National Security, and the Quadrennial Defense Review: Avoiding the Perfect Storm . Strategic Studies
USSR and Eastern Europe Scientific Abstracts, Physics and Mathematics, Number 34
1977-04-27
Russian abstract provided by the source] [Text] The relationship of duration and intensity of ultrashort pulses in a mode-locked ruby laser with Q...Excess charge carriers have been found to appear in pure Ge and Si crystals irradiated with short pulses from a C02 laser . The high purity and perfection...Illustrations 2; References 15: 8 Russian, 7 Western. USSR UDC 621.378.325 CONTROL OF DURATION OF ULTRASHORT PULSES IN MODE-LOCKED LASERS ZHURNAL
Does Praxis Make Perfect? A Personal Journey through the Praxis II: World Language Test
ERIC Educational Resources Information Center
Moser, Kelly
2012-01-01
For initial certification in French, German, and Spanish, teacher candidates in most states are required to pass one of the Praxis II subject matter tests. As of October 2010, a new test was added to the "Praxis Series." This Praxis II: World Language Test represents a significant change from previous versions and relies heavily upon the…
2012-01-01
Background Staphylococcus aureus is one of the most common causes of intramammary infections in dairy cows at dry off. Reliable identification is important for disease management on herd level and for antimicrobial treatment of infected animals. Our objective was to evaluate the test characteristics of PathoProof ™ Mastitis PCR Assay and bacteriological culture (BC) in diagnosing bovine intramammary infections caused by S. aureus at dry off at different PCR cycle threshold (Ct)-value cut-offs. Methods Sterile quarter samples and non-sterile composite samples from 140 animals in seven herds were collected in connection with the dairy herd improvement (DHI) milk recording. All quarter samples were analyzed using BC whereas all composite samples were analyzed with PathoProof ™ Mastitis PCR Assay. Latent class analysis was used to estimate test properties for PCR and BC in the absence of a perfect reference test. The population was divided into two geographically divided subpopulations and the Hui-Walter 2-test 2-populations model applied to estimate Se, Sp for the two tests, and prevalence for the two subpopulations. Results The Se for PCR increased with increasing Ct-value cut-off, accompanied by a small decrease in Sp. For BC the Se decreased and Sp increased with increasing Ct-value cut-off. Most optimal test estimates for the real-time PCR assay were at a Ct-value cut-off of 37; 0.93 [95% posterior probability interval (PPI) 0.60-0.99] for Se and 0.95 [95% PPI 0.95-0.99] for Sp. At the same Ct-value cut-off, Se and Sp for BC were 0.83 [95% PPI 0.66-0.99] and 0.97 [95% PPI 0.91-0.99] respectively. Depending on the chosen PCR Ct-value cut-off, the prevalence in the subpopulations varied; the prevalence increased with increasing PCR Ct-value cut-offs. Conclusion Neither BC nor real-time PCR is a perfect test in detecting IMI in dairy cows at dry off. The changes in sensitivity and prevalence at different Ct-value cut-offs for both PCR and BC may indicate a change in the underlying disease definition. At low PCR Ct-value cut-offs the underlying disease definition may be a truly/heavily infected cow, whereas at higher PCR Ct-value cut-offs the disease definition may be a S. aureus positive cow. PMID:23164432
Time and Space Partitioning the EagleEye Reference Misson
NASA Astrophysics Data System (ADS)
Bos, Victor; Mendham, Peter; Kauppinen, Panu; Holsti, Niklas; Crespo, Alfons; Masmano, Miguel; de la Puente, Juan A.; Zamorano, Juan
2013-08-01
We discuss experiences gained by porting a Software Validation Facility (SVF) and a satellite Central Software (CSW) to a platform with support for Time and Space Partitioning (TSP). The SVF and CSW are part of the EagleEye Reference mission of the European Space Agency (ESA). As a reference mission, EagleEye is a perfect candidate to evaluate practical aspects of developing satellite CSW for and on TSP platforms. The specific TSP platform we used consists of a simulated LEON3 CPU controlled by the XtratuM separation micro-kernel. On top of this, we run five separate partitions. Each partition runs its own real-time operating system or Ada run-time kernel, which in turn are running the application software of the CSW. We describe issues related to partitioning; inter-partition communication; scheduling; I/O; and fault-detection, isolation, and recovery (FDIR).
Karas, Vlad O; Sinnott-Armstrong, Nicholas A; Varghese, Vici; Shafer, Robert W; Greenleaf, William J; Sherlock, Gavin
2018-01-01
Abstract Much of the within species genetic variation is in the form of single nucleotide polymorphisms (SNPs), typically detected by whole genome sequencing (WGS) or microarray-based technologies. However, WGS produces mostly uninformative reads that perfectly match the reference, while microarrays require genome-specific reagents. We have developed Diff-seq, a sequencing-based mismatch detection assay for SNP discovery without the requirement for specialized nucleic-acid reagents. Diff-seq leverages the Surveyor endonuclease to cleave mismatched DNA molecules that are generated after cross-annealing of a complex pool of DNA fragments. Sequencing libraries enriched for Surveyor-cleaved molecules result in increased coverage at the variant sites. Diff-seq detected all mismatches present in an initial test substrate, with specific enrichment dependent on the identity and context of the variation. Application to viral sequences resulted in increased observation of variant alleles in a biologically relevant context. Diff-Seq has the potential to increase the sensitivity and efficiency of high-throughput sequencing in the detection of variation. PMID:29361139
The Gravity Probe B Experiment
NASA Technical Reports Server (NTRS)
Kolodziejczak, Jeffrey
2008-01-01
This presentation briefly describes the Gravity Probe B (GP-B) Experiment which is designed to measure parts of Einstein's general theory of relativity by monitoring gyroscope orientation relative to a distant guide star. To measure the miniscule angles predicted by Einstein's theory, it was necessary to build near-perfect gyroscopes that were approximately 50 million times more precise than the best navigational gyroscopes. A telescope mounted along the central axis of the dewar and spacecraft provided the experiment's pointing reference to a guide star. The telescope's image divide precisely split the star's beam into x-axis and y-axis components whose brightness could be compared. GP-B's 650-gallon dewar, kept the science instrument inside the probe at a cryogenic temperature for 17.3 months and also provided the thruster propellant for precision attitude and translation control. Built around the dewar, the GP-B spacecraft was a total-integrated system, comprising both the space vehicle and payload, dedicated as a single entity to experimentally testing predictions of Einstein's theory.
Editorial: Let's talk about sex - the gender binary revisited.
Oldehinkel, Albertine J
2017-08-01
Sex refers to biological differences and gender to socioculturally delineated masculine and feminine roles. Sex or gender are included as a covariate or effect modifier in the majority of child psychology and psychiatry studies, and differences found between boys and girls have inspired many researchers to postulate underlying mechanisms. Empirical tests of whether including these proposed explanatory variables actually reduces the variance explained by gender are lagging behind somewhat. That is a pity, because a lot can be gained from a greater focus on the active agents of specific gender differences. As opposed to biological sex as such, some of the processes explaining why a specific outcome shows gender differences may be changeable and so possible prevention targets. Moreover, while the sex binary may be reasonable adequate as a classification variable, the gender binary is far from perfect. Gender is a multidimensional, partly context-dependent factor, and the dichotomy generally used in research does not do justice to the diversity existing within boys and girls. © 2017 Association for Child and Adolescent Mental Health.
Chemical treatment of wastewater from flue gas desulphurisation
NASA Astrophysics Data System (ADS)
Pasiecznik, Iwona; Szczepaniak, Włodzimierz
2017-11-01
The article presents results of laboratory tests of removing boron and arsenium from non-ideal solutions using double-layered magnesium/aluminium hydroxides (Mg/Al Double-Layered Hydroxide - DLH) produced with nitrate-chloride method. In research, wastewater from an installation for flue gas desulfurization was examined. Double-layered hydroxides are perfect absorbents for anionic compounds. The research proved high effectiveness of preparation with reference to arsenium, as well as confirmed the effect of presence of sulfatic and arsenate ions on the effectiveness of boron removal. On the basis of research on absorption kinetics a theoretical dose of DLH/NO3-Cl/M preparation was calculated and compared with a dose that ensures emimination of boron below the limit standarized by the national regulations. Application of double-layered magnesium/aluminium hydroxides for boron elimination from industrial wastewater requires significantly higher doses of preparation than those calculated in model investigations. It is due to the priority of removal of multivalent ions, such as sulfatic, arsenate or phosphate ions, by DLH/NO3-Cl/M.
Matrix approach to the simultaneous detection of multiple potato pathogens by real-time PCR.
Nikitin, M M; Statsyuk, N V; Frantsuzov, P A; Dzhavakhiya, V G; Golikov, A G
2018-03-01
Create a method for highly sensitive, selective, rapid and easy-to-use detection and identification of economically significant potato pathogens, including viruses, bacteria and oomycetes, be it single pathogen, or a range of various pathogens occurring simultaneously. Test-systems for real-time PCR, operating in the unified amplification regime, have been developed for Phytophthora infestans, Pectobacterium atrosepticum, Dickeya dianthicola, Dickeya solani, Ralstonia solanacearum, Pectobacterium carotovorum, Clavibacter michiganensis subsp. sepedonicus, potato viruses Y (ordinary and necrotic forms as well as indiscriminative test system, detecting all forms), A, X, S, M, potato leaf roll virus, potato mop top virus and potato spindle tuber viroid. The test-systems (including polymerase and revertase) were immobilized and lyophilized in miniature microreactors (1·2 μl) on silicon DNA/RNA microarrays (micromatrices) to be used with a mobile AriaDNA ® amplifier. Preloaded 30-reaction micromatrices having shelf life of 3 and 6 months (for RNA- and DNA-based pathogens, respectively) at room temperature with no special conditions were successfully tested on both reference and field samples in comparison with traditional ELISA and microbiological methods, showing perfect performance and sensitivity (1 pg). The accurate, rapid and user-friendly diagnostic system in a micromatrix format may significantly contribute to pathogen screening and phytopathological studies. © 2018 The Authors. Journal of Applied Microbiology published by John Wiley & Sons Ltd on behalf of The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Casadei, D.
2014-10-01
The objective Bayesian treatment of a model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered. It is shown that the reference prior for the parameter of interest (the signal intensity) can be well approximated by the widely (ab)used flat prior only when the expected background is very high. On the other hand, a very simple approximation (the limiting form of the reference prior for perfect prior background knowledge) can be safely used over a large portion of the background parameters space. The resulting approximate reference posterior is a Gamma density whose parameters are related to the observed counts. This limiting form is simpler than the result obtained with a flat prior, with the additional advantage of representing a much closer approximation to the reference posterior in all cases. Hence such limiting prior should be considered a better default or conventional prior than the uniform prior. On the computing side, it is shown that a 2-parameter fitting function is able to reproduce extremely well the reference prior for any background prior. Thus, it can be useful in applications requiring the evaluation of the reference prior for a very large number of times.
Method and simulation to study 3D crosstalk perception
NASA Astrophysics Data System (ADS)
Khaustova, Dar'ya; Blondé, Laurent; Huynh-Thu, Quan; Vienne, Cyril; Doyen, Didier
2012-03-01
To various degrees, all modern 3DTV displays suffer from crosstalk, which can lead to a decrease of both visual quality and visual comfort, and also affect perception of depth. In the absence of a perfect 3D display technology, crosstalk has to be taken into account when studying perception of 3D stereoscopic content. In order to improve 3D presentation systems and understand how to efficiently eliminate crosstalk, it is necessary to understand its impact on human perception. In this paper, we present a practical method to study the perception of crosstalk. The approach consists of four steps: (1) physical measurements of a 3DTV, (2) building of a crosstalk surface based on those measurements and representing specifically the behavior of that 3TV, (3) manipulation of the crosstalk function and application on reference images to produce test images degraded by crosstalk in various ways, and (4) psychophysical tests. Our approach allows both a realistic representation of the behavior of a 3DTV and the easy manipulation of its resulting crosstalk in order to conduct psycho-visual experiments. Our approach can be used in all studies requiring the understanding of how crosstalk affects perception of stereoscopic content and how it can be corrected efficiently.
Laser Opto-Electronic Correlator for Robotic Vision Automated Pattern Recognition
NASA Technical Reports Server (NTRS)
Marzwell, Neville
1995-01-01
A compact laser opto-electronic correlator for pattern recognition has been designed, fabricated, and tested. Specifically it is a translation sensitivity adjustable compact optical correlator (TSACOC) utilizing convergent laser beams for the holographic filter. Its properties and performance, including the location of the correlation peak and the effects of lateral and longitudinal displacements for both filters and input images, are systematically analyzed based on the nonparaxial approximation for the reference beam. The theoretical analyses have been verified in experiments. In applying the TSACOC to important practical problems including fingerprint identification, we have found that the tolerance of the system to the input lateral displacement can be conveniently increased by changing a geometric factor of the system. The system can be compactly packaged using the miniature laser diode sources and can be used in space by the National Aeronautics and Space Administration (NASA) and ground commercial applications which include robotic vision, and industrial inspection of automated quality control operations. The personnel of Standard International will work closely with the Jet Propulsion Laboratory (JPL) to transfer the technology to the commercial market. Prototype systems will be fabricated to test the market and perfect the product. Large production will follow after successful results are achieved.
Guðnadóttir, Unnur; Garðarsdóttir, Ragna B
2014-04-01
Exposure to media images of the 'body-perfect' ideal has been partly blamed for the pursuit of thinness among women and muscularity among men. Research has largely overlooked the materialistic messages frequently associated with these images. We present findings from two studies with Icelandic students aged 18-21, one focusing on young women (n = 303) and one on young men (n = 226), which test associations of materialistic and body-perfect ideals with body dissatisfaction and excessive body shaping behaviors. In both studies, the internalization of materialistic values is strongly linked to the internalization of body-perfect ideals: the thin-ideal for young women, and the muscular-ideal for young men. A materialist value orientation also predicted body dissatisfaction in both studies, and was linked to body shaping behaviors, albeit differently for young women and men. Thus, the research identifies materialism as a further correlate of both body dissatisfaction and excessive body-shaping behaviors. The findings support Dittmar's (2008) Consumer Culture Impact Model, which proposes that the body-perfect and 'material good life' ideals jointly impact well-being. © 2014 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Schellenberg, François; Humeau, Camille
2017-06-01
CDT is at present the most relevant routinely available biological marker of alcohol use and is widely used for screening and monitoring of patients. The lack of standardization leads to specific reference intervals for each procedure. The IFCC working group devoted to CDT demonstrated that the standardization is possible using calibrators assigned to the reference measurement procedure. In this study, we compare the capillary electrophoresis (CE) techniques Capillarys® CDT and Minicap® CDT (Sebia, Lisses, France) to the reference procedure before and after standardization in 126 samples covering the range of CDT measurement. Both capillary electrophoresis procedures show a high correlation (r=0,997) with the reference procedure and the concordance correlation coefficient evaluated according to Mc Bride is "almost perfect" (>0.997 for both CE procedures). The number of results with a relative difference higher than the acceptable difference limit is only 1 for Capillarys® CDT and 5 for Minicap® CDT. These results demonstrate the efficiency of the standardization of CDT measurements for both CE techniques from Sebia, achieved using calibrators assigned to the reference measurement procedure.
Sabeh, Michael; Duceppe, Marc-Olivier; St-Arnaud, Marc; Mimee, Benjamin
2018-01-01
Relative gene expression analyses by qRT-PCR (quantitative reverse transcription PCR) require an internal control to normalize the expression data of genes of interest and eliminate the unwanted variation introduced by sample preparation. A perfect reference gene should have a constant expression level under all the experimental conditions. However, the same few housekeeping genes selected from the literature or successfully used in previous unrelated experiments are often routinely used in new conditions without proper validation of their stability across treatments. The advent of RNA-Seq and the availability of public datasets for numerous organisms are opening the way to finding better reference genes for expression studies. Globodera rostochiensis is a plant-parasitic nematode that is particularly yield-limiting for potato. The aim of our study was to identify a reliable set of reference genes to study G. rostochiensis gene expression. Gene expression levels from an RNA-Seq database were used to identify putative reference genes and were validated with qRT-PCR analysis. Three genes, GR, PMP-3, and aaRS, were found to be very stable within the experimental conditions of this study and are proposed as reference genes for future work.
The Linear Bicharacteristic Scheme for Computational Electromagnetics
NASA Technical Reports Server (NTRS)
Beggs, John H.; Chan, Siew-Loong
2000-01-01
The upwind leapfrog or Linear Bicharacteristic Scheme (LBS) has previously been implemented and demonstrated on electromagnetic wave propagation problems. This paper extends the Linear Bicharacteristic Scheme for computational electromagnetics to treat lossy dielectric and magnetic materials and perfect electrical conductors. This is accomplished by proper implementation of the LBS for homogeneous lossy dielectric and magnetic media, and treatment of perfect electrical conductors (PECs) are shown to follow directly in the limit of high conductivity. Heterogeneous media are treated through implementation of surface boundary conditions and no special extrapolations or interpolations at dielectric material boundaries are required. Results are presented for one-dimensional model problems on both uniform and nonuniform grids, and the FDTD algorithm is chosen as a convenient reference algorithm for comparison. The results demonstrate that the explicit LBS is a dissipation-free, second-order accurate algorithm which uses a smaller stencil than the FDTD algorithm, yet it has approximately one-third the phase velocity error. The LBS is also more accurate on nonuniform grids.
Pseudo-cat's eye for improved tilt-immune interferometry.
Speake, Clive C; Bradshaw, Miranda J
2015-08-20
We present a new simple optical design for a cat's eye retroreflector. We describe the design of the new optical configuration and its use in tilt-immune interferometry where it enables the tracking of the displacement of a plane target mirror with minimum sensitivity to its tilt about axes orthogonal to the interferometer's optical axis. In this application the new cat's eye does not behave as a perfect retroreflector and we refer to it as a "pseudo"-cat's eye (PCE). The device allows, for the first time, tilt-immune interferometric displacement measurements in cases where the nominal distance to the target mirror is significantly larger than the length of the cat's eye. We describe the general optical characteristics of the PCE and compare its performance in our application with that of a conventional cat's eye optical configuration using ABCD matrices and Zemax analyses. We further suggest a simple modification to the design that would enable the PCE to behave as a perfect cat's eye, and this design may provide an advantageous solution for other applications.
A Two-Dimensional Linear Bicharacteristic Scheme for Electromagnetics
NASA Technical Reports Server (NTRS)
Beggs, John H.
2002-01-01
The upwind leapfrog or Linear Bicharacteristic Scheme (LBS) has previously been implemented and demonstrated on one-dimensional electromagnetic wave propagation problems. This memorandum extends the Linear Bicharacteristic Scheme for computational electromagnetics to model lossy dielectric and magnetic materials and perfect electrical conductors in two dimensions. This is accomplished by proper implementation of the LBS for homogeneous lossy dielectric and magnetic media and for perfect electrical conductors. Both the Transverse Electric and Transverse Magnetic polarizations are considered. Computational requirements and a Fourier analysis are also discussed. Heterogeneous media are modeled through implementation of surface boundary conditions and no special extrapolations or interpolations at dielectric material boundaries are required. Results are presented for two-dimensional model problems on uniform grids, and the Finite Difference Time Domain (FDTD) algorithm is chosen as a convenient reference algorithm for comparison. The results demonstrate that the two-dimensional explicit LBS is a dissipation-free, second-order accurate algorithm which uses a smaller stencil than the FDTD algorithm, yet it has less phase velocity error.
Fang, Fang; Collins-Emerson, Julie M; Heuer, Cord; Hill, Fraser I; Tisdall, David J; Wilson, Peter R; Benschop, Jackie
2014-11-01
A study was performed to investigate interlaboratory test agreement between a research and a commercial veterinary diagnostic laboratory on blood and urine samples, and to investigate test agreement between blood, urine, and kidney samples (research laboratory) for leptospirosis diagnosis. Samples were sourced from 399 sheep and 146 beef cattle from a local abattoir. Interlaboratory agreement for real-time quantitative polymerase chain reaction (qPCR) results on urine samples was almost perfect (kappa = 0.90), despite the use of different amplification targets (DNA gyrase subunit B gene vs. 16s ribosomal RNA gene), chemistries (SYTO9 vs. TaqMan probe), and pre-PCR processing. Interlaboratory agreement for microscopic agglutination test (MAT) positivity was almost perfect (kappa = 0.93) for Leptospira borgpetersenii serovar Hardjo subtype Hardjobovis (Hardjobovis) but moderate (kappa = 0.53) for Leptospira interrogans serovar Pomona (Pomona). Among animals that had different titers recorded, higher Hardjobovis and lower Pomona titers were reported by the commercial laboratory than by the research laboratory (P < 0.005). These interlaboratory comparisons can assist researchers and diagnosticians in interpreting the sometimes discrepant test results. Within the research laboratory, the comparison of qPCR results on urine and kidney showed almost perfect agreement (kappa = 0.84), suggesting that the qPCR on these 2 specimens can be used interchangeably. The agreement between MAT positivity and urine and kidney qPCR results was fair (kappa = 0.32 and kappa = 0.33, respectively). However, the prevalence ratio of urine and kidney qPCR positivity in Hardjobovis-seropositive versus Hardjobovis-seronegative sheep indicated that Hardjobovis seropositivity found in sheep may be able to predict shedding or renal carriage. © 2014 The Author(s).
Fleischhacker, Sheila E; Rodriguez, Daniel A; Evenson, Kelly R; Henley, Amanda; Gizlice, Ziya; Soto, Dolly; Ramachandran, Gowri
2012-11-22
Most studies on the local food environment have used secondary sources to describe the food environment, such as government food registries or commercial listings (e.g., Reference USA). Most of the studies exploring evidence for validity of secondary retail food data have used on-site verification and have not conducted analysis by data source (e.g., sensitivity of Reference USA) or by food outlet type (e.g., sensitivity of Reference USA for convenience stores). Few studies have explored the food environment in American Indian communities. To advance the science on measuring the food environment, we conducted direct, on-site observations of a wide range of food outlets in multiple American Indian communities, without a list guiding the field observations, and then compared our findings to several types of secondary data. Food outlets located within seven State Designated Tribal Statistical Areas in North Carolina (NC) were gathered from online Yellow Pages, Reference USA, Dun & Bradstreet, local health departments, and the NC Department of Agriculture and Consumer Services. All TIGER/Line 2009 roads (>1,500 miles) were driven in six of the more rural tribal areas and, for the largest tribe, all roads in two of its cities were driven. Sensitivity, positive predictive value, concordance, and kappa statistics were calculated to compare secondary data sources to primary data. 699 food outlets were identified during primary data collection. Match rate for primary data and secondary data differed by type of food outlet observed, with the highest match rates found for grocery stores (97%), general merchandise stores (96%), and restaurants (91%). Reference USA exhibited almost perfect sensitivity (0.89). Local health department data had substantial sensitivity (0.66) and was almost perfect when focusing only on restaurants (0.91). Positive predictive value was substantial for Reference USA (0.67) and moderate for local health department data (0.49). Evidence for validity was comparatively lower for Dun & Bradstreet, online Yellow Pages, and the NC Department of Agriculture. Secondary data sources both over- and under-represented the food environment; they were particularly problematic for identifying convenience stores and specialty markets. More attention is needed to improve the validity of existing data sources, especially for rural local food environments.
2012-01-01
Background Most studies on the local food environment have used secondary sources to describe the food environment, such as government food registries or commercial listings (e.g., Reference USA). Most of the studies exploring evidence for validity of secondary retail food data have used on-site verification and have not conducted analysis by data source (e.g., sensitivity of Reference USA) or by food outlet type (e.g., sensitivity of Reference USA for convenience stores). Few studies have explored the food environment in American Indian communities. To advance the science on measuring the food environment, we conducted direct, on-site observations of a wide range of food outlets in multiple American Indian communities, without a list guiding the field observations, and then compared our findings to several types of secondary data. Methods Food outlets located within seven State Designated Tribal Statistical Areas in North Carolina (NC) were gathered from online Yellow Pages, Reference USA, Dun & Bradstreet, local health departments, and the NC Department of Agriculture and Consumer Services. All TIGER/Line 2009 roads (>1,500 miles) were driven in six of the more rural tribal areas and, for the largest tribe, all roads in two of its cities were driven. Sensitivity, positive predictive value, concordance, and kappa statistics were calculated to compare secondary data sources to primary data. Results 699 food outlets were identified during primary data collection. Match rate for primary data and secondary data differed by type of food outlet observed, with the highest match rates found for grocery stores (97%), general merchandise stores (96%), and restaurants (91%). Reference USA exhibited almost perfect sensitivity (0.89). Local health department data had substantial sensitivity (0.66) and was almost perfect when focusing only on restaurants (0.91). Positive predictive value was substantial for Reference USA (0.67) and moderate for local health department data (0.49). Evidence for validity was comparatively lower for Dun & Bradstreet, online Yellow Pages, and the NC Department of Agriculture. Conclusions Secondary data sources both over- and under-represented the food environment; they were particularly problematic for identifying convenience stores and specialty markets. More attention is needed to improve the validity of existing data sources, especially for rural local food environments. PMID:23173781
Shanmugam, Vedapuri; Azarskova, Marianna; Nguyen, Shon; Hurlston, Mackenzie; Sabatier, Jennifer; Zhang, Guoqing; Osmanov, Saladin; Ellenberger, Dennis; Yang, Chunfu; Vitek, Charles; Liulchuk, Maria; Nizova, Natalya
2015-01-01
An accurate accessible test for early infant diagnosis (EID) is crucial for identifying HIV-infected infants and linking them to treatment. To improve EID services in Ukraine, dried blood spot (DBS) samples obtained from 237 HIV-exposed children (≤18 months of age) in six regions in Ukraine in 2012 to 2013 were tested with the AmpliSens DNA-HIV-FRT assay, the Roche COBAS AmpliPrep/COBAS TaqMan (CAP/CTM) HIV-1 Qual test, and the Abbott RealTime HIV-1 Qualitative assay. In comparison with the paired whole-blood results generated from AmpliSens testing at the oblast HIV reference laboratories in Ukraine, the sensitivity was 0.99 (95% confidence interval [CI], 0.95 to 1.00) for the AmpliSens and Roche CAP/CTM Qual assays and 0.96 (95% CI, 0.90 to 0.98) for the Abbott Qualitative assay. The specificity was 1.00 (95% CI, 0.97 to 1.00) for the AmpliSens and Abbott Qualitative assays and 0.99 (95% CI, 0.96 to 1.00) for the Roche CAP/CTM Qual assay. McNemar analysis indicated that the proportions of positive results for the tests were not significantly different (P > 0.05). Cohen's kappa (0.97 to 0.99) indicated almost perfect agreement among the three tests. These results indicated that the AmpliSens DBS and whole-blood tests performed equally well and were comparable to the two commercially available EID tests. More importantly, the performance characteristics of the AmpliSens DBS test meets the World Health Organization EID test requirements; implementing AmpliSens DBS testing might improve EID services in resource-limited settings. PMID:26447114
The Union’s Naval War in Louisiana, 1861-1863
2006-11-06
Navy,” 2 December 1861, in Appendix to the Congressional Globe , 37th Cong., 2d Sess., 1861, 18. In his report, Welles referred to the blockade as...were pouring into the Forts a perfect storm of shot, shell, grape , Cannister, and spherical can. The roar of the artillery was deafening; the rushing...sound of the descending bombs, the sharp, whizzing noise made by the jagged fragments of exploded shells, the whirring of grape shot and hissing of
NASA Astrophysics Data System (ADS)
Güémez, J.; Fiolhais, M.
2018-05-01
We apply the four-vector formalism of special relativity to describe various interaction processes of photons with a solar sail, in two cases: when the sail’s surface is a perfect mirror, and when it is a body coated with a totally absorbing material. We stress the pedagogical value of implementing simultaneously both the linear momentum and the energy conservation in a covariant fashion, as our formalism inherently does. It also allows for a straightforward change of the description of a certain process in different inertial reference frames.
[Computer simulation by passenger wound analysis of vehicle collision].
Zou, Dong-Hua; Liu, Nning-Guo; Shen, Jie; Zhang, Xiao-Yun; Jin, Xian-Long; Chen, Yi-Jiu
2006-08-15
To reconstruct the course of vehicle collision, so that to provide the reference for forensic identification and disposal of traffic accidents. Through analyzing evidences left both on passengers and vehicles, technique of momentum impulse combined with multi-dynamics was applied to simulate the motion and injury of passengers as well as the track of vehicles. Model of computer stimulation perfectly reconstructed phases of the traffic collision, which coincide with details found by forensic investigation. Computer stimulation is helpful and feasible for forensic identification in traffic accidents.
Crystal growth of sulfide materials from alkali polysulfide liquids
NASA Technical Reports Server (NTRS)
White, W. B.
1979-01-01
The fluids experiment system was designed for low temperature solution growth, nominally aqueous solution growth. The alkali polysulfides, compositions in the systems Na2S-S and K2S-S form liquids in the temperature range of 190 C to 400 C. These can be used as solvents for other important classes of materials such as transition metal and other sulfides which are not soluble in aqueous media. Among these materials are luminescent and electroluminescent crystals whose physical properties are sensitive functions of crystal perfection and which could, therefore, serve as test materials for perfection improvement under microgravity conditions.
Li, Qiuying; Pham, Hoang
2017-01-01
In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance.
Translation, reliability, and clinical utility of the Melbourne Assessment 2.
Gerber, Corinna N; Plebani, Anael; Labruyère, Rob
2017-10-12
The aims were to (i) provide a German translation of the Melbourne Assessment 2 (MA2), a quantitative test to measure unilateral upper limb function in children with neurological disabilities and (ii) to evaluate its reliability and aspects of clinical utility. After its translation into German and approval of the back translation by the original authors, the MA2 was performed and videotaped twice with 30 children with neuromotor disorders. For each participant, two raters scored the video of the first test for inter-rater reliability. To determine test-retest reliability, one rater additionally scored the video of the second test while the other rater repeated the scoring of the first video to evaluate intra-rater reliability. Time needed for rater training, test administration, and scoring was recorded. The four subscale scores showed excellent intra-, inter-rater, and test-retest reliability with intraclass correlation coefficients of 0.90-1.00 (95%-confidence intervals 0.78-1.00). Score items revealed substantial to almost perfect intra-rater reliability (weighted kappa k w = 0.66-1.00) for the more affected side. Score item inter-rater and test-retest reliability of the same extremity were, with one exception, moderate to almost perfect (k w = 0.42-0.97; k w = 0.40-0.89). Furthermore, the MA2 was feasible and acceptable for patients and clinicians. The MA2 showed excellent subscale and moderate to almost perfect score item reliability. Implications for Rehabilitation There is a lack of high-quality studies about psychometric properties of upper limb measurement tools in the neuropediatric population. The Melbourne Assessment 2 is a promising tool for reliable measurement of unilateral upper limb movement quality in the neuropediatric population. The Melbourne Assessment 2 is acceptable and practicable to therapists and patients for routine use in clinical care.
Figueroa, Carmen; Johnson, Cheryl; Ford, Nathan; Sands, Anita; Dalal, Shona; Meurant, Robyn; Prat, Irena; Hatzold, Karin; Urassa, Willy; Baggaley, Rachel
2018-06-01
The ability of individuals to use HIV self-tests correctly is debated. To inform the 2016 WHO recommendation on HIV self-testing, we assessed the reliability and performance of HIV rapid diagnostic tests when used by self-testers. In this systematic review and meta-analysis, we searched PubMed, PopLine, and Embase, conference abstracts, and additional grey literature between Jan 1, 1995, and April 30, 2016, for observational and experimental studies reporting on HIV self-testing performance. We excluded studies evaluating home specimen collection because patients did not interpret their own test results. We extracted data independently, using standardised extraction forms. Outcomes of interest were agreement between self-testers and health-care workers, sensitivity, and specificity. We calculated κ to establish the level of agreement and pooled κ estimates using a random-effects model, by approach (directly assisted or unassisted) and type of specimen (blood or oral fluid). We examined heterogeneity with the I 2 statistic. 25 studies met inclusion criteria (22 to 5662 participants). Quality assessment with QUADAS-2 showed studies had low risk of bias and incomplete reporting in accordance with the STARD checklist. Raw proportion of agreement ranged from 85·4% to 100%, and reported κ ranged from fair (κ 0·277, p<0·001) to almost perfect (κ 0·99, n=25). Pooled κ suggested almost perfect agreement for both types of approaches (directly assisted 0·98, 95% CI 0·96-0·99 and unassisted 0·97, 0·96-0·98; I 2 =34·5%, 0-97·8). Excluding two outliers, sensitivity and specificity was higher for blood-based rapid diagnostic tests (4/16) compared with oral fluid rapid diagnostic tests (13/16). The most common error that affected test performance was incorrect specimen collection (oral swab or finger prick). Study limitations included the use of different reference standards and no disaggregation of results by individuals taking antiretrovirals. Self-testers can reliably and accurately do HIV rapid diagnostic tests, as compared with trained health-care workers. Errors in performance might be reduced through the improvement of rapid diagnostic tests for self-testing, particularly to make sample collection easier and to simplify instructions for use. The Bill & Melinda Gates Foundation and Unitaid. © 2018. World Health Oranization. Licensee Elseviere. This is an Open Access article published under the CC BY 3.0 IGO license which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In any use of this article, there should be no suggestion that WHO endorses any specific organisation, products or services. The use of the WHO logo is not permitted. This notice should be preserved along with the article's original URL.
College Planning: The Savvy Parent's Guide
ERIC Educational Resources Information Center
Berger, Sandra
2008-01-01
Because college admission has become much more competitive, parents and students need to know that excellent grades and test scores may not be enough to gain placement, especially in highly selective schools. Keep in mind that there are more than 25,000 class valedictorians every year, most with nearly perfect standardized test scores. Also, the…
Delicious Low GL space foods by using Low GI materials -IH and Vacuum cooking -
NASA Astrophysics Data System (ADS)
Katayama, Naomi; Nagasaka, Sanako; Murasaki, Masahiro; Space Agriculture Task Force, J.
Enough life-support systems are necessary to stay in space for a long term. The management of the meal for astronauts is in particular very important. When an astronaut gets sick in outer space, it means death. To astronauts, the delicious good balance space foods are essential for their work. This study was aimed at making balance space foods menu for the healthy space-life. The kitchen utensil has a limit in the space environment. And a method to warm is only heater without fire. Therefore purpose of this study, we make the space foods which make by using vacuum cooking device and the IH heater We made space foods menu to referred to Japanese nutrition standard in 2010. We made space foods menu which are using "brown rice, wheat, soy bean, sweet potato and green-vegetable" and " loach and insects which are silkworm pupa, snail, mud snail, turmait, fly, grasshopper, bee". We use ten health adults as subjects. Ten subjects performed the sensory test of the questionnaire method. There was the sensuality examination in the item of "taste, a fragrance, color, the quantity" and acquired a mark at ten points of perfect scores.. We could make the space foods which we devised with vacuum cooking and IH deliciously. As a result of sensuality examination, the eight points in ten points of perfect scores was appeared. This result showed, our space food menu is delicious. We can store these space foods with a refrigerator for 20 days by making vacuum cooking. This thing is at all important result so that a save is enabled when surplus food was done in future by performing vacuum cooking. We want to make delicious space foods menu with vacuum cooking and IH heater more in future.
Ma, Ya-Jun; West, Justin; Nazaran, Amin; Cheng, Xin; Hoenecke, Heinz; Du, Jiang; Chang, Eric Y
2018-02-02
To utilize the 3D inversion recovery prepared ultrashort echo time with cones readout (IR-UTE-Cones) MRI technique for direct imaging of lamellar bone with comparison to the gold standard of computed tomography (CT). CT and MRI was performed on 11 shoulder specimens and three patients. Five specimens had imaging performed before and after glenoid fracture (osteotomy). 2D and 3D volume-rendered CT images were reconstructed and conventional T1-weighted and 3D IR-UTE-Cones MRI techniques were performed. Glenoid widths and defects were independently measured by two readers using the circle method. Measurements were compared with those made from 3D CT datasets. Paired-sample Student's t tests and intraclass correlation coefficients were performed. In addition, 2D CT and 3D IR-UTE-Cones MRI datasets were linearly registered, digitally overlaid, and compared in consensus by these two readers. Compared with the reference standard (3D CT), glenoid bone diameter measurements made on 2D CT and 3D IR-UTE-Cones were not significantly different for either reader, whereas T1-weighted images underestimated the diameter (mean difference of 0.18 cm, p = 0.003 and 0.16 cm, p = 0.022 for readers 1 and 2, respectively). However, mean margin of error for measuring glenoid bone loss was small for all modalities (range, 1.46-3.92%). All measured ICCs were near perfect. Digitally registered 2D CT and 3D IR-UTE-Cones MRI datasets yielded essentially perfect congruity between the two modalities. The 3D IR-UTE-Cones MRI technique selectively visualizes lamellar bone, produces similar contrast to 2D CT imaging, and compares favorably to measurements made using 2D and 3D CT.
NASA Astrophysics Data System (ADS)
Costa-Surós, M.; Calbó, J.; González, J. A.; Long, C. N.
2014-08-01
The cloud vertical distribution and especially the cloud base height, which is linked to cloud type, are important characteristics in order to describe the impact of clouds on climate. In this work, several methods for estimating the cloud vertical structure (CVS) based on atmospheric sounding profiles are compared, considering the number and position of cloud layers, with a ground-based system that is taken as a reference: the Active Remote Sensing of Clouds (ARSCL). All methods establish some conditions on the relative humidity, and differ in the use of other variables, the thresholds applied, or the vertical resolution of the profile. In this study, these methods are applied to 193 radiosonde profiles acquired at the Atmospheric Radiation Measurement (ARM) Southern Great Plains site during all seasons of the year 2009 and endorsed by Geostationary Operational Environmental Satellite (GOES) images, to confirm that the cloudiness conditions are homogeneous enough across their trajectory. The perfect agreement (i.e., when the whole CVS is estimated correctly) for the methods ranges between 26 and 64%; the methods show additional approximate agreement (i.e., when at least one cloud layer is assessed correctly) from 15 to 41%. Further tests and improvements are applied to one of these methods. In addition, we attempt to make this method suitable for low-resolution vertical profiles, like those from the outputs of reanalysis methods or from the World Meteorological Organization's (WMO) Global Telecommunication System. The perfect agreement, even when using low-resolution profiles, can be improved by up to 67% (plus 25% of the approximate agreement) if the thresholds for a moist layer to become a cloud layer are modified to minimize false negatives with the current data set, thus improving overall agreement.
NASA Astrophysics Data System (ADS)
Costa-Surós, M.; Calbó, J.; González, J. A.; Long, C. N.
2014-04-01
The cloud vertical distribution and especially the cloud base height, which is linked to cloud type, is an important characteristic in order to describe the impact of clouds on climate. In this work several methods to estimate the cloud vertical structure (CVS) based on atmospheric sounding profiles are compared, considering number and position of cloud layers, with a ground based system which is taken as a reference: the Active Remote Sensing of Clouds (ARSCL). All methods establish some conditions on the relative humidity, and differ on the use of other variables, the thresholds applied, or the vertical resolution of the profile. In this study these methods are applied to 193 radiosonde profiles acquired at the ARM Southern Great Plains site during all seasons of year 2009 and endorsed by GOES images, to confirm that the cloudiness conditions are homogeneous enough across their trajectory. The perfect agreement (i.e. when the whole CVS is correctly estimated) for the methods ranges between 26-64%; the methods show additional approximate agreement (i.e. when at least one cloud layer is correctly assessed) from 15-41%. Further tests and improvements are applied on one of these methods. In addition, we attempt to make this method suitable for low resolution vertical profiles, like those from the outputs of reanalysis methods or from the WMO's Global Telecommunication System. The perfect agreement, even when using low resolution profiles, can be improved up to 67% (plus 25% of approximate agreement) if the thresholds for a moist layer to become a cloud layer are modified to minimize false negatives with the current data set, thus improving overall agreement.
Kêkê, L M; Samouda, H; Jacobs, J; di Pompeo, C; Lemdani, M; Hubert, H; Zitouni, D; Guinhouya, B C
2015-06-01
This study aims to compare three body mass index (BMI)-based classification systems of childhood obesity: the French, the International Obesity Task Force (IOTF) and the World Health Organization (WHO) references. The study involved 1382 schoolchildren, recruited from the Lille Academic District in France in May 2009 aged 8.4±1.7 years (4.0-12.0 years). Their mean height and body mass were 131.5±10.9cm and 30.7±9.2kg, respectively, resulting in a BMI of 17.4±3.2kg/m(2). The weight status was defined according to the three systems considered in this study. The agreement between these references was tested using the Cohen's kappa coefficient. The prevalence of overweight was higher with the WHO references (20.0%) in comparison with the French references (13.8%; P<0.0001) and the IOTF (16.2%; P≤0.01). A similar result was found with obesity (WHO: 11.6% vs. IOTF: 6.7%; or French references: 6.7%; P<0.0001). Agreement between the three references ranged from "moderate" to "perfect" (0.43≤κ≤1.00; P<0.0001). Kappa coefficients were higher when the three references were used to classify children as obese (0.63≤κ≤1.00; P<0.0001) as compared to classification in the overweight (obesity excluded) category (0.43≤κ≤0.94; P<0.0001). When sex and age categories (4-6 years vs. 7-12 years) were considered to define the overweight status, the lowest kappa coefficient was found between the French and WHO references in boys aged 7-12 years (κ=0.28; P<0.0001), and the highest one in girls aged 7-12 years between the French references and IOTF (κ=0.97; P<0.0001). As for obesity, agreement between the three references ranged from 0.60 to 1.00 (P<0.0001), with the lowest values obtained in the comparison of the WHO references against French references or IOTF among boys aged 7-12 years (κ=0.60; P<0.0001). Overall, the WHO references yield an overestimation in overweight and/or obesity within this sample of schoolchildren as compared to the French references and the IOTF. The magnitude of agreement coefficients between the three references depends on of both sex and age categories. The French references seem to be in rather close agreement with the IOTF in defining overweight, especially in 7-12-year-old children. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
First-Principles Study on the Tensile Properties and Failure Mechanism of the CoSb3/Ti Interface
NASA Astrophysics Data System (ADS)
She, Wuchang; Liu, Qiwen; Mei, Hai; Zhai, Pengcheng; Li, Jun; Liu, Lisheng
2018-06-01
The mechanical properties of the CoSb3/Ti interface play a critical role in the application of thermoelectric devices. To understand the failure mechanism of the CoSb3(001)/Ti(01 \\bar{1} 0) interface, we investigated its response during tensile deformations by first-principles calculations. By comparison with the result between the perfect interface and the interface after atomic migration, we find that the atomic migration at the interface has an obvious influence on the mechanical properties. The tensile tests indicate the ideal tensile stress of the CoSb3/Ti interface after atomic migration decreases by about 8.1% as compared to that of the perfect one. The failure mechanism of the perfect CoSb3/Ti interface is different from that of the migrated CoSb3/Ti interface. For the perfect CoSb3/Ti interface, the breakage of the Co-Sb bond leads to the failure of the system. For the CoSb3/Ti interface after atomic migration, the breakage of the Sb-Sb bond leads to the failure of the system. This is mainly because the new ionic Ti-Sb bonds make the electrons redistributed and weaken the stiffness of the Co-Sb bonds.
Tian, Hongmiao; Wang, Chunhui; Shao, Jinyou; Ding, Yucheng; Li, Xiangming
2014-10-28
Electrically induced structure formation (EISF) is an interesting and unique approach for generating a microstructured duplicate from a rheological polymer by a spatially modulated electric field induced by a patterned template. Most of the research on EISF have so far used various dielectric polymers (with an electrical conductivity smaller than 10(-10) S/m that can be considered a perfect dielectric), on which the electric field induces a Maxwell stress only due to the dipoles (or bounded charges) in the polymer molecules, leading to a structure with a small aspect ratio. This paper presents a different approach for improving the aspect ratio allowed in EISF by doping organic salt into the perfect dielectric polymer, i.e., turning the perfect dielectric into a leaky dielectric, considering the fact that the free space charges enriched in the leaky dielectric polymer can make an additional contribution to the Maxwell stress, i.e., electrohydrodynamic pressure, which is desirable for high aspect ratio structuring. Our numerical simulations and experimental tests have shown that a leaky dielectric polymer, with a small conductivity comparable to that of deionized water, can be much more effective at being electrohydrodynamically deformed into a high aspect ratio in comparison with a perfect dielectric polymer when both of them have roughly the same dielectric constant.
Numbers of Beauty: An Innovative Aesthetic Analysis for Orthognathic Surgery Treatment Planning.
Marianetti, Tito Matteo; Gasparini, Giulio; Midulla, Giulia; Grippaudo, Cristina; Deli, Roberto; Cervelli, Daniele; Pelo, Sandro; Moro, Alessandro
2016-01-01
The aim of this study was to validate a new aesthetic analysis and establish the sagittal position of the maxilla on an ideal group of reference. We want to demonstrate the usefulness of these findings in the treatment planning of patients undergoing orthognathic surgery. We took a reference group of 81 Italian women participating in a national beauty contest in 2011 on which we performed Arnett's soft tissues cephalometric analysis and our new "Vertical Planning Line" analysis. We used the ideal values to elaborate the surgical treatment planning of a second group of 60 consecutive female patients affected by skeletal class III malocclusion. Finally we compared both pre- and postoperative pictures with the reference values of the ideal group. The ideal group of reference does not perfectly fit in Arnett's proposed norms. From the descriptive statistical comparison of the patients' values before and after orthognathic surgery with the reference values we observed how all parameters considered got closer to the ideal population. We consider our "Vertical Planning Line" a useful help for orthodontist and surgeon in the treatment planning of patients with skeletal malocclusions, in combination with the clinical facial examination and the classical cephalometric analysis of bone structures.
Model to Test Electric Field Comparisons in a Composite Fairing Cavity
NASA Technical Reports Server (NTRS)
Trout, Dawn; Burford, Janessa
2012-01-01
Evaluating the impact of radio frequency transmission in vehicle fairings is important to sensitive spacecraft. This study shows cumulative distribution function (CDF) comparisons of composite . a fairing electromagnetic field data obtained by computational electromagnetic 3D full wave modeling and laboratory testing. This work is an extension of the bare aluminum fairing perfect electric conductor (PEC) model. Test and model data correlation is shown.
Model to Test Electric Field Comparisons in a Composite Fairing Cavity
NASA Technical Reports Server (NTRS)
Trout, Dawn H.; Burford, Janessa
2013-01-01
Evaluating the impact of radio frequency transmission in vehicle fairings is important to sensitive spacecraft. This study shows cumulative distribution function (CDF) comparisons of composite a fairing electromagnetic field data obtained by computational electromagnetic 3D full wave modeling and laboratory testing. This work is an extension of the bare aluminum fairing perfect electric conductor (PEC) model. Test and model data correlation is shown.
2013-01-01
Background This study investigates the reliability of muscle performance tests using cost- and time-effective methods similar to those used in clinical practice. When conducting reliability studies, great effort goes into standardising test procedures to facilitate a stable outcome. Therefore, several test trials are often performed. However, when muscle performance tests are applied in the clinical setting, clinicians often only conduct a muscle performance test once as repeated testing may produce fatigue and pain, thus variation in test results. We aimed to investigate whether cervical muscle performance tests, which have shown promising psychometric properties, would remain reliable when examined under conditions similar to those of daily clinical practice. Methods The intra-rater (between-day) and inter-rater (within-day) reliability was assessed for five cervical muscle performance tests in patients with (n = 33) and without neck pain (n = 30). The five tests were joint position error, the cranio-cervical flexion test, the neck flexor muscle endurance test performed in supine and in a 45°-upright position and a new neck extensor test. Results Intra-rater reliability ranged from moderate to almost perfect agreement for joint position error (ICC ≥ 0.48-0.82), the cranio-cervical flexion test (ICC ≥ 0.69), the neck flexor muscle endurance test performed in supine (ICC ≥ 0.68) and in a 45°-upright position (ICC ≥ 0.41) with the exception of a new test (neck extensor test), which ranged from slight to moderate agreement (ICC = 0.14-0.41). Likewise, inter-rater reliability ranged from moderate to almost perfect agreement for joint position error (ICC ≥ 0.51-0.75), the cranio-cervical flexion test (ICC ≥ 0.85), the neck flexor muscle endurance test performed in supine (ICC ≥ 0.70) and in a 45°-upright position (ICC ≥ 0.56). However, only slight to fair agreement was found for the neck extensor test (ICC = 0.19-0.25). Conclusions Intra- and inter-rater reliability ranged from moderate to almost perfect agreement with the exception of a new test (neck extensor test), which ranged from slight to moderate agreement. The significant variability observed suggests that tests like the neck extensor test and the neck flexor muscle endurance test performed in a 45°-upright position are too unstable to be used when evaluating neck muscle performance. PMID:24299621
Ahlqvist, Margary; Berglund, Britta; Nordström, Gun; Klang, Birgitta; Johansson, Eva
2014-01-01
Nursing students should be given opportunities to participate in clinical audits during their education. However, audit tools are seldom tested for reliability among nursing students. The aim of this study was to present reliability among nursing students using the instrument PVC assess to assess management of peripheral venous catheters (PVCs) and PVC-related signs of thrombophlebitis. PVC assess was used to assess 67 inserted PVCs in 60 patients at ten wards at a university hospital. One group of nursing students (n=4) assessed PVCs at the bedside (inter-rater reliability) and photographs of these PVCs were taken. Another group of students (n=3) assessed the PVCs in the photographs after 4 weeks (test-retest reliability). To determine reliability, proportion of agreement [P(A)] and Cohen's kappa coefficient (κ) were calculated. For bedside assessment of PVCs, P(A) ranged from good to excellent (0.80-1.0) in 55% of the 26 PVC assess items that were tested. P(A) was poor (<0.70) for two items: "adherence of inner dressing to the skin" and "PVC location." In 81% of the items, κ was between moderate and almost perfect: moderate (n=5), substantial (n=3), almost perfect (n=5). For edema at insertion site and two items on PVC dressing, κ was fair (0.21-0.40). Regarding test-retest reliability, P(A) varied between good and excellent (0.81-1) in 85%-95% of the items, and the κ ranged between moderate and almost perfect (0.41-1) in 90%-95%. PVC assess demonstrated satisfactory reliability among nursing students. However, students need training in how to use the instrument before assessing PVCs.
Guideline maintenance and revision. 50 years of the Jones criteria for diagnosis of rheumatic fever.
Shiffman, R N
1995-07-01
To understand better the factors that led to revisions of the Jones criteria, a widely used diagnostic guideline for diagnosis of rheumatic fever. The original publication of the Jones criteria and the four revisions were examined to identify changes. A computer software maintenance paradigm was applied, and modifications were categorized as corrective (error correction), perfective (enhancements in response to user needs), or adaptive (responses to new knowledge). Modifications of the Jones criteria were primarily corrective and perfective. Disease characteristics, originally characterized as major manifestations, were subsequently categorized as minor manifestations and vice versa. Twenty years after the initial publication, a requirement was added to enhance specificity (evidence for antecedent streptococcal infection). Descriptions of rheumatic manifestations became more detailed over time to eliminate ambiguous definitions and provide information to help clinicians decide about borderline cases. This emphasis on corrective and perfective maintenance contrasts with an expectation that adaptive changes would predominate, as with most knowledge-based systems. In fact, despite 50 years of technologic and methodologic advances in medicine, only echocardiography and new antibody testing contributed new knowledge that bears on the diagnosis of rheumatic fever. Corrective and perfective maintenance can be avoided by making effective use of knowledge that exists at the time a guideline is published. Despite the apparent durability of the Jones criteria, carefully structured, evidence-based guidelines should require less corrective and perfective maintenance. Adaptive maintenance can be anticipated if the quality of evidence or the level of consensus that supports each recommendation is explicitly recorded.
Effects of controlled element dynamics on human feedforward behavior in ramp-tracking tasks.
Laurense, Vincent A; Pool, Daan M; Damveld, Herman J; van Paassen, Marinus René M; Mulder, Max
2015-02-01
In real-life manual control tasks, human controllers are often required to follow a visible and predictable reference signal, enabling them to use feedforward control actions in conjunction with feedback actions that compensate for errors. Little is known about human control behavior in these situations. This paper investigates how humans adapt their feedforward control dynamics to the controlled element dynamics in a combined ramp-tracking and disturbance-rejection task. A human-in-the-loop experiment is performed with a pursuit display and vehicle-like controlled elements, ranging from a single integrator through second-order systems with a break frequency at either 3, 2, or 1 rad/s, to a double integrator. Because the potential benefits of feedforward control increase with steeper ramp segments in the target signal, three steepness levels are tested to investigate their possible effect on feedforward control with the various controlled elements. Analyses with four novel models of the operator, fitted to time-domain data, reveal feedforward control for all tested controlled elements and both (nonzero) tested levels of ramp steepness. For the range of controlled element dynamics investigated, it is found that humans adapt to these dynamics in their feedforward response, with a close to perfect inversion of the controlled element dynamics. No significant effects of ramp steepness on the feedforward model parameters are found.
[The benefit of multi-disciplines combination in evidence-based medicine teaching practice].
Fang, Xianghua; Wang, Chunxiu
2016-01-01
In this article, we gave a detail description on the experience of teaching evidence-based medicine (EBM) in undergraduate students and graduate students as well as for continue medical education. The staff of Department of EBM was from variety of sub-discipline, including epidemiologists, physicians, surgeons and librarian. To make the course smoothly, the member of the department discussed the plan together frequently, and had conduct test lecture, which make the course to become perfect. The key for the development in our department is powerful organization and leadership, pursuing perfect, keeping with the progress of the EBM and team-working.
NASA Technical Reports Server (NTRS)
Garai, Anirban; Diosady, Laslo T.; Murman, Scott M.; Madavan, Nateri K.
2016-01-01
The perfectly matched layer (PML) technique is developed in the context of a high- order spectral-element Discontinuous-Galerkin (DG) method. The technique is applied to a range of test cases and is shown to be superior compared to other approaches, such as those based on using characteristic boundary conditions and sponge layers, for treating the inflow and outflow boundaries of computational domains. In general, the PML technique improves the quality of the numerical results for simulations of practical flow configurations, but it also exhibits some instabilities for large perturbations. A preliminary analysis that attempts to understand the source of these instabilities is discussed.
Welding in airplane construction
NASA Technical Reports Server (NTRS)
Rechtlich, A; Schrenk, M
1928-01-01
The present article attempts to explain the principles for the production of a perfect weld and to throw light on the unexplained problems. Moreover, it is intended to elucidate the possibilities of testing the strength and reliability of welded parts.
Performance evaluation of infrared imaging system in field test
NASA Astrophysics Data System (ADS)
Wang, Chensheng; Guo, Xiaodong; Ren, Tingting; Zhang, Zhi-jie
2014-11-01
Infrared imaging system has been applied widely in both military and civilian fields. Since the infrared imager has various types and different parameters, for system manufacturers and customers, there is great demand for evaluating the performance of IR imaging systems with a standard tool or platform. Since the first generation IR imager was developed, the standard method to assess the performance has been the MRTD or related improved methods which are not perfect adaptable for current linear scanning imager or 2D staring imager based on FPA detector. For this problem, this paper describes an evaluation method based on the triangular orientation discrimination metric which is considered as the effective and emerging method to evaluate the synthesis performance of EO system. To realize the evaluation in field test, an experiment instrument is developed. And considering the importance of operational environment, the field test is carried in practical atmospheric environment. The test imagers include panoramic imaging system and staring imaging systems with different optics and detectors parameters (both cooled and uncooled). After showing the instrument and experiment setup, the experiment results are shown. The target range performance is analyzed and discussed. In data analysis part, the article gives the range prediction values obtained from TOD method, MRTD method and practical experiment, and shows the analysis and results discussion. The experimental results prove the effectiveness of this evaluation tool, and it can be taken as a platform to give the uniform performance prediction reference.
Genetic Algorithm Based Multi-Agent System Applied to Test Generation
ERIC Educational Resources Information Center
Meng, Anbo; Ye, Luqing; Roy, Daniel; Padilla, Pierre
2007-01-01
Automatic test generating system in distributed computing context is one of the most important links in on-line evaluation system. Although the issue has been argued long since, there is not a perfect solution to it so far. This paper proposed an innovative approach to successfully addressing such issue by the seamless integration of genetic…
Examples of Data Analysis with SPSS/PC+ Studentware.
ERIC Educational Resources Information Center
MacFarland, Thomas W.
Intended for classroom use only, these unpublished notes contain computer lessons on descriptive statistics with files previously created in WordPerfect 4.2 and Lotus 1-2-3 Version 1.A for the IBM PC+. The statistical measures covered include Student's t-test with two independent samples; Student's t-test with a paired sample; Chi-square analysis;…
Fusion And Inference From Multiple And Massive Disparate Distributed Dynamic Data Sets
2017-07-01
principled methodology for two-sample graph testing; designed a provably almost-surely perfect vertex clustering algorithm for block model graphs; proved...3.7 Semi-Supervised Clustering Methodology ...................................................................... 9 3.8 Robust Hypothesis Testing...dimensional Euclidean space – allows the full arsenal of statistical and machine learning methodology for multivariate Euclidean data to be deployed for
Feldman, Steven R; Bagel, Jerry; Namak, Shahla
2018-05-01
The introduction of biologics has revolutionized the treatment of immune-mediated diseases, but high cost and limited patient access remain hurdles, and some physicians are concerned that biosimilars are not similar enough. The purpose of this narrative review is to describe biosimilar safety, efficacy, nomenclature, extrapolation and interchangeability. In the United States, the Biologics Price Competition and Innovation Act created an abbreviated pathway for licensing of a biologic that is biosimilar to another licensed product (i.e., the reference product). This approval pathway differs from that of generic small-molecule drugs because biologics are too complex to be perfectly duplicated, and follows a process designed to demonstrate that any differences between the biosimilar and its reference product have no significant impact on safety and efficacy. The US approval process requires extensive analytical assessments, animal studies and clinical trials, assuring that biosimilar products provide clinical results similar to those of the reference product. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Defect inspection of periodic patterns with low-order distortions
NASA Astrophysics Data System (ADS)
Khalaj, Babak H.; Aghajan, Hamid K.; Paulraj, Arogyaswami; Kailath, Thomas
1994-03-01
A self-reliance technique is developed for detecting defects in repeated pattern wafers and masks with low-order distortions. If the patterns are located on a perfect rectangular grid, it is possible to estimate the period of repeated patterns in both directions, and then produce a defect-free reference image for making comparison with the actual image. But in some applications, the repeated patterns are somehow shifted from their desired position on a rectangular grid, and the aforementioned algorithm cannot be directly applied. In these situations, to produce a defect-free reference image and locate the defected cells, it is necessary to estimate the amount of misalignment of each cell beforehand. The proposed technique first estimates the misalignment of repeated patterns in each row and column. After estimating the location of all cells in the image, a defect-free reference image is generated by averaging over all the cells and is compared with the input image to localize the possible defects.
NASA Astrophysics Data System (ADS)
Chen, Hsi-Chao; Huang, Chen-Yu; Lin, Ssu-Fan; Chen, Sheng-Hui
2011-09-01
Residual or internal stresses directly affect a variety of phenomena including adhesion, generation of crystalline defects, perfection of epitaxial layers and formation of film surface growths such as hillocks and whiskers. Sputtering oxide films with high density promote high compressive stress, and it offers researchers a reference if the value of residual stress could be analyzed directly. Since, the study of residual stress of SiO2 and Nb2O5 thin film deposited by DC magnetron sputtered on hard substrate (BK7) and flexible substrate (PET and PC). A finite element method (FEM) with an equivalent-reference-temperature (ERT) technique had been proposed and used to model and evaluate the intrinsic strains of layered structures. The research has improved the equivalent reference temperature (ERT) technique of the simulation of intrinsic strain for oxygen film. The results have also generalized two models connecting to the lattice volume to predict the residual stress of hard substrate and flexible substrate with error of 3% and 6%, respectively.
Artifact removal in the context of group ICA: a comparison of single-subject and group approaches
Du, Yuhui; Allen, Elena A.; He, Hao; Sui, Jing; Wu, Lei; Calhoun, Vince D.
2018-01-01
Independent component analysis (ICA) has been widely applied to identify intrinsic brain networks from fMRI data. Group ICA computes group-level components from all data and subsequently estimates individual-level components to recapture inter-subject variability. However, the best approach to handle artifacts, which may vary widely among subjects, is not yet clear. In this work, we study and compare two ICA approaches for artifacts removal. One approach, recommended in recent work by the Human Connectome Project, first performs ICA on individual subject data to remove artifacts, and then applies a group ICA on the cleaned data from all subjects. We refer to this approach as Individual ICA based artifacts Removal Plus Group ICA (IRPG). A second proposed approach, called Group Information Guided ICA (GIG-ICA), performs ICA on group data, then removes the group-level artifact components, and finally performs subject-specific ICAs using the group-level non-artifact components as spatial references. We used simulations to evaluate the two approaches with respect to the effects of data quality, data quantity, variable number of sources among subjects, and spatially unique artifacts. Resting-state test-retest datasets were also employed to investigate the reliability of functional networks. Results from simulations demonstrate GIG-ICA has greater performance compared to IRPG, even in the case when single-subject artifacts removal is perfect and when individual subjects have spatially unique artifacts. Experiments using test-retest data suggest that GIG-ICA provides more reliable functional networks. Based on high estimation accuracy, ease of implementation, and high reliability of functional networks, we find GIG-ICA to be a promising approach. PMID:26859308
Underway Recovery Test 6 (URT-6) - Day 4 Activities
2018-01-20
Astronaut Stephen Bowen checks out the choppy sea conditions as part of Underway Recovery Test 6 aboard the USS Anchorage. Bowen is an observer of the testing—giving an astronaut’s perspective to Kennedy Space Center’s NASA Recovery Team. The purpose of testing is to perfect recovery efforts to minimize impact to astronauts like Bowen once they have splashed down in the Pacific Ocean inside the Orion capsule.
Modified Perfect Harmonics Cancellation Control of a Grid Interfaced SPV Power Generation
NASA Astrophysics Data System (ADS)
Singh, B.; Shahani, D. T.; Verma, A. K.
2015-03-01
This paper deals with a grid interfaced solar photo voltaic (SPV) power generating system with modified perfect harmonic cancellation (MPHC) control for power quality improvement in terms of mitigation of the current harmonics, power factor correction, control of point of common coupling (PCC) voltage with reactive power compensation and load balancing in a three phase distribution system. The proposed grid interfaced SPV system consists of a SPV array, a dc-dc boost converter and a voltage source converter (VSC) used for the compensation of other connected linear and nonlinear loads at PCC. The reference grid currents are estimated using MPHC method and control signals are derived by using pulse width modulation (PWM) current controller of VSC. The SPV power is fed to the common dc bus of VSC and dc-dc boost converter using maximum power point tracking (MPPT). The dc link voltage of VSC is regulated by using dc voltage proportional integral (PI) controller. The analysis of the proposed SPV power generating system is carried out under dc/ac short circuit and severe SPV-SX and SPV-TX intrusion.
Micro-optics: enabling technology for illumination shaping in optical lithography
NASA Astrophysics Data System (ADS)
Voelkel, Reinhard
2014-03-01
Optical lithography has been the engine that has empowered semiconductor industry to continually reduce the half-pitch for over 50 years. In early mask aligners a simple movie lamp was enough to illuminate the photomask. Illumination started to play a more decisive role when proximity mask aligners appeared in the mid-1970s. Off-axis illumination was introduced to reduce diffraction effects. For early projection lithography systems (wafer steppers), the only challenge was to collect the light efficiently to ensure short exposure time. When projection optics reached highest level of perfection, further improvement was achieved by optimizing illumination. Shaping the illumination light, also referred as pupil shaping, allows the optical path from reticle to wafer to be optimized and thus has a major impact on aberrations and diffraction effects. Highly-efficient micro-optical components are perfectly suited for this task. Micro-optics for illumination evolved from simple flat-top (fly's-eye) to annular, dipole, quadrupole, multipole and freeform illumination. Today, programmable micro-mirror arrays allow illumination to be changed on the fly. The impact of refractive, diffractive and reflective microoptics for photolithography will be discussed.
Dendritic platforms for biomimicry and biotechnological applications.
Nagpal, Kalpana; Mohan, Anand; Thakur, Sourav; Kumar, Pradeep
2018-02-15
Dendrimers, commonly referred to as polymeric trees, offer endless opportunities for biotechnological and biomedical applications. By controlling the type, length, and molecular weight of the core, branches and end groups, respectively, the chemical functionality and topology of dendrimeric archetypes can be customized which further can be applied to achieve required solubility, biodegradability, diagnosis and other applications. Given the physicochemical variability of the dendrimers and their hybrids, this review attempts to discuss a full spectrum of recent advances and strides made by these "perfectly designed structures". An extensive biotech/biomimicry application profiling of dendrimers is provided with focus on complex archetypical designs such as protein biomimicry (angiogenic inhibitors, regenerative hydroxyapatite and collagen) and biotechnology applications. In terms of biotechnological advances, dendrimers have provided distinctive advantages in the fields of biocatalysis, microbicides, artificial lights, mitochondrial function modulation, vaccines, tissue regeneration and repair, antigen carriers and even biosensors. In addition, this review provides overview of the extensive chemo-functionalization opportunities available with dendrimers which makes them a perfect candidate for forming drug conjugates, protein hybrids, bio mimics, lipidic derivatives, metal deposits and nanoconjugates thereby making them the most multifunctional platforms for diverse biotechnological applications.
The electronic transport properties of defected bilayer sliding armchair graphene nanoribbons
NASA Astrophysics Data System (ADS)
Mohammadi, Amin; Haji-Nasiri, Saeed
2018-04-01
By applying non-equilibrium Green's functions (NEGF) in combination with tight-binding (TB) model, we investigate and compare the electronic transport properties of perfect and defected bilayer armchair graphene nanoribbons (BAGNRs) under finite bias. Two typical defects which are placed in the middle of top layer (i.e. single vacancy (SV) and stone wale (SW) defects) are examined. The results reveal that in both perfect and defected bilayers, the maximum current refers to β-AB, AA and α-AB stacking orders, respectively, since the intermolecular interactions are stronger in them. Moreover it is observed that a SV decreases the current in all stacking orders, but the effects of a SW defect is nearly unpredictable. Besides, we introduced a sequential switching behavior and the effects of defects on the switching performance is studied as well. We found that a SW defect can significantly improve the switching behavior of a bilayer system. Transmission spectrum, band structure, molecular energy spectrum and molecular projected self-consistent Hamiltonian (MPSH) are analyzed subsequently to understand the electronic transport properties of these bilayer devices which can be used in developing nano-scale bilayer systems.
NASA Astrophysics Data System (ADS)
Carrea, Dario; Abellan, Antonio; Humair, Florian; Matasci, Battista; Derron, Marc-Henri; Jaboyedoff, Michel
2016-03-01
Ground-based LiDAR has been traditionally used for surveying purposes via 3D point clouds. In addition to XYZ coordinates, an intensity value is also recorded by LiDAR devices. The intensity of the backscattered signal can be a significant source of information for various applications in geosciences. Previous attempts to account for the scattering of the laser signal are usually modelled using a perfect diffuse reflection. Nevertheless, experience on natural outcrops shows that rock surfaces do not behave as perfect diffuse reflectors. The geometry (or relief) of the scanned surfaces plays a major role in the recorded intensity values. Our study proposes a new terrestrial LiDAR intensity correction, which takes into consideration the range, the incidence angle and the geometry of the scanned surfaces. The proposed correction equation combines the classical radar equation for LiDAR with the bidirectional reflectance distribution function of the Oren-Nayar model. It is based on the idea that the surface geometry can be modelled by a relief of multiple micro-facets. This model is constrained by only one tuning parameter: the standard deviation of the slope angle distribution (σslope) of micro-facets. Firstly, a series of tests have been carried out in laboratory conditions on a 2 m2 board covered by black/white matte paper (perfect diffuse reflector) and scanned at different ranges and incidence angles. Secondly, other tests were carried out on rock blocks of different lithologies and surface conditions. Those tests demonstrated that the non-perfect diffuse reflectance of rock surfaces can be practically handled by the proposed correction method. Finally, the intensity correction method was applied to a real case study, with two scans of the carbonate rock outcrop of the Dents-du-Midi (Swiss Alps), to improve the lithological identification for geological mapping purposes. After correction, the intensity values are proportional to the intrinsic material reflectance and are independent from range, incidence angle and scanned surface geometry. The corrected intensity values significantly improve the material differentiation.
Benz, Thomas; Lehmann, Susanne; Gantenbein, Andreas R; Sandor, Peter S; Stewart, Walter F; Elfering, Achim; Aeschlimann, André G; Angst, Felix
2018-03-09
The Migraine Disability Assessment (MIDAS) is a brief questionnaire and measures headache-related disability. This study aimed to translate and cross-culturally adapt the original English version of the MIDAS to German and to test its reliability. The standardized translation process followed international guidelines. The pre-final version was tested for clarity and comprehensibility by 34 headache sufferers. Test-retest reliability of the final version was quantified by 36 headache patients completing the MIDAS twice with an interval of 48 h. Reliability was determined by intraclass correlation coefficients and internal consistency by Cronbach's α. All steps of the translation process were followed, documented and approved by the developer of the MIDAS. The expert committee discussed in detail the complex phrasing of the questions that refer to one to another, especially exclusion of headache-days from one item to the next. The German version contains more active verb sentences and prefers the perfect to the imperfect tense. The MIDAS scales intraclass correlation coefficients ranged from 0.884 to 0.994 and was 0.991 (95% CI: 0.982-0.995) for the MIDAS total score. Cronbach's α for the MIDAS as a whole was 0.69 at test and 0.67 at retest. The translation process was challenged by the comprehensibility of the questionnaire. The German version of the MIDAS is a highly reliable instrument for assessing headache related disability with moderate internal consistency. Provided validity testing of the German MIDAS is successful, it can be recommended for use in clinical practice as well as in research.
Li, Qiuying; Pham, Hoang
2017-01-01
In this paper, we propose a software reliability model that considers not only error generation but also fault removal efficiency combined with testing coverage information based on a nonhomogeneous Poisson process (NHPP). During the past four decades, many software reliability growth models (SRGMs) based on NHPP have been proposed to estimate the software reliability measures, most of which have the same following agreements: 1) it is a common phenomenon that during the testing phase, the fault detection rate always changes; 2) as a result of imperfect debugging, fault removal has been related to a fault re-introduction rate. But there are few SRGMs in the literature that differentiate between fault detection and fault removal, i.e. they seldom consider the imperfect fault removal efficiency. But in practical software developing process, fault removal efficiency cannot always be perfect, i.e. the failures detected might not be removed completely and the original faults might still exist and new faults might be introduced meanwhile, which is referred to as imperfect debugging phenomenon. In this study, a model aiming to incorporate fault introduction rate, fault removal efficiency and testing coverage into software reliability evaluation is developed, using testing coverage to express the fault detection rate and using fault removal efficiency to consider the fault repair. We compare the performance of the proposed model with several existing NHPP SRGMs using three sets of real failure data based on five criteria. The results exhibit that the model can give a better fitting and predictive performance. PMID:28750091
Martín-Martín, G P; García-Armengol, J; Roig-Vila, J V; Espí-Macías, A; Martínez-Sanjuán, V; Mínguez-Pérez, M; Lorenzo-Liñán, M Á; Mulas-Fernández, C; González-Argenté, F X
2017-10-01
The aim of the present study was to evaluate the diagnostic accuracy of magnetic resonance (MR) defecography and compare it with videodefecography in the evaluation of obstructed defecation syndrome. This was a prospective cohort test accuracy study conducted at one major tertiary referral center on patients with a diagnosis of obstructed defecation syndrome who were referred to the colorectal surgery clinic in a consecutive series from 2009 to 2012. All patients underwent a clinical examination, videodefecography, and MR defecography in the supine position. We analyzed diagnostic accuracy for MR defecography and performed an agreement analysis using Cohen's kappa index (κ) for each diagnostic imaging examination performed with videodefecography and MR defecography. We included 40 patients with Rome III diagnostic criteria of obstructed defecation syndrome. The degree of agreement between the two tests was as follows: almost perfect for anismus (κ = 0.88) and rectal prolapse (κ = 0.83), substantial for enterocele (κ = 0.80) and rectocele grade III (κ = 0.65), moderate for intussusception (κ = 0.50) and rectocele grade II (κ = 0.49), and slight for rectocele grade I (κ = 0.30) and excessive perineal descent (κ = 0.22). Eighteen cystoceles and 11 colpoceles were diagnosed only by MR defecography. Most patients (54%) stated that videodefecography was the more uncomfortable test. MR defecography could become the imaging test of choice for evaluating obstructed defecation syndrome.
Relation between perception of vertical axis rotation and vestibulo-ocular reflex symmetry
NASA Technical Reports Server (NTRS)
Peterka, Robert J.; Benolken, Martha S.
1991-01-01
Subjects seated in a vertical axis rotation chair controlled their rotational velocity by adjusting a potentiometer. Their goal was to null out pseudorandom rotational perturbations in order to remain perceptually stationary. Most subjects showed a slow linear drift of velocity (a constant acceleration) to one side when they were deprived of an earth-fixed visual reference. The amplitude and direction of this drift can be considered a measure of a static bias in the subject's perception of rotation. The presence of a perceptual bias is consistent with a small, constant imbalance of vestibular function which could be of either central or peripheral origin. Deviations from perfect vestibulocular reflex (VOR) symmetry are also assumed to be related to imbalances in either peripheral or central vestibular function. Researchers looked for correlations between perceptual bias and various measures of vestibular reflex symmetry that might suggest a common source for both reflective and perceptual imbalances. No correlations were found. Measurement errors could not account for these results since repeated tests on the same subjects of both perceptual bias and VOR symmetry were well correlated.
At the Crossroads of Art and Science: A New Course for University Non-Science Majors
NASA Astrophysics Data System (ADS)
Blatt, S. Leslie
2004-03-01
How much did Seurat know about the physics, physiology, and perceptual science of color mixing when he began his experiments in pointillism? Did Vermeer have a camera obscura built into his studio to create the perfect perspective and luminous effects of his canvases? Early in the 20th century, consequences of the idea that "no single reference point is to be preferred above any other" were worked out in physics by Einstein (special and general relativity), in art by Picasso (early cubism), and in music by Schoenberg (12-tone compositions); did this same paradigm-shifting concept arise, in three disparate fields, merely by coincidence? We are developing a new course, aimed primarily at non-science majors, that addresses questions like these through a combination of hands-on experiments on the physics of light, investigations in visual perception, empirical tests of various drawing and painting techniques, and field trips to nearby museums. We will show a few examples of the kinds of art/science intersections our students will be exploring, and present a working outline for the course.
Breaking the indexing ambiguity in serial crystallography.
Brehm, Wolfgang; Diederichs, Kay
2014-01-01
In serial crystallography, a very incomplete partial data set is obtained from each diffraction experiment (a `snapshot'). In some space groups, an indexing ambiguity exists which requires that the indexing mode of each snapshot needs to be established with respect to a reference data set. In the absence of such re-indexing information, crystallographers have thus far resorted to a straight merging of all snapshots, yielding a perfectly twinned data set of higher symmetry which is poorly suited for structure solution and refinement. Here, two algorithms have been designed for assembling complete data sets by clustering those snapshots that are indexed in the same way, and they have been tested using 15,445 snapshots from photosystem I [Chapman et al. (2011), Nature (London), 470, 73-77] and with noisy model data. The results of the clustering are unambiguous and enabled the construction of complete data sets in the correct space group P63 instead of (twinned) P6322 that researchers have been forced to use previously in such cases of indexing ambiguity. The algorithms thus extend the applicability and reach of serial crystallography.
Interferometric surface mapping with variable sensitivity.
Jaerisch, W; Makosch, G
1978-03-01
In the photolithographic process, presently employed for the production of integrated circuits, sets of correlated masks are used for exposing the photoresist on silicon wafers. Various sets of masks which are printed in different printing tools must be aligned correctly with respect to the structures produced on the wafer in previous process steps. Even when perfect alignment is considered, displacements and distortions of the printed wafer patterns occur. They are caused by imperfections of the printing tools or/and wafer deformations resulting from high temperature processes. Since the electrical properties of the final integrated circuits and therefore the manufacturing yield depend to a great extent on the precision at which such patterns are superimposed, simple and fast overlay measurements and flatness measurements as well are very important in IC-manufacturing. A simple optical interference method for flatness measurements will be described which can be used under manufacturing conditions. This method permits testing of surface height variations by nearly grazing light incidence by absence of a physical reference plane. It can be applied to polished surfaces and rough surfaces as well.
NASA Astrophysics Data System (ADS)
Harris, Brent; Fields, Shelby S.; Neill, Justin L.; Pulliam, Robin; Muckle, Matt; Pate, Brooks
2016-06-01
Recent advances in Fourier transform millimeter-wave spectroscopy techniques have renewed the application reach of molecular rotational spectroscopy for analytical chemistry. We present a sampling method for sub ppm analysis of low volatility impurities by thermal evolution from solid powders using a millimeter-wave Fourier transform molecular rotational resonance (FT-MRR) spectrometer for detection. This application of FT-MRR is relevant to the manufacturing of safe oral pharmaceuticals. Low volatility impurities can be challenging to detect at 1 ppm levels with chromatographic techniques. One such example of a potentially mutagenic impurity is acetamide (v.p. 1 Torr at 40 C, m.p. 80 C). We measured the pure reference spectrum of acetamide by flowing the sublimated vapor pressure of acetamide crystals through the FT-MRR spectrometer. The spectrometer lower detection level (LDL) for a broadband (> 20 GHz, 10 min.) spectrum is 300 nTorr, 30 pmol, or 2 ng. For a 50 mg powder, perfect sample transfer efficiency can yield a w/w % detection limit of 35 ppb. We extended the sampling method for the acetamide reference measurement to an acetaminophen sample spiked with 5000 ppm acetamide in order to test the sample transfer efficiency when liberated from an pharmaceutical powder. A spectral reference matching algorithm detected the presence of several impurities including acetaldehyde, acetic acid, and acetonitrile that evolved at the melting point of acetaminophen, demonstrating the capability of FT-MRR for identification without a routine chemical standard. The method detection limit (MDL) without further development is less than 10 ppm w/w %. Resolved FT-MRR mixture spectra will be presented with a description of sampling methods.
A system to measure the data quality of spectral remote-sensing reflectance of aquatic environments
NASA Astrophysics Data System (ADS)
Wei, Jianwei; Lee, Zhongping; Shang, Shaoling
2016-11-01
Spectral remote-sensing reflectance (Rrs, sr-1) is the key for ocean color retrieval of water bio-optical properties. Since Rrs from in situ and satellite systems are subject to errors or artifacts, assessment of the quality of Rrs data is critical. From a large collection of high quality in situ hyperspectral Rrs data sets, we developed a novel quality assurance (QA) system that can be used to objectively evaluate the quality of an individual Rrs spectrum. This QA scheme consists of a unique Rrs spectral reference and a score metric. The reference system includes Rrs spectra of 23 optical water types ranging from purple blue to yellow waters, with an upper and a lower bound defined for each water type. The scoring system is to compare any target Rrs spectrum with the reference and a score between 0 and 1 will be assigned to the target spectrum, with 1 for perfect Rrs spectrum and 0 for unusable Rrs spectrum. The effectiveness of this QA system is evaluated with both synthetic and in situ Rrs spectra and it is found to be robust. Further testing is performed with the NOMAD data set as well as with satellite Rrs over coastal and oceanic waters, where questionable or likely erroneous Rrs spectra are shown to be well identifiable with this QA system. Our results suggest that applications of this QA system to in situ data sets can improve the development and validation of bio-optical algorithms and its application to ocean color satellite data can improve the short-term and long-term products by objectively excluding questionable Rrs data.
Epitaxial Growth of LuAG:Ce and LuAG:Ce,Pr Films and Their Scintillation Properties
NASA Astrophysics Data System (ADS)
Douissard, Paul-Antoine; Martin, Thierry; Riva, Federica; Zorenko, Yuriy; Zorenko, Tetiana; Paprocki, Kazimierz; Fedorov, Alexander; Bilski, Pawel; Twardak, Anna
2016-06-01
We performed the growth by Liquid Phase Epitaxy (LPE) of Ce and Ce-Pr doped Lu3Al5O12 (LuAG) Single Crystalline Films (SCFs) onto LuAG and Y3Al5O12 (YAG) substrates. The structural properties of LuAG:Ce and LuAG:Ce,Pr SCFs were examined by X-ray diffraction. The optical properties of the SCFs were studied through cathodoluminescence (CL) spectra, scintillation Light Yield (LY), decay kinetic under α-particle (Pu239) excitation, X-ray excited luminescence, thermostimulated luminescence (TSL) and afterglow measurements. The SCFs grown on LuAG substrates displayed good surface quality and structural perfection, whereas the SCFs grown on YAG substrates showed a rough surface and poorer crystalline quality, due to a large lattice mismatch between the film and the substrate (0.82%). Under α-particle excitation, the LY of LuAG:Ce SCF exceeded by 2 times that of the best YAG:Ce SCF sample used as reference. Under X-ray excitation, the LuAG:Ce SCF with optimized Ce concentration showed LY close (77%) to a reference YAG:Ce Single Crystal (SC) scintillator. The afterglow of LuAG:Ce and LuAG:Ce,Pr SCFs was lower (by 1 decade) than that of the tested reference LuAG:Ce SC. However there is not a complete suppression of the afterglow at room temperature (RT), despite the fact that the SCFs present much lower concentration of antisite and vacancy type defects than their SC counterparts. This can be explained by the presence in the films of other trap centers responsible for TSL above RT.
The perfect family: decision making in biparental care.
Akçay, Erol; Roughgarden, Joan
2009-10-13
Previous theoretical work on parental decisions in biparental care has emphasized the role of the conflict between evolutionary interests of parents in these decisions. A prominent prediction from this work is that parents should compensate for decreases in each other's effort, but only partially so. However, experimental tests that manipulate parents and measure their responses fail to confirm this prediction. At the same time, the process of parental decision making has remained unexplored theoretically. We develop a model to address the discrepancy between experiments and the theoretical prediction, and explore how assuming different decision making processes changes the prediction from the theory. We assume that parents make decisions in behavioral time. They have a fixed time budget, and allocate it between two parental tasks: provisioning the offspring and defending the nest. The proximate determinant of the allocation decisions are parents' behavioral objectives. We assume both parents aim to maximize the offspring production from the nest. Experimental manipulations change the shape of the nest production function. We consider two different scenarios for how parents make decisions: one where parents communicate with each other and act together (the perfect family), and one where they do not communicate, and act independently (the almost perfect family). The perfect family model is able to generate all the types of responses seen in experimental studies. The kind of response predicted depends on the nest production function, i.e. how parents' allocations affect offspring production, and the type of experimental manipulation. In particular, we find that complementarity of parents' allocations promotes matching responses. In contrast, the relative responses do not depend on the type of manipulation in the almost perfect family model. These results highlight the importance of the interaction between nest production function and how parents make decisions, factors that have largely been overlooked in previous models.
Lupia, Rodgers; Wabuyia, Peter B; Otiato, Peter; Fang, Chi-Tai; Tsai, Feng-Jen
2017-12-01
This study aimed to evaluate the association between highly active antiretroviral therapy (HAART) adherence and development of Kaposi's sarcoma (KS) in human immunodeficiency virus (HIV)/AIDS patients. We conducted a retrospective nested case-control study of 165 participants (33 cases and 132 controls) receiving HAART care at Maseno Hospital, Kenya, from January 2005 to October 2013. Cases were HIV-positive adults with KS, who were matched with controls in a ratio of 1:4 based on age (±5 years of each case), sex, and KS diagnosis date. Perfect adherence to HAART was assessed on every clinic visit by patients' self-reporting and pill counts. Chi-square tests were performed to compare socioeconomic and clinical statuses between cases and controls. A conditional logistic regression was used to assess the effects of perfect adherence to HAART, the latest CD4 count, education level, distance to health-care facility, initial World Health Organization stage, and number of regular sexual partners on the development of KS. Only 63.6% participants reported perfect adherence, and the control group had a significantly higher percentage of perfect adherence (75.0%) than did cases (18.2%). After adjustment for potential imbalances in the baseline and clinical characteristics, patients with imperfect HAART adherence had 20-times greater risk of developing KS than patients with perfect HAART adherence [hazard ratios: 21.0, 95% confidence interval: 4.2-105.1]. Patients with low latest CD4 count (≤350 cells/mm 3 ) had a seven-times greater risk of developing KS than did their counterparts (HRs: 7.1, 95% CI: 1.4-36.2). Imperfect HAART adherence and low latest CD4 count are significantly associated with KS development. Copyright © 2015. Published by Elsevier B.V.
Multipotent mesenchymal stromal cells: A promising strategy to manage alcoholic liver disease
Ezquer, Fernando; Bruna, Flavia; Calligaris, Sebastián; Conget, Paulette; Ezquer, Marcelo
2016-01-01
Chronic alcohol consumption is a major cause of liver disease. The term alcoholic liver disease (ALD) refers to a spectrum of mild to severe disorders including steatosis, steatohepatitis, cirrhosis, and hepatocellular carcinoma. With limited therapeutic options, stem cell therapy offers significant potential for these patients. In this article, we review the pathophysiologic features of ALD and the therapeutic mechanisms of multipotent mesenchymal stromal cells, also referred to as mesenchymal stem cells (MSCs), based on their potential to differentiate into hepatocytes, their immunomodulatory properties, their potential to promote residual hepatocyte regeneration, and their capacity to inhibit hepatic stellate cells. The perfect match between ALD pathogenesis and MSC therapeutic mechanisms, together with encouraging, available preclinical data, allow us to support the notion that MSC transplantation is a promising therapeutic strategy to manage ALD onset and progression. PMID:26755858
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raudsepp, E.
A test is given to determine if an engineer suffers from one of the three barriers to technical success: fear of success, fear of failure, or perfectionism. As in most such tests, the middle way is best. Successful engineers know that perfection cannot be attained, that they don't have time to worry about failure or success, and that by aiming and perservering in doing things well, success can be achieved.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-04
... WordPerfect, Microsoft Word, PDF, or ASCII file format, and avoid the use of special characters or any... furnaces and boilers is found at 10 CFR 430.23(n) and 10 CFR part 430, subpart B, appendix N, Uniform Test... such as fuel calorific value, weight of condensate, water flow and temperature, voltage, and flue gas...
Nudge for (the Public) Good: How Defaults Can Affect Cooperation
Fosgaard, Toke R.; Piovesan, Marco
2015-01-01
In this paper we test the effect of non-binding defaults on the level of contribution to a public good. We manipulate the default numbers appearing on the decision screen to nudge subjects toward a free-rider strategy or a perfect conditional cooperator strategy. Our results show that the vast majority of our subjects did not adopt the default numbers, but their stated strategy was affected by the default. Moreover, we find that our manipulation spilled over to a subsequent repeated public goods game where default was not manipulated. Here we found that subjects who previously saw the free rider default were significantly less cooperative than those who saw the perfect conditional cooperator default. PMID:26717569
Nudge for (the Public) Good: How Defaults Can Affect Cooperation.
Fosgaard, Toke R; Piovesan, Marco
2015-01-01
In this paper we test the effect of non-binding defaults on the level of contribution to a public good. We manipulate the default numbers appearing on the decision screen to nudge subjects toward a free-rider strategy or a perfect conditional cooperator strategy. Our results show that the vast majority of our subjects did not adopt the default numbers, but their stated strategy was affected by the default. Moreover, we find that our manipulation spilled over to a subsequent repeated public goods game where default was not manipulated. Here we found that subjects who previously saw the free rider default were significantly less cooperative than those who saw the perfect conditional cooperator default.
NASA Technical Reports Server (NTRS)
Bittker, David A.
1996-01-01
A generalized version of the NASA Lewis general kinetics code, LSENS, is described. The new code allows the use of global reactions as well as molecular processes in a chemical mechanism. The code also incorporates the capability of performing sensitivity analysis calculations for a perfectly stirred reactor rapidly and conveniently at the same time that the main kinetics calculations are being done. The GLSENS code has been extensively tested and has been found to be accurate and efficient. Nine example problems are presented and complete user instructions are given for the new capabilities. This report is to be used in conjunction with the documentation for the original LSENS code.
Polarization sensitivity testing of off-plane reflection gratings
NASA Astrophysics Data System (ADS)
Marlowe, Hannah; McEntaffer, Randal L.; DeRoo, Casey T.; Miles, Drew M.; Tutt, James H.; Laubis, Christian; Soltwisch, Victor
2015-09-01
Off-Plane reflection gratings were previously predicted to have different efficiencies when the incident light is polarized in the transverse-magnetic (TM) versus transverse-electric (TE) orientations with respect to the grating grooves. However, more recent theoretical calculations which rigorously account for finitely conducting, rather than perfectly conducting, grating materials no longer predict significant polarization sensitivity. We present the first empirical results for radially ruled, laminar groove profile gratings in the off-plane mount which demonstrate no difference in TM versus TE efficiency across our entire 300-1500 eV bandpass. These measurements together with the recent theoretical results confirm that grazing incidence off-plane reflection gratings using real, not perfectly conducting, materials are not polarization sensitive.
ERIC Educational Resources Information Center
Scott, Paul
2007-01-01
In "Just Perfect: Part 1," the author defined a perfect number N to be one for which the sum of the divisors d (1 less than or equal to d less than N) is N. He gave the first few perfect numbers, starting with those known by the early Greeks. In this article, the author provides an extended list of perfect numbers, with some comments about their…
Locating and parsing bibliographic references in HTML medical articles
Zou, Jie; Le, Daniel; Thoma, George R.
2010-01-01
The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level. PMID:20640222
Locating and parsing bibliographic references in HTML medical articles.
Zou, Jie; Le, Daniel; Thoma, George R
2010-06-01
The set of references that typically appear toward the end of journal articles is sometimes, though not always, a field in bibliographic (citation) databases. But even if references do not constitute such a field, they can be useful as a preprocessing step in the automated extraction of other bibliographic data from articles, as well as in computer-assisted indexing of articles. Automation in data extraction and indexing to minimize human labor is key to the affordable creation and maintenance of large bibliographic databases. Extracting the components of references, such as author names, article title, journal name, publication date and other entities, is therefore a valuable and sometimes necessary task. This paper describes a two-step process using statistical machine learning algorithms, to first locate the references in HTML medical articles and then to parse them. Reference locating identifies the reference section in an article and then decomposes it into individual references. We formulate this step as a two-class classification problem based on text and geometric features. An evaluation conducted on 500 articles drawn from 100 medical journals achieves near-perfect precision and recall rates for locating references. Reference parsing identifies the components of each reference. For this second step, we implement and compare two algorithms. One relies on sequence statistics and trains a Conditional Random Field. The other focuses on local feature statistics and trains a Support Vector Machine to classify each individual word, followed by a search algorithm that systematically corrects low confidence labels if the label sequence violates a set of predefined rules. The overall performance of these two reference-parsing algorithms is about the same: above 99% accuracy at the word level, and over 97% accuracy at the chunk level.
Generating perfect fluid spheres in general relativity
NASA Astrophysics Data System (ADS)
Boonserm, Petarpa; Visser, Matt; Weinfurtner, Silke
2005-06-01
Ever since Karl Schwarzschild’s 1916 discovery of the spacetime geometry describing the interior of a particular idealized general relativistic star—a static spherically symmetric blob of fluid with position-independent density—the general relativity community has continued to devote considerable time and energy to understanding the general-relativistic static perfect fluid sphere. Over the last 90 years a tangle of specific perfect fluid spheres has been discovered, with most of these specific examples seemingly independent from each other. To bring some order to this collection, in this article we develop several new transformation theorems that map perfect fluid spheres into perfect fluid spheres. These transformation theorems sometimes lead to unexpected connections between previously known perfect fluid spheres, sometimes lead to new previously unknown perfect fluid spheres, and in general can be used to develop a systematic way of classifying the set of all perfect fluid spheres.
Thorne, John C; Coggins, Truman E; Carmichael Olson, Heather; Astley, Susan J
2007-04-01
To evaluate classification accuracy and clinical feasibility of a narrative analysis tool for identifying children with a fetal alcohol spectrum disorder (FASD). Picture-elicited narratives generated by 16 age-matched pairs of school-aged children (FASD vs. typical development [TD]) were coded for semantic elaboration and reference strategy by judges who were unaware of age, gender, and group membership of the participants. Receiver operating characteristic (ROC) curves were used to examine the classification accuracy of the resulting set of narrative measures for making 2 classifications: (a) for the 16 children diagnosed with FASD, low performance (n = 7) versus average performance (n = 9) on a standardized expressive language task and (b) FASD (n = 16) versus TD (n = 16). Combining the rates of semantic elaboration and pragmatically inappropriate reference perfectly matched a classification based on performance on the standardized language task. More importantly, the rate of ambiguous nominal reference was highly accurate in classifying children with an FASD regardless of their performance on the standardized language task (area under the ROC curve = .863, confidence interval = .736-.991). Results support further study of the diagnostic utility of narrative analysis using discourse level measures of elaboration and children's strategic use of reference.
Vermeeren, G; Gosselin, M C; Kühn, S; Kellerman, V; Hadjem, A; Gati, A; Joseph, W; Wiart, J; Meyer, F; Kuster, N; Martens, L
2010-09-21
The environment is an important parameter when evaluating the exposure to radio-frequency electromagnetic fields. This study investigates numerically the variation on the whole-body and peak spatially averaged-specific absorption rate (SAR) in the heterogeneous virtual family male placed in front of a base station antenna in a reflective environment. The SAR values in a reflective environment are also compared to the values obtained when no environment is present (free space). The virtual family male has been placed at four distances (30 cm, 1 m, 3 m and 10 m) in front of six base station antennas (operating at 300 MHz, 450 MHz, 900 MHz, 2.1 GHz, 3.5 GHz and 5.0 GHz, respectively) and in three reflective environments (a perfectly conducting wall, a perfectly conducting ground and a perfectly conducting ground + wall). A total of 72 configurations are examined. The absorption in the heterogeneous body model is determined using the 3D electromagnetic (EM) finite-difference time-domain (FDTD) solver Semcad-X. For the larger simulations, requirements in terms of computer resources are reduced by using a generalized Huygens' box approach. It has been observed that the ratio of the SAR in the virtual family male in a reflective environment and the SAR in the virtual family male in the free-space environment ranged from -8.7 dB up to 8.0 dB. A worst-case reflective environment could not be determined. ICNIRP reference levels not always showed to be compliant with the basic restrictions.
Kapil, Aditi; Rai, Piyush Kant; Shanker, Asheesh
2014-01-01
Simple sequence repeats (SSRs) are regions in DNA sequence that contain repeating motifs of length 1–6 nucleotides. These repeats are ubiquitously present and are found in both coding and non-coding regions of genome. A total of 534 complete chloroplast genome sequences (as on 18 September 2014) of Viridiplantae are available at NCBI organelle genome resource. It provides opportunity to mine these genomes for the detection of SSRs and store them in the form of a database. In an attempt to properly manage and retrieve chloroplastic SSRs, we designed ChloroSSRdb which is a relational database developed using SQL server 2008 and accessed through ASP.NET. It provides information of all the three types (perfect, imperfect and compound) of SSRs. At present, ChloroSSRdb contains 124 430 mined SSRs, with majority lying in non-coding region. Out of these, PCR primers were designed for 118 249 SSRs. Tetranucleotide repeats (47 079) were found to be the most frequent repeat type, whereas hexanucleotide repeats (6414) being the least abundant. Additionally, in each species statistical analyses were performed to calculate relative frequency, correlation coefficient and chi-square statistics of perfect and imperfect SSRs. In accordance with the growing interest in SSR studies, ChloroSSRdb will prove to be a useful resource in developing genetic markers, phylogenetic analysis, genetic mapping, etc. Moreover, it will serve as a ready reference for mined SSRs in available chloroplast genomes of green plants. Database URL: www.compubio.in/chlorossrdb/ PMID:25380781
Kapil, Aditi; Rai, Piyush Kant; Shanker, Asheesh
2014-01-01
Simple sequence repeats (SSRs) are regions in DNA sequence that contain repeating motifs of length 1-6 nucleotides. These repeats are ubiquitously present and are found in both coding and non-coding regions of genome. A total of 534 complete chloroplast genome sequences (as on 18 September 2014) of Viridiplantae are available at NCBI organelle genome resource. It provides opportunity to mine these genomes for the detection of SSRs and store them in the form of a database. In an attempt to properly manage and retrieve chloroplastic SSRs, we designed ChloroSSRdb which is a relational database developed using SQL server 2008 and accessed through ASP.NET. It provides information of all the three types (perfect, imperfect and compound) of SSRs. At present, ChloroSSRdb contains 124 430 mined SSRs, with majority lying in non-coding region. Out of these, PCR primers were designed for 118 249 SSRs. Tetranucleotide repeats (47 079) were found to be the most frequent repeat type, whereas hexanucleotide repeats (6414) being the least abundant. Additionally, in each species statistical analyses were performed to calculate relative frequency, correlation coefficient and chi-square statistics of perfect and imperfect SSRs. In accordance with the growing interest in SSR studies, ChloroSSRdb will prove to be a useful resource in developing genetic markers, phylogenetic analysis, genetic mapping, etc. Moreover, it will serve as a ready reference for mined SSRs in available chloroplast genomes of green plants. Database URL: www.compubio.in/chlorossrdb/ © The Author(s) 2014. Published by Oxford University Press.
Medical and psychosocial associates of nonadherence in adolescents with cancer.
Hullmann, Stephanie E; Brumley, Lauren D; Schwartz, Lisa A
2015-01-01
The current study examined adherence to medication regimens among adolescents with cancer by applying the Pediatric Self-Management Model. Adolescents and their parents reported on adherence to medication, reasons for nonadherence, and patient-, family-, and community-level psychosocial variables. Adolescent- and parent-reported adherence were significantly correlated, with about half of the sample reporting perfect adherence. The majority reported "just forgot" as the most common reason for missed medication. Patient-, family-, and community-level variables were examined as predictors of adherence. With regard to individual factors, adolescents who endorsed perfect adherence reported a greater proportion of future-orientated goals and spent fewer days in outpatient clinic visits. For family factors, adolescents who endorsed perfect adherence reported greater social support from their family and were more likely to have a second caregiver who they perceived as overprotective. The community-level variable (social support from friends) tested did not emerge as a predictor of adherence. The results of this study provide direction for intervention efforts to target adolescent goals and family support in order to increase adolescent adherence to cancer treatment regimens. © 2014 by Association of Pediatric Hematology/Oncology Nurses.
Medical and Psychosocial Associates of Nonadherence in Adolescents With Cancer
Hullmann, Stephanie E.; Brumley, Lauren D.; Schwartz, Lisa A.
2015-01-01
The current study examined adherence to medication regimens among adolescents with cancer by applying the Pediatric Self-Management Model. Adolescents and their parents reported on adherence to medication, reasons for nonadherence, and patient-, family-, and community-level psychosocial variables. Adolescent- and parent-reported adherence were significantly correlated, with about half of the sample reporting perfect adherence. The majority reported “just forgot” as the most common reason for missed medication. Patient-, family-, and community-level variables were examined as predictors of adherence. With regard to individual factors, adolescents who endorsed perfect adherence reported a greater proportion of future-orientated goals and spent fewer days in outpatient clinic visits. For family factors, adolescents who endorsed perfect adherence reported greater social support from their family and were more likely to have a second caregiver who they perceived as overprotective. The community-level variable (social support from friends) tested did not emerge as a predictor of adherence. The results of this study provide direction for intervention efforts to target adolescent goals and family support in order to increase adolescent adherence to cancer treatment regimens. PMID:25366574
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabhakaran, SP.; Babu, R. Ramesh, E-mail: rampap2k@yahoo.co.in; Velusamy, P.
2011-11-15
Highlights: {yields} Growth of bulk single crystal of 8-hydroxyquinoline (8-HQ) by vertical Bridgman technique for the first time. {yields} The crystalline perfection is reasonably good. {yields} The photoluminescence spectrum shows that the material is suitable for blue light emission. -- Abstract: Single crystal of organic nonlinear optical material, 8-hydroxyquinoline (8-HQ) of dimension 52 mm (length) x 12 mm (dia.) was grown from melt using vertical Bridgman technique. The crystal system of the material was confirmed by powder X-ray diffraction analysis. The crystalline perfection of the grown crystal was examined by high-resolution X-ray diffraction study. Low angular spread around 400'' ofmore » the diffraction curve and the low full width half maximum values show that the crystalline perfection is reasonably good. The recorded photoluminescence spectrum shows that the material is suitable for blue light emission. Optical transmittance for the UV and visible region was measured and mechanical strength was estimated from Vicker's microhardness test along the growth face of the grown crystal.« less
The Sagittarius tidal stream as a gravitationnal experiment in the Milky Way
NASA Astrophysics Data System (ADS)
Thomas, G. F.; Famaey, B.; Ibata, R.; Lüghausen, F.; Kroupa, P.
2015-12-01
Modified Newtonian Dynamics (MOND or Milgromian dynamics) gives a successful description of many galaxy properties that are hard to understand in the classical framework. The rotation curves of spiral galaxies are, for instance, perfectly reproduced and understood within this framework. Nevertheless, rotation curves only trace the potential in the galactic plane, and it is thus useful to test the shape of the potential outside the plane. Here we use the Sagittarius tidal stream as a gravitational experiment in the Milky Way, in order to check whether MOND can explain both its characteristics and those of the remnant dwarf spheroidal galaxy progenitor. We show that a MOND model of the Sagittarius stream can both perfectly reproduce the observed positions of stars in the stream, and even more strikingly, perfectly reproduce the observed properties of the remnant. Nevertheless, this first model does not reproduce well the observed radial velocities, which could be a signature of a rotating component in the progenitor or of the presence of a massive hot gaseous halo around the Milky Way.
Tuckerman, B
1971-10-01
The 24th Mersenne prime M(p) = 2(p) - 1, and currently the largest known prime, is 2(19937) - 1. Primality was shown by the Lucas-Lehmer test on an IBM 360/91 computer. The 24th even perfect number is (2(19937) - 1).2(19936).
Tuckerman, Bryant
1971-01-01
The 24th Mersenne prime Mp = 2p - 1, and currently the largest known prime, is 219937 - 1. Primality was shown by the Lucas-Lehmer test on an IBM 360/91 computer. The 24th even perfect number is (219937 - 1)·219936. PMID:16591945
Thermal Vacuum Integrated System Test at B-2
NASA Technical Reports Server (NTRS)
Kudlac, Maureen T.; Weaver, Harold F.; Cmar, Mark D.
2012-01-01
The National Aeronautics and Space Administration (NASA) Glenn Research Center (GRC) Plum Brook Station (PBS) Space Propulsion Research Facility, commonly referred to as B-2, is NASA s third largest thermal vacuum facility. It is the largest designed to store and transfer large quantities of liquid hydrogen and liquid oxygen, and is perfectly suited to support developmental testing of chemical propulsion systems as well as fully integrated stages. The facility is also capable of providing thermal-vacuum simulation services to support testing of large lightweight structures, Cryogenic Fluid Management (CFM) systems, electric propulsion test programs, and other In-Space propulsion programs. A recently completed integrated system test demonstrated the refurbished thermal vacuum capabilities of the facility. The test used the modernized data acquisition and control system to monitor the facility during pump down of the vacuum chamber, operation of the liquid nitrogen heat sink (or cold wall) and the infrared lamp array. A vacuum level of 1.3x10(exp -4)Pa (1x10(exp -6)torr) was achieved. The heat sink provided a uniform temperature environment of approximately 77 K (140deg R) along the entire inner surface of the vacuum chamber. The recently rebuilt and modernized infrared lamp array produced a nominal heat flux of 1.4 kW/sq m at a chamber diameter of 6.7 m (22 ft) and along 11 m (36 ft) of the chamber s cylindrical vertical interior. With the lamp array and heat sink operating simultaneously, the thermal systems produced a heat flux pattern simulating radiation to space on one surface and solar exposure on the other surface. The data acquired matched pretest predictions and demonstrated system functionality.
2017-09-05
metamaterial perfect absorber behaves as a meta-cavity bounded between a resonant metasurface and a metallic thin- film reflector. The perfect absorption...cavity quantum electrodynamics devices. 15. SUBJECT TERMS Metamaterial; meta-cavity; metallic thin- film reflector; Fabry-Perot cavity resonance...metamaterial perfect absorber behaves as a meta-cavity bounded between a resonant metasurface and a metallic thin- film reflector. The perfect absorption is
NASA Astrophysics Data System (ADS)
Rode, Stefan; Bennett, Robert; Yoshi Buhmann, Stefan
2018-04-01
We discuss the Casimir effect for boundary conditions involving perfect electromagnetic conductors, which interpolate between perfect electric conductors and perfect magnetic conductors. Based on the corresponding reciprocal Green’s tensor we construct the Green’s tensor for two perfectly reflecting plates with magnetoelectric coupling (non-reciprocal media) within the framework of macroscopic quantum electrodynamics. We calculate the Casimir force between two arbitrary perfect electromagnetic conductor plates, resulting in a universal analytic expression that connects the attractive Casimir force with the repulsive Boyer force. We relate the results to a duality symmetry of electromagnetism.
Visual-conformal display format for helicopter guidance
NASA Astrophysics Data System (ADS)
Doehler, H.-U.; Schmerwitz, Sven; Lueken, Thomas
2014-06-01
Helicopter guidance in situations where natural vision is reduced is still a challenging task. Beside new available sensors, which are able to "see" through darkness, fog and dust, display technology remains one of the key issues of pilot assistance systems. As long as we have pilots within aircraft cockpits, we have to keep them informed about the outside situation. "Situational awareness" of humans is mainly powered by their visual channel. Therefore, display systems which are able to cross-fade seamless from natural vision to artificial computer vision and vice versa, are of greatest interest within this context. Helmet-mounted displays (HMD) have this property when they apply a head-tracker for measuring the pilot's head orientation relative to the aircraft reference frame. Together with the aircraft's position and orientation relative to the world's reference frame, the on-board graphics computer can generate images which are perfectly aligned with the outside world. We call image elements which match the outside world, "visual-conformal". Published display formats for helicopter guidance in degraded visual environment apply mostly 2D-symbologies which stay far behind from what is possible. We propose a perspective 3D-symbology for a head-tracked HMD which shows as much as possible visual-conformal elements. We implemented and tested our proposal within our fixed based cockpit simulator as well as in our flying helicopter simulator (FHS). Recently conducted simulation trials with experienced helicopter pilots give some first evaluation results of our proposal.
Colliandre, Lionel; Le Guilloux, Vincent; Bourg, Stephane; Morin-Allory, Luc
2012-02-27
High Throughput Screening (HTS) is a standard technique widely used to find hit compounds in drug discovery projects. The high costs associated with such experiments have highlighted the need to carefully design screening libraries in order to avoid wasting resources. Molecular diversity is an established concept that has been used to this end for many years. In this article, a new approach to quantify the molecular diversity of screening libraries is presented. The approach is based on the Delimited Reference Chemical Subspace (DRCS) methodology, a new method that can be used to delimit the densest subspace spanned by a reference library in a reduced 2D continuous space. A total of 22 diversity indices were implemented or adapted to this methodology, which is used here to remove outliers and obtain a relevant cell-based partition of the subspace. The behavior of these indices was assessed and compared in various extreme situations and with respect to a set of theoretical rules that a diversity function should satisfy when libraries of different sizes have to be compared. Some gold standard indices are found inappropriate in such a context, while none of the tested indices behave perfectly in all cases. Five DRCS-based indices accounting for different aspects of diversity were finally selected, and a simple framework is proposed to use them effectively. Various libraries have been profiled with respect to more specific subspaces, which further illustrate the interest of the method.
NASA Astrophysics Data System (ADS)
Storm, Emma; Weniger, Christoph; Calore, Francesca
2017-08-01
We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (gtrsim 105) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that are motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |l|<90o and |b|<20o, as observed by the Fermi Large Area Telescope. We finally define a simple reference model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.
Chang, Joy; Tarasova, Tetyana; Shanmugam, Vedapuri; Azarskova, Marianna; Nguyen, Shon; Hurlston, Mackenzie; Sabatier, Jennifer; Zhang, Guoqing; Osmanov, Saladin; Ellenberger, Dennis; Yang, Chunfu; Vitek, Charles; Liulchuk, Maria; Nizova, Natalya
2015-12-01
An accurate accessible test for early infant diagnosis (EID) is crucial for identifying HIV-infected infants and linking them to treatment. To improve EID services in Ukraine, dried blood spot (DBS) samples obtained from 237 HIV-exposed children (≤18 months of age) in six regions in Ukraine in 2012 to 2013 were tested with the AmpliSens DNA-HIV-FRT assay, the Roche COBAS AmpliPrep/COBAS TaqMan (CAP/CTM) HIV-1 Qual test, and the Abbott RealTime HIV-1 Qualitative assay. In comparison with the paired whole-blood results generated from AmpliSens testing at the oblast HIV reference laboratories in Ukraine, the sensitivity was 0.99 (95% confidence interval [CI], 0.95 to 1.00) for the AmpliSens and Roche CAP/CTM Qual assays and 0.96 (95% CI, 0.90 to 0.98) for the Abbott Qualitative assay. The specificity was 1.00 (95% CI, 0.97 to 1.00) for the AmpliSens and Abbott Qualitative assays and 0.99 (95% CI, 0.96 to 1.00) for the Roche CAP/CTM Qual assay. McNemar analysis indicated that the proportions of positive results for the tests were not significantly different (P > 0.05). Cohen's kappa (0.97 to 0.99) indicated almost perfect agreement among the three tests. These results indicated that the AmpliSens DBS and whole-blood tests performed equally well and were comparable to the two commercially available EID tests. More importantly, the performance characteristics of the AmpliSens DBS test meets the World Health Organization EID test requirements; implementing AmpliSens DBS testing might improve EID services in resource-limited settings. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
2014-09-05
adiabatic expansion of a perfect gas ; b. Contains a gas or liquid that would endanger personnel or equipment or create a mis- hap if released; or c...Guidelines for Liquid Rocket Engines 31. TOR-2013(3213)-6 Acoustic Testing on Production Space Vehicle (The Value of the Test and Deletion...materials used in space vehicles, interstages, payload adapters, payload fairings, motor cases, nozzles , propellant tanks, and over-wrapped pressure vessels
NASA Astrophysics Data System (ADS)
Nakano, T.; Oogane, M.; Furuichi, T.; Ando, Y.
2018-04-01
The automotive industry requires magnetic sensors exhibiting highly linear output within a dynamic range as wide as ±1 kOe. A simple model predicts that the magneto-conductance (G-H) curve in a magnetic tunnel junction (MTJ) is perfectly linear, whereas the magneto-resistance (R-H) curve inevitably contains a finite nonlinearity. We prepared two kinds of MTJs using in-plane or perpendicularly magnetized synthetic antiferromagnetic (i-SAF or p-SAF) reference layers and investigated their sensor performance. In the MTJ with the i-SAF reference layer, the G-H curve did not necessarily show smaller nonlinearities than those of the R-H curve with different dynamic ranges. This is because the magnetizations of the i-SAF reference layer start to rotate at a magnetic field even smaller than the switching field (Hsw) measured by a magnetometer, which significantly affects the tunnel magnetoresistance (TMR) effect. In the MTJ with the p-SAF reference layer, the G-H curve showed much smaller nonlinearities than those of the R-H curve, thanks to a large Hsw value of the p-SAF reference layer. We achieved a nonlinearity of 0.08% FS (full scale) in the G-H curve with a dynamic range of ±1 kOe, satisfying our target for automotive applications. This demonstrated that a reference layer exhibiting a large Hsw value is indispensable in order to achieve a highly linear G-H curve.
Completed Gravity Probe B Undergoes Thermal Vacuum Testing
NASA Technical Reports Server (NTRS)
2000-01-01
The Gravity Probe B (GP-B) is the relativity experiment developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. In this photograph, the completed space vehicle is undergoing thermal vacuum environment testing. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies that are already enlivening other branches of science and engineering. Launched April 20, 2004 , the GP-B program was managed for NASA by the Marshall Space Flight Center. Development of the GP-B is the responsibility of Stanford University along with major subcontractor Lockheed Martin Corporation. (Image credit to Russ Underwood, Lockheed Martin Corporation.)
A left lateral accessory pathway unmasked by rivastigmine.
Guenancia, Charles; Fichot, Marie; Garnier, Fabien; Montoy, Mathieu; Laurent, Gabriel
A 75-year-old woman was referred for advice regarding surface electrocardiographic modifications after the initiation of rivastigmine. In our patient, the baseline ECGs appeared perfectly normal. However, the initiation of a cholinesterase inhibitor unmasked a left lateral accessory pathway that had never been diagnosed before. Although cholinesterase inhibitors are known to increase vagal tone, the PR interval was shortened after rivastigmine administration, thus excluding this hypothesis to explain the appearance of the accessory pathway. Therefore, we hypothesized that cholinesterase inhibitors may have increased conduction velocity in the accessory pathway or in the atria. Copyright © 2017 Elsevier Inc. All rights reserved.
Trentham-Dietz, Amy; Ergun, Mehmet Ali; Alagoz, Oguzhan; Stout, Natasha K; Gangnon, Ronald E; Hampton, John M; Dittus, Kim; James, Ted A; Vacek, Pamela M; Herschorn, Sally D; Burnside, Elizabeth S; Tosteson, Anna N A; Weaver, Donald L; Sprague, Brian L
2018-02-01
Due to limitations in the ability to identify non-progressive disease, ductal carcinoma in situ (DCIS) is usually managed similarly to localized invasive breast cancer. We used simulation modeling to evaluate the potential impact of a hypothetical test that identifies non-progressive DCIS. A discrete-event model simulated a cohort of U.S. women undergoing digital screening mammography. All women diagnosed with DCIS underwent the hypothetical DCIS prognostic test. Women with test results indicating progressive DCIS received standard breast cancer treatment and a decrement to quality of life corresponding to the treatment. If the DCIS test indicated non-progressive DCIS, no treatment was received and women continued routine annual surveillance mammography. A range of test performance characteristics and prevalence of non-progressive disease were simulated. Analysis compared discounted quality-adjusted life years (QALYs) and costs for test scenarios to base-case scenarios without the test. Compared to the base case, a perfect prognostic test resulted in a 40% decrease in treatment costs, from $13,321 to $8005 USD per DCIS case. A perfect test produced 0.04 additional QALYs (16 days) for women diagnosed with DCIS, added to the base case of 5.88 QALYs per DCIS case. The results were sensitive to the performance characteristics of the prognostic test, the proportion of DCIS cases that were non-progressive in the model, and the frequency of mammography screening in the population. A prognostic test that identifies non-progressive DCIS would substantially reduce treatment costs but result in only modest improvements in quality of life when averaged over all DCIS cases.
Text Detection and Translation from Natural Scenes
2001-06-01
is no explicit tags around Chinese words. A module for Chinese word segmentation is included in the system. This segmentor uses a word- frequency ... list to make segmentation decisions. We tested the EBMT based method using randomly selected 50 signs from our database, assuming perfect sign
Optimal single-shot strategies for discrimination of quantum measurements
NASA Astrophysics Data System (ADS)
Sedlák, Michal; Ziman, Mário
2014-11-01
We study discrimination of m quantum measurements in the scenario when the unknown measurement with n outcomes can be used only once. We show that ancilla-assisted discrimination procedures provide a nontrivial advantage over simple (ancilla-free) schemes for perfect distinguishability and we prove that inevitably m ≤n . We derive necessary and sufficient conditions of perfect distinguishability of general binary measurements. We show that the optimization of the discrimination of projective qubit measurements and their mixtures with white noise is equivalent to the discrimination of specific quantum states. In particular, the optimal protocol for discrimination of projective qubit measurements with fixed failure rate (exploiting maximally entangled test state) is described. While minimum-error discrimination of two projective qubit measurements can be realized without any need of entanglement, we show that discrimination of three projective qubit measurements requires a bipartite probe state. Moreover, when the measurements are not projective, the non-maximally entangled test states can outperform the maximally entangled ones. Finally, we rephrase the unambiguous discrimination of measurements as quantum key distribution protocol.
Dufour, Simon; Latour, Sylvie; Chicoine, Yvan; Fecteau, Gilles; Forget, Sylvain; Moreau, Jean; Trépanier, André
2012-01-01
A script concordance test (SCT) was developed measuring clinical reasoning of food-ruminant practitioners for whom potential clinical competence difficulties were identified by their provincial professional organization. The SCT was designed to be used as part of a broader evaluation procedure. A scoring key was developed based on answers from a reference panel of 12 experts and using the modified aggregate method commonly used for SCTs. A convenient sample of 29 food-ruminant practitioners was constituted to assess the reliability and precision of the SCT and to determine a fair threshold value for success. Cronbach's α coefficients were computed to evaluate internal reliability. To evaluate SCT precision, a test-retest methodology was used and measures of agreement beyond chance were computed at question and test levels. After optimization, the 36-question SCT yielded acceptable internal reliability (Cronbach's α=0.70). Precision of the SCT at question level was excellent with 33 questions (92%) yielding moderate to almost perfect agreement between administrations. At test level, fair agreement (concordance correlation coefficient=0.32) was observed between administrations. A slight SCT score improvement (M=+2.8 points) on the second administration was in part responsible for some of the disagreement and was potentially a result of an adaptation to the SCT format. Scores distribution was used to determine a fair threshold value for success, while considering the underlying objectives of the examination. The data suggest that the developed SCT can be used as a reliable and precise measurement of clinical reasoning of food-ruminant practitioners.
A quantitative AFM analysis of nano-scale surface roughness in various orthodontic brackets.
Lee, Gi-Ja; Park, Ki-Ho; Park, Young-Guk; Park, Hun-Kuk
2010-10-01
In orthodontics, the surface roughnesses of orthodontic archwire and brackets affect the effectiveness of arch-guided tooth movement, corrosion behavior, and the aesthetics of orthodontic components. Atomic force microscopy (AFM) measurements were used to provide quantitative information on the surface roughness of the orthodontic material. In this study, the changes in surface roughness of various orthodontic bracket slots before and after sliding movement of archwire in vitro and in vivo were observed through the utilization of AFM. Firstly, we characterized the surface of four types of brackets slots as follows: conventional stainless steel (Succes), conventional ceramic (Perfect), self-ligating stainless steel (Damon) and self-ligating ceramic (Clippy-C) brackets. Succes) and Damon brackets showed relatively smooth surfaces, while Perfect had the roughest surface among the four types of brackets used. Secondly, after in vitro sliding test with beta titanium wire in two conventional brackets (Succes and Perfect), there were significant increases in only stainless steel bracket, Succes. Thirdly, after clinical orthodontic treatment for a maximum of 2 years, the self-ligating stainless steel bracket, Damon, showed a significant increase in surface roughness. But self-ligating ceramic brackets, Clippy-C, represented less significant changes in roughness parameters than self-ligating stainless steel ones. Based on the results of the AFM measurements, it is suggested that the self-ligating ceramic bracket has great possibility to exhibit less friction and better biocompatibility than the other tested brackets. This implies that these bracket slots will aid in the effectiveness of arch-guided tooth movement.
Ball bearing vibrations amplitude modeling and test comparisons
NASA Technical Reports Server (NTRS)
Hightower, Richard A., III; Bailey, Dave
1995-01-01
Bearings generate disturbances that, when combined with structural gains of a momentum wheel, contribute to induced vibration in the wheel. The frequencies generated by a ball bearing are defined by the bearing's geometry and defects. The amplitudes at these frequencies are dependent upon the actual geometry variations from perfection; therefore, a geometrically perfect bearing will produce no amplitudes at the kinematic frequencies that the design generates. Because perfect geometry can only be approached, emitted vibrations do occur. The most significant vibration is at the spin frequency and can be balanced out in the build process. Other frequencies' amplitudes, however, cannot be balanced out. Momentum wheels are usually the single largest source of vibrations in a spacecraft and can contribute to pointing inaccuracies if emitted vibrations ring the structure or are in the high-gain bandwidth of a sensitive pointing control loop. It is therefore important to be able to provide an a priori knowledge of possible amplitudes that are singular in source or are a result of interacting defects that do not reveal themselves in normal frequency prediction equations. This paper will describe the computer model that provides for the incorporation of bearing geometry errors and then develops an estimation of actual amplitudes and frequencies. Test results were correlated with the model. A momentum wheel was producing an unacceptable 74 Hz amplitude. The model was used to simulate geometry errors and proved successful in identifying a cause that was verified when the parts were inspected.
NASA Astrophysics Data System (ADS)
Tan, Wei; Zhang, Caihong; Li, Chun; Zhou, Xiaoying; Jia, Xiaoqing; Feng, Zheng; Su, Juan; Jin, Biaobing
2017-05-01
We demonstrate that the subradiant mode in ultrathin bi-layer metamaterials can be exclusively excited under two-antisymmetric-beam illumination (or equivalently, at a node of the standing wave field), while the superradiant mode is fully suppressed due to their different mode symmetry. Coherent perfect absorption (CPA) with the Lorentzian lineshape can be achieved corresponding to the subradiant mode. A theoretical model is established to distinguish the different behaviors of these two modes and to elucidate the CPA condition. Terahertz ultrathin bi-layer metamaterials on flexible polyimide substrates are fabricated and tested, exhibiting excellent agreement with theoretical predictions. This work provides physical insight into how to selectively excite the antisymmetric subradiant mode via coherence incidence.
76 FR 49751 - Perfect Fitness, Provisional Acceptance of a Settlement Agreement and Order
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-11
... CONSUMER PRODUCT SAFETY COMMISSION [CPSC Docket No. 11-C0009] Perfect Fitness, Provisional...(e). Published below is a provisionally-accepted Settlement Agreement with Perfect Fitness... accordance with 16 CFR 1118.20, Perfect Fitness and staff (``Staff'') of the United States Consumer Product...
Local-order metric for condensed-phase environments
NASA Astrophysics Data System (ADS)
Martelli, Fausto; Ko, Hsin-Yu; Oǧuz, Erdal C.; Car, Roberto
2018-02-01
We introduce a local order metric (LOM) that measures the degree of order in the neighborhood of an atomic or molecular site in a condensed medium. The LOM maximizes the overlap between the spatial distribution of sites belonging to that neighborhood and the corresponding distribution in a suitable reference system. The LOM takes a value tending to zero for completely disordered environments and tending to one for environments that perfectly match the reference. The site-averaged LOM and its standard deviation define two scalar order parameters, S and δ S , that characterize with excellent resolution crystals, liquids, and amorphous materials. We show with molecular dynamics simulations that S , δ S , and the LOM provide very insightful information in the study of structural transformations, such as those occurring when ice spontaneously nucleates from supercooled water or when a supercooled water sample becomes amorphous upon progressive cooling.
Li, Su-Yun
2010-06-01
In the background of the spread of western medicine into the East in the Ming and Qing Dynasties, Chinese doctors who had accepted western medicine referred to western medical knowledge and began to use the methods of anatomical observation and demonstrating to explain the objective structure of meridians and collaterals. They tried to adopt the artery and vessel explaining the shape of meridian and the blood circle and pulmonary respiration explaining the circulation of Ying-Wei. When the anatomy structures could not perfectly equal to meridians and collaterals, some doctors put forward the gasification feature of meridian to explain the reason. These results suggest that there are difference between meridians and collaterals and pure anatomy concepts, which serves as significant reference and edification for later generations.
The global reference atmospheric model, mod 2 (with two scale perturbation model)
NASA Technical Reports Server (NTRS)
Justus, C. G.; Hargraves, W. R.
1976-01-01
The Global Reference Atmospheric Model was improved to produce more realistic simulations of vertical profiles of atmospheric parameters. A revised two scale random perturbation model using perturbation magnitudes which are adjusted to conform to constraints imposed by the perfect gas law and the hydrostatic condition is described. The two scale perturbation model produces appropriately correlated (horizontally and vertically) small scale and large scale perturbations. These stochastically simulated perturbations are representative of the magnitudes and wavelengths of perturbations produced by tides and planetary scale waves (large scale) and turbulence and gravity waves (small scale). Other new features of the model are: (1) a second order geostrophic wind relation for use at low latitudes which does not "blow up" at low latitudes as the ordinary geostrophic relation does; and (2) revised quasi-biennial amplitudes and phases and revised stationary perturbations, based on data through 1972.
Measurable residual disease testing in acute myeloid leukaemia.
Hourigan, C S; Gale, R P; Gormley, N J; Ossenkoppele, G J; Walter, R B
2017-07-01
There is considerable interest in developing techniques to detect and/or quantify remaining leukaemia cells termed measurable or, less precisely, minimal residual disease (MRD) in persons with acute myeloid leukaemia (AML) in complete remission defined by cytomorphological criteria. An important reason for AML MRD-testing is the possibility of estimating the likelihood (and timing) of leukaemia relapse. A perfect MRD-test would precisely quantify leukaemia cells biologically able and likely to cause leukaemia relapse within a defined interval. AML is genetically diverse and there is currently no uniform approach to detecting such cells. Several technologies focused on immune phenotype or cytogenetic and/or molecular abnormalities have been developed, each with advantages and disadvantages. Many studies report a positive MRD-test at diverse time points during AML therapy identifies persons with a higher risk of leukaemia relapse compared with those with a negative MRD-test even after adjusting for other prognostic and predictive variables. No MRD-test in AML has perfect sensitivity and specificity for relapse prediction at the cohort- or subject levels and there are substantial rates of false-positive and -negative tests. Despite these limitations, correlations between MRD-test results and relapse risk have generated interest in MRD-test result-directed therapy interventions. However, convincing proof that a specific intervention will reduce relapse risk in persons with a positive MRD-test is lacking and needs testing in randomized trials. Routine clinical use of MRD-testing requires further refinements and standardization/harmonization of assay platforms and results reporting. Such data are needed to determine whether results of MRD-testing can be used as a surrogate end point in AML therapy trials. This could make drug-testing more efficient and accelerate regulatory approvals. Although MRD-testing in AML has advanced substantially, much remains to be done.
A multifactorial model of masticatory performance: the Suita study.
Kosaka, T; Ono, T; Kida, M; Kikui, M; Yamamoto, M; Yasui, S; Nokubi, T; Maeda, Y; Kokubo, Y; Watanabe, M; Miyamoto, Y
2016-05-01
Previous studies have identified various factors related to masticatory performance. This study was aimed to investigate variations and impacts of factors related to masticatory performance among different occlusal support areas in general urban population in Japan. A total of 1875 Japanese subjects (mean age: 66·7 years) were included in the Suita study. Periodontal status was evaluated using the Community Periodontal Index (CPI). The number of functional teeth and occlusal support areas (OSA) were recorded, and the latter divided into three categories of perfect, decreased and lost OSA based on the Eichner Index. Masticatory performance was determined by means of test gummy jelly. For denture wearers, masticatory performance was measured with the dentures in place. The multiple linear regression analysis showed that, when controlling for other variables, masticatory performance was significantly associated with sex, number of functional teeth, maximum bite force and periodontal status in perfect OSA. Masticatory performance was significantly associated with number of functional teeth, maximum bite force and periodontal status in decreased OSA. In lost OSA, masticatory performance was significantly associated with maximum bite force. Maximum bite force was a factor significantly influencing masticatory performance that was common to all OSA groups. After controlling for possible confounding factors, the number of functional teeth and periodontal status were common factors in the perfect and decreased OSA groups, and only sex was significant in the perfect OSA group. These findings may help in providing dietary guidance to elderly people with tooth loss or periodontal disease. © 2015 John Wiley & Sons Ltd.
Boléo-Tomé, J
2009-01-01
It is difficult to speak of ethic dilemmas in a society that has relativism as the oficial philosophical and political doctrine, i.e., stable values and behavior references, are denied, both in health care and in any other area of human knowledge. In the field of medical sciences it is even pretended to pass from the observational methodology to a field of manipulation and manipulability. It is the very Ethic that is presented as a dilemma. In these conditions one needs to know the lines of thought that are defended, to replace and make disappear the stable ethic references: ecletism, historicism, scientificism, pragmatism, and nihilism itself, that lead to the 'new ethic paradigm', that has created by itself a pseudo-spirituality. The truth is we are adrift in the 'Ethic of Convenience' which changes according to the majorities. In this setting the way to go is to rediscover the abandoned ethic values: only with an objective ethic, with sound references and foundations, it is possible to re-establish and perfect the patient-physician relationship, for a better social health. And this begins with the ethic problem of human life.
Noise Estimation and Quality Assessment of Gaussian Noise Corrupted Images
NASA Astrophysics Data System (ADS)
Kamble, V. M.; Bhurchandi, K.
2018-03-01
Evaluating the exact quantity of noise present in an image and quality of an image in the absence of reference image is a challenging task. We propose a near perfect noise estimation method and a no reference image quality assessment method for images corrupted by Gaussian noise. The proposed methods obtain initial estimate of noise standard deviation present in an image using the median of wavelet transform coefficients and then obtains a near to exact estimate using curve fitting. The proposed noise estimation method provides the estimate of noise within average error of +/-4%. For quality assessment, this noise estimate is mapped to fit the Differential Mean Opinion Score (DMOS) using a nonlinear function. The proposed methods require minimum training and yields the noise estimate and image quality score. Images from Laboratory for image and Video Processing (LIVE) database and Computational Perception and Image Quality (CSIQ) database are used for validation of the proposed quality assessment method. Experimental results show that the performance of proposed quality assessment method is at par with the existing no reference image quality assessment metric for Gaussian noise corrupted images.
Multicultural issues in test interpretation.
Langdon, Henriette W; Wiig, Elisabeth H
2009-11-01
Designing the ideal test or series of tests to assess individuals who speak languages other than English is difficult. This article first describes some of the roadblocks-one of which is the lack of identification criteria for language and learning disabilities in monolingual and bilingual populations in most countries of the non-English-speaking world. This lag exists, in part, because access to general education is often limited. The second section describes tests that have been developed in the United States, primarily for Spanish-speaking individuals because they now represent the largest first-language majority in the United States (80% of English-language learners [ELLs] speak Spanish at home). We discuss tests developed for monolingual and bilingual English-Spanish speakers in the United States and divide this coverage into two parts: The first addresses assessment of students' first language (L1) and second language (L2), usually English, with different versions of the same test; the second describes assessment of L1 and L2 using the same version of the test, administered in the two languages. Examples of tests that fit a priori-determined criteria are briefly discussed throughout the article. Suggestions how to develop tests for speakers of languages other than English are also provided. In conclusion, we maintain that there will never be a perfect test or set of tests to adequately assess the communication skills of a bilingual individual. This is not surprising because we have yet to develop an ideal test or set of tests that fits monolingual Anglo speakers perfectly. Tests are tools, and the speech-language pathologist needs to know how to use those tools most effectively and equitably. The goal of this article is to provide such guidance. Thieme Medical Publishers.
Financial Management in the Strategic Systems Project Office.
SSPO, the largest program office in the Navy and in existence for over 20 years, has perfected time tested financial management procedures which may...serve as a model for the student of program management. This report presents an overview of the SSPO financial management concepts and general
Fosgate, G T; Petzer, I M; Karzis, J
2013-04-01
Screening tests for mastitis can play an important role in proactive mastitis control programs. The primary objective of this study was to compare the sensitivity and specificity of milk electrical conductivity (EC) to the California mastitis test (CMT) in commercial dairy cattle in South Africa using Bayesian methods without a perfect reference test. A total of 1848 quarter milk specimens were collected from 173 cows sampled during six sequential farm visits. Of these samples, 25.8% yielded pathogenic bacterial isolates. The most frequently isolated species were coagulase negative Staphylococci (n=346), Streptococcus agalactiae (n=54), and Staphylococcus aureus (n=42). The overall cow-level prevalence of mastitis was 54% based on the Bayesian latent class (BLC) analysis. The CMT was more accurate than EC for classification of cows having somatic cell counts >200,000/mL and for isolation of a bacterial pathogen. BLC analysis also suggested an overall benefit of CMT over EC but the statistical evidence was not strong (P=0.257). The Bayesian model estimated the sensitivity and specificity of EC (measured via resistance) at a cut-point of >25 mΩ/cm to be 89.9% and 86.8%, respectively. The CMT had a sensitivity and specificity of 94.5% and 77.7%, respectively, when evaluated at the weak positive cut-point. EC was useful for identifying milk specimens harbouring pathogens but was not able to differentiate among evaluated bacterial isolates. Screening tests can be used to improve udder health as part of a proactive management plan. Copyright © 2012 Elsevier Ltd. All rights reserved.
Guerrero-Ramos, Alvaro; Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina
2014-06-01
The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Patel, Mauli; Kadakia, Kinjal; Haque, Tanzina
2014-01-01
The Architect EBV antibody panel is a new chemiluminescence immunoassay system used to determine the stage of Epstein-Barr virus (EBV) infection based on the detection of IgM and IgG antibodies to viral capsid antigen (VCA) and IgG antibodies against Epstein-Barr nuclear antigen 1 (EBNA-1). We evaluated its diagnostic accuracy in immunocompetent adolescents and young adults with clinical suspicion of infectious mononucleosis (IM) using the RecomLine EBV IgM and IgG immunoblots as the reference standard. In addition, the use of the antibody panel in a sequential testing algorithm based on initial EBNA-1 IgG analysis was assessed for cost-effectiveness. Finally, we investigated the degree of cross-reactivity of the VCA IgM marker during other primary viral infections that may present with an EBV IM-like picture. High sensitivity (98.3% [95% confidence interval {CI}, 90.7 to 99.7%]) and specificity (94.2% [95% CI, 87.9 to 97.8%]) were found after testing 162 precharacterized archived serum samples. There was perfect agreement between the use of the antibody panel in sequential and parallel testing algorithms, but substantial cost savings (23%) were obtained with the sequential strategy. A high rate of reactive VCA IgM results was found in primary cytomegalovirus (CMV) infections (60.7%). In summary, the Architect EBV antibody panel performs satisfactorily in the investigation of EBV IM in immunocompetent adolescents and young adults, and the application of an EBNA-1 IgG-based sequential testing algorithm is cost-effective in this diagnostic setting. Concomitant testing for CMV is strongly recommended to aid in the interpretation of EBV serological patterns. PMID:24695777
Effect of packing method on the randomness of disc packings
NASA Astrophysics Data System (ADS)
Zhang, Z. P.; Yu, A. B.; Oakeshott, R. B. S.
1996-06-01
The randomness of disc packings, generated by random sequential adsorption (RSA), random packing under gravity (RPG) and Mason packing (MP) which gives a packing density close to that of the RSA packing, has been analysed, based on the Delaunay tessellation, and is evaluated at two levels, i.e. the randomness at individual subunit level which relates to the construction of a triangle from a given edge length distribution and the randomness at network level which relates to the connection between triangles from a given triangle frequency distribution. The Delaunay tessellation itself is also analysed and its almost perfect randomness at the two levels is demonstrated, which verifies the proposed approach and provides a random reference system for the present analysis. It is found that (i) the construction of a triangle subunit is not random for the RSA, MP and RPG packings, with the degree of randomness decreasing from the RSA to MP and then to RPG packing; (ii) the connection of triangular subunits in the network is almost perfectly random for the RSA packing, acceptable for the MP packing and not good for the RPG packing. Packing method is an important factor governing the randomness of disc packings.
Beyond Aztec Castles: Toric Cascades in the dP 3 Quiver
NASA Astrophysics Data System (ADS)
Lai, Tri; Musiker, Gregg
2017-12-01
Given one of an infinite class of supersymmetric quiver gauge theories, string theorists can associate a corresponding toric variety (which is a Calabi-Yau 3-fold) as well as an associated combinatorial model known as a brane tiling. In combinatorial language, a brane tiling is a bipartite graph on a torus and its perfect matchings are of interest to both combinatorialists and physicists alike. A cluster algebra may also be associated to such quivers and in this paper we study the generators of this algebra, known as cluster variables, for the quiver associated to the cone over the del Pezzo surface d P 3. In particular, mutation sequences involving mutations exclusively at vertices with two in-coming arrows and two out-going arrows are referred to as toric cascades in the string theory literature. Such toric cascades give rise to interesting discrete integrable systems on the level of cluster variable dynamics. We provide an explicit algebraic formula for all cluster variables that are reachable by toric cascades as well as a combinatorial interpretation involving perfect matchings of subgraphs of the d P 3 brane tiling for these formulas in most cases.
Intravenous fluid temperature management by infrared thermometer.
Lapostolle, Frédéric; Catineau, Jean; Le Toumelin, Philippe; Proust, Clément; Garrigue, Bruno; Galinski, Michel; Adnet, Frédéric
2006-03-01
The management of intravenous (IV) fluid temperature is a daily challenge in critical care, anesthesiology, and emergency medicine. Infusion of IV fluids at the right temperature partly influences clinical outcomes of critically ill patients. Nowadays, intravenous fluid temperature is poorly managed, as no suitable device is routinely available. Infrared (IR) thermometers have been recently developed for industrial, personal, or medical purposes. The aim of this study was to evaluate the accuracy of an IR thermometer in measuring temperature of warmed and cooled infusion fluids in fluid bags. This study compared temperatures simultaneously recorded by an infrared thermometer and a temperature sensor. Temperatures of warmed (41 degrees C) and cooled (4 degrees C) infusion fluids in fluid bags were recorded by 2 independent operators every minute until IV bags' temperature reached ambient temperature. The relation curve was established with 576 measures. Temperature measures performed with an IR thermometer were perfectly linear and perfectly correlated with the reference method (R(2) = 0.995, P < 10(-5)). Infrared thermometers are efficient to measure IV fluid bag temperature in the range of temperatures used in clinical practice. As these devices are easy to use and inexpensive, they could be largely used in critical care, anesthesiology, or emergency medicine.
Mushquash, Aislin R; Sherry, Simon B
2013-04-01
The perfectionism model of binge eating is an integrative model explaining why perfectionism is tied to binge eating. This study extended and tested this emerging model by proposing daughters' socially prescribed perfectionism (i.e., perceiving one's mother is harshly demanding perfection of oneself) and mothers' psychological control (i.e., a negative parenting style involving control and demandingness) contribute indirectly to daughters' binge eating by generating situations or experiences that trigger binge eating. These binge triggers include discrepancies (i.e., viewing oneself as falling short of one's mother's expectations), depressive affect (i.e., feeling miserable and sad), and dietary restraint (i.e., behaviors aimed at reduced caloric intake). This model was tested in 218 mother-daughter dyads studied using a mixed longitudinal and daily diary design. Daughters were undergraduate students. Results largely supported hypotheses, with bootstrapped tests of mediation suggesting daughters' socially prescribed perfectionism and mothers' psychological control contribute to binge eating through binge triggers. For undergraduate women who believe their mothers rigidly require them to be perfect and whose mothers are demanding and controlling, binge eating may provide a means of coping with or escaping from an unhealthy, unsatisfying mother-daughter relationship. Copyright © 2013 Elsevier Ltd. All rights reserved.
Contextual Factors in the Use of the Present Perfect
ERIC Educational Resources Information Center
Moy, Raymond H.
1977-01-01
In this study the inadequacies of rules governing the present perfect in isolated sentences are discussed and then two contextual factors thought to be connected with current relevance and the use of the present perfect are described. These factors are experimentally shown to influence use of the present perfect significantly. (CHK)
Bimodal distribution of performance in discriminating major/minor modes.
Chubb, Charles; Dickson, Christopher A; Dean, Tyler; Fagan, Christopher; Mann, Daniel S; Wright, Charles E; Guan, Maime; Silva, Andrew E; Gregersen, Peter K; Kowalsky, Elena
2013-10-01
This study investigated the abilities of listeners to classify various sorts of musical stimuli as major vs minor. All stimuli combined four pure tones: low and high tonics (G5 and G6), dominant (D), and either a major third (B) or a minor third (B[symbol: see text]). Especially interesting results were obtained using tone-scrambles, randomly ordered sequences of pure tones presented at ≈15 per second. All tone-scrambles tested comprised 16 G's (G5's + G6's), 8 D's, and either 8 B's or 8 B[symbol: see text]'s. The distribution of proportion correct across 275 listeners tested over the course of three experiments was strikingly bimodal, with one mode very close to chance performance, and the other very close to perfect performance. Testing with tone-scrambles thus sorts listeners fairly cleanly into two subpopulations. Listeners in subpopulation 1 are sufficiently sensitive to major vs minor to classify tone-scrambles nearly perfectly; listeners in subpopulation 2 (comprising roughly 70% of the population) have very little sensitivity to major vs minor. Skill in classifying major vs minor tone-scrambles shows a modest correlation of around 0.5 with years of musical training.
2000-08-01
The Gravity Probe B (GP-B) is the relativity experiment developed at Stanford University to test two extraordinary predictions of Albert Einstein’s general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth’s rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. In this photograph, the completed space vehicle is undergoing thermal vacuum environment testing. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies that are already enlivening other branches of science and engineering. Launched April 20, 2004 , the GP-B program was managed for NASA by the Marshall Space Flight Center. Development of the GP-B is the responsibility of Stanford University along with major subcontractor Lockheed Martin Corporation. (Image credit to Russ Underwood, Lockheed Martin Corporation.)
Physical and Technical Profiles of Garuda Basketball Club’s 17 to 18 Years Old Male Group
NASA Astrophysics Data System (ADS)
Rohmat, H. D. N.; Rismayadi, A.; Mustaqim, R.; Kusdinar, Y.
2017-03-01
The background of this research is the attempt of creating the best basketball players by investigating the requirements which have to be prepared in every stage. In fact, there has not been any data which can be reference as the target for athletes in every age stages to achieve, especially in the ages of 17 to 18. Thus, a data is needed to be the reference so that it can be found what kind of requirements which have to be fulfilled by the athletes. Garuda, one of the basketball clubs in NBL (Indonesia professional basketball competition), is not only a professional team but it also develops youth to be the next basketball players. The research problem is how the physical and technical profiles of Garuda basketball club’s 17 to 18 years old men group are like. Descriptive method was used in this research. This research aims at portraying the physical and technical profiles of Garuda basketball club’s 17 to 18 years old men group. In physical profile, the results show that no athlete (0%) is classified in perfect, very good, and low categories. Six (35.3%) and eleven (64.7%) of them are classified in good and moderate categories. Meanwhile, the results show that no athlete (0%) is classified in perfect and low categories on technical profile. Four (23.5%) and eight (47.1%) of them are classified in very good and good categories. Based on the results, it can be concluded that physical and technical profiles of Garuda basketball club’s 17 to 18 years old men group are considered as in moderate and good categories.
24th Annual National Test and Evaluation Conference
2008-02-28
LSL USL μ2 μ1 μ2 LSL USL μ1 Robust Design Page 38©2008 Air Academy Associates, LLC. Do Not Reproduce. Simplify, Perfect, Innovate Why Robust Design? x...Vehicle performance Simulated Terrain Physics Soil strength Vegetation density Longitudinal force Lateral force Traction Resistance Local vehicle
Working Memory Is (Almost) Perfectly Predicted by "g"
ERIC Educational Resources Information Center
Colom, Roberto; Rebollo, Irene; Palacios, Antonio; Juan-Espinosa, Manuel; Kyllonen, Patrick C.
2004-01-01
This article analyzes if working memory (WM) is especially important to understand "g." WM comprises the functions of focusing attention, conscious rehearsal, and transformation and mental manipulation of information, while "g" reflects the component variance that is common to all tests of ability. The centrality of WM in individual differences in…
40 CFR 1060.501 - General testing provisions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... that requires a blend of gasoline and ethanol, blend this grade of gasoline with fuel-grade ethanol... measure the ethanol concentration of such blended fuels and may instead calculate the blended composition by assuming that the ethanol is pure and mixes perfectly with the base fuel. For example, if you mix...
40 CFR 1060.501 - General testing provisions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... that requires a blend of gasoline and ethanol, blend this grade of gasoline with fuel-grade ethanol... measure the ethanol concentration of such blended fuels and may instead calculate the blended composition by assuming that the ethanol is pure and mixes perfectly with the base fuel. For example, if you mix...
40 CFR 1060.501 - General testing provisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... that requires a blend of gasoline and ethanol, blend this grade of gasoline with fuel-grade ethanol... measure the ethanol concentration of such blended fuels and may instead calculate the blended composition by assuming that the ethanol is pure and mixes perfectly with the base fuel. For example, if you mix...
40 CFR 1060.501 - General testing provisions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... that requires a blend of gasoline and ethanol, blend this grade of gasoline with fuel-grade ethanol... measure the ethanol concentration of such blended fuels and may instead calculate the blended composition by assuming that the ethanol is pure and mixes perfectly with the base fuel. For example, if you mix...
40 CFR 1060.501 - General testing provisions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... that requires a blend of gasoline and ethanol, blend this grade of gasoline with fuel-grade ethanol... measure the ethanol concentration of such blended fuels and may instead calculate the blended composition by assuming that the ethanol is pure and mixes perfectly with the base fuel. For example, if you mix...
Medical Practice Makes Perfect
NASA Technical Reports Server (NTRS)
1998-01-01
Cedaron Medical Inc., was founded in 1990 as a result of a NASA SBIR (Small Business Innovative Research) grant from Johnson Space Center to develop a Hand Testing and Exercise Unit for use in space. From that research came Dexter, a comprehensive workstation that creates a paperless environment for medical data management.
Experimental demonstration of the anti-maser
NASA Astrophysics Data System (ADS)
Mazzocco, Anthony; Aviles, Michael; Andrews, Jim; Dawson, Nathan; Crescimanno, Michael
2012-10-01
We denote by ``anti-maser'' a coherent perfect absorption (CPA) process in the radio frequency domain. We demonstrate several experimental realizations of the anti-maser suitable for an advanced undergraduate laboratory. Students designed, assembled and tested these devices, as well as the inexpensive laboratory setup and experimental protocol for displaying various CPA phenomenon.
2008-03-01
Conductor PMC: Perfect Magnetic Conductor RF: Radio Frequency RH: Right-handed SNG : Single Negative TACAN: Tactical Air Navigation UAV: Unmanned Aerial...negative ( SNG ) and double-negative (DNG) materials, and their fascinating properties have driven the interest in MTMs (Engheta and Ziolkowski, 2006
NASA Astrophysics Data System (ADS)
Manha, William D.
2010-09-01
One to the expressions for the most demanding quality was made by a well-known rocket scientist, for which this center was named, Dr. Wernher Von Braun in the Foreword of a book about the design of rocket engines that was first published by NASA in 1967: “Success in space demands perfection. Many of the brilliant achievements made in this vast, austere environment seem almost miraculous. Behind each apparent miracle, however, stands the flawless performance of numerous highly complex systems. All are important. The failure of only one portion of a launch vehicle or spacecraft may cause failure of an entire mission. But the first to feel this awesome imperative for perfection are the propulsion systems, especially the engines. Unless they operate flawlessly first, none of the other systems will get a chance to perform in space. Perfection begins in the design of space hardware. This book emphasizes quality and reliability in the design of propulsion and engine systems. It draws deeply from the vast know-how and experience which have been the essence of several well-designed, reliable systems of the past and present. And, with a thoroughness and completeness not previously available, it tells how the present high state of reliability, gained through years of research and testing, can be maintained, and perhaps improved, in engines of the future. As man ventures deeper into space to explore the planets, the search for perfection in the design of propulsion systems will continue.” Some catastrophes with losses of life will be compared to show lapses in quality and safety and contrasted with a catastrophe without loss of life because of compliance with safety requirements. 1. October 24, 1960,(USSR) Nedelin Catastrophe, Death on the Steppes, 124 deaths 2. October 25, 1966,(USA) North American Rockwell, Apollo Block I Service Module Service(SM) Propulsion System fuel tank explosion/fire and destruction of SM and test cell, test engineer/conductor/author, Bill Manha,(the presenter) 0 injuries, 0 deaths 3. March 18, 1980,(USSR) Vostok 8A92M booster pad explosion, 48 deaths. 4. August 22, 2003,(Brazil) -Alcantara VLS -1, V03. Solid rocket ignited on pad, 21 deaths 5. Summer of 2006(USA) a payload organization inquired about requirements to fly a satellite with a new “safe” SpaceDev hybrid propulsion system using a solid polymer as the fuel and nitrous oxide as the oxidizer. The extensive titanium/nitrous oxide materials compatibility testing that was required discouraged the payload organization from further exploration of using the Shuttle as the launch vehicle. 6. July 26, 2007(USA) SpaceShipTwo nitrous oxide explosion, 3 seriously injured, 3 deaths The above listed catastrophic failures resulted in 210 deaths, but there were none on the Apollo SM explosion because of compliance with CalOSHA. This is an applied lesson learned of the Shuttle. Safety was not jeopardized without extensive materials compatibility testing. On the other hand, the nitrous oxide was erroneously identified as safe for launch from Shuttle or ISS which resulted in a catastrophic explosion and resulted in 3 major injuries, and 3 deaths. This is a testimony of a survivor of a catastrophic failure where safety rules were followed and the application of the lesson learned which confirmed safety and quality, as expressed by Von Braun, PERFECTION and SAFETY do MATTER!
The Perfect Aspect as a State of Being.
ERIC Educational Resources Information Center
Moy, Raymond H.
English as second language (ESL) learners often avoid using the present perfect or use it improperly. In contrast with native speakers of English sampled from newspaper editorials, of whom 75 percent used the present perfect, only 22 percent of ESL college students used the present perfect correctly. This avoidance is due in part to lack of…
Delicious Low GL space foods by using Low GI materials -Checked of blood sugar level-
NASA Astrophysics Data System (ADS)
Katayama, Naomi; Kuwayama, Akemi; Space Agriculture Task Force, J.
Enough life-support systems are necessary to stay in space for a long term. The management of the meal for astronauts is in particular very important. When an astronaut gets sick in outer space, it means death. To astronauts, the delicious good balance space foods are essential for their work. Therefore, this study was aimed at evaluating space foods menu for the healthy space-life by measuring blood sugar level. We made space foods menu to referred to Japanese nutrition standard in 2010. We made space foods menu which are using "brown rice, wheat, soy bean, sweet potato and green-vegetable" and " loach and insects which are silkworm pupa, snail, mud snail, turmait, fly, grasshopper, bee". We use ten health adults as subjects. Ten subjects performed the sensory test of the questionnaire method. There was the sensuality examination in the item of "taste, a fragrance, color, the quantity" and acquired a mark at ten points of perfect scores. The blood sugar level was measured with peripheral blood, before and after a meal for each 15 minutesduring 120 minutes. Statistical analysis was analysed by Excel statistics. As a result of having measured blood sugar level, the space foods menu understood that hyperglycosemia value after a meal was hard to happen. As a result of sensuality exam-ination of the subject, ten points of evaluation of the taste exceeded eight points in a perfect score. The healthy space foods which were hard to go up of the blood sugar level were made deliciously. We can evaluate space foods leading to good health maintenance of the balance by measuring blood sugar level. An astronaut must be healthy to stay in the space for a long term. Therefore the development of the delicious space foods which increase of the health is essential. I devise a combination and the cooking method of the cooking ingredient and want to make healthier space foods menu.
Walton, David M; Macdermid, Joy C; Nielson, Warren; Teasell, Robert W; Chiasson, Marco; Brown, Lauren
2011-09-01
Clinical measurement. To evaluate the intrarater, interrater, and test-retest reliability of an accessible digital algometer, and to determine the minimum detectable change in normal healthy individuals and a clinical population with neck pain. Pressure pain threshold testing may be a valuable assessment and prognostic indicator for people with neck pain. To date, most of this research has been completed using algometers that are too resource intensive for routine clinical use. Novice raters (physiotherapy students or clinical physiotherapists) were trained to perform algometry testing over 2 clinically relevant sites: the angle of the upper trapezius and the belly of the tibialis anterior. A convenience sample of normal healthy individuals and a clinical sample of people with neck pain were tested by 2 different raters (all participants) and on 2 different days (healthy participants only). Intraclass correlation coefficient (ICC), standard error of measurement, and minimum detectable change were calculated. A total of 60 healthy volunteers and 40 people with neck pain were recruited. Intrarater reliability was almost perfect (ICC = 0.94-0.97), interrater reliability was substantial to near perfect (ICC = 0.79-0.90), and test-retest reliability was substantial (ICC = 0.76-0.79). Smaller change was detectable in the trapezius compared to the tibialis anterior. This study provides evidence that novice raters can perform digital algometry with adequate reliability for research and clinical use in people with and without neck pain.
1980-08-01
EFFECTS OF MATERIAL AND TASK VARIATIONS ON A BRIEF COGNITIVE LEARNING STRATEGIES TRAINING PROGRAM Introduction As scholastic achievement scores continue to...variance of the test scores revealed no significant dif- ferences among the three treatment conditions on any of the tests. Although these results...tentative because of the group performance patterns. On the first (easier) passage, group means indicated nearly perfect scores for all three of the
The principle of relativity, superluminality and EPR experiments. "Riserratevi sotto coverta ..."
NASA Astrophysics Data System (ADS)
Cocciaro, B.
2015-07-01
The principle of relativity claims the invariance of the results for experiments carried out in inertial reference frames if the system under examination is not in interaction with the outside world. In this paper it is analysed a model suggested by J. S. Bell, and later developed by P. H. Eberhard, D. Bohm and B. Hiley on the basis of which the EPR correlations would be due to superluminal exchanges between the various parts of the entangled system under examination. In the model the existence of a privileged reference frame (PF) for the propagation of superluminal signals is hypothesized so that these superluminal signals may not give rise to causal paradoxes. According to this model, in an EPR experiment, the entangled system interacts with the outer world since the result of the experiment depends on an entity (the reference frame PF) that is not prepared by the experimenter. The existence of this privileged reference frame makes the model non invariant for Lorentz transformations. In this paper, in opposition to what claimed by the authors mentioned above, the perfect compatibility of the model with the theory of relativity is strongly maintained since, as already said, the principle of relativity does not require that the results of experiments carried out on systems interacting with the outside world should be invariant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.
We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less
Analysis of the cadastral data published in the Polish Spatial Data Infrastructure
NASA Astrophysics Data System (ADS)
Izdebski, Waldemar
2017-12-01
The cadastral data, including land parcels, are the basic reference data for presenting various objects collected in spatial databases. Easy access to up-to-date records is a very important matter for the individuals and institutions using spatial data infrastructure. The primary objective of the study was to check the current accessibility of cadastral data as well as to verify how current and complete they are. The author started researching this topic in 2007, i.e. from the moment the Team for National Spatial Data Infrastructure developed documentation concerning the standard of publishing cadastral data with the use of the WMS. Since ten years, the author was monitoring the status of cadastral data publishing in various districts as well as participated in data publishing in many districts. In 2017, when only half of the districts published WMS services from cadastral data, the questions arise: why is it so and how to change this unfavourable status? As a result of the tests performed, it was found that the status of publishing cadastral data is still far from perfect. The quality of the offered web services varies and, unfortunately, many services offer poor performance; moreover, there are plenty services that do not operate at all.
Child t-shirt size data set from 3D body scanner anthropometric measurements and a questionnaire.
Pierola, A; Epifanio, I; Alemany, S
2017-04-01
A dataset of a fit assessment study in children is presented. Anthropometric measurements of 113 children were obtained using a 3D body scanner. Children tested a t-shirt of different sizes and a different model for boys and girls, and their fit was assessed by an expert. This expert labeled the fit as 0 (correct), -1 (if the garment was small for that child), or 1 (if the garment was large for that child) in an ordered factor called Size-fit. Moreover, the fit was numerically assessed from 1 (very poor fit) to 10 (perfect fit) in a variable called Expert evaluation. This data set contains the differences between the reference mannequin of the evaluated size and the child׳s anthropometric measurements for 27 variables. Besides these variables, in the data set, we can also find the gender, the size evaluated, and the size recommended by the expert, including if an intermediate, but nonexistent size between two consecutive sizes would have been the right size. In total, there are 232 observations. The analysis of these data can be found in Pierola et al. (2016) [2].
The Perfect Mate for Safe Fueling
NASA Technical Reports Server (NTRS)
2004-01-01
Referred to as the "lifeline for any space launch vehicle" by NASA Space Launch Initiative Program Manager Warren Wiley, an umbilical is a large device that transports power, communications, instrument readings, and fluids such as propellants, pressurization gases, and coolants from one source to another. Numerous launch vehicles, planetary systems, and rovers require umbilical "mating". This process is a driving factor for dependable and affordable space access. With future-generation space vehicles in mind, NASA recently designed a smart, automated method for quickly and reliably mating and demating electrical and fluid umbilical connectors. The new umbilical concept is expected to replace NASA s traditional umbilical systems that release at vehicle lift-off (T-0). The idea is to increase safety by automatically performing hazardous tasks, thus reducing potential failure modes and the time and labor hours necessary to prepare for launch. The new system will also be used as a test bed for quick disconnect development and for advance control and leak detection. It incorporates concepts such as a secondary mate plate, robotic machine vision, and compliant motor motion control, and is destined to advance usage of automated umbilicals in a variety of aerospace and commercial applications.
The Q Continuum: Encounter with the Cloud Mask
NASA Astrophysics Data System (ADS)
Ackerman, S. A.; Frey, R.; Holz, R.; Philips, C.; Dutcher, S.
2017-12-01
We are developing a common cloud mask for MODIS and VIIRS observations, referred to as the MODIS VIIRS Continuity Mask (MVCM). Our focus is on extending the MODIS-heritage cloud detection approach in order to generate appropriate climate data records for clouds and climate studies. The MVCM is based on heritage from the MODIS cloud mask (MOD35 and MYD35) and employs a series of tests on MODIS reflectances and brightness temperatures. Cloud detection is based on contrasts (i.e., cloud versus background surface) at pixel resolution. The MVCM follows the same approach. These cloud masks use multiple cloud detection tests to indicate the confidence level that the observation is of a clear-sky scene. The outcome of a test ranges from 0 (cloudy) to 1 (clear-sky scene). Because of overlap in the sensitivities of the various spectral tests to the type of cloud, each test is considered in one of several groups. The final cloud mask is determined from the product of the minimum confidence of each group and is referred to as the Q value as defined in Ackerman et al (1998). In MOD35 and MYD35 processing, the Q value is not output, rather predetermined Q values determine the result: If Q ≥ .99 the scene is clear; .95 ≤ Q < .99 the pixel is probably a clear scene, .66 ≤ Q < .95 is probably cloudy and Q < .66 is cloudy. Thus representing Q discretely and not as a continuum. For the MVCM, the numerical value of the Q is output along with the classification of clear, probably clear, probably cloudy, and cloudy. Through comparisons with collocated CALIOP and MODIS observations, we will assess the categorization of the Q values as a function of scene type ). While validation studies have indicated the utility and statistical correctness of the cloud mask approach, the algorithm does not possess immeasurable power and perfection. This comparison will assess the time and space dependence of Q and assure that the laws of physics are followed, at least according to normal human notions. Using CALIOP as representing truth, a receiver operating characteristic curve (ROC) will be analyzed to determine the optimum Q for various scenes and seasons, thus providing a continuum of discriminating thresholds.
Lange, Toni; Freiberg, Alice; Dröge, Patrik; Lützner, Jörg; Schmitt, Jochen; Kopkow, Christian
2015-06-01
Systematic literature review. Despite their frequent application in routine care, a systematic review on the reliability of clinical examination tests to evaluate the integrity of the ACL is missing. To summarize and evaluate intra- and interrater reliability research on physical examination tests used for the diagnosis of ACL tears. A comprehensive systematic literature search was conducted in MEDLINE, EMBASE and AMED until May 30th 2013. Studies were included if they assessed the intra- and/or interrater reliability of physical examination tests for the integrity of the ACL. Methodological quality was evaluated with the Quality Appraisal of Reliability Studies (QAREL) tool by two independent reviewers. 110 hits were achieved of which seven articles finally met the inclusion criteria. These studies examined the reliability of four physical examination tests. Intrarater reliability was assessed in three studies and ranged from fair to almost perfect (Cohen's k = 0.22-1.00). Interrater reliability was assessed in all included studies and ranged from slight to almost perfect (Cohen's k = 0.02-0.81). The Lachman test is the physical tests with the highest intrarater reliability (Cohen's k = 1.00), the Lachman test performed in prone position the test with the highest interrater reliability (Cohen's k = 0.81). Included studies were partly of low methodological quality. A meta-analysis could not be performed due to the heterogeneity in study populations, reliability measures and methodological quality of included studies. Systematic investigations on the reliability of physical examination tests to assess the integrity of the ACL are scarce and of varying methodological quality. Copyright © 2014 Elsevier Ltd. All rights reserved.
Magnetostatic Surface Field Measurement Facility.
1986-12-01
Sphere 106 4 Page C .2 Acoustically Hard Prolate 10 8 HC.3 Perfectly Conducting Sphere 110 C .4 Some Implications 113 C .5 References 115 APPENDIX D 117 0.1...I/a2+ r2+z2 - 2ar cos a d -3, C 2 var 1 -K2 )K - / - E Jo 3ɚ- (2.37) and since J cos a da 0 0 we have . ... 1 2)K A = - -~i)Ej + 0 (k 3) . (2.38...assumed exp(jwt) time convention. C - We begin by analyzing one of our eight secondary distribution sub- systems, as shown in Fig. D .2. V -j- Z 0j B e Vx
[Movement and tranquility in 19th century Aesthetics].
Muñoz, S
1993-01-01
The nineteenth century sees the rise of the bourgeoisie to social and political power. The values of this class increased the attention paid to certain branches of the medical sciences, such as for example hygiene. A set of rules and methods to achieve better health and, at the same time, to come closer to the perfect image of man described by writers of aesthetics, often taking classical Greece as a point of reference, are believed to be found in these branches. In these strategies physical exercise plays a role which is positively valued as much by hygienists as by philosophers, some of whose works are studied in this article.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Storm, Emma; Weniger, Christoph; Calore, Francesca, E-mail: e.m.storm@uva.nl, E-mail: c.weniger@uva.nl, E-mail: francesca.calore@lapth.cnrs.fr
We present SkyFACT (Sky Factorization with Adaptive Constrained Templates), a new approach for studying, modeling and decomposing diffuse gamma-ray emission. Like most previous analyses, the approach relies on predictions from cosmic-ray propagation codes like GALPROP and DRAGON. However, in contrast to previous approaches, we account for the fact that models are not perfect and allow for a very large number (∼> 10{sup 5}) of nuisance parameters to parameterize these imperfections. We combine methods of image reconstruction and adaptive spatio-spectral template regression in one coherent hybrid approach. To this end, we use penalized Poisson likelihood regression, with regularization functions that aremore » motivated by the maximum entropy method. We introduce methods to efficiently handle the high dimensionality of the convex optimization problem as well as the associated semi-sparse covariance matrix, using the L-BFGS-B algorithm and Cholesky factorization. We test the method both on synthetic data as well as on gamma-ray emission from the inner Galaxy, |ℓ|<90{sup o} and | b |<20{sup o}, as observed by the Fermi Large Area Telescope. We finally define a simple reference model that removes most of the residual emission from the inner Galaxy, based on conventional diffuse emission components as well as components for the Fermi bubbles, the Fermi Galactic center excess, and extended sources along the Galactic disk. Variants of this reference model can serve as basis for future studies of diffuse emission in and outside the Galactic disk.« less
Wahab, Tara; Birdsell, Dawn N.; Hjertqvist, Marika; Mitchell, Cedar L.; Wagner, David M.; Keim, Paul S.; Hedenström, Ingela; Löfdahl, Sven
2014-01-01
Tularaemia, caused by the bacterium Francisella tularensis, is endemic in Sweden and is poorly understood. The aim of this study was to evaluate the effectiveness of three different genetic typing systems to link a genetic type to the source and place of tularemia infection in Sweden. Canonical single nucleotide polymorphisms (canSNPs), MLVA including five variable number of tandem repeat loci and PmeI-PFGE were tested on 127 F. tularensis positive specimens collected from Swedish case-patients. All three typing methods identified two major genetic groups with near-perfect agreement. Higher genetic resolution was obtained with canSNP and MLVA compared to PFGE; F. tularensis samples were first assigned into ten phylogroups based on canSNPs followed by 33 unique MLVA types. Phylogroups were geographically analysed to reveal complex phylogeographic patterns in Sweden. The extensive phylogenetic diversity found within individual counties posed a challenge to linking specific genetic types with specific geographic locations. Despite this, a single phylogroup (B.22), defined by a SNP marker specific to a lone Swedish sequenced strain, did link genetic type with a likely geographic place. This result suggests that SNP markers, highly specific to a particular reference genome, may be found most frequently among samples recovered from the same location where the reference genome originated. This insight compels us to consider whole-genome sequencing (WGS) as the appropriate tool for effectively linking specific genetic type to geography. Comparing the WGS of an unknown sample to WGS databases of archived Swedish strains maximizes the likelihood of revealing those rare geographically informative SNPs. PMID:25401326
Quantifying the Dependencies of Rooftop Temperatures on Albedo
NASA Technical Reports Server (NTRS)
Dominquez, Anthony; Kleissl, Jan; Luvall, Jeff
2009-01-01
The thermal properties of building materials directly effect the conditions inside of buildings Heat transfer is not a primary design driver in building design. Rooftop modifications lower heat transfer, which lowers energy consumption and costs. The living environmental laboratory attitude at UCSD makes it the perfect place to test the success of these modifications.
Strategies Courtside. Perfect or Perilous: When Is a Teacher Negligent?
ERIC Educational Resources Information Center
Carpenter, Linda Jean
1994-01-01
In today's society, teachers are more apt to be sued than ever before; it is important, therefore, that they know exactly what negligence means to themselves and their students. The framework of a negligence case (duty, breach, cause, and harm) is explained, and a teacher self-test is provided. (SM)
The Hanging Cord with a Real Tip Mass
ERIC Educational Resources Information Center
Deschaine, J. S.; Suits, B. H.
2008-01-01
Normal mode solutions for the perfectly flexible hanging cord problem have been known for over 200 years. More recently, theoretical results for a hanging cord with a point mass attached were presented. Here the theoretical results are tested experimentally using high-precision techniques which are accessible for use in an introductory laboratory.…
Considerations Underlying the Use of Mixed Group Validation
ERIC Educational Resources Information Center
Jewsbury, Paul A.; Bowden, Stephen C.
2013-01-01
Mixed Group Validation (MGV) is an approach for estimating the diagnostic accuracy of tests. MGV is a promising alternative to the more commonly used Known Groups Validation (KGV) approach for estimating diagnostic accuracy. The advantage of MGV lies in the fact that the approach does not require a perfect external validity criterion or gold…
ERIC Educational Resources Information Center
Schwind, Christina; Buder, Jurgen; Cress, Ulrike; Hesse, Friedrich W.
2012-01-01
The Web is a perfect backdrop for opinion formation as a multitude of different opinions is publicly available. However, the different opinions often remain unexploited: Learners prefer preference-consistent over preference-inconsistent information, a phenomenon called confirmation bias. Two experiments were designed to test whether technologies…
Luneburg lens in silicon photonics.
Di Falco, Andrea; Kehr, Susanne C; Leonhardt, Ulf
2011-03-14
The Luneburg lens is an aberration-free lens that focuses light from all directions equally well. We fabricated and tested a Luneburg lens in silicon photonics. Such fully-integrated lenses may become the building blocks of compact Fourier optics on chips. Furthermore, our fabrication technique is sufficiently versatile for making perfect imaging devices on silicon platforms.
The Environmental Technology Verification report discusses the technology and performance of the PerfectPleat Ultra 175-102-863 air filter for dust and bioaerosol filtration manufactured by AAF International. The pressure drop across the filter was 112 Pa clean and 229 Pa dust lo...
Frederick, R I
2000-01-01
Mixed group validation (MGV) is offered as an alternative to criterion group validation (CGV) to estimate the true positive and false positive rates of tests and other diagnostic signs. CGV requires perfect confidence about each research participant's status with respect to the presence or absence of pathology. MGV determines diagnostic efficiencies based on group data; knowing an individual's status with respect to pathology is not required. MGV can use relatively weak indicators to validate better diagnostic signs, whereas CGV requires perfect diagnostic signs to avoid error in computing true positive and false positive rates. The process of MGV is explained, and a computer simulation demonstrates the soundness of the procedure. MGV of the Rey 15-Item Memory Test (Rey, 1958) for 723 pre-trial criminal defendants resulted in higher estimates of true positive rates and lower estimates of false positive rates as compared with prior research conducted with CGV. The author demonstrates how MGV addresses all the criticisms Rogers (1997b) outlined for differential prevalence designs in malingering detection research. Copyright 2000 John Wiley & Sons, Ltd.
Lee, Hoseok; Ahn, Joong Mo; Kang, Yusuhn; Oh, Joo Han; Lee, Eugene; Lee, Joon Woo; Kang, Heung Sik
2018-01-01
To compare the T1-weighted spectral presaturation with inversion-recovery sequences (T1 SPIR) with T2-weighted turbo spin-echo sequences (T2 TSE) on 3T magnetic resonance arthrography (MRA) in the evaluation of the subscapularis (SSC) tendon tear with arthroscopic findings as the reference standard. This retrospective study included 120 consecutive patients who had undergone MRA within 3 months between April and December 2015. Two musculoskeletal radiologists blinded to the arthroscopic results evaluated T1 SPIR and T2 TSE images in separate sessions for the integrity of the SSC tendon, examining normal/articular-surface partial-thickness tear (PTTa)/full-thickness tear (FTT). Diagnostic performance of T1 SPIR and T2 TSE was calculated with arthroscopic results as the reference standard, and sensitivity, specificity, and accuracy were compared using the McNemar test. Interobserver agreement was measured with kappa (κ) statistics. There were 74 SSC tendon tears (36 PTTa and 38 FTT) confirmed by arthroscopy. Significant differences were found in the sensitivity and accuracy between T1 SPIR and T2 TSE using the McNemar test, with respective rates of 95.9-94.6% vs. 71.6-75.7% and 90.8-91.7% vs. 79.2-83.3% for detecting tear; 55.3% vs. 31.6-34.2% and 85.8% vs. 78.3-79.2%, respectively, for FTT; and 91.7-97.2% vs. 58.3-61.1% and 89% vs. 78-79.3%, respectively, for PTTa. Interobserver agreement for T1 SPIR was almost perfect for T1 SPIR (κ = 0.839) and substantial for T2 TSE (κ = 0.769). T1-weighted spectral presaturation with inversion-recovery sequences is more sensitive and accurate compared to T2 TSE in detecting SSC tendon tear on 3T MRA.
Liu, San-Xu; Hou, Wei; Zhang, Xue-Yan; Peng, Chang-Jun; Yue, Bi-Song; Fan, Zhen-Xin; Li, Jing
2018-07-18
The Tibetan macaque, which is endemic to China, is currently listed as a Near Endangered primate species by the International Union for Conservation of Nature (IUCN). Short tandem repeats (STRs) refer to repetitive elements of genome sequence that range in length from 1-6 bp. They are found in many organisms and are widely applied in population genetic studies. To clarify the distribution characteristics of genome-wide STRs and understand their variation among Tibetan macaques, we conducted a genome-wide survey of STRs with next-generation sequencing of five macaque samples. A total of 1 077 790 perfect STRs were mined from our assembly, with an N50 of 4 966 bp. Mono-nucleotide repeats were the most abundant, followed by tetra- and di-nucleotide repeats. Analysis of GC content and repeats showed consistent results with other macaques. Furthermore, using STR analysis software (lobSTR), we found that the proportion of base pair deletions in the STRs was greater than that of insertions in the five Tibetan macaque individuals (P<0.05, t-test). We also found a greater number of homozygous STRs than heterozygous STRs (P<0.05, t-test), with the Emei and Jianyang Tibetan macaques showing more heterozygous loci than Huangshan Tibetan macaques. The proportion of insertions and mean variation of alleles in the Emei and Jianyang individuals were slightly higher than those in the Huangshan individuals, thus revealing differences in STR allele size between the two populations. The polymorphic STR loci identified based on the reference genome showed good amplification efficiency and could be used to study population genetics in Tibetan macaques. The neighbor-joining tree classified the five macaques into two different branches according to their geographical origin, indicating high genetic differentiation between the Huangshan and Sichuan populations. We elucidated the distribution characteristics of STRs in the Tibetan macaque genome and provided an effective method for screening polymorphic STRs. Our results also lay a foundation for future genetic variation studies of macaques.
Spectrophotovoltaic orbital power generation
NASA Technical Reports Server (NTRS)
Knowles, G.; Carroll, J.
1983-01-01
A subscale model of a photovoltaic power system employing spectral splitting and 1000:1 concentration was fabricated and tested. The 10-in. aperture model demonstrated 15.5% efficiency with 86% of the energy produced by a GaAs solar cell and 14% of the energy produced by an Si cell. The calculated efficiency of the system using the same solar cells, but having perfect optics, would be approximately 20%. The model design, component measurements, test results, and mathematical model are presented.
NPS Solar Cell Array Tester Cubesat Flight Testing and Integration
2014-09-01
with current (I). P V I (2.1) This is significant because the battery discharge test will not lineup perfectly with Figure 12...accordance with the charging procedures [13]. 3. NPS-SCAT Power Budget A power budget analysis was performed to determine if the NPS-SCAT is self...using procedures developed by Marissa Brummitt, and with the assistance of Adam Hill, NPS-SCAT Program Manager. 1. ELaNa IV Random Vibration Levels
Whatman, Chris; Hing, Wayne; Hume, Patria
2012-05-01
To investigate physiotherapist agreement in rating movement quality during lower extremity functional tests using two visual rating methods and physiotherapists with differing clinical experience. Clinical measurement. Six healthy individuals were rated by 44 physiotherapists. These raters were in three groups (inexperienced, novice, experienced). Video recordings of all six individuals performing four lower extremity functional tests were visually rated (dichotomous or ordinal scale) using two rating methods (overall or segment) on two occasions separated by 3-4 weeks. Intra and inter-rater agreement for physiotherapists was determined using overall percentage agreement (OPA) and the first order agreement coefficient (AC1). Intra-rater agreement for overall and segment methods ranged from slight to almost perfect (OPA: 29-96%, AC1: 0.01 to 0.96). AC1 agreement was better in the experienced group (84-99% likelihood) and for dichotomous rating (97-100% likelihood). Inter-rater agreement ranged from fair to good (OPA: 45-79%; AC1: 0.22-0.71). AC1 agreement was not influenced by clinical experience but was again better using dichotomous rating. Physiotherapists' visual rating of movement quality during lower extremity functional tests resulted in slight to almost perfect intra-rater agreement and fair to good inter-rater agreement. Agreement improved with increased level of clinical experience and use of dichotomous rating. Copyright © 2011 Elsevier Ltd. All rights reserved.
Perfect Detection of Spikes in the Linear Sub-threshold Dynamics of Point Neurons
Krishnan, Jeyashree; Porta Mana, PierGianLuca; Helias, Moritz; Diesmann, Markus; Di Napoli, Edoardo
2018-01-01
Spiking neuronal networks are usually simulated with one of three main schemes: the classical time-driven and event-driven schemes, and the more recent hybrid scheme. All three schemes evolve the state of a neuron through a series of checkpoints: equally spaced in the first scheme and determined neuron-wise by spike events in the latter two. The time-driven and the hybrid scheme determine whether the membrane potential of a neuron crosses a threshold at the end of the time interval between consecutive checkpoints. Threshold crossing can, however, occur within the interval even if this test is negative. Spikes can therefore be missed. The present work offers an alternative geometric point of view on neuronal dynamics, and derives, implements, and benchmarks a method for perfect retrospective spike detection. This method can be applied to neuron models with affine or linear subthreshold dynamics. The idea behind the method is to propagate the threshold with a time-inverted dynamics, testing whether the threshold crosses the neuron state to be evolved, rather than vice versa. Algebraically this translates into a set of inequalities necessary and sufficient for threshold crossing. This test is slower than the imperfect one, but can be optimized in several ways. Comparison confirms earlier results that the imperfect tests rarely miss spikes (less than a fraction 1/108 of missed spikes) in biologically relevant settings. PMID:29379430
Rescreening of persons with a negative colonoscopy result: results from a microsimulation model.
Knudsen, Amy B; Hur, Chin; Gazelle, G Scott; Schrag, Deborah; McFarland, Elizabeth G; Kuntz, Karen M
2012-11-06
Persons with a negative result on screening colonoscopy are recommended to repeat the procedure in 10 years. To assess the effectiveness and costs of colonoscopy versus other rescreening strategies after an initial negative colonoscopy result. Microsimulation model. Literature and data from the Surveillance, Epidemiology, and End Results program. Persons aged 50 years who had no adenomas or cancer detected on screening colonoscopy. Lifetime. Societal. No further screening or rescreening starting at age 60 years with colonoscopy every 10 years, annual highly sensitive guaiac fecal occult blood testing (HSFOBT), annual fecal immunochemical testing (FIT), or computed tomographic colonography (CTC) every 5 years. Lifetime cases of colorectal cancer, life expectancy, and lifetime costs per 1000 persons, assuming either perfect or imperfect adherence. Rescreening with any method substantially reduced the risk for colorectal cancer compared with no further screening (range, 7.7 to 12.6 lifetime cases per 1000 persons [perfect adherence] and 17.7 to 20.9 lifetime cases per 1000 persons [imperfect adherence] vs. 31.3 lifetime cases per 1000 persons with no further screening). In both adherence scenarios, the differences in life-years across rescreening strategies were small (range, 30 893 to 30 902 life-years per 1000 persons [perfect adherence] vs. 30 865 to 30 869 life-years per 1000 persons [imperfect adherence]). Rescreening with HSFOBT, FIT, or CTC had fewer complications and was less costly than continuing colonoscopy. Results were sensitive to test-specific adherence rates. Data on adherence to rescreening were limited. Compared with the currently recommended strategy of continuing colonoscopy every 10 years after an initial negative examination, rescreening at age 60 years with annual HSFOBT, annual FIT, or CTC every 5 years provides approximately the same benefit in life-years with fewer complications at a lower cost. Therefore, it is reasonable to use other methods to rescreen persons with negative colonoscopy results. National Cancer Institute.
"Whose perfection is it anyway?": a virtuous consideration of enhancement.
Keenan, James F
1999-08-01
Discussions of genetic enhancements often imply deep suspicions about human desires to manipulate or enhance the course of our future. These unspoken assumptions about the arrogance of the quest for perfection are at odds with the normally hopeful resonancy we find in contemporary theology. The author argues that these fears, suspicions and accusations are misplaced. The problem lies not with the question of whether we should pursue perfection, but rather what perfection we are pursuing. The author argues that perfection, properly understood, has an enormously positive function in the Roman Catholic tradition. The author examines three sources: the Scriptures, the scholastic tradition, and ascetical theology. He examines contemporary criticisms of perfectionism and suggests that an adequate virtue theory keeps us from engaging perfectionism as such. The author then shows how a positive, responsible view of perfection is an asset to our discussion on enhancement technology.
NASA Astrophysics Data System (ADS)
Williams, Christopher J.; Moffitt, Christine M.
2003-03-01
An important emerging issue in fisheries biology is the health of free-ranging populations of fish, particularly with respect to the prevalence of certain pathogens. For many years, pathologists focused on captive populations and interest was in the presence or absence of certain pathogens, so it was economically attractive to test pooled samples of fish. Recently, investigators have begun to study individual fish prevalence from pooled samples. Estimation of disease prevalence from pooled samples is straightforward when assay sensitivity and specificity are perfect, but this assumption is unrealistic. Here we illustrate the use of a Bayesian approach for estimating disease prevalence from pooled samples when sensitivity and specificity are not perfect. We also focus on diagnostic plots to monitor the convergence of the Gibbs-sampling-based Bayesian analysis. The methods are illustrated with a sample data set.
Taverniers, Isabel; Van Bockstaele, Erik; De Loose, Marc
2004-03-01
Analytical real-time PCR technology is a powerful tool for implementation of the GMO labeling regulations enforced in the EU. The quality of analytical measurement data obtained by quantitative real-time PCR depends on the correct use of calibrator and reference materials (RMs). For GMO methods of analysis, the choice of appropriate RMs is currently under debate. So far, genomic DNA solutions from certified reference materials (CRMs) are most often used as calibrators for GMO quantification by means of real-time PCR. However, due to some intrinsic features of these CRMs, errors may be expected in the estimations of DNA sequence quantities. In this paper, two new real-time PCR methods are presented for Roundup Ready soybean, in which two types of plasmid DNA fragments are used as calibrators. Single-target plasmids (STPs) diluted in a background of genomic DNA were used in the first method. Multiple-target plasmids (MTPs) containing both sequences in one molecule were used as calibrators for the second method. Both methods simultaneously detect a promoter 35S sequence as GMO-specific target and a lectin gene sequence as endogenous reference target in a duplex PCR. For the estimation of relative GMO percentages both "delta C(T)" and "standard curve" approaches are tested. Delta C(T) methods are based on direct comparison of measured C(T) values of both the GMO-specific target and the endogenous target. Standard curve methods measure absolute amounts of target copies or haploid genome equivalents. A duplex delta C(T) method with STP calibrators performed at least as well as a similar method with genomic DNA calibrators from commercial CRMs. Besides this, high quality results were obtained with a standard curve method using MTP calibrators. This paper demonstrates that plasmid DNA molecules containing either one or multiple target sequences form perfect alternative calibrators for GMO quantification and are especially suitable for duplex PCR reactions.
Four-point-bending-fatigue behavior of the Zr-based Vitreloy 105 bulk metallic glass
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morrison, M. L.; Buchanan, R. A.; Liaw, Peter K
The purpose of this study was to make a direct comparison between four-point-bending and uniaxial fatigue tests with the Zr{sub 52.5}Cu{sub 17.9}Ni{sub 14.6}Al{sub 10.0}Ti{sub 5.0} (at.%) BMG alloy (Vitreloy 105). The fatigue lifetimes in four-point bending were found to be greater than those reported in uniaxial testing. However, the fatigue-endurance limit found in four-point bending was slightly less than that reported for uniaxial fatigue. Thus, the significant differences between fatigue studies in the literature are not likely due to this difference in testing geometry. On the contrary, the fatigue lifetimes were found to be highly dependent upon surface defects andmore » material quality. The four-point-bending-fatigue performance of the Vit 105 alloy was found to be greater than most BMGs and similar to the 300 M high-strength steel and other crystalline alloys in spite of not being 'perfectly amorphous.' Due to the detrimental effects of these inhomogeneities and wear at the supporting pins, this fatigue behavior can be assumed to be a conservative estimate of the potential fatigue performance of a perfectly amorphous and homogeneous BMG.« less
NASA Technical Reports Server (NTRS)
hoelzer, H. D.; Fourroux, K. A.; Rickman, D. L.; Schrader, C. M.
2011-01-01
Figures of Merit (FoMs) and the FoM software provide a method for quantitatively evaluating the quality of a regolith simulant by comparing the simulant to a reference material. FoMs may be used for comparing a simulant to actual regolith material, specification by stating the value a simulant s FoMs must attain to be suitable for a given application and comparing simulants from different vendors or production runs. FoMs may even be used to compare different simulants to each other. A single FoM is conceptually an algorithm that computes a single number for quantifying the similarity or difference of a single characteristic of a simulant material and a reference material and provides a clear measure of how well a simulant and reference material match or compare. FoMs have been constructed to lie between zero and 1, with zero indicating a poor or no match and 1 indicating a perfect match. FoMs are defined for modal composition, particle size distribution, particle shape distribution, (aspect ratio and angularity), and density. This TM covers the mathematics, use, installation, and licensing for the existing FoM code in detail.
Estimating daily climatologies for climate indices derived from climate model data and observations
Mahlstein, Irina; Spirig, Christoph; Liniger, Mark A; Appenzeller, Christof
2015-01-01
Climate indices help to describe the past, present, and the future climate. They are usually closer related to possible impacts and are therefore more illustrative to users than simple climate means. Indices are often based on daily data series and thresholds. It is shown that the percentile-based thresholds are sensitive to the method of computation, and so are the climatological daily mean and the daily standard deviation, which are used for bias corrections of daily climate model data. Sample size issues of either the observed reference period or the model data lead to uncertainties in these estimations. A large number of past ensemble seasonal forecasts, called hindcasts, is used to explore these sampling uncertainties and to compare two different approaches. Based on a perfect model approach it is shown that a fitting approach can improve substantially the estimates of daily climatologies of percentile-based thresholds over land areas, as well as the mean and the variability. These improvements are relevant for bias removal in long-range forecasts or predictions of climate indices based on percentile thresholds. But also for climate change studies, the method shows potential for use. Key Points More robust estimates of daily climate characteristics Statistical fitting approach Based on a perfect model approach PMID:26042192
Putzulu, Rossana; Piccirillo, Nicola; Orlando, Nicoletta; Massini, Giuseppina; Maresca, Maddalena; Scavone, Fernando; Ricerca, Bianca Maria; Zini, Gina
2017-04-01
Chronic red blood cell transfusions remain an essential part of supportive treatment in patients with thalassaemia and sickle cell disease (SCD). Red blood cell (RBC) transfusions expose patients to the risk of developing antibodies: RBC alloimmunization occurs when the immune system meets foreign antigens. We created a register of extensively genotyped donors to achieve a better matched transfusion in order to reduce transfusion alloimmunization. Extended RBC antigen typing was determined and confirmed by molecular biology techniques using Human Erythrocyte Antigen (HEA) BeadChip (BioArray Solutions Ltd., Warren, NJ) in periodic blood donors and in patients with thalassaemia and SCD. During 3 years, we typed extensively 1220 periodic blood donors, 898 male and 322 female. We also studied 10 hematologic patients affected by thalassaemia and sickle cell disease referred to our institution as candidate to periodic transfusions. Our patients (8 females and 2 males with a median age of 48 years, range 24-76 years), extensively typed using molecular techniques and screened for RBC alloantibodies, were transfused with a median of 33.5 RBC units. After three years of molecular typing, the "perfect match" transfusion strategy avoided new alloantibodies development in all studied patients. Copyright © 2017 Elsevier Ltd. All rights reserved.
Klein tunneling in Weyl semimetals under the influence of magnetic field.
Yesilyurt, Can; Tan, Seng Ghee; Liang, Gengchiau; Jalil, Mansoor B A
2016-12-12
Klein tunneling refers to the absence of normal backscattering of electrons even under the case of high potential barriers. At the barrier interface, the perfect matching of electron and hole wavefunctions enables a unit transmission probability for normally incident electrons. It is theoretically and experimentally well understood in two-dimensional relativistic materials such as graphene. Here we investigate the Klein tunneling effect in Weyl semimetals under the influence of magnetic field induced by ferromagnetic stripes placed at barrier boundaries. Our results show that the resonance of Fermi wave vector at specific barrier lengths gives rise to perfect transmission rings, i.e., three-dimensional analogue of the so-called magic transmission angles in two-dimensional Dirac semimetals. Besides, the transmission profile can be shifted by application of magnetic field in the central region, a property which may be utilized in electro-optic applications. When the applied potential is close to the Fermi level, a particular incident vector can be selected by tuning the magnetic field, thus enabling highly selective transmission of electrons in the bulk of Weyl semimetals. Our analytical and numerical calculations obtained by considering Dirac electrons in three regions and using experimentally feasible parameters can pave the way for relativistic tunneling applications in Weyl semimetals.
Noh, Sung Hyun; Zhang, Ho Yeol
2018-01-25
We intended to analyze the efficacy of a new integrated cage and plate device called Perfect-C for anterior cervical discectomy and fusion (ACDF) to cure single-level cervical degenerative disc disease. We enrolled 148 patients who were subjected to single-level ACDF with one of the following three surgical devices: a Perfect-C implant (41 patients), a Zero-P implant (36 patients), or a titanium plate with a polyetheretherketone (PEEK) cage (71 patients). We conducted a retrospective study to compare the clinical and radiological results among the three groups. The length of the operation, intraoperative blood loss, and duration of hospitalization were significantly lower in the Perfect-C group than in the Zero-P and plate-with-cage groups (P < 0.05). At the last follow-up visit, heterotopic ossification (HO) was not observed in any cases (0%) in the Perfect-C and Zero-P groups but was noted in 21 cases (30%) in the plate-with-cage group. The cephalad and caudal plate-to-disc distance (PDD) and the cephalad and caudal PDD/anterior body height (ABH) were significantly greater in the Perfect-C and Zero-P groups than in the plate-with-cage group (P < 0.05). Subsidence occurred in five cases (14%) in the Perfect-C group, in nine cases (25%) in the Zero-P group, and in 15 cases (21%) in the plate-with-cage group. Fusion occurred in 37 cases (90%) in the Perfect-C group, in 31 cases (86%) in the Zero-P group, and in 68 cases (95%) in the plate-with-cage group. The Perfect-C, Zero-P, and plate-with-cage devices are effective for treating single-level cervical degenerative disc disease. However, the Perfect-C implant has many advantages over both the Zero-P implant and conventional plate-cage treatments. The Perfect-C implant was associated with shorter operation times and hospitalization durations, less blood loss, and lower subsidence rates compared with the Zero-P implant or the titanium plate with a PEEK cage.
What grades and achievement tests measure.
Borghans, Lex; Golsteyn, Bart H H; Heckman, James J; Humphries, John Eric
2016-11-22
Intelligence quotient (IQ), grades, and scores on achievement tests are widely used as measures of cognition, but the correlations among them are far from perfect. This paper uses a variety of datasets to show that personality and IQ predict grades and scores on achievement tests. Personality is relatively more important in predicting grades than scores on achievement tests. IQ is relatively more important in predicting scores on achievement tests. Personality is generally more predictive than IQ on a variety of important life outcomes. Both grades and achievement tests are substantially better predictors of important life outcomes than IQ. The reason is that both capture personality traits that have independent predictive power beyond that of IQ.
What grades and achievement tests measure
Borghans, Lex; Golsteyn, Bart H. H.; Heckman, James J.; Humphries, John Eric
2016-01-01
Intelligence quotient (IQ), grades, and scores on achievement tests are widely used as measures of cognition, but the correlations among them are far from perfect. This paper uses a variety of datasets to show that personality and IQ predict grades and scores on achievement tests. Personality is relatively more important in predicting grades than scores on achievement tests. IQ is relatively more important in predicting scores on achievement tests. Personality is generally more predictive than IQ on a variety of important life outcomes. Both grades and achievement tests are substantially better predictors of important life outcomes than IQ. The reason is that both capture personality traits that have independent predictive power beyond that of IQ. PMID:27830648
Simulation of Spiral Slot Antennas on Composite Platforms
NASA Technical Reports Server (NTRS)
Volakis, John L.
1996-01-01
The project goals, plan and accomplishments up to this point are summarized in the viewgraphs. Among the various accomplishments, the most important have been: the development of the prismatic finite element code for doubly curved platforms and its validation with many different antenna configurations; the design and fabrication of a new slot spiral antennas suitable for automobile cellular, GPS and PCs communications; the investigation and development of various mesh truncation schemes, including the perfectly matched absorber and various fast integral equation methods; and the introduction of a frequency domain extrapolation technique (AWE) for predicting broadband responses using only a few samples of the response. This report contains several individual reports most of which have been submitted for publication to referred journals. For a report on the frequency extrapolation technique, the reader is referred to the UM Radiation Laboratory report A total of 14 papers have been published or accepted for publication with the full or partial support of this grant. Several more papers are in preparation.
QUAST: quality assessment tool for genome assemblies.
Gurevich, Alexey; Saveliev, Vladislav; Vyahhi, Nikolay; Tesler, Glenn
2013-04-15
Limitations of genome sequencing techniques have led to dozens of assembly algorithms, none of which is perfect. A number of methods for comparing assemblers have been developed, but none is yet a recognized benchmark. Further, most existing methods for comparing assemblies are only applicable to new assemblies of finished genomes; the problem of evaluating assemblies of previously unsequenced species has not been adequately considered. Here, we present QUAST-a quality assessment tool for evaluating and comparing genome assemblies. This tool improves on leading assembly comparison software with new ideas and quality metrics. QUAST can evaluate assemblies both with a reference genome, as well as without a reference. QUAST produces many reports, summary tables and plots to help scientists in their research and in their publications. In this study, we used QUAST to compare several genome assemblers on three datasets. QUAST tables and plots for all of them are available in the Supplementary Material, and interactive versions of these reports are on the QUAST website. http://bioinf.spbau.ru/quast . Supplementary data are available at Bioinformatics online.
Salticid predation as one potential driving force of ant mimicry in jumping spiders
Huang, Jin-Nan; Cheng, Ren-Chung; Li, Daiqin; Tso, I-Min
2011-01-01
Many spiders possess myrmecomorphy, and species of the jumping spider genus Myrmarachne exhibit nearly perfect ant mimicry. Most salticids are diurnal predators with unusually high visual acuity that prey on various arthropods, including conspecifics. In this study, we tested whether predation pressure from large jumping spiders is one possible driving force of perfect ant mimicry in jumping spiders. The results showed that small non-ant-mimicking jumping spiders were readily treated as prey by large ones (no matter whether heterospecific or conspecific) and suffered high attack and mortality rates. The size difference between small and large jumping spiders significantly affected the outcomes of predatory interactions between them: the smaller the juvenile jumping spiders, the higher the predation risk from large ones. The attack and mortality rates of ant-mimicking jumping spiders were significantly lower than those of non-ant-mimicking jumping spiders, indicating that a resemblance to ants could provide protection against salticid predation. However, results of multivariate behavioural analyses showed that the responses of large jumping spiders to ants and ant-mimicking salticids differed significantly. Results of this study indicate that predation pressure from large jumping spiders might be one selection force driving the evolution of nearly perfect myrmecomorphy in spiders and other arthropods. PMID:20961898
Giordano, Bruno L; McAdams, Stephen
2006-02-01
Identification of the material of struck objects of variable size was investigated. Previous studies on this issue assumed recognition to be based on acoustical measures of damping. This assumption was tested, comparing the power of a damping measure in explaining identification data with that of several other acoustical descriptors. Listeners' performance was perfect with respect to gross material categories (steel-glass and wood-plexiglass) comprising materials of vastly different mechanical properties. Impaired performance was observed for materials within the same gross category, identification being based on the size of the objects alone. The damping descriptor accounted for the identification of the gross categories. However other descriptors such as signal duration explained the results equally well. Materials within the same gross category were identified mainly on the basis of signal frequency. Overall poor support for the relevance of damping to material perception was found. An analysis of the acoustical support for perfect material identification was carried out. Sufficient acoustical information for perfect performance was found. Thus, procedural biases for the origin of the effects of size could be discarded, pointing toward their cognitive, rather than methodological nature. Identification performance was explained in terms of the regularities of the everyday acoustical environment.
Hammond, Simon P; Cross, Jane L; Shepstone, Lee; Backhouse, Tamara; Henderson, Catherine; Poland, Fiona; Sims, Erika; MacLullich, Alasdair; Penhale, Bridget; Howard, Robert; Lambert, Nigel; Varley, Anna; Smith, Toby O; Sahota, Opinder; Donell, Simon; Patel, Martyn; Ballard, Clive; Young, John; Knapp, Martin; Jackson, Stephen; Waring, Justin; Leavey, Nick; Howard, Gregory; Fox, Chris
2017-12-04
Health and social care provision for an ageing population is a global priority. Provision for those with dementia and hip fracture has specific and growing importance. Older people who break their hip are recognised as exceptionally vulnerable to experiencing confusion (including but not exclusively, dementia and/or delirium and/or cognitive impairment(s)) before, during or after acute admissions. Older people experiencing hip fracture and confusion risk serious complications, linked to delayed recovery and higher mortality post-operatively. Specific care pathways acknowledging the differences in patient presentation and care needs are proposed to improve clinical and process outcomes. This protocol describes a multi-centre, feasibility, cluster-randomised, controlled trial (CRCT) to be undertaken across ten National Health Service hospital trusts in the UK. The trial will explore the feasibility of undertaking a CRCT comparing the multicomponent PERFECTED enhanced recovery intervention (PERFECT-ER), which acknowledges the differences in care needs of confused older patients experiencing hip fracture, with standard care. The trial will also have an integrated process evaluation to explore how PERFECT-ER is implemented and interacts with the local context. The study will recruit 400 hip fracture patients identified as experiencing confusion and will also recruit "suitable informants" (individuals in regular contact with participants who will complete proxy measures). We will also recruit NHS professionals for the process evaluation. This mixed methods design will produce data to inform a definitive evaluation of the intervention via a large-scale pragmatic randomised controlled trial (RCT). The trial will provide a preliminary estimate of potential efficacy of PERFECT-ER versus standard care; assess service delivery variation, inform primary and secondary outcome selection, generate estimates of recruitment and retention rates, data collection difficulties, and completeness of outcome data and provide an indication of potential economic benefits. The process evaluation will enhance knowledge of implementation delivery and receipt. ISRCTN, 99336264 . Registered on 5 September 2016.
Chen, Yue; Fang, Zhao-Xiang; Ren, Yu-Xuan; Gong, Lei; Lu, Rong-De
2015-09-20
Optical vortices are associated with a spatial phase singularity. Such a beam with a vortex is valuable in optical microscopy, hyper-entanglement, and optical levitation. In these applications, vortex beams with a perfect circle shape and a large topological charge are highly desirable. But the generation of perfect vortices with high topological charges is challenging. We present a novel method to create perfect vortex beams with large topological charges using a digital micromirror device (DMD) through binary amplitude modulation and a narrow Gaussian approximation. The DMD with binary holograms encoding both the spatial amplitude and the phase could generate fast switchable, reconfigurable optical vortex beams with significantly high quality and fidelity. With either the binary Lee hologram or the superpixel binary encoding technique, we were able to generate the corresponding hologram with high fidelity and create a perfect vortex with topological charge as large as 90. The physical properties of the perfect vortex beam produced were characterized through measurements of propagation dynamics and the focusing fields. The measurements show good consistency with the theoretical simulation. The perfect vortex beam produced satisfies high-demand utilization in optical manipulation and control, momentum transfer, quantum computing, and biophotonics.
ERIC Educational Resources Information Center
Woods, Charlotte; Armstrong, Paul; Bragg, Joanna; Pearson, Diana
2013-01-01
This article examines illustrative cases of partnerships from a government-funded programme of experimental projects in England designed to test out the potential of senior business managers to provide leadership across a group of schools. The article places the programme within the context of international public service reforms and, more…
Grob aircraft construction: The G 110 flies
NASA Technical Reports Server (NTRS)
Malzbender, B.
1982-01-01
Description, specifications and test flight performance of the G 110 are provided. The G 110 completely incorporates modern GfK construction techniques which heretofore have been developed and perfected for the construction of sailplanes. The G 110 is a prototype of a GfK constructed motorized aircraft and shows much promise for the future of German aviation.
Retrieval-Induced Forgetting in Perceptually Driven Memory Tests
ERIC Educational Resources Information Center
Bajo, M. Teresa; Gomez-Ariza, Carlos J.; Fernandez, Angel; Marful, Alejandra
2006-01-01
Recent data (T. J. Perfect, C. J. A. Moulin, M. A. Conway, & E. Perry, 2002) have suggested that retrieval-induced forgetting (RIF) depends on conceptual memory because the effect is not found in perceptually driven tasks. In 3 experiments, the authors aimed to show that the presence of RIF depends on whether the procedure induces appropriate…
Making Sense of Phrasal Verbs: A Cognitive Linguistic Account of L2 Learning
ERIC Educational Resources Information Center
Gonzalez, Rafael Alejo
2010-01-01
Phrasal verbs (PVs) have recently been the object of interest by linguists given their status as phraseological units whose meaning is non-compositional and opaque. They constitute a perfect case for theories of language processing and language acquisition to be tested. Cognitive linguists have participated in this debate and shown a certain…
Proof test of the computer program BUCKY for plasticity problems
NASA Technical Reports Server (NTRS)
Smith, James P.
1994-01-01
A theoretical equation describing the elastic-plastic deformation of a cantilever beam subject to a constant pressure is developed. The theoretical result is compared numerically to the computer program BUCKY for the case of an elastic-perfectly plastic specimen. It is shown that the theoretical and numerical results compare favorably in the plastic range. Comparisons are made to another research code to further validate the BUCKY results. This paper serves as a quality test for the computer program BUCKY developed at NASA Johnson Space Center.
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Dawson, Nathan J.; Andrews, James H.
2012-09-01
Two classes of conservative, linear, optical rotary effects (optical activity and Faraday rotation) are distinguished by their behavior under time reversal. Faraday rotation, but not optical activity, is capable of coherent perfect rotation, by which we mean the complete transfer of counterpropagating coherent light fields into their orthogonal polarization. Unlike coherent perfect absorption, however, this process is explicitly energy conserving and reversible. Our study highlights the necessity of time-reversal-odd processes (not just absorption) and coherence in perfect mode conversion and thus informs the optimization of active multiport optical devices.
Key forecasts shaping nursing's perfect storm.
Yoder-Wise, Patricia S
2007-01-01
Perfect storms abound in nursing and healthcare. How we plan for them and how we forecast effectively which ones will have tremendous impact on how we lead the profession is a challenge to anyone who is or will be a leader. This article focuses on key forecasts that contribute to creating perfect storms of the future. The "perfect storm" is a term found in multiple disciplines. The phrase denotes the condition that exists when events occur simultaneously with the result that this confluence has a greater impact than what could have resulted from a chance combination. Although perfect storms are rare, they have enormous impact when they occur, and if an alteration in any of the events occurs, the overall impact is lessened.
Zhou, Gaochao; Tao, Xudong; Shen, Ze; Zhu, Guanghao; Jin, Biaobing; Kang, Lin; Xu, Weiwei; Chen, Jian; Wu, Peiheng
2016-01-01
We propose a kind of general framework for the design of a perfect linear polarization converter that works in the transmission mode. Using an intuitive picture that is based on the method of bi-directional polarization mode decomposition, it is shown that when the device under consideration simultaneously possesses two complementary symmetry planes, with one being equivalent to a perfect electric conducting surface and the other being equivalent to a perfect magnetic conducting surface, linear polarization conversion can occur with an efficiency of 100% in the absence of absorptive losses. The proposed framework is validated by two design examples that operate near 10 GHz, where the numerical, experimental and analytic results are in good agreements. PMID:27958313
System for objective assessment of image differences in digital cinema
NASA Astrophysics Data System (ADS)
Fliegel, Karel; Krasula, Lukáš; Páta, Petr; Myslík, Jiří; Pecák, Josef; Jícha, Marek
2014-09-01
There is high demand for quick digitization and subsequent image restoration of archived film records. Digitization is very urgent in many cases because various invaluable pieces of cultural heritage are stored on aging media. Only selected records can be reconstructed perfectly using painstaking manual or semi-automatic procedures. This paper aims to answer the question what are the quality requirements on the restoration process in order to obtain acceptably close visual perception of the digitally restored film in comparison to the original analog film copy. This knowledge is very important to preserve the original artistic intention of the movie producers. Subjective experiment with artificially distorted images has been conducted in order to answer the question what is the visual impact of common image distortions in digital cinema. Typical color and contrast distortions were introduced and test images were presented to viewers using digital projector. Based on the outcome of this subjective evaluation a system for objective assessment of image distortions has been developed and its performance tested. The system utilizes calibrated digital single-lens reflex camera and subsequent analysis of suitable features of images captured from the projection screen. The evaluation of captured image data has been optimized in order to obtain predicted differences between the reference and distorted images while achieving high correlation with the results of subjective assessment. The system can be used to objectively determine the difference between analog film and digital cinema images on the projection screen.
AN FDTD ALGORITHM WITH PERFECTLY MATCHED LAYERS FOR CONDUCTIVE MEDIA. (R825225)
We extend Berenger's perfectly matched layers (PML) to conductive media. A finite-difference-time-domain (FDTD) algorithm with PML as an absorbing boundary condition is developed for solutions of Maxwell's equations in inhomogeneous, conductive media. For a perfectly matched laye...
ERIC Educational Resources Information Center
Landphair, Juliette
2007-01-01
What exactly is perfect? Students describe perfection as a combination of characteristics valued by their peer culture: intelligence, thin and fit physical appearance, social poise. As students chug through their daily lives--morning classes, organization meetings, club sports practice or the gym, dinner, another class, more meetings, library,…
Nonreflective Conditions for Perfectly Matched Layer in Computational Aeroacoustics
NASA Astrophysics Data System (ADS)
Choung, Hanahchim; Jang, Seokjong; Lee, Soogab
2018-05-01
In computational aeroacoustics, boundary conditions such as radiation, outflow, or absorbing boundary conditions are critical issues in that they can affect the entire solution of the computation. Among these types of boundary conditions, the perfectly matched layer boundary condition, which has been widely used in computational fluid dynamics and computational aeroacoustics, is developed by augmenting the additional term in the original governing equations by an absorption function so as to stably absorb the outgoing waves. Even if the perfectly matched layer is analytically a perfectly nonreflective boundary condition, spurious waves occur at the interface, since the analysis is performed in discretized space. Hence, this study is focused on factors that affect numerical errors from perfectly matched layer to find the optimum conditions for nonreflective PML. Through a mathematical approach, a minimum width of perfectly matched layer and an optimum absorption coefficient are suggested. To validate the prediction of the analysis, numerical simulations are performed in a generalized coordinate system, as well as in a Cartesian coordinate system.
SU-E-T-234: Daily Quality Assurance for a Six Degrees of Freedom Couch Using a Novel Phantom
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, K; Woollard, J; Ayan, A
2015-06-15
Purpose: To test the accuracy and reproducibility of both translational and rotational movements for a couch with six degrees of freedom (6DoF) using a novel phantom design Methods: An end-to-end test was carried out using two different phantoms. A 6 cm3 cube with a central fiducial BB (WL-QA Sun Nuclear) and a custom fabricated rectangular prism (31 cm x 8 cm x 8 cm), placed on a baseplate with known angular offsets for pitch, roll and yaw with a central fiducial BB and unique surface structures for registration purposes, were used. The end-to-end test included an initial CT simulation formore » a reference study, setup to an offset mark on each phantom, registration of the reference CT to the acquired cone-beam CT, and final Winston-Lutz delivery at four cardinal gantry angles. Results for both translational and rotational movements were recorded and compared for both phantoms. Results: Translational and rotational measurements were performed with a PerfectPitch (Varian) couch for 10 trials for both phantoms. Distinct translational shifts were [−5.372±0.384mm, −10.183±0.137mm, 14.028±0.155mm] for the cube and [7.520±0.159mm, −9.117±0.101mm, 16.273±0.115mm] for the prototype phantom for lateral, longitudinal, and vertical shifts, respectively. Distinct rotational adjustments were [1.121±0.102o, −1.067±0.235o, −2.662±0.380o] for the cube and [2.534±0.059o, 1.994±0.025o, 2.094±0.076o] for the prototype for pitch, roll, and yaw, respectively. Winston-Lutz test results performed after 6DoF couch correction from each cardinal gantry angle ranged from 0.26–0.72mm for the cube and 0.55–0.86mm for the prototype. Conclusion: The prototype phantom is more precise for both translational and rotational adjustments compared to a commercial phantom. The design of the prototype phantom allows for a more discernible visual confirmation of correct translational and rotational adjustments with the prototype phantom. Winston-Lutz results are more accurate for the commercial phantom but are still within tolerance for the prototype phantom.« less
Quantitating Human Optic Disc Topography
NASA Astrophysics Data System (ADS)
Graebel, William P.; Cohan, Bruce E.; Pearch, Andrew C.
1980-07-01
A method is presented for quantitatively expressing the topography of the human optic disc, applicable in a clinical setting to the diagnosis and management of glaucoma. Pho-tographs of the disc illuminated by a pattern of fine, high contrast parallel lines are digitized. From the measured deviation of the lines as they traverse the disc surface, disc topography is calculated, using the principles of optical sectioning. The quantitators applied to express this topography have the the following advantages : sensitivity to disc shape; objectivity; going beyond the limits of cup-disc ratio estimates and volume calculations; perfect generality in a mathematical sense; an inherent scheme for determining a non-subjective reference frame to compare different discs or the same disc over time.
Mars global reference atmosphere model (Mars-GRAM)
NASA Technical Reports Server (NTRS)
Justus, C. G.; James, Bonnie F.
1992-01-01
Mars-GRAM is an empirical model that parameterizes the temperature, pressure, density, and wind structure of the Martian atmosphere from the surface through thermospheric altitudes. In the lower atmosphere of Mars, the model is built around parameterizations of height, latitudinal, longitudinal, and seasonal variations of temperature determined from a survey of published measurements from the Mariner and Viking programs. Pressure and density are inferred from the temperature by making use of the hydrostatic and perfect gas laws relationships. For the upper atmosphere, the thermospheric model of Stewart is used. A hydrostatic interpolation routine is used to insure a smooth transition from the lower portion of the model to the Stewart thermospheric model. Other aspects of the model are discussed.
Heterodyne interferometer with subatomic periodic nonlinearity.
Wu, C M; Lawall, J; Deslattes, R D
1999-07-01
A new, to our knowledge, heterodyne interferometer for differential displacement measurements is presented. It is, in principle, free of periodic nonlinearity. A pair of spatially separated light beams with different frequencies is produced by two acousto-optic modulators, avoiding the main source of periodic nonlinearity in traditional heterodyne interferometers that are based on a Zeeman split laser. In addition, laser beams of the same frequency are used in the measurement and the reference arms, giving the interferometer theoretically perfect immunity from common-mode displacement. We experimentally demonstrated a residual level of periodic nonlinearity of less than 20 pm in amplitude. The remaining periodic error is attributed to unbalanced ghost reflections that drift slowly with time.
NASA Astrophysics Data System (ADS)
Gaur, Vinod K.
The article begins with a reference to the first rational approaches to explaining the earth's magnetic field notably Elsasser's application of magneto-hydrodynamics, followed by brief outlines of the characteristics of planetary magnetic fields and of the potentially insightful homopolar dynamo in illuminating the basic issues: theoretical requirements of asymmetry and finite conductivity in sustaining the dynamo process. It concludes with sections on Dynamo modeling and, in particular, the Geo-dynamo, but not before some of the evocative physical processes mediated by the Lorentz force and the behaviour of a flux tube embedded in a perfectly conducting fluid, using Alfvén theorem, are explained, as well as the traditional intermediate approaches to investigating dynamo processes using the more tractable Kinematic models.
12 CFR 1511.3 - Law governing other interests.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 1511.3 Banks and Banking DEPARTMENT OF THE TREASURY RESOLUTION FUNDING CORPORATION BOOK-ENTRY PROCEDURE... perfection, effect of perfection or non-perfection and priority of a security interest in a Security... Securities Account is a clearing corporation, and the Participant's interest in a Book-entry Funding...
12 CFR 1511.3 - Law governing other interests.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 1511.3 Banks and Banking DEPARTMENT OF THE TREASURY RESOLUTION FUNDING CORPORATION BOOK-ENTRY PROCEDURE... perfection, effect of perfection or non-perfection and priority of a security interest in a Security... Securities Account is a clearing corporation, and the Participant's interest in a Book-entry Funding...
Allele-sharing models: LOD scores and accurate linkage tests.
Kong, A; Cox, N J
1997-11-01
Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.
Allele-sharing models: LOD scores and accurate linkage tests.
Kong, A; Cox, N J
1997-01-01
Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested. PMID:9345087
A numerical relativity scheme for cosmological simulations
NASA Astrophysics Data System (ADS)
Daverio, David; Dirian, Yves; Mitsou, Ermis
2017-12-01
Cosmological simulations involving the fully covariant gravitational dynamics may prove relevant in understanding relativistic/non-linear features and, therefore, in taking better advantage of the upcoming large scale structure survey data. We propose a new 3 + 1 integration scheme for general relativity in the case where the matter sector contains a minimally-coupled perfect fluid field. The original feature is that we completely eliminate the fluid components through the constraint equations, thus remaining with a set of unconstrained evolution equations for the rest of the fields. This procedure does not constrain the lapse function and shift vector, so it holds in arbitrary gauge and also works for arbitrary equation of state. An important advantage of this scheme is that it allows one to define and pass an adaptation of the robustness test to the cosmological context, at least in the case of pressureless perfect fluid matter, which is the relevant one for late-time cosmology.
Tredennick, Andrew T; de Mazancourt, Claire; Loreau, Michel; Adler, Peter B
2017-04-01
Temporal asynchrony among species helps diversity to stabilize ecosystem functioning, but identifying the mechanisms that determine synchrony remains a challenge. Here, we refine and test theory showing that synchrony depends on three factors: species responses to environmental variation, interspecific interactions, and demographic stochasticity. We then conduct simulation experiments with empirical population models to quantify the relative influence of these factors on the synchrony of dominant species in five semiarid grasslands. We found that the average synchrony of per capita growth rates, which can range from 0 (perfect asynchrony) to 1 (perfect synchrony), was higher when environmental variation was present (0.62) rather than absent (0.43). Removing interspecific interactions and demographic stochasticity had small effects on synchrony. For the dominant species in these plant communities, where species interactions and demographic stochasticity have little influence, synchrony reflects the covariance in species' responses to the environment. © 2017 by the Ecological Society of America.
PARC Navier-Stokes code upgrade and validation for high speed aeroheating predictions
NASA Technical Reports Server (NTRS)
Liver, Peter A.; Praharaj, Sarat C.; Seaford, C. Mark
1990-01-01
Applications of the PARC full Navier-Stokes code for hypersonic flowfield and aeroheating predictions around blunt bodies such as the Aeroassist Flight Experiment (AFE) and Aeroassisted Orbital Transfer Vehicle (AOTV) are evaluated. Two-dimensional/axisymmetric and three-dimensional perfect gas versions of the code were upgraded and tested against benchmark wind tunnel cases of hemisphere-cylinder, three-dimensional AFE forebody, and axisymmetric AFE and AOTV aerobrake/wake flowfields. PARC calculations are in good agreement with experimental data and results of similar computer codes. Difficulties encountered in flowfield and heat transfer predictions due to effects of grid density, boundary conditions such as singular stagnation line axis and artificial dissipation terms are presented together with subsequent improvements made to the code. The experience gained with the perfect gas code is being currently utilized in applications of an equilibrium air real gas PARC version developed at REMTECH.
A computational study of the flowfield surrounding the Aeroassist Flight Experiment vehicle
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.; Greene, Francis A.
1987-01-01
A symmetric total variation diminishing (STVD) algorithm has been applied to the solution of the three-dimensional hypersonic flowfield surrounding the Aeroassist Flight Experiment (AFE) vehicle. Both perfect-gas and chemical nonequilibrium models have been used. The perfect-gas flows were computed at two different Reynolds numbers, including a flight trajectory point at maximum dynamic pressure, and on two different grids. Procedures for coupling the solution of the species continuity equations with the Navier-Stokes equations in the presence of chemical nonequilibrium are reviewed and tested on the forebody of the AFE and on the complete flowfield assuming noncatalytic wall and no species diffusion. Problems with the STVD algorithm unique to flows with variable thermodynamic properties (real gas) are identified and algorithm modifications are suggested. A potential heating problem caused by strong flow impingement on the nozzle lip in the near wake at 0-deg angle of attack has been identified.
System-wide versus component-specific trust using multiple aids.
Keller, David; Rice, Stephen
2010-01-01
Previous research in operator trust toward automated aids has focused primarily on single aids. The current study focuses on how operator trust is affected by the presence of multiple aids. Two competing theories of multiple-trust are presented. A component-specific trust theory predicts that operators will differentially place their trust in automated aids that vary in reliability. A system-wide trust theory predicts that operators will treat multiple imperfect aids as one "system" and merge their trust across aids despite differences in the aids' reliability. A simulated flight task was used to test these theories, whereby operators performed a pursuit tracking task while concurrently monitoring multiple system gauges that were augmented with perfect or imperfect automated aids. The data revealed that a system-wide trust theory best predicted the data; operators merged their trust across both aids, behaving toward a perfectly reliable aid in the same manner as they did towards unreliable aids.
Mahmmod, Yasser S; Toft, Nils; Katholm, Jørgen; Grønbæk, Carsten; Klaas, Ilka C
2013-11-01
Danish farmers can order a real-time PCR mastitis diagnostic test on routinely taken cow-level samples from milk recordings. Validation of its performance in comparison to conventional mastitis diagnostics under field conditions is essential for efficient control of intramammary infections (IMI) with Staphylococcus aureus (S. aureus). Therefore, the objective of this study was to estimate the sensitivity (Se) and specificity (Sp) of real-time PCR, bacterial culture (BC) and California mastitis test (CMT) for the diagnosis of the naturally occurring IMI with S. aureus in routinely collected milk samples using latent class analysis (LCA) to avoid the assumption of a perfect reference test. Using systematic random sampling, a total of 609 lactating dairy cows were selected from 6 dairy herds with bulk tank milk PCR cycle threshold (Ct) value ≤39 for S. aureus. At routine milk recordings, automatically obtained cow-level (composite) milk samples were analyzed by PCR and at the same milking, 2436 quarter milk samples were collected aseptically for BC and CMT. Results showed that 140 cows (23%) were positive for S. aureus IMI by BC while 170 cows (28%) were positive by PCR. Estimates of Se and Sp for PCR were higher than test estimates of BC and CMT. SeCMT was higher than SeBC however, SpBC was higher than SpCMT. SePCR was 91%, while SeBC was 53%, and SeCMT was 61%. SpPCR was 99%, while SpBC was 89%, and SpCMT was 65%. In conclusion, PCR has a higher performance than the conventional diagnostic tests (BC and CMT) suggesting its usefulness as a routine test for accurate diagnosis of S. aureus IMI from dairy cows at routine milk recordings. The use of LCA provided estimates of the test characteristics for two currently diagnostic tests (BC, CMT) and a novel technique (real-time PCR) for diagnosing S. aureus IMI under field conditions at routine milk recordings in Denmark. Copyright © 2013 Elsevier B.V. All rights reserved.
10 CFR 611.108 - Perfection of liens and preservation of collateral.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 10 Energy 4 2010-01-01 2010-01-01 false Perfection of liens and preservation of collateral. 611.108 Section 611.108 Energy DEPARTMENT OF ENERGY (CONTINUED) ASSISTANCE REGULATIONS ADVANCED TECHNOLOGY VEHICLES MANUFACTURER ASSISTANCE PROGRAM Direct Loan Program § 611.108 Perfection of liens and preservation...
10 CFR 609.16 - Perfection of liens and preservation of collateral.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 4 2011-01-01 2011-01-01 false Perfection of liens and preservation of collateral. 609.16 Section 609.16 Energy DEPARTMENT OF ENERGY (CONTINUED) ASSISTANCE REGULATIONS LOAN GUARANTEES FOR PROJECTS THAT EMPLOY INNOVATIVE TECHNOLOGIES § 609.16 Perfection of liens and preservation of collateral. (a...
Structure and symmetry in coherent perfect polarization rotation
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Zhou, Chuanhong; Andrews, James H.; Baker, Michael A.
2015-01-01
Theoretical investigations of different routes to coherent perfect polarization rotation illustrate its phenomenological connection with coherent perfect absorption. Our study of systems with broken parity, layering, combined Faraday rotation and optical activity, or a rotator-loaded optical cavity highlights their similarity and suggests alternate approaches to improving and miniaturizing optical devices.
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Dawson, Nathan; Andrews, James
2012-04-01
Two classes of conservative, linear, optical rotary effects (optical activity and Faraday rotation) are distinguished by their behavior under time reversal. In analogy with coherent perfect absorption, where counterpropagating light fields are controllably converted into other degrees of freedom, we show that in a linear-conservative medium only time-odd (Faraday) rotation is capable of coherent perfect rotation, by which we mean the complete transfer of counterpropagating coherent light fields into their orthogonal polarization. This highlights the necessity of time reversal odd processes (not just absorption) and coherence in perfect mode conversion and may inform device design.
Cheng, Fei; Yang, Xiaodong; Gao, Jie
2014-06-01
An infrared refractive index sensor based on plasmonic perfect absorbers for glucose concentration sensing is experimentally demonstrated. Utilizing substantial absorption contrast between a perfect absorber (∼98% at normal incidence) and a non-perfect absorber upon the refractive index change, a maximum value of figure of merit (FOM*) about 55 and a bulk wavelength sensitivity about 590 nm/RIU are achieved. The demonstrated sensing platform provides great potential in improving the performance of plasmonic refractive index sensors and developing future surface enhanced infrared spectroscopy.
Mirshafieyan, Seyed Sadreddin; Luk, Ting S.; Guo, Junpeng
2016-03-04
Here, we demonstrated perfect light absorption in optical nanocavities made of ultra-thin percolation aluminum and silicon films deposited on an aluminum surface. The total layer thickness of the aluminum and silicon films is one order of magnitude less than perfect absorption wavelength in the visible spectral range. The ratio of silicon cavity layer thickness to perfect absorption wavelength decreases as wavelength decreases due to the increased phase delays at silicon-aluminum boundaries at shorter wavelengths. It is explained that perfect light absorption is due to critical coupling of incident wave to the fundamental Fabry-Perot resonance mode of the structure where themore » round trip phase delay is zero. Simulations were performed and the results agree well with the measurement results.« less
A proposal of a perfect graphene absorber with enhanced design and fabrication tolerance.
Lee, Sangjun; Tran, Thang Q; Heo, Hyungjun; Kim, Myunghwan; Kim, Sangin
2017-07-06
We propose a novel device structure for the perfect absorption of a one-sided lightwavve illumination, which consists of a high-contrast grating (HCG) and an evanescently coupled slab with an absorbing medium (graphene). The operation principle and design process of the proposed structure are analyzed using the coupled mode theory (CMT), which is confirmed by the rigorous coupled wave analysis (RCWA). According to the CMT analysis, in the design of the proposed perfect absorber, the HCG, functioning as a broadband reflector, and the lossy slab structure can be optimized separately. In addition, we have more design parameters than conditions to satisfy; that is, we have more than enough degrees of freedom in the device design. This significantly relieves the complexity of the perfect absorber design. Moreover, in the proposed perfect absorber, most of the incident wave is confined in the slab region with strong field enhancement, so that the absorption performance is very tolerant to the variation of the design parameters near the optimal values for the perfect absorption. It has been demonstrated numerically that absorption spectrum tuning over a wider wavelength range of ~300 nm is possible, keeping significantly high maximum absorption (>95%). It is also shown that the proposed perfect absorber outperforms the previously proposed scheme in all aspects.
Tunable dual-band nearly perfect absorption based on a compound metallic grating
NASA Astrophysics Data System (ADS)
Gao, Hua; Zheng, Zhi-Yuan; Feng, Juan
2017-02-01
Traditional metallic gratings and novel metamaterials are two basic kinds of candidates for perfect absorption. Comparatively speaking, metallic grating is the preferred choice for the same absorption effect because it is structurally simpler and more convenient to fabricate. However, to date, most of the perfect absorption effects achieved based on metamaterials are also available using an metallic grating except the tunable dual(multi)-band perfect absorption. To fill this gap, in this paper, by adding subgrooves on the rear surface as well as inside the grating slits to a free-standing metallic grating, tunable dual-band perfect absorption is also obtained for the first time. The grooves inside the slits is to tune the frequency of the Cavity Mode(CM) resonance which enhances the transmission and suppresses the reflectance simultaneously. The grooves on the rear surface give rise to the phase resonance which not only suppresses the transmission but also reinforces the reflectance depression effect. Thus, when the phase resonance and the frequency tunable CM resonance occur together, transmission and reflection can be suppressed simultaneously, dual-band nearly perfect absorption with tunable frequencies is obtained. To our knowledge, this perfect absorption phenomenon is achieved for the first time in a designed metallic grating structure.
Lightning testing at the subsystem level
NASA Technical Reports Server (NTRS)
Luteran, Frank
1991-01-01
Testing at the subsystem or black box level for lightning hardness is required if system hardness is to be assured at the system level. The often applied philosophy of lighting testing only at the system level leads to extensive end of the line design changes which result in excessive costs and time delays. In order to perform testing at the subsystem level two important factors must be defined to make the testing simulation meaningful. The first factor is the definition of the test stimulus appropriate to the subsystem level. Application of system level stimulations to the subsystem level usually leads to significant overdesign of the subsystem which is not necessary and may impair normal subsystem performance. The second factor is the availability of test equipment needed to provide the subsystem level lightning stimulation. Equipment for testing at this level should be portable or at least movable to enable efficient testing in a design laboratory environment. Large fixed test installations for system level tests are not readily available for use by the design engineers at the subsystem level and usually require special operating skills. The two factors, stimulation level and test equipment availability, must be evaluated together in order to produce a practical, workable test standard. The neglect or subordination of either factor will guarantee failure in generating the standard. It is not unusual to hear that test standards or specifications are waived because a specified stimulation level cannot be accomplished by in-house or independent test facilities. Determination of subsystem lightning simulation level requires a knowledge and evaluation of field coupling modes, peak and median levels of voltages and currents, bandwidths, and repetition rates. Practical limitations on test systems may require tradeoffs in lightning stimulation parameters in order to build practical test equipment. Peak power levels that can be generated at specified bandwidths with standard electrical components must be considered in the design and costing of the test system. Stimulation tests equipment and test methods are closely related and must be considered a test system for lightning simulation. A non-perfect specification that can be reliably and repeatedly applied at the subsystem test level is more desirable than a perfect specification that cannot be applied at all.
ERIC Educational Resources Information Center
Carr, Margaret; Mitchell, Linda; Rameka, Lesley
2016-01-01
In this article, the authors argue that the use of Organisation for Economic Co-operation and Development (OECD) standardized tests to evaluate the early childhood education sector, while it may be perfectly "scientific", could be disastrous for "Te Whariki", a curriculum that is child-centred and learning-oriented. The basic…
ERIC Educational Resources Information Center
Ortega, Norma E.
2010-01-01
This study examined the relationship between individual and family perfectionism and mental health functioning among two hundred and seven Latino college students. One aim of this study was to test the factor structure of the Almost Perfect Scale-Revised (APS-R; Slaney, Rice, Mobley, Trippi, & Ashby, 2001) with Latino college students by…
The Social Field(s) of Arts Education Today: Living Vulnerability in Neo-Liberal Times
ERIC Educational Resources Information Center
Dimitriadis, Greg; Cole, Emily; Costello, Adrienne
2009-01-01
The arts are often seen as peripheral to the "real business" of school and schooling. While this has been the case for some time now, the increasing pressures of high-stakes testing and ever-more draconian public funding schemes (particularly in the wake of 9/11) have created something of a "perfect storm" for those working in…
Perfect Mirror Design Technology
1999-02-01
with Prof. Mario Molina, recipient of the 1995 Nobel Prize in Chemistry. The partnership, along with Aerodyne Research Inc., looked at how sulfur...Corporation is developing standards for nondestructive evaluation ( NDE ) techniques for industry use. Use of the new standards will result in improved...novel testing methodology that dramatically improves the accuracy of NDE techniques used to detect flaws. Basic Research Five years ago, the main
Why African American College Students Miss the Perfect Test Score
ERIC Educational Resources Information Center
Gentry, Ruben; Stokes, Dorothy
2016-01-01
Many African Americans were imbued with the cliché that they must work twice as hard as others to be a success in life. Entering college, students with this belief put extensive effort into earning top grades to ensure quality preparation for their chosen career; yet, some fail to earn top scores. Why? This is the million dollar question, but the…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-26
... rewards would go to the market maker with the highest number of winning tests and 20% of the total rewards... to allocate the rewards differently. Instead of a fixed dollar amount, the Exchange would reward the... and to perfect the mechanism for a free and open market and a national market system, and, in general...
DiRoberto, Cole; Lehto, Crystal; Baccei, Steven J
2016-08-01
The purpose of this study was to improve the transcription of patient information from imaging study requisitions to the radiology information database at a single institution. Five hundred radiology reports from adult outpatient radiographic examinations were chosen randomly from the radiology information system (RIS) and categorized according to their degree of concordance with their corresponding clinical order indications. The number and types of grammatical errors and types of order forms were also recorded. Countermeasures centered on the education of the technical staff and referring physician offices and the implementation of a checklist. Another sample of 500 reports was taken after the implementation of the countermeasures and compared with the baseline data using a χ(2) test. The number of RIS indications perfectly concordant with their corresponding clinical order indications increased from 232 (46.4%) to 314 (62.8%) after the implementation of the countermeasures (P < .0001). The number of partially concordant matches due to inadequate RIS indications dropped from 162 (32.4%) to 114 (22.8%) (P < .001), whereas the number of partially concordant matches due to inadequate clinical order indications increased from 22 (4.4%) to 57 (11.4%) (P < .0001). The number of discordant pairings dropped from 84 (16.8%) to 15 (3%) (P < .0001). Technologists began to input additional patient information obtained from the patients (not present in the image requisitions) in the RIS after the implementation of the countermeasures. The education of technical staff members and the implementation of a checklist markedly improved the information provided to radiologists on image requisitions from referring providers. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Discus: investigating subjective judgment of optic disc damage.
Denniss, Jonathan; Echendu, Damian; Henson, David B; Artes, Paul H
2011-01-01
To describe a software package (Discus) for investigating clinicians' subjective assessment of optic disc damage [diagnostic accuracy in detecting visual field (VF) damage, decision criteria, and agreement with a panel of experts] and to provide reference data from a group of expert observers. Optic disc images were selected from patients with manifest or suspected glaucoma or ocular hypertension who attended the Manchester Royal Eye Hospital. Eighty images came from eyes without evidence of VF loss in at least four consecutive tests (VF negatives), and 20 images from eyes with repeatable VF loss (VF positives). Software was written to display these images in randomized order, for up to 60 s. Expert observers (n = 12) rated optic disc damage on a 5-point scale (definitely healthy, probably healthy, not sure, probably damaged, and definitely damaged). Optic disc damage as determined by the expert observers predicted VF loss with less than perfect accuracy (mean area under receiver-operating characteristic curve, 0.78; range, 0.72 to 0.85). When the responses were combined across the panel of experts, the area under receiver-operating characteristic curve reached 0.87, corresponding to a sensitivity of ∼60% at 90% specificity. Although the observers' performances were similar, there were large differences between the criteria they adopted (p < 0.001), even though all observers had been given identical instructions. Discus provides a simple and rapid means for assessing important aspects of optic disc interpretation. The data from the panel of expert observers provide a reference against which students, trainees, and clinicians may compare themselves. The program and the analyses described in this article are freely accessible from http://www.discusproject.blogspot.com/.
Shawar, R; Paetznick, V; Witte, Z; Ensign, L G; Anaissie, E; LaRocco, M
1992-01-01
A study was performed in two laboratories to evaluate the effect of growth medium and test methodology on inter- and intralaboratory variations in the MICs of amphotericin B (AMB), flucytosine (5FC), fluconazole (FLU), itraconazole (ITRA), and the triazole Sch 39304 (SCH) against 14 isolates of Candida albicans. Testing was performed by broth microdilution and semisolid agar dilution with the following media, buffered to pH 7.0 with morpholinepropanesulfonic acid (MOPS): buffered yeast nitrogen base (BYNB), Eagle's minimal essential medium (EMEM), RPMI 1640 medium (RPMI), and synthetic amino acid medium for fungi (SAAMF). Inocula were standardized spectrophotometrically, and endpoints were defined by the complete absence of growth for AMB and by no more than 25% of the growth in the drug-free control for all other agents. Comparative analyses of median MICs, as determined by each test method, were made for all drug-medium combinations. Both methods yielded similar (+/- 1 twofold dilution) median MICs for AMB in EMEM and RPMI, 5FC in all media, and FLU in EMEM, RPMI, and SAAMF. In contrast, substantial between-method variations in median MICs were seen for AMB in BYNB and SAAMF, FLU In BYNB, and ITRA and SCH in all media. Interlaboratory concordance of median MICs was good for AMB, 5FC, and FLU but poor for ITRA and SCH in all media. Endpoint determinations were analyzed by use of kappa statistical analyses for evaluating the strength of observer agreement. Moderate to almost perfect interlaboratory agreement occurred with AMB and 5FC in all media and with FLU in EMEM, RPMI, and SAAMF, irrespective of the test method. Slight to almost perfect interlaboratory agreement occurred with ITRA and SCH in EMEM, RPMI, and SAAMF when tested by semisolid agar dilution but not broth microdilution. Kappa values assessing intralaboratory agreement between methods were high for 5FC in all media, for AMB in BYNB, ENEM, and RPMI, and for FLU in EMEM, RPMI, and SAAMF. One laboratory, but not the other, reported substantial to almost perfect agreement between methods for ITRA, and SCH in EMEM, RPMI, and SAAMF. Both laboratories reported poor agreement between methods for the azoles in BYNB. Discrepancies noted in azole-BYNB combinations were largely due to the greater inhibitory effect of these agents in BYNB than in other media. These results indicate that the semisolid agar dilution and broth microdilution methods with EMEM or RPMI yield equivalent and reproducible MICs for AMB, 5FC, and FLU but not ITRA and SCH. PMID:1500502
Shawar, R; Paetznick, V; Witte, Z; Ensign, L G; Anaissie, E; LaRocco, M
1992-08-01
A study was performed in two laboratories to evaluate the effect of growth medium and test methodology on inter- and intralaboratory variations in the MICs of amphotericin B (AMB), flucytosine (5FC), fluconazole (FLU), itraconazole (ITRA), and the triazole Sch 39304 (SCH) against 14 isolates of Candida albicans. Testing was performed by broth microdilution and semisolid agar dilution with the following media, buffered to pH 7.0 with morpholinepropanesulfonic acid (MOPS): buffered yeast nitrogen base (BYNB), Eagle's minimal essential medium (EMEM), RPMI 1640 medium (RPMI), and synthetic amino acid medium for fungi (SAAMF). Inocula were standardized spectrophotometrically, and endpoints were defined by the complete absence of growth for AMB and by no more than 25% of the growth in the drug-free control for all other agents. Comparative analyses of median MICs, as determined by each test method, were made for all drug-medium combinations. Both methods yielded similar (+/- 1 twofold dilution) median MICs for AMB in EMEM and RPMI, 5FC in all media, and FLU in EMEM, RPMI, and SAAMF. In contrast, substantial between-method variations in median MICs were seen for AMB in BYNB and SAAMF, FLU In BYNB, and ITRA and SCH in all media. Interlaboratory concordance of median MICs was good for AMB, 5FC, and FLU but poor for ITRA and SCH in all media. Endpoint determinations were analyzed by use of kappa statistical analyses for evaluating the strength of observer agreement. Moderate to almost perfect interlaboratory agreement occurred with AMB and 5FC in all media and with FLU in EMEM, RPMI, and SAAMF, irrespective of the test method. Slight to almost perfect interlaboratory agreement occurred with ITRA and SCH in EMEM, RPMI, and SAAMF when tested by semisolid agar dilution but not broth microdilution. Kappa values assessing intralaboratory agreement between methods were high for 5FC in all media, for AMB in BYNB, ENEM, and RPMI, and for FLU in EMEM, RPMI, and SAAMF. One laboratory, but not the other, reported substantial to almost perfect agreement between methods for ITRA, and SCH in EMEM, RPMI, and SAAMF. Both laboratories reported poor agreement between methods for the azoles in BYNB. Discrepancies noted in azole-BYNB combinations were largely due to the greater inhibitory effect of these agents in BYNB than in other media. These results indicate that the semisolid agar dilution and broth microdilution methods with EMEM or RPMI yield equivalent and reproducible MICs for AMB, 5FC, and FLU but not ITRA and SCH.
Conservative, special-relativistic smoothed particle hydrodynamics
NASA Astrophysics Data System (ADS)
Rosswog, Stephan
2010-11-01
We present and test a new, special-relativistic formulation of smoothed particle hydrodynamics (SPH). Our approach benefits from several improvements with respect to earlier relativistic SPH formulations. It is self-consistently derived from the Lagrangian of an ideal fluid and accounts for the terms that stem from non-constant smoothing lengths, usually called “grad-h terms”. In our approach, we evolve the canonical momentum and the canonical energy per baryon and thus circumvent some of the problems that have plagued earlier formulations of relativistic SPH. We further use a much improved artificial viscosity prescription which uses the extreme local eigenvalues of the Euler equations and triggers selectively on (a) shocks and (b) velocity noise. The shock trigger accurately monitors the relative density slope and uses it to fine-tune the amount of artificial viscosity that is applied. This procedure substantially sharpens shock fronts while still avoiding post-shock noise. If not triggered, the viscosity parameter of each particle decays to zero. None of these viscosity triggers is specific to special relativity, both could also be applied in Newtonian SPH.The performance of the new scheme is explored in a large variety of benchmark tests where it delivers excellent results. Generally, the grad-h terms deliver minor, though worthwhile, improvements. As expected for a Lagrangian method, it performs close to perfect in supersonic advection tests, but also in strong relativistic shocks, usually considered a particular challenge for SPH, the method yields convincing results. For example, due to its perfect conservation properties, it is able to handle Lorentz factors as large as γ = 50,000 in the so-called wall shock test. Moreover, we find convincing results in a rarely shown, but challenging test that involves so-called relativistic simple waves and also in multi-dimensional shock tube tests.
Duan, Yuetao; Luo, Jie; Wang, Guanghao; Hang, Zhi Hong; Hou, Bo; Li, Jensen; Sheng, Ping; Lai, Yun
2015-01-01
We derive and numerically demonstrate that perfect absorption of elastic waves can be achieved in two types of ultra-thin elastic meta-films: one requires a large value of almost pure imaginary effective mass density and a free space boundary, while the other requires a small value of almost pure imaginary effective modulus and a hard wall boundary. When the pure imaginary density or modulus exhibits certain frequency dispersions, the perfect absorption effect becomes broadband, even in the low frequency regime. Through a model analysis, we find that such almost pure imaginary effective mass density with required dispersion for perfect absorption can be achieved by elastic metamaterials with large damping. Our work provides a feasible approach to realize broadband perfect absorption of elastic waves in ultra-thin films. PMID:26184117
NASA Astrophysics Data System (ADS)
Omrani, H.; Drobinski, P.; Dubos, T.
2009-09-01
In this work, we consider the effect of indiscriminate nudging time on the large and small scales of an idealized limited area model simulation. The limited area model is a two layer quasi-geostrophic model on the beta-plane driven at its boundaries by its « global » version with periodic boundary condition. This setup mimics the configuration used for regional climate modelling. Compared to a previous study by Salameh et al. (2009) who investigated the existence of an optimal nudging time minimizing the error on both large and small scale in a linear model, we here use a fully non-linear model which allows us to represent the chaotic nature of the atmosphere: given the perfect quasi-geostrophic model, errors in the initial conditions, concentrated mainly in the smaller scales of motion, amplify and cascade into the larger scales, eventually resulting in a prediction with low skill. To quantify the predictability of our quasi-geostrophic model, we measure the rate of divergence of the system trajectories in phase space (Lyapunov exponent) from a set of simulations initiated with a perturbation of a reference initial state. Predictability of the "global", periodic model is mostly controlled by the beta effect. In the LAM, predictability decreases as the domain size increases. Then, the effect of large-scale nudging is studied by using the "perfect model” approach. Two sets of experiments were performed: (1) the effect of nudging is investigated with a « global » high resolution two layer quasi-geostrophic model driven by a low resolution two layer quasi-geostrophic model. (2) similar simulations are conducted with the two layer quasi-geostrophic LAM where the size of the LAM domain comes into play in addition to the first set of simulations. In the two sets of experiments, the best spatial correlation between the nudge simulation and the reference is observed with a nudging time close to the predictability time.
Romero-Franco, Natalia; Jiménez-Reyes, Pedro; Castaño-Zambudio, Adrián; Capelo-Ramírez, Fernando; Rodríguez-Juan, Juan José; González-Hernández, Jorge; Toscano-Bendala, Francisco Javier; Cuadrado-Peñafiel, Víctor; Balsalobre-Fernández, Carlos
2017-05-01
The purpose of this study was to assess validity and reliability of sprint performance outcomes measured with an iPhone application (named: MySprint) and existing field methods (i.e. timing photocells and radar gun). To do this, 12 highly trained male sprinters performed 6 maximal 40-m sprints during a single session which were simultaneously timed using 7 pairs of timing photocells, a radar gun and a newly developed iPhone app based on high-speed video recording. Several split times as well as mechanical outputs computed from the model proposed by Samozino et al. [(2015). A simple method for measuring power, force, velocity properties, and mechanical effectiveness in sprint running. Scandinavian Journal of Medicine & Science in Sports. https://doi.org/10.1111/sms.12490] were then measured by each system, and values were compared for validity and reliability purposes. First, there was an almost perfect correlation between the values of time for each split of the 40-m sprint measured with MySprint and the timing photocells (r = 0.989-0.999, standard error of estimate = 0.007-0.015 s, intraclass correlation coefficient (ICC) = 1.0). Second, almost perfect associations were observed for the maximal theoretical horizontal force (F 0 ), the maximal theoretical velocity (V 0 ), the maximal power (P max ) and the mechanical effectiveness (DRF - decrease in the ratio of force over acceleration) measured with the app and the radar gun (r = 0.974-0.999, ICC = 0.987-1.00). Finally, when analysing the performance outputs of the six different sprints of each athlete, almost identical levels of reliability were observed as revealed by the coefficient of variation (MySprint: CV = 0.027-0.14%; reference systems: CV = 0.028-0.11%). Results on the present study showed that sprint performance can be evaluated in a valid and reliable way using a novel iPhone app.
2002-10-01
Gravity Probe-B (GP-B) is the relativity experiment being developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies -- technologies that are already enlivening other branches of science and engineering. Scheduled for launch in 2003 and managed for NASA by Marshall Space Flight Center, development of the GP-B is the responsibility of Stanford University, with major subcontractor Lockheed Martin Corporation.
2002-10-01
Gravity Probe-B (GP-B) is the relativity experiment being developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies -- technologies that are already enlivening other branches of science and engineering. Scheduled for launch in 2003 and managed for NASA by Marshall Space Flight Center, development of GP-B is the responsibility of Stanford University, with major subcontractor Lockheed Martin Corporation.
2002-10-01
Gravity Probe-B (GP-B) is the relativity experiment being developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies -- technologies that are already enlivening other branches of science and engineering. Scheduled for launch in 2003 and managed for NASA by the Marshall Space Flight Center, development of GP-B is the responsibility of Stanford University, with major subcontractor Lockheed Martin Corporation.
High-resolution moisture profiles from full-waveform probabilistic inversion of TDR signals
NASA Astrophysics Data System (ADS)
Laloy, Eric; Huisman, Johan Alexander; Jacques, Diederik
2014-11-01
This study presents an novel Bayesian inversion scheme for high-dimensional undetermined TDR waveform inversion. The methodology quantifies uncertainty in the moisture content distribution, using a Gaussian Markov random field (GMRF) prior as regularization operator. A spatial resolution of 1 cm along a 70-cm long TDR probe is considered for the inferred moisture content. Numerical testing shows that the proposed inversion approach works very well in case of a perfect model and Gaussian measurement errors. Real-world application results are generally satisfying. For a series of TDR measurements made during imbibition and evaporation from a laboratory soil column, the average root-mean-square error (RMSE) between maximum a posteriori (MAP) moisture distribution and reference TDR measurements is 0.04 cm3 cm-3. This RMSE value reduces to less than 0.02 cm3 cm-3 for a field application in a podzol soil. The observed model-data discrepancies are primarily due to model inadequacy, such as our simplified modeling of the bulk soil electrical conductivity profile. Among the important issues that should be addressed in future work are the explicit inference of the soil electrical conductivity profile along with the other sampled variables, the modeling of the temperature-dependence of the coaxial cable properties and the definition of an appropriate statistical model of the residual errors.
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan
1994-01-01
LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 1 of a series of three reference publications that describe LENS, provide a detailed guide to its usage, and present many example problems. Part 1 derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved. The accuracy and efficiency of LSENS are examined by means of various test problems, and comparisons with other methods and codes are presented. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions.
Kwak, Namhee; Swan, Joshua T; Thompson-Moore, Nathaniel; Liebl, Michael G
2016-08-01
This study aims to develop a systematic search strategy and test its validity and reliability in terms of identifying projects published in peer-reviewed journals as reported by residency graduates through an online survey. This study was a prospective blind comparison to a reference standard. Pharmacy residency projects conducted at the study institution between 2001 and 2012 were included. A step-wise, systematic procedure containing up to 8 search strategies in PubMed and EMBASE for each project was created using the names of authors and abstract keywords. In order to further maximize sensitivity, complex phrases with multiple variations were truncated to the root word. Validity was assessed by obtaining information on publications from an online survey deployed to residency graduates. The search strategy identified 13 publications (93% sensitivity, 100% specificity, and 99% accuracy). Both methods identified a similar proportion achieving publication (19.7% search strategy vs 21.2% survey, P = 1.00). Reliability of the search strategy was affirmed by the perfect agreement between 2 investigators (k = 1.00). This systematic search strategy demonstrated a high sensitivity, specificity, and accuracy for identifying publications resulting from pharmacy residency projects using information available in residency conference abstracts. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Liu, Xiaoxin; Feng, Peilei; Jan, Lisheng; Dai, Xiaozhong; Cai, Pengcheng
2018-01-01
In recent years, Nujiang Prefecture vigorously develop hydropower, the grid structure in the northwest of Yunnan Province is not perfect, which leads to the research and construction of the power grid lags behind the development of the hydropower. In 2015, the company in view of the nu river hydropower dilemma decided to change outside the nu river to send out a passage with series compensation device in order to improve the transmission capacity, the company to the main problems related to the system plan, but not too much in the region distribution network and detailed study. Nujiang power grid has unique structure and properties of the nujiang power grid after respectively, a whole rack respectively into two parts, namely power delivery channels, load power supply, the whole grid occurred fundamental changes, the original strategy of power network is not applicable, especially noteworthy is the main failure after network of independent operation problem, how to avoid the local series, emergency problem is more urgent, very tolerance test area power grid, this paper aims at the analysis of existing data, simulation, provide a reference for respectively after the operation for the stable operation of the power grid.
Artist's Concept of Gravity Probe-B
NASA Technical Reports Server (NTRS)
2002-01-01
Gravity Probe-B (GP-B) is the relativity experiment being developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies -- technologies that are already enlivening other branches of science and engineering. Scheduled for launch in 2003 and managed for NASA by Marshall Space Flight Center, development of GP-B is the responsibility of Stanford University, with major subcontractor Lockheed Martin Corporation.
Artist's Concept of Gravity Probe-B
NASA Technical Reports Server (NTRS)
2002-01-01
Gravity Probe-B (GP-B) is the relativity experiment being developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies -- technologies that are already enlivening other branches of science and engineering. Scheduled for launch in 2003 and managed for NASA by Marshall Space Flight Center, development of the GP-B is the responsibility of Stanford University, with major subcontractor Lockheed Martin Corporation.
Artist's Concept of Gravity Probe-B
NASA Technical Reports Server (NTRS)
2002-01-01
Gravity Probe-B (GP-B) is the relativity experiment being developed at Stanford University to test two extraordinary predictions of Albert Einstein's general theory of relativity. The experiment will measure, very precisely, the expected tiny changes in the direction of the spin axes of four gyroscopes contained in an Earth-orbiting satellite at a 400-mile altitude. So free are the gyroscopes from disturbance that they will provide an almost perfect space-time reference system. They will measure how space and time are very slightly warped by the presence of the Earth, and, more profoundly, how the Earth's rotation very slightly drags space-time around with it. These effects, though small for the Earth, have far-reaching implications for the nature of matter and the structure of the Universe. GP-B is among the most thoroughly researched programs ever undertaken by NASA. This is the story of a scientific quest in which physicists and engineers have collaborated closely over many years. Inspired by their quest, they have invented a whole range of technologies -- technologies that are already enlivening other branches of science and engineering. Scheduled for launch in 2003 and managed for NASA by the Marshall Space Flight Center, development of GP-B is the responsibility of Stanford University, with major subcontractor Lockheed Martin Corporation.
Fabrication and Test of an Optical Magnetic Mirror
NASA Technical Reports Server (NTRS)
Hagopian, John G.; Roman, Patrick A.; Shiri, Shahram; Wollack, Edward J.; Roy, Madhumita
2011-01-01
Traditional mirrors at optical wavelengths use thin metalized or dielectric layers of uniform thickness to approximate a perfect electric field boundary condition. The electron gas in such a mirror configuration oscillates in response to the incident photons and subsequently re-emits fields where the propagation and electric field vectors have been inverted and the phase of the incident magnetic field is preserved. We proposed fabrication of sub-wavelength-scale conductive structures that could be used to interact with light at a nano-scale and enable synthesis of the desired perfect magnetic-field boundary condition. In a magnetic mirror, the interaction of light with the nanowires, dielectric layer and ground plate, inverts the magnetic field vector resulting in a zero degree phase shift upon reflection. Geometries such as split ring resonators and sinusoidal conductive strips were shown to demonstrate magnetic mirror behavior in the microwave and then in the visible. Work to design, fabricate and test a magnetic mirror began in 2007 at the NASA Goddard Space Flight Center (GSFC) under an Internal Research and Development (IRAD) award Our initial nanowire geometry was sinusoidal but orthogonally asymmetric in spatial frequency, which allowed clear indications of its behavior by polarization. We report on the fabrication steps and testing of magnetic mirrors using a phase shifting interferometer and the first far-field imaging of an optical magnetic mirror.
Radio-Optical Reference Frame Link Using the U.S. Naval Observatory Astrograph and Deep CCD Imaging
NASA Astrophysics Data System (ADS)
Zacharias, N.; Zacharias, M. I.
2014-05-01
Between 1997 and 2004 several observing runs were conducted, mainly with the CTIO 0.9 m, to image International Celestial Reference Frame (ICRF) counterparts (mostly QSOs) in order to determine accurate optical positions. Contemporary to these deep CCD images, the same fields were observed with the U.S. Naval Observatory astrograph in the same bandpass. They provide accurate positions on the Hipparcos/Tycho-2 system for stars in the 10-16 mag range used as reference stars for the deep CCD imaging data. Here we present final optical position results of 413 sources based on reference stars obtained by dedicated astrograph observations that were reduced following two different procedures. These optical positions are compared to radio very long baseline interferometry positions. The current optical system is not perfectly aligned to the ICRF radio system with rigid body rotation angles of 3-5 mas (= 3σ level) found between them for all three axes. Furthermore, statistically, the optical-radio position differences are found to exceed the total, combined, known errors in the observations. Systematic errors in the optical reference star positions and physical offsets between the centers of optical and radio emissions are both identified as likely causes. A detrimental, astrophysical, random noise component is postulated to be on about the 10 mas level. If confirmed by future observations, this could severely limit the Gaia to ICRF reference frame alignment accuracy to an error of about 0.5 mas per coordinate axis with the current number of sources envisioned to provide the link. A list of 36 ICRF sources without the detection of an optical counterpart to a limiting magnitude of about R = 22 is provided as well.
Reference genes for quantitative PCR in the adipose tissue of mice with metabolic disease.
Almeida-Oliveira, Fernanda; Leandro, João G B; Ausina, Priscila; Sola-Penna, Mauro; Majerowicz, David
2017-04-01
Obesity and diabetes are metabolic diseases and they are increasing in prevalence. The dynamics of gene expression associated with these diseases is fundamental to identifying genes involved in related biological processes. qPCR is a sensitive technique for mRNA quantification and the most commonly used method in gene-expression studies. However, the reliability of these results is directly influenced by data normalization. As reference genes are the major normalization method used, this work aims to identify reference genes for qPCR in adipose tissues of mice with type-I diabetes or obesity. We selected 12 genes that are commonly used as reference genes. The expression of these genes in the adipose tissues of mice was analyzed in the context of three different experimental protocols: 1) untreated animals; 2) high-fat-diet animals; and 3) streptozotocin-treated animals. Gene-expression stability was analyzed using four different algorithms. Our data indicate that TATA-binding protein is stably expressed across adipose tissues in control animals. This gene was also a useful reference when the brown adipose tissues of control and obese mice were analyzed. The mitochondrial ATP synthase F1 complex gene exhibits stable expression in subcutaneous and perigonadal adipose tissue from control and obese mice. Moreover, this gene is the best reference for qPCR normalization in adipose tissue from streptozotocin-treated animals. These results show that there is no perfect stable gene suited for use under all experimental conditions. In conclusion, the selection of appropriate genes is a prerequisite to ensure qPCR reliability and must be performed separately for different experimental protocols. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Multicomponent blood lipid analysis by means of near infrared spectroscopy, in geese.
Bazar, George; Eles, Viktoria; Kovacs, Zoltan; Romvari, Robert; Szabo, Andras
2016-08-01
This study provides accurate near infrared (NIR) spectroscopic models on some laboratory determined clinicochemical parameters (i.e. total lipid (5.57±1.95 g/l), triglyceride (2.59±1.36 mmol/l), total cholesterol (3.81±0.68 mmol/l), high density lipoprotein (HDL) cholesterol (2.45±0.58 mmol/l)) of blood serum samples of fattened geese. To increase the performance of multivariate chemometrics, samples significantly deviating from the regression models implying laboratory error were excluded from the final calibration datasets. Reference data of excluded samples having outlier spectra in principal component analysis were not marked as false. Samples deviating from the regression models but having non outlier spectra in PCA were identified as having false reference constituent values. Based on the NIR selection methods, 5% of the reference measurement data were rated as doubtful. The achieved models reached R(2) of 0.864, 0.966, 0.850, 0.793, and RMSE of 0.639 g/l, 0.232 mmol/l, 0.210 mmol/l, 0.241 mmol/l for total lipid, triglyceride, total cholesterol and HDL cholesterol, respectively, during independent validation. Classical analytical techniques focus on single constituents and often require chemicals, time-consuming measurements, and experienced technicians. NIR technique provides a quick, cost effective, non-hazardous alternative method for analysis of several constituents based on one single spectrum of each sample, and it also offers the possibility for looking at the laboratory reference data critically. Evaluation of reference data to identify and exclude falsely analyzed samples can provide warning feedback to the reference laboratory, especially in the case of analyses where laboratory methods are not perfectly suited to the subjected material and there is an increased chance of laboratory error. Copyright © 2016 Elsevier B.V. All rights reserved.
Radio-optical reference frame link using the U.S. Naval observatory astrograph and deep CCD imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zacharias, N.; Zacharias, M. I., E-mail: nz@usno.navy.mil
2014-05-01
Between 1997 and 2004 several observing runs were conducted, mainly with the CTIO 0.9 m, to image International Celestial Reference Frame (ICRF) counterparts (mostly QSOs) in order to determine accurate optical positions. Contemporary to these deep CCD images, the same fields were observed with the U.S. Naval Observatory astrograph in the same bandpass. They provide accurate positions on the Hipparcos/Tycho-2 system for stars in the 10-16 mag range used as reference stars for the deep CCD imaging data. Here we present final optical position results of 413 sources based on reference stars obtained by dedicated astrograph observations that were reducedmore » following two different procedures. These optical positions are compared to radio very long baseline interferometry positions. The current optical system is not perfectly aligned to the ICRF radio system with rigid body rotation angles of 3-5 mas (= 3σ level) found between them for all three axes. Furthermore, statistically, the optical-radio position differences are found to exceed the total, combined, known errors in the observations. Systematic errors in the optical reference star positions and physical offsets between the centers of optical and radio emissions are both identified as likely causes. A detrimental, astrophysical, random noise component is postulated to be on about the 10 mas level. If confirmed by future observations, this could severely limit the Gaia to ICRF reference frame alignment accuracy to an error of about 0.5 mas per coordinate axis with the current number of sources envisioned to provide the link. A list of 36 ICRF sources without the detection of an optical counterpart to a limiting magnitude of about R = 22 is provided as well.« less
Nearly Perfect Fluidity in a High Temperature Superconductor
Rameau, J. D.; Reber, T. J.; Yang, H. -B.; ...
2014-10-13
Perfect fluids are characterized as having the smallest ratio of shear viscosity to entropy density, η/s, consistent with quantum uncertainty and causality. So far, nearly perfect fluids have only been observed in the quark-gluon plasma and in unitary atomic Fermi gases, exotic systems that are amongst the hottest and coldest objects in the known universe, respectively. We use angle resolved photoemission spectroscopy to measure the temperature dependence of an electronic analog of η/s in an optimally doped cuprate high-temperature superconductor, finding it too is a nearly perfect fluid around, and above, its superconducting transition temperature T c.
Nearly perfect fluidity in a high-temperature superconductor
NASA Astrophysics Data System (ADS)
Rameau, J. D.; Reber, T. J.; Yang, H.-B.; Akhanjee, S.; Gu, G. D.; Johnson, P. D.; Campbell, S.
2014-10-01
Perfect fluids are characterized as having the smallest ratio of shear viscosity to entropy density, η /s, consistent with quantum uncertainty and causality. So far, nearly perfect fluids have only been observed in the quark-gluon plasma and in unitary atomic Fermi gases, exotic systems that are amongst the hottest and coldest objects in the known universe, respectively. We use angle resolved photoemission spectroscopy to measure the temperature dependence of an electronic analog of η /s in an optimally doped cuprate high-temperature superconductor, finding it too is a nearly perfect fluid around, and above, its superconducting transition temperature Tc.
Alternative Fuels Data Center: Deploying Clean Buses in Texas through
money for large expenditures--new school buses are a perfect example and offer the perfect opportunity a potential penalty or fine when that money is put toward an SEP. Municipalities may offset 100% of the problem schools have with finding money for large expenditures-new school buses are a perfect
The Perfect Eye A Novel Model for Teaching the Theory of Refraction.
ERIC Educational Resources Information Center
Kurtz, Daniel
1999-01-01
The Perfect Eye model simplifies solutions to a wide variety of optometry instructional problems by facilitating student understanding of the interaction among lenses, objects, accommodation, and ametropia. The model is based on the premise that inside every eye is a perfect (emmetropic) eye, and that the physiological eye is a combination of the…
ERIC Educational Resources Information Center
Ozier, Lance
2011-01-01
Pressure for students to produce writing perfection in the classroom often eclipses the emphasis placed on the need for students to practice writing. Occasions for students to choose, challenge, and reflect--to actually risk risking--are too often absent from conversations among students and teachers in countless English classrooms. Tom Romano…
Applications of satellite data relay to problems of field seismology
NASA Technical Reports Server (NTRS)
Webster, W. J., Jr.; Miller, W. H.; Whitley, R.; Allenby, R. J.; Dennison, R. T.
1980-01-01
A seismic signal processor was developed and tested for use with the NOAA-GOES satellite data collection system. Performance tests on recorded, as well as real time, short period signals indicate that the event recognition technique used is nearly perfect in its rejection of cultural signals and that data can be acquired in many swarm situations with the use of solid state buffer memories. Detailed circuit diagrams are provided. The design of a complete field data collection platform is discussed and the employment of data collection platforms in seismic network is reviewed.
Free-flight measurement technique in the free-piston high-enthalpy shock tunnel.
Tanno, H; Komuro, T; Sato, K; Fujita, K; Laurence, S J
2014-04-01
A novel multi-component force-measurement technique has been developed and implemented at the impulse facility JAXA-HIEST, in which the test model is completely unrestrained during the test and thus experiences free-flight conditions for a period on the order of milliseconds. Advantages over conventional free-flight techniques include the complete absence of aerodynamic interference from a model support system and less variation in model position and attitude during the test itself. A miniature on-board data recorder, which was a key technology for this technique, was also developed in order to acquire and store the measured data. The technique was demonstrated in a HIEST wind-tunnel test campaign in which three-component aerodynamic force measurement was performed on a blunted cone of length 316 mm, total mass 19.75 kg, and moment of inertia 0.152 kgm(2). During the test campaign, axial force, normal forces, and pitching moment coefficients were obtained at angles of attack from 14° to 32° under two conditions: H0 = 4 MJ/kg, P0 = 14 MPa; and H0 = 16 MJ/kg, P0 = 16 MPa. For the first, low-enthalpy condition, the test flow was considered a perfect gas; measurements were thus directly compared with those obtained in a conventional blow-down wind tunnel (JAXA-HWT2) to evaluate the accuracy of the technique. The second test condition was a high-enthalpy condition in which 85% of the oxygen molecules were expected to be dissociated; high-temperature real-gas effects were therefore evaluated by comparison with results obtained in perfect-gas conditions. The precision of the present measurements was evaluated through an uncertainty analysis, which showed the aerodynamic coefficients in the HIEST low enthalpy test agreeing well with those of JAXA-HWT2. The pitching-moment coefficient, however, showed significant differences between low- and high-enthalpy tests. These differences are thought to result from high-temperature real-gas effects.
Free-flight measurement technique in the free-piston high-enthalpy shock tunnel
NASA Astrophysics Data System (ADS)
Tanno, H.; Komuro, T.; Sato, K.; Fujita, K.; Laurence, S. J.
2014-04-01
A novel multi-component force-measurement technique has been developed and implemented at the impulse facility JAXA-HIEST, in which the test model is completely unrestrained during the test and thus experiences free-flight conditions for a period on the order of milliseconds. Advantages over conventional free-flight techniques include the complete absence of aerodynamic interference from a model support system and less variation in model position and attitude during the test itself. A miniature on-board data recorder, which was a key technology for this technique, was also developed in order to acquire and store the measured data. The technique was demonstrated in a HIEST wind-tunnel test campaign in which three-component aerodynamic force measurement was performed on a blunted cone of length 316 mm, total mass 19.75 kg, and moment of inertia 0.152 kgm2. During the test campaign, axial force, normal forces, and pitching moment coefficients were obtained at angles of attack from 14° to 32° under two conditions: H0 = 4 MJ/kg, P0 = 14 MPa; and H0 = 16 MJ/kg, P0 = 16 MPa. For the first, low-enthalpy condition, the test flow was considered a perfect gas; measurements were thus directly compared with those obtained in a conventional blow-down wind tunnel (JAXA-HWT2) to evaluate the accuracy of the technique. The second test condition was a high-enthalpy condition in which 85% of the oxygen molecules were expected to be dissociated; high-temperature real-gas effects were therefore evaluated by comparison with results obtained in perfect-gas conditions. The precision of the present measurements was evaluated through an uncertainty analysis, which showed the aerodynamic coefficients in the HIEST low enthalpy test agreeing well with those of JAXA-HWT2. The pitching-moment coefficient, however, showed significant differences between low- and high-enthalpy tests. These differences are thought to result from high-temperature real-gas effects.
D'Angelo, Heather; Fleischhacker, Sheila; Rose, Shyanika W; Ribisl, Kurt M
2014-07-01
Identifying tobacco retail outlets for U.S. FDA compliance checks or calculating tobacco outlet density is difficult in the 13 States without tobacco retail licensing or where licensing lists are unavailable for research. This study uses primary data collection to identify tobacco outlets in three counties in a non-licensing state and validate two commercial secondary data sources. We calculated sensitivity and positive predictive values (PPV) to examine the evidence of validity for two secondary data sources, and conducted a geospatial analysis to determine correct allocation to census tract. ReferenceUSA had almost perfect sensitivity (0.82) while Dun & Bradstreet (D&B) had substantial sensitivity (0.69) for identifying tobacco outlets; combined, sensitivity improved to 0.89. D&B identified fewer "false positives" with a PPV of 0.82 compared to 0.71 for ReferenceUSA. More than 90% of the outlets identified by ReferenceUSA were geocoded to the correct census tract. Combining two commercial data sources resulted in enumeration of nearly 90% of tobacco outlets in a three county area. Commercial databases appear to provide a reasonably accurate way to identify tobacco outlets for enforcement operations and density estimation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Khalatbari, Shokoufeh; Liu, Peter S. C.; Maturen, Katherine E.; Kaza, Ravi K.; Wasnik, Ashish P.; Al-Hawary, Mahmoud M.; Glazer, Daniel I.; Stein, Erica B.; Patel, Jeet; Somashekar, Deepak K.; Viglianti, Benjamin L.; Hussain, Hero K.
2014-01-01
Purpose To determine for expert and novice radiologists repeatability of major diagnostic features and scoring systems (ie, Liver Imaging Reporting and Data System [LI-RADS], Organ Procurement and Transplantation Network [OPTN], and American Association for the Study of Liver Diseases [AASLD]) for hepatocellular carcinoma (HCC) by using magnetic resonance (MR) imaging. Materials and Methods Institutional review board approval was obtained and patient consent was waived for this HIPAA-compliant, retrospective study. The LI-RADS discussed in this article refers to version 2013.1. Ten blinded readers reviewed 100 liver MR imaging studies that demonstrated observations preliminarily assigned LI-RADS scores of LR1–LR5. Diameter and major HCC features (arterial hyperenhancement, washout appearance, pseudocapsule) were recorded for each observation. LI-RADS, OPTN, and AASLD scores were assigned. Interreader agreement was assessed by using intraclass correlation coefficients and κ statistics. Scoring rates were compared by using McNemar test. Results Overall interreader agreement was substantial for arterial hyperenhancement (0.67 [95% confidence interval {CI}: 0.65, 0.69]), moderate for washout appearance (0.48 [95%CI: 0.46, 0.50]), moderate for pseudocapsule (0.52 [95% CI: 050, 0.54]), fair for LI-RADS (0.35 [95% CI: 0.34, 0.37]), fair for AASLD (0.39 [95% CI: 0.37, 0.42]), and moderate for OPTN (0.53 [95% CI: 0.51, 0.56]). Agreement for measured diameter was almost perfect (range, 0.95–0.97). There was substantial agreement for most scores consistent with HCC. Experts agreed significantly more than did novices and were significantly more likely than were novices to assign a diagnosis of HCC (P < .001). Conclusion Two of three major features for HCC (washout appearance and pseudocapsule) have only moderate interreader agreement. Experts and novices who assigned scores consistent with HCC had substantial but not perfect agreement. Expert agreement is substantial for OPTN, but moderate for LI-RADS and AASLD. Novices were less consistent and less likely to diagnose HCC than were experts. © RSNA, 2014 Online supplemental material is available for this article. PMID:24555636
Wu, Gang; Xie, Ruyi; Zhang, Xiaoli; Morelli, John; Yan, Xu; Zhu, Xiaolei; Li, Xiaoming
2017-12-01
The aim of this study was to evaluate the diagnostic performance of noncontrast magnetic resonance imaging utilizing sampling perfection with application optimized contrasts using different flip angle evolutions (SPACE) in detecting deep venous thrombus (DVT) of the lower extremity and evaluating clot burden. This prospective study was approved by the institutional review board. Ninety-four consecutive patients (42 men, 52 women; age range, 14-87 years; average age, 52.7 years) suspected of lower extremity DVT underwent ultrasound (US) and SPACE. The venous visualization score for SPACE was determined by 2 radiologists independently according to a 4-point scale (1-4, poor to excellent). The sensitivity and specificity of SPACE in detecting DVT were calculated based on segment, limb, and patient, with US serving as the reference standard. The clot burden for each segment was scored (0-3, patent to entire segment occlusion). The clot burden score obtained with SPACE was compared with US using a Wilcoxon test based on region, limb, and patient. Interobserver agreement in assessing DVT (absent, nonocclusive, or occlusive) with SPACE was determined by calculating Cohen kappa coefficients. The mean venous visualization score for SPACE was 3.82 ± 0.50 for reader 1 and 3.81 ± 0.50 for reader 2. For reader 1, sensitivity/specificity values of SPACE in detecting DVT were 96.53%/99.90% (segment), 95.24%/99.04% (limb), and 95.89%/95.24% (patient). For reader 2, corresponding values were 97.20%/99.90%, 96.39%/99.05%, and 97.22%/95.45%. The clot burden assessed with SPACE was not significantly different from US (P > 0.05 for region, limb, patient). Interobserver agreement of SPACE in assessing thrombosis was excellent (kappa = 0.894 ± 0.014). Non-contrast-enhanced 3-dimensional SPACE magnetic resonance imaging is highly accurate in detecting lower extremity DVT and reliable in the evaluation of clot burden. SPACE could serve as an important alternative for patients in whom US cannot be performed.
From the generalized reflection law to the realization of perfect anomalous reflectors
Díaz-Rubio, Ana; Asadchy, Viktar S.; Elsakka, Amr; Tretyakov, Sergei A.
2017-01-01
The use of the generalized Snell’s law opens wide possibilities for the manipulation of transmitted and reflected wavefronts. However, known structures designed to shape reflection wavefronts suffer from significant parasitic reflections in undesired directions. We explore the limitations of the existing solutions for the design of passive planar reflectors and demonstrate that strongly nonlocal response is required for perfect performance. A new paradigm for the design of perfect reflectors based on energy surface channeling is introduced. We realize and experimentally verify a perfect design of an anomalously reflective surface using an array of rectangular metal patches backed by a metallic plate. This conceptually new mechanism for wavefront manipulation allows the design of thin perfect reflectors, offering a versatile design method applicable to other scenarios, such as focusing reflectors, surface wave manipulations, or metasurface holograms, extendable to other frequencies. PMID:28819642
NASA Astrophysics Data System (ADS)
Hu, Lin; Wirth, Brian D.; Maroudas, Dimitrios
2017-08-01
We report results on the lattice thermal conductivities of tungsten single crystals containing nanoscale-sized pores or voids and helium (He) nanobubbles as a function of void/bubble size and gas pressure in the He bubbles based on molecular-dynamics simulations. For reference, we calculated lattice thermal conductivities of perfect tungsten single crystals along different crystallographic directions at room temperature and found them to be about 10% of the overall thermal conductivity of tungsten with a weak dependence on the heat flux direction. The presence of nanoscale voids in the crystal causes a significant reduction in its lattice thermal conductivity, which decreases with increasing void size. Filling the voids with He to form He nanobubbles and increasing the bubble pressure leads to further significant reduction of the tungsten lattice thermal conductivity, down to ˜20% of that of the perfect crystal. The anisotropy in heat conduction remains weak for tungsten single crystals containing nanoscale-sized voids and He nanobubbles throughout the pressure range examined. Analysis of the pressure and atomic displacement fields in the crystalline region that surrounds the He nanobubbles reveals that the significant reduction of tungsten lattice thermal conductivity in this region is due to phonon scattering from the nanobubbles, as well as lattice deformation around the nanobubbles and formation of lattice imperfections at higher bubble pressure.
Ellison, Kirsten L
2014-04-01
Drawing from a collection of over 160 North American print advertisements for anti-aging skin care products from January to December of 2009, this paper examines the discourse of agelessness, a vision of esthetic perfection and optimal health that is continually referred to by gerontologists, cultural theorists, and scientific researchers as a state of being to which humankind can aspire. Employing critical discourse analysis through the use of semiotics and visual rhetoric, this paper explores the means through which anti-aging skin care advertisements present to their viewers a particular object of desire, looking, more specifically, at how agelessness is presented as a way out and ultimate transcendence of age. Through the analytical tools of semiotics and visual rhetoric, four visions of agelessness are identified and explored in this paper: Agelessness as Scientific Purity, Agelessness as Genetic Impulse, Agelessness as Nature's Essence, and Agelessness as Myth. Whether found in the heights of scientific purity, the inner core of our genetic impulse, the depths of nature's essence, or whether agelessness itself has reached its own, untouchable, mythic status, the advertisements in this study represent one of the most pervasive vehicles through which our current vision(s) of ageless perfection are reflected, reinforced, and suspended in a drop of cream. Copyright © 2013 The Author. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehtola, Susi; Parkhill, John; Head-Gordon, Martin
Novel implementations based on dense tensor storage are presented here for the singlet-reference perfect quadruples (PQ) [J. A. Parkhill et al., J. Chem. Phys. 130, 084101 (2009)] and perfect hextuples (PH) [J. A. Parkhill and M. Head-Gordon, J. Chem. Phys. 133, 024103 (2010)] models. The methods are obtained as block decompositions of conventional coupled-cluster theory that are exact for four electrons in four orbitals (PQ) and six electrons in six orbitals (PH), but that can also be applied to much larger systems. PQ and PH have storage requirements that scale as the square, and as the cube of the numbermore » of active electrons, respectively, and exhibit quartic scaling of the computational effort for large systems. Applications of the new implementations are presented for full-valence calculations on linear polyenes (C nH n+2), which highlight the excellent computational scaling of the present implementations that can routinely handle active spaces of hundreds of electrons. The accuracy of the models is studied in the π space of the polyenes, in hydrogen chains (H 50), and in the π space of polyacene molecules. In all cases, the results compare favorably to density matrix renormalization group values. With the novel implementation of PQ, active spaces of 140 electrons in 140 orbitals can be solved in a matter of minutes on a single core workstation, and the relatively low polynomial scaling means that very large systems are also accessible using parallel computing.« less
Analysis and Design of a Digital Controller for a Seismically Stable Platform.
1981-12-01
disturbances are critical in gyroscope and accelerometer sensor evaluations. Distur- bances may be either measured, modeled and compensated in test profiles...controller design issues separately. Basically, the control law considerations are investigated assuming the SSP sensors provide perfect state...signal noise effects into sensor measurements through voltages induced in cabling directly or indirectly through ground loops in instrument amplifiers or
ERIC Educational Resources Information Center
Fahey, Johannah; Prosser, Howard
2015-01-01
Elite schools around the world aspire to produce perfect students and yet there are always obstacles to this perfection being achieved. In this paper, we suggest that this process of perfectionism and obstruction can best be understood using a methodology that looks to the creative arts, rather than the usual social science orthodoxies. Our focus…
Numerical simulation of flow field in umbrella wind turbine
NASA Astrophysics Data System (ADS)
Daorina, Bao; Xiaoxue, Wang; Wei, Shang; Yadong, Liu; Daorina, Bao; Xiaoxue, Wang; Wei, Shang; Yadong, Liu
2018-05-01
Umbrella wind turbine can control the swept area by adjusting the shrinking angle of the rotor so as to ensure that output power is near the rated value. This is very helpful for the utilization of wind energy in sandstorms and typhoon-prone areas of our country. In this paper, Fluent software is used to simulate the velocity field and pressure field of 5kW Umbrella Wind Turbine at 0° 45°and 60°angle of contraction. The results provide a theoretical basis for further improving the power adjustment mechanism of Umbrella Wind Turbines, At the same time, it also provide a reference for our country to perfect the wind energy utilization system about the typhoon environment in the coastal areas.
Design and study of water supply system for supercritical unit boiler in thermal power station
NASA Astrophysics Data System (ADS)
Du, Zenghui
2018-04-01
In order to design and optimize the boiler feed water system of supercritical unit, the establishment of a highly accurate controlled object model and its dynamic characteristics are prerequisites for developing a perfect thermal control system. In this paper, the method of mechanism modeling often leads to large systematic errors. Aiming at the information contained in the historical operation data of the boiler typical thermal system, the modern intelligent identification method to establish a high-precision quantitative model is used. This method avoids the difficulties caused by the disturbance experiment modeling for the actual system in the field, and provides a strong reference for the design and optimization of the thermal automation control system in the thermal power plant.
Electron Correlation from the Adiabatic Connection for Multireference Wave Functions
NASA Astrophysics Data System (ADS)
Pernal, Katarzyna
2018-01-01
An adiabatic connection (AC) formula for the electron correlation energy is derived for a broad class of multireference wave functions. The AC expression recovers dynamic correlation energy and assures a balanced treatment of the correlation energy. Coupling the AC formalism with the extended random phase approximation allows one to find the correlation energy only from reference one- and two-electron reduced density matrices. If the generalized valence bond perfect pairing model is employed a simple closed-form expression for the approximate AC formula is obtained. This results in the overall M5 scaling of the computation cost making the method one of the most efficient multireference approaches accounting for dynamic electron correlation also for the strongly correlated systems.
Multimedia Forensics Is Not Computer Forensics
NASA Astrophysics Data System (ADS)
Böhme, Rainer; Freiling, Felix C.; Gloe, Thomas; Kirchner, Matthias
The recent popularity of research on topics of multimedia forensics justifies reflections on the definition of the field. This paper devises an ontology that structures forensic disciplines by their primary domain of evidence. In this sense, both multimedia forensics and computer forensics belong to the class of digital forensics, but they differ notably in the underlying observer model that defines the forensic investigator’s view on (parts of) reality, which itself is not fully cognizable. Important consequences on the reliability of probative facts emerge with regard to available counter-forensic techniques: while perfect concealment of traces is possible for computer forensics, this level of certainty cannot be expected for manipulations of sensor data. We cite concrete examples and refer to established techniques to support our arguments.
ROSAT in-orbit attitude measurement recovery
NASA Astrophysics Data System (ADS)
Kaffer, L.; Boeinghoff, A.; Bruederle, E.; Schrempp, W.; Wullstein, P.
After about 7 months of nearly perfect Attitude Measurement and Control System (AMCS) functioning, the ROSAT mission was influenced by gyro degradations which complicated the operation and after one year the nominal mission could no longer be maintained. The reestablishment of the nominal mission by the redesign of the attitude measurement using inertial reference generation from coarse Sun sensor and magnetometer together with a new star acquisition procedure is described. This success was only possible because sufficient reprogramming provisions in the onboard computer were available. The new software now occupies nearly the complete Random Access Memory (RAM) area and increases the computation time from about 50 msec to 300 msec per 1 sec cycle. This proves that deficiencies of the hardware can be overcome by a more intelligent software.
No Regret Learning in Oligopolies: Cournot vs. Bertrand
NASA Astrophysics Data System (ADS)
Nadav, Uri; Piliouras, Georgios
Cournot and Bertrand oligopolies constitute the two most prevalent models of firm competition. The analysis of Nash equilibria in each model reveals a unique prediction about the stable state of the system. Quite alarmingly, despite the similarities of the two models, their projections expose a stark dichotomy. Under the Cournot model, where firms compete by strategically managing their output quantity, firms enjoy positive profits as the resulting market prices exceed that of the marginal costs. On the contrary, the Bertrand model, in which firms compete on price, predicts that a duopoly is enough to push prices down to the marginal cost level. This suggestion that duopoly will result in perfect competition, is commonly referred to in the economics literature as the "Bertrand paradox".
Rapid pitch correction in choir singers.
Grell, Anke; Sundberg, Johan; Ternström, Sten; Ptok, Martin; Altenmüller, Eckart
2009-07-01
Highly and moderately skilled choral singers listened to a perfect fifth reference, with the instruction to complement the fifth such that a major triad resulted. The fifth was suddenly and unexpectedly shifted in pitch, and the singers' task was to shift the fundamental frequency of the sung tone accordingly. The F0 curves during the transitions often showed two phases, an initial quick and large change followed by a slower and smaller change, apparently intended to fine-tune voice F0 to complement the fifth. Anesthetizing the vocal folds of moderately skilled singers tended to delay the reaction. The means of the response times varied in the range 197- 259 ms depending on direction and size of the pitch shifts, as well as on skill and anesthetization.
GAS payload no. G-025: Study of liquid sloshing behaviour in microgravity
NASA Technical Reports Server (NTRS)
Gilbert, C. R.
1986-01-01
The Get Away Special (GAS) G-025, which flew on shuttle Mission 51-G, examined the behavior of a liquid in a tank under microgravity conditions. The experiment is representative of phenomena occurring in satellite tanks with liquid propellants. A reference fluid in a hemispherical model tank will be subjected to linear acceleration inputs of known levels and frequencies, and the dynamic response of the tank liquid system was recorded. Preliminary analysis of the flight data indicates that the experiment functioned perfectly. The results will validate and refine mathematical models describing the dynamic characteristics of tank-fluid systems. This will in turn support the development of future spacecraft tanks, in particular the design of propellant management devices for surface tension tanks.
3-methyl-1,2,3-butanetricarboxylic acid: An atmospheric tracer for terpene secondary organic aerosol
NASA Astrophysics Data System (ADS)
Szmigielski, Rafal; Surratt, Jason D.; Gómez-González, Yadian; Van der Veken, Pieter; Kourtchev, Ivan; Vermeylen, Reinhilde; Blockhuys, Frank; Jaoui, Mohammed; Kleindienst, Tadeusz E.; Lewandowski, Michael; Offenberg, John H.; Edney, Edward O.; Seinfeld, John H.; Maenhaut, Willy; Claeys, Magda
2007-12-01
Highly oxygenated compounds assigned to be oxidation products of α-pinene have recently been observed in substantial concentrations in ambient aerosols. Here, we confirm the unknown α-pinene tracer compound with molecular weight (MW) 204 as the C8-tricarboxylic acid 3-methyl-1,2,3-butanetricarboxylic acid. Its gas and liquid chromatographic behaviors and its mass spectral characteristics in electron ionization and negative ion electrospray ionization perfectly agree with those of a synthesized reference compound. The formation of this compound is explained by further reaction of cis-pinonic acid involving participation of the OH radical. This study illustrates that complex, multi-generation chemistry holds for the photooxidation of α-pinene in the presence of NOx.
Mamut, Jannathan; Xiong, Ying-Ze; Tan, Dun-Yan; Huang, Shuang-Quan
2017-03-01
It has been hypothesized that two flower types permit flexible allocation of resources to female and male functions, yet empirical evidence for the sex-allocation hypothesis remains scarce in gynomonoecious species. To characterize resource allocation to pistillate and perfect flowers and allocation of perfect flowers between gynomonoecious and hermaphroditic individuals, we examined the flexibility and whether female-biased allocation increases with plant size in the hermaphroditic-gynomonoecious herb Eremurus anisopterus . Frequency of gynomonoecious individuals, flower production, and plant size were investigated in different populations. Floral allocation was compared among the three flower types of E. anisopterus . Frequency of gynomonoecious plants varied from 2-17% in nine populations. Only larger plants produced female flowers at the bottom of racemes. Both female and perfect flower production tended to increase proportionately with plant size in gynomonoecious individuals. Female flowers did not produce less biomass than perfect flowers from hermaphroditic or gynomonoecious plants. However, both female and perfect flowers from gynomonoecious individuals had lighter stamen mass, but larger pistil mass, than perfect flowers from hermaphrodites. Although the prediction of an increase in female flower number with plant size was not observed in E. anisopterus , the flexibility of sex allocation in gynomonoecious species was confirmed in that gynomonoecious individuals had a female-biased floral allocation compared to hermaphroditic individuals. Such comparisons of gynomonoecious to hermaphroditic individuals permit us to unveil a sexual adjustment strategy: flexibility of sexual investments within plants. © 2017 Botanical Society of America.
NASA Astrophysics Data System (ADS)
Peltoniemi, Jouni I.; Hakala, Teemu; Suomalainen, Juha; Honkavaara, Eija; Markelin, Lauri; Gritsevich, Maria; Eskelinen, Juho; Jaanson, Priit; Ikonen, Erkki
2014-10-01
The measurement uncertainty and traceability of the Finnish Geodetic Institutess field gonio-spectro-polarimeter FIGIFIGO have been assessed. First, the reference standard (Spectralon sample) was measured at the National Standard Laboratory of MIKES-Aalto. This standard was transferred to FGIs field reference standard (larger Spectralon sample), and from that to the unmanned aerial vehicle (UAV), reference standards (1 m2 plates). The reflectance measurement uncertainty of FIGIFIGO has been estimated to be 0.01 in ideal laboratory conditions, but about 0.02-0.05 in typical field conditions, larger at larger solar or observation zenith angles. Target specific uncertainties can increase total uncertainty even to 0.1-0.2. The angular reading uncertainty is between 1° and 3°, depending on user selection, and the polarisation uncertainty is around 0.01. For UAV, the transferred reflectance uncertainty is about 0.05-0.1, depending on, how ideal the measurement conditions are. The design concept of FIGIFIGO has been proved to have a number of advantages, such as a well-adopted user-friendly interface, a high level of automation and excellent suitability for the field measurements. It is a perfect instrument for collection of reference data on a given target in natural (and well-recorded) conditions. In addition to the strong points of FIGIFIGO, the current study reveals several issues that need further attention, such as the field of view, illumination quality, polarisation calibration, Spectralon reflectance and polarisation properties in the 1000-2400 nm range.
[Roaming through methodology. XXXII. False test results].
van der Weijden, T; van den Akker, M
2001-05-12
The number of requests for diagnostic tests is rising. This leads to a higher chance of false test results. The false-negative proportion of a test is the proportion of negative test results among the diseased subjects. The false-positive proportion is the proportion of positive test results among the healthy subjects. The calculation of the false-positive proportion is often incorrect. For example, instead of 1 minus the specificity it is calculated as 1 minus the positive predictive value. This can lead to incorrect decision-making with respect to the application of the test. Physicians must apply diagnostic tests in such a way that the risk of false test results is minimal. The patient should be aware that a perfectly conclusive diagnostic test is rare in medical practice, and should more often be informed of the implications of false-positive and false-negative test results.
QUAST: quality assessment tool for genome assemblies
Gurevich, Alexey; Saveliev, Vladislav; Vyahhi, Nikolay; Tesler, Glenn
2013-01-01
Summary: Limitations of genome sequencing techniques have led to dozens of assembly algorithms, none of which is perfect. A number of methods for comparing assemblers have been developed, but none is yet a recognized benchmark. Further, most existing methods for comparing assemblies are only applicable to new assemblies of finished genomes; the problem of evaluating assemblies of previously unsequenced species has not been adequately considered. Here, we present QUAST—a quality assessment tool for evaluating and comparing genome assemblies. This tool improves on leading assembly comparison software with new ideas and quality metrics. QUAST can evaluate assemblies both with a reference genome, as well as without a reference. QUAST produces many reports, summary tables and plots to help scientists in their research and in their publications. In this study, we used QUAST to compare several genome assemblers on three datasets. QUAST tables and plots for all of them are available in the Supplementary Material, and interactive versions of these reports are on the QUAST website. Availability: http://bioinf.spbau.ru/quast Contact: gurevich@bioinf.spbau.ru Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23422339
Possible astronomical references in two megalithic building of ancient Latium
NASA Astrophysics Data System (ADS)
Magli, G.
In the wide area of the ancient Latium Vetus - roughly enclosed within the coast and the Apennines between Rome and Terracina, in Central Italy - there are several examples of town's walls and buildings constructed with the spectacular megalithic technique called polygonal, in which enormous blocks are cut in irregular shapes and perfectly fit together without mortar. In many cases, for instance in Alatri, Arpino, Circei, Norba and Segni, the megalithic size of the blocks and the ingenuity in construction reach the same magnificence and impression of power and pride which characterize the worldwide famous Mycenaean towns of Tiryns and Mycenae, constructed around the XIII century BC. In Italy however, all polygonal walls are currently attributed to the Romans, and dated to the first centuries of the Roman republic (V-III century BC), although for most of these constructions no reliable stratigraphy is available. In the present work, which is part of an ongoing project aiming at a complete study of these buildings, we investigate the possible astronomical references in the planning of two among the most imposing of them, namely the so called Acropolis of Alatri and Circei.
16 CFR 23.12 - Misuse of the words “flawless,” “perfect,” etc.
Code of Federal Regulations, 2012 CFR
2012-01-01
...,” “perfect,” etc. (a) It is unfair or deceptive to use the word “flawless” to describe any diamond that... a person skilled in diamond grading. (b) It is unfair or deceptive to use the word “perfect,” or any representation of similar meaning, to describe any diamond unless the diamond meets the definition of “flawless...
16 CFR 23.12 - Misuse of the words “flawless,” “perfect,” etc.
Code of Federal Regulations, 2011 CFR
2011-01-01
...,” “perfect,” etc. (a) It is unfair or deceptive to use the word “flawless” to describe any diamond that... a person skilled in diamond grading. (b) It is unfair or deceptive to use the word “perfect,” or any representation of similar meaning, to describe any diamond unless the diamond meets the definition of “flawless...
16 CFR 23.12 - Misuse of the words “flawless,” “perfect,” etc.
Code of Federal Regulations, 2013 CFR
2013-01-01
...,” “perfect,” etc. (a) It is unfair or deceptive to use the word “flawless” to describe any diamond that... a person skilled in diamond grading. (b) It is unfair or deceptive to use the word “perfect,” or any representation of similar meaning, to describe any diamond unless the diamond meets the definition of “flawless...
16 CFR 23.12 - Misuse of the words “flawless,” “perfect,” etc.
Code of Federal Regulations, 2014 CFR
2014-01-01
...,” “perfect,” etc. (a) It is unfair or deceptive to use the word “flawless” to describe any diamond that... a person skilled in diamond grading. (b) It is unfair or deceptive to use the word “perfect,” or any representation of similar meaning, to describe any diamond unless the diamond meets the definition of “flawless...
A numerical study of some potential sources of error in side-by-side seismometer evaluations
Holcomb, L. Gary
1990-01-01
This report presents the results of a series of computer simulations of potential errors in test data, which might be obtained when conducting side-by-side comparisons of seismometers. These results can be used as guides in estimating potential sources and magnitudes of errors one might expect when analyzing real test data. First, the derivation of a direct method for calculating the noise levels of two sensors in a side-by-side evaluation is repeated and extended slightly herein. This bulk of this derivation was presented previously (see Holcomb 1989); it is repeated here for easy reference.This method is applied to the analysis of a simulated test of two sensors in a side-by-side test in which the outputs of both sensors consist of white noise spectra with known signal-tonoise ratios (SNR's). This report extends this analysis to high SNR's to determine the limitations of the direct method for calculating the noise levels at signal-to-noise levels which are much higher than presented previously (see Holcomb 1989). Next, the method is used to analyze a simulated test of two sensors in a side-by-side test in which the outputs of both sensors consist of bandshaped noise spectra with known signal-tonoise ratios. This is a much more realistic representation of real world data because the earth's background spectrum is certainly not flat.Finally, the results of the analysis of simulated white and bandshaped side-by-side test data are used to assist in interpreting the analysis of the effects of simulated azimuthal misalignment in side-by-side sensor evaluations. A thorough understanding of azimuthal misalignment errors is important because of the physical impossibility of perfectly aligning two sensors in a real world situation. The analysis herein indicates that alignment errors place lower limits on the levels of system noise which can be resolved in a side-by-side measurement It also indicates that alignment errors are the source of the fact that real data noise spectra tend to follow the earth's background spectra in shape.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirshafieyan, Seyed Sadreddin; Luk, Ting S.; Guo, Junpeng
Here, we demonstrated perfect light absorption in optical nanocavities made of ultra-thin percolation aluminum and silicon films deposited on an aluminum surface. The total layer thickness of the aluminum and silicon films is one order of magnitude less than perfect absorption wavelength in the visible spectral range. The ratio of silicon cavity layer thickness to perfect absorption wavelength decreases as wavelength decreases due to the increased phase delays at silicon-aluminum boundaries at shorter wavelengths. It is explained that perfect light absorption is due to critical coupling of incident wave to the fundamental Fabry-Perot resonance mode of the structure where themore » round trip phase delay is zero. Simulations were performed and the results agree well with the measurement results.« less
Coherent Perfect Rotation: The conservative analogue of CPA
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Dawson, Nathan; Andrews, James
2012-06-01
The two classes of conservative, linear, optical rotary effects (optical activity and Faraday rotation) are distinguished by their behavior under time reversal. In analogy with coherent perfect absorption (CPA) resonances, where counter-propagating light fields are completely converted into other degrees of freedom, we show that in a linear conservative medium only time-odd (Faraday) rotation is capable of coherent perfect rotation, by which we mean the complete transfer of any arbitrarily oriented polarization of light into the other orthogonal polarization via the application of phased counter-propagating light fields. This contributes to the understanding of the importance of time reversal symmetry in perfect mode conversion that may be of use in optical device design.
Coherent perfect absorber and laser modes in purely imaginary metamaterials
NASA Astrophysics Data System (ADS)
Fu, Yangyang; Cao, Yanyan; Cummer, Steven A.; Xu, Yadong; Chen, Huanyang
2017-10-01
Conjugate metamaterials, in which the permittivity and the permeability are complex conjugates of each other, possess the elements of loss and gain simultaneously. By employing a conjugate metamaterial with a purely imaginary form, we propose a mechanism for realizing both coherent perfect absorber (CPA) and laser modes. Moreover, the general conditions for obtaining CPA and laser modes, including obtaining them simultaneously, are revealed by analyzing the wave scattering properties of a slab made of purely imaginary metamaterials (PIMs). Specifically, in a PIM slab with a subunity effective refractive index, the CPA mode can be simplified as a perfect absorption mode and the incident wave from one side could be perfectly absorbed.
Perfect Circular Dichroism in the Haldane Model
NASA Astrophysics Data System (ADS)
Ghalamkari, Kazu; Tatsumi, Yuki; Saito, Riichiro
2018-06-01
We theoretically show that perfect circular dichroism (CD) occurs in the Haldane model in which the two-dimensional (2D) material absorbs only either left-handed or right-handed circularly polarized light. Perfect CD occurs in the phase diagram of the Haldane model when the zero-field quantum Hall conductivity has a nonzero value. The coincidence of the occurrence of perfect CD and zero-field quantum Hall effect is attributed to the fact that the effect of broken time-reversal symmetry is larger than the effect of broken inversion symmetry. On the other hand, valley polarization and perfect CD occur exclusively in the phase diagram. Further, for the four regions of the phase diagram, pseudospin polarization occurs at the K and K' points in the hexagonal Brillouin zone with either the same sign or opposite sign for the K and K' points and for the valence and conduction bands. This theoretical prediction may have an impact on search for a new optical device that selects circularly polarized light controlled by the electric field.
NASA Astrophysics Data System (ADS)
Avelino, P. P.; Azevedo, R. P. L.
2018-03-01
In this paper we show that the on-shell Lagrangian of a perfect fluid depends on microscopic properties of the fluid, giving specific examples of perfect fluids with different on-shell Lagrangians but with the same energy-momentum tensor. We demonstrate that if the fluid is constituted by localized concentrations of energy with fixed rest mass and structure (solitons) then the average on-shell Lagrangian of a perfect fluid is given by Lm=T , where T is the trace of the energy-momentum tensor. We show that our results have profound implications for theories of gravity where the matter Lagrangian appears explicitly in the equations of motion of the gravitational and matter fields, potentially leading to observable deviations from a nearly perfect cosmic microwave background black body spectrum: n -type spectral distortions, affecting the normalization of the spectral energy density. Finally, we put stringent constraints on f (R ,Lm) theories of gravity using the COBE-FIRAS measurement of the spectral radiance of the cosmic microwave background.
The perils of the imperfect expectation of the perfect baby.
Chervenak, Frank A; McCullough, Laurence B; Brent, Robert L
2010-08-01
Advances in modern medicine invite the assumption that medicine can control human biology. There is a perilous logic that leads from expectations of medicine's control over reproductive biology to the expectation of having a perfect baby. This article proposes that obstetricians should take a preventive ethics approach to the care of pregnant women with expectations for a perfect baby. We use Nathaniel Hawthorne's classic short story, "The Birthmark," to illustrate the perils of the logic of control and perfection through science and then identify possible contemporary sources of the expectation of the perfect baby. We propose that the informed consent process should be used as a preventive ethics tool throughout the course of pregnancy to educate pregnant women about the inherent errors of human reproduction, the highly variable clinical outcomes of these errors, the limited capacity of medicine to detect these errors, and the even more limited capacity to correct them. Copyright (c) 2010 Mosby, Inc. All rights reserved.
A hybrid of monopoly and perfect competition model for hi-tech products
NASA Astrophysics Data System (ADS)
Yang, P. C.; Wee, H. M.; Pai, S.; Yang, H. J.; Wee, P. K. P.
2010-11-01
For Hi-tech products, the demand rate, the component cost as well as the selling price usually decline significantly with time. In the case of perfect competition, shortages usually result in lost sales; while in a monopoly, shortages will be completely backordered. However, neither perfect competition nor monopoly exists. Therefore, there is a need to develop a replenishment model considering a hybrid of perfect competition and monopoly when the cost, price and demand are decreasing simultaneously. A numerical example and sensitivity analysis are carried out to illustrate this model. The results show that a higher decline-rate in the component cost leads to a smaller service level and a larger replenishment interval. When the component cost decline rate increases and the selling price decline rate decreases simultaneously, the replenishment interval decreases. In perfect competition it is better to have a high service level, while for the case with monopoly, keeping a low service level is better due to complete backordering.
ERIC Educational Resources Information Center
Jackson, Carolyn; Nyström, Anne-Sofie
2015-01-01
Discourses about the value of effort and hard work are prevalent and powerful in many western societies and educational contexts. Yet, paradoxically, in these same contexts effortless achievement is often lauded, and in certain discourses is heralded as the pinnacle of success and a sign of genius. In this paper we interrogate discourses about…
Standardization of Performance Tests: A Proposal for Further Steps.
1986-07-01
obviously demand substantial attention can sometimes be time shared perfectly. Wickens describes cases in which skilled pianists can time share sight-reading...effects of divided attention on information processing in tracking. Journal of Experimental Psychology, 1, 1-13. Wickens, C.D. (1984). Processing resources... attention he regards focused- divided attention tasks (e.g. dichotic listening, dual task situations) as theoretically useful. From his point of view good
Using arborescences to estimate hierarchicalness in directed complex networks
2018-01-01
Complex networks are a useful tool for the understanding of complex systems. One of the emerging properties of such systems is their tendency to form hierarchies: networks can be organized in levels, with nodes in each level exerting control on the ones beneath them. In this paper, we focus on the problem of estimating how hierarchical a directed network is. We propose a structural argument: a network has a strong top-down organization if we need to delete only few edges to reduce it to a perfect hierarchy—an arborescence. In an arborescence, all edges point away from the root and there are no horizontal connections, both characteristics we desire in our idealization of what a perfect hierarchy requires. We test our arborescence score in synthetic and real-world directed networks against the current state of the art in hierarchy detection: agony, flow hierarchy and global reaching centrality. These tests highlight that our arborescence score is intuitive and we can visualize it; it is able to better distinguish between networks with and without a hierarchical structure; it agrees the most with the literature about the hierarchy of well-studied complex systems; and it is not just a score, but it provides an overall scheme of the underlying hierarchy of any directed complex network. PMID:29381761
Nearly Perfect Durable Superhydrophobic Surfaces Fabricated by a Simple One-Step Plasma Treatment.
Ryu, Jeongeun; Kim, Kiwoong; Park, JooYoung; Hwang, Bae Geun; Ko, YoungChul; Kim, HyunJoo; Han, JeongSu; Seo, EungRyeol; Park, YongJong; Lee, Sang Joon
2017-05-16
Fabrication of superhydrophobic surfaces is an area of great interest because it can be applicable to various engineering fields. A simple, safe and inexpensive fabrication process is required to fabricate applicable superhydrophobic surfaces. In this study, we developed a facile fabrication method of nearly perfect superhydrophobic surfaces through plasma treatment with argon and oxygen gases. A polytetrafluoroethylene (PTFE) sheet was selected as a substrate material. We optimized the fabrication parameters to produce superhydrophobic surfaces of superior performance using the Taguchi method. The contact angle of the pristine PTFE surface is approximately 111.0° ± 2.4°, with a sliding angle of 12.3° ± 6.4°. After the plasma treatment, nano-sized spherical tips, which looked like crown-structures, were created. This PTFE sheet exhibits the maximum contact angle of 178.9°, with a sliding angle less than 1°. As a result, this superhydrophobic surface requires a small external force to detach water droplets dripped on the surface. The contact angle of the fabricated superhydrophobic surface is almost retained, even after performing an air-aging test for 80 days and a droplet impacting test for 6 h. This fabrication method can provide superb superhydrophobic surface using simple one-step plasma etching.
Blunt Body Near-Wake Flow Field at Mach 10
NASA Technical Reports Server (NTRS)
Horvath, Thomas; Hannemann, Klaus
1997-01-01
Tests were conducted in a Mach 10 air flow to examine the reattachment process of a free shear layer associated with the near wake of a 70 deg half angle, spherically blunted cone having a cylindrical after body. The nominal free-stream Reynolds number based on model diameter ranged from 0.25 x l0(exp 6) to 1 x l0(exp 6) and the angle of incidence set at 0 and +/- 20 deg. The present study was designed to complement previously reported Mach 6 perfect air tests as well as results obtained in several hypervelocity facilities capable of producing real gas effects. Surface heating rates were inferred from temperature time histories from coaxial surface thermocouples on the model forebody and thin film resistance gages along the model base and cylindrical after body. Limited forebody, base, and support sting surface pressures were obtained with piezoresistive Experimental results are compared to laminar perfect gas predictions provided by a 3-0 Navier Stokes code (NSHYP). Shear layer impingement on the instrumented cylindrical after body resulted in a localized heating maximum that was 16 to 18percent of the forebody stagnation point and a factor of 2 higher than laminar predictions, suggesting a transitional or turbulent shear layer. transducers.
NASA Astrophysics Data System (ADS)
Wootten, A.; Dixon, K. W.; Lanzante, J. R.; Mcpherson, R. A.
2017-12-01
Empirical statistical downscaling (ESD) approaches attempt to refine global climate model (GCM) information via statistical relationships between observations and GCM simulations. The aim of such downscaling efforts is to create added-value climate projections by adding finer spatial detail and reducing biases. The results of statistical downscaling exercises are often used in impact assessments under the assumption that past performance provides an indicator of future results. Given prior research describing the danger of this assumption with regards to temperature, this study expands the perfect model experimental design from previous case studies to test the stationarity assumption with respect to precipitation. Assuming stationarity implies the performance of ESD methods are similar between the future projections and historical training. Case study results from four quantile-mapping based ESD methods demonstrate violations of the stationarity assumption for both central tendency and extremes of precipitation. These violations vary geographically and seasonally. For the four ESD methods tested the greatest challenges for downscaling of daily total precipitation projections occur in regions with limited precipitation and for extremes of precipitation along Southeast coastal regions. We conclude with a discussion of future expansion of the perfect model experimental design and the implications for improving ESD methods and providing guidance on the use of ESD techniques for impact assessments and decision-support.
Phase-shifting point diffraction interferometer
Medecki, H.
1998-11-10
Disclosed is a point diffraction interferometer for evaluating the quality of a test optic. In operation, the point diffraction interferometer includes a source of radiation, the test optic, a beam divider, a reference wave pinhole located at an image plane downstream from the test optic, and a detector for detecting an interference pattern produced between a reference wave emitted by the pinhole and a test wave emitted from the test optic. The beam divider produces separate reference and test beams which focus at different laterally separated positions on the image plane. The reference wave pinhole is placed at a region of high intensity (e.g., the focal point) for the reference beam. This allows reference wave to be produced at a relatively high intensity. Also, the beam divider may include elements for phase shifting one or both of the reference and test beams. 8 figs.
Phase-shifting point diffraction interferometer
Medecki, Hector
1998-01-01
Disclosed is a point diffraction interferometer for evaluating the quality of a test optic. In operation, the point diffraction interferometer includes a source of radiation, the test optic, a beam divider, a reference wave pinhole located at an image plane downstream from the test optic, and a detector for detecting an interference pattern produced between a reference wave emitted by the pinhole and a test wave emitted from the test optic. The beam divider produces separate reference and test beams which focus at different laterally separated positions on the image plane. The reference wave pinhole is placed at a region of high intensity (e.g., the focal point) for the reference beam. This allows reference wave to be produced at a relatively high intensity. Also, the beam divider may include elements for phase shifting one or both of the reference and test beams.
Zeroth Poisson Homology, Foliated Cohomology and Perfect Poisson Manifolds
NASA Astrophysics Data System (ADS)
Martínez-Torres, David; Miranda, Eva
2018-01-01
We prove that, for compact regular Poisson manifolds, the zeroth homology group is isomorphic to the top foliated cohomology group, and we give some applications. In particular, we show that, for regular unimodular Poisson manifolds, top Poisson and foliated cohomology groups are isomorphic. Inspired by the symplectic setting, we define what a perfect Poisson manifold is. We use these Poisson homology computations to provide families of perfect Poisson manifolds.
Developments in Coherent Perfect Polarization Rotation
NASA Astrophysics Data System (ADS)
Crescimanno, Michael; Andrews, James; Zhou, Chaunhong; Baker, Michael
2015-05-01
Coherent Perfect Polarization Rotation (CPR) is a useful technique akin to Coherent Perfect Absorption (CPA, also known as the anti-laser) but that results in very high efficiency optical mode conversion. We describe the analysis of recent experimental data from our CPR testbed, the use of CPR in miniaturizing optical isolators and CPR phenomena in non-linear optics. Work supported by the N.S.F. under Grant No. ECCS-1360725.
16 CFR 23.12 - Misuse of the words “flawless,” “perfect,” etc.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Misuse of the words âflawless,â âperfect,â... GUIDES FOR THE JEWELRY, PRECIOUS METALS, AND PEWTER INDUSTRIES § 23.12 Misuse of the words “flawless,” “perfect,” etc. (a) It is unfair or deceptive to use the word “flawless” to describe any diamond that...
16 CFR 23.26 - Misuse of the words “flawless,” “perfect,” etc.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Misuse of the words âflawless,â âperfect,â... GUIDES FOR THE JEWELRY, PRECIOUS METALS, AND PEWTER INDUSTRIES § 23.26 Misuse of the words “flawless,” “perfect,” etc. (a) It is unfair or deceptive to use the word “flawless” as a quality description of any...
The Perfect Glass Paradigm: Disordered Hyperuniform Glasses Down to Absolute Zero
NASA Astrophysics Data System (ADS)
Zhang, G.; Stillinger, F. H.; Torquato, S.
2016-11-01
Rapid cooling of liquids below a certain temperature range can result in a transition to glassy states. The traditional understanding of glasses includes their thermodynamic metastability with respect to crystals. However, here we present specific examples of interactions that eliminate the possibilities of crystalline and quasicrystalline phases, while creating mechanically stable amorphous glasses down to absolute zero temperature. We show that this can be accomplished by introducing a new ideal state of matter called a “perfect glass”. A perfect glass represents a soft-interaction analog of the maximally random jammed (MRJ) packings of hard particles. These latter states can be regarded as the epitome of a glass since they are out of equilibrium, maximally disordered, hyperuniform, mechanically rigid with infinite bulk and shear moduli, and can never crystallize due to configuration-space trapping. Our model perfect glass utilizes two-, three-, and four-body soft interactions while simultaneously retaining the salient attributes of the MRJ state. These models constitute a theoretical proof of concept for perfect glasses and broaden our fundamental understanding of glass physics. A novel feature of equilibrium systems of identical particles interacting with the perfect-glass potential at positive temperature is that they have a non-relativistic speed of sound that is infinite.
Perfect and broadband acoustic absorption by critically coupled sub-wavelength resonators.
Romero-García, V; Theocharis, G; Richoux, O; Merkel, A; Tournat, V; Pagneux, V
2016-01-19
Perfect absorption is an interdisciplinary topic with a large number of applications, the challenge of which consists of broadening its inherently narrow frequency-band performance. We experimentally and analytically report perfect and broadband absorption for audible sound, by the mechanism of critical coupling, with a sub-wavelength multi-resonant scatterer (SMRS) made of a plate-resonator/closed waveguide structure. In order to introduce the role of the key parameters, we first present the case of a single resonant scatterer (SRS) made of a Helmholtz resonator/closed waveguide structure. In both cases the controlled balance between the energy leakage of the several resonances and the inherent losses of the system leads to perfect absorption peaks. In the case of the SMRS we show that systems with large inherent losses can be critically coupled using resonances with large leakage. In particular, we show that in the SMRS system, with a thickness of λ/12 and diameter of λ/7, several perfect absorption peaks overlap to produce absorption bigger than 93% for frequencies that extend over a factor of 2 in audible frequencies. The reported concepts and methodology provide guidelines for the design of broadband perfect absorbers which could contribute to solve the major issue of noise reduction.
Perfect and broadband acoustic absorption by critically coupled sub-wavelength resonators
NASA Astrophysics Data System (ADS)
Romero-García, V.; Theocharis, G.; Richoux, O.; Merkel, A.; Tournat, V.; Pagneux, V.
2016-01-01
Perfect absorption is an interdisciplinary topic with a large number of applications, the challenge of which consists of broadening its inherently narrow frequency-band performance. We experimentally and analytically report perfect and broadband absorption for audible sound, by the mechanism of critical coupling, with a sub-wavelength multi-resonant scatterer (SMRS) made of a plate-resonator/closed waveguide structure. In order to introduce the role of the key parameters, we first present the case of a single resonant scatterer (SRS) made of a Helmholtz resonator/closed waveguide structure. In both cases the controlled balance between the energy leakage of the several resonances and the inherent losses of the system leads to perfect absorption peaks. In the case of the SMRS we show that systems with large inherent losses can be critically coupled using resonances with large leakage. In particular, we show that in the SMRS system, with a thickness of λ/12 and diameter of λ/7, several perfect absorption peaks overlap to produce absorption bigger than 93% for frequencies that extend over a factor of 2 in audible frequencies. The reported concepts and methodology provide guidelines for the design of broadband perfect absorbers which could contribute to solve the major issue of noise reduction.
Perfect and broadband acoustic absorption by critically coupled sub-wavelength resonators
Romero-García, V.; Theocharis, G.; Richoux, O.; Merkel, A.; Tournat, V.; Pagneux, V.
2016-01-01
Perfect absorption is an interdisciplinary topic with a large number of applications, the challenge of which consists of broadening its inherently narrow frequency-band performance. We experimentally and analytically report perfect and broadband absorption for audible sound, by the mechanism of critical coupling, with a sub-wavelength multi-resonant scatterer (SMRS) made of a plate-resonator/closed waveguide structure. In order to introduce the role of the key parameters, we first present the case of a single resonant scatterer (SRS) made of a Helmholtz resonator/closed waveguide structure. In both cases the controlled balance between the energy leakage of the several resonances and the inherent losses of the system leads to perfect absorption peaks. In the case of the SMRS we show that systems with large inherent losses can be critically coupled using resonances with large leakage. In particular, we show that in the SMRS system, with a thickness of λ/12 and diameter of λ/7, several perfect absorption peaks overlap to produce absorption bigger than 93% for frequencies that extend over a factor of 2 in audible frequencies. The reported concepts and methodology provide guidelines for the design of broadband perfect absorbers which could contribute to solve the major issue of noise reduction. PMID:26781863
The Perfect Glass Paradigm: Disordered Hyperuniform Glasses Down to Absolute Zero.
Zhang, G; Stillinger, F H; Torquato, S
2016-11-28
Rapid cooling of liquids below a certain temperature range can result in a transition to glassy states. The traditional understanding of glasses includes their thermodynamic metastability with respect to crystals. However, here we present specific examples of interactions that eliminate the possibilities of crystalline and quasicrystalline phases, while creating mechanically stable amorphous glasses down to absolute zero temperature. We show that this can be accomplished by introducing a new ideal state of matter called a "perfect glass". A perfect glass represents a soft-interaction analog of the maximally random jammed (MRJ) packings of hard particles. These latter states can be regarded as the epitome of a glass since they are out of equilibrium, maximally disordered, hyperuniform, mechanically rigid with infinite bulk and shear moduli, and can never crystallize due to configuration-space trapping. Our model perfect glass utilizes two-, three-, and four-body soft interactions while simultaneously retaining the salient attributes of the MRJ state. These models constitute a theoretical proof of concept for perfect glasses and broaden our fundamental understanding of glass physics. A novel feature of equilibrium systems of identical particles interacting with the perfect-glass potential at positive temperature is that they have a non-relativistic speed of sound that is infinite.
The Perfect Glass Paradigm: Disordered Hyperuniform Glasses Down to Absolute Zero
Zhang, G.; Stillinger, F. H.; Torquato, S.
2016-01-01
Rapid cooling of liquids below a certain temperature range can result in a transition to glassy states. The traditional understanding of glasses includes their thermodynamic metastability with respect to crystals. However, here we present specific examples of interactions that eliminate the possibilities of crystalline and quasicrystalline phases, while creating mechanically stable amorphous glasses down to absolute zero temperature. We show that this can be accomplished by introducing a new ideal state of matter called a “perfect glass”. A perfect glass represents a soft-interaction analog of the maximally random jammed (MRJ) packings of hard particles. These latter states can be regarded as the epitome of a glass since they are out of equilibrium, maximally disordered, hyperuniform, mechanically rigid with infinite bulk and shear moduli, and can never crystallize due to configuration-space trapping. Our model perfect glass utilizes two-, three-, and four-body soft interactions while simultaneously retaining the salient attributes of the MRJ state. These models constitute a theoretical proof of concept for perfect glasses and broaden our fundamental understanding of glass physics. A novel feature of equilibrium systems of identical particles interacting with the perfect-glass potential at positive temperature is that they have a non-relativistic speed of sound that is infinite. PMID:27892452
Yong, Zhengdong; Zhang, Senlin; Gong, Chensheng; He, Sailing
2016-01-01
Plasmonics offer an exciting way to mediate the interaction between light and matter, allowing strong field enhancement and confinement, large absorption and scattering at resonance. However, simultaneous realization of ultra-narrow band perfect absorption and electromagnetic field enhancement is challenging due to the intrinsic high optical losses and radiative damping in metals. Here, we propose an all-metal plasmonic absorber with an absorption bandwidth less than 8 nm and polarization insensitive absorptivity exceeding 99%. Unlike traditional Metal-Dielectric-Metal configurations, we demonstrate that the narrowband perfect absorption and field enhancement are ascribed to the vertical gap plasmonic mode in the deep subwavelength scale, which has a high quality factor of 120 and mode volume of about 10−4 × (λres/n)3. Based on the coupled mode theory, we verify that the diluted field enhancement is proportional to the absorption, and thus perfect absorption is critical to maximum field enhancement. In addition, the proposed perfect absorber can be operated as a refractive index sensor with a sensitivity of 885 nm/RIU and figure of merit as high as 110. It provides a new design strategy for narrow band perfect absorption and local field enhancement, and has potential applications in biosensors, filters and nonlinear optics. PMID:27046540
NASA Astrophysics Data System (ADS)
Chen, Menglin L. N.; Jiang, Li Jun; Sha, Wei E. I.
2016-02-01
Orbital angular momentum (OAM) is a promising degree of freedom for fundamental studies in electromagnetics and quantum mechanics. The unlimited state space of OAM shows a great potential to enhance channel capacities of classical and quantum communications. By exploring the Pancharatnam-Berry phase concept and engineering anisotropic scatterers in a metasurface with spatially varying orientations, a plane wave with zero OAM can be converted to a vortex beam carrying nonzero OAM. In this paper, we proposed two types of novel perfect electric conductor-perfect magnetic conductor anisotropic metasurfaces. One is composed of azimuthally continuous loops and the other is constructed by azimuthally discontinuous dipole scatterers. Both types of metasurfaces are mounted on a mushroom-type high impedance surface. Compared to previous metasurface designs for generating OAM, the proposed ones achieve nearly perfect conversion efficiency. In view of the eliminated vertical component of electric field, the continuous metasurface shows very smooth phase pattern at the near-field region, which cannot be achieved by convectional metasurfaces composed of discrete scatterers. On the other hand, the metasurface with discrete dipole scatterers shows a great flexibility to generate OAM with arbitrary topological charges. Our work is fundamentally and practically important to high-performance OAM generation.
SNV-PPILP: refined SNV calling for tumor data using perfect phylogenies and ILP.
van Rens, Karen E; Mäkinen, Veli; Tomescu, Alexandru I
2015-04-01
Recent studies sequenced tumor samples from the same progenitor at different development stages and showed that by taking into account the phylogeny of this development, single-nucleotide variant (SNV) calling can be improved. Accurate SNV calls can better reveal early-stage tumors, identify mechanisms of cancer progression or help in drug targeting. We present SNV-PPILP, a fast and easy to use tool for refining GATK's Unified Genotyper SNV calls, for multiple samples assumed to form a phylogeny. We tested SNV-PPILP on simulated data, with a varying number of samples, SNVs, read coverage and violations of the perfect phylogeny assumption. We always match or improve the accuracy of GATK, with a significant improvement on low read coverage. SNV-PPILP, available at cs.helsinki.fi/gsa/snv-ppilp/, is written in Python and requires the free ILP solver lp_solve. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Quantifying the Effect of Soil Water Repellency on Infiltration Parameters Using a Dry Sand
NASA Astrophysics Data System (ADS)
Shillito, R.; Berli, M.; Ghezzehei, T. A.; Kaminski, E.
2017-12-01
Water infiltration into less than perfectly wettable soils has usually been considered an exceptional case—in fact, it may be the rule. Infiltration into soils exhibiting some degree of water repellency has important implications in agricultural irrigation, post-fire runoff, golf course and landscape management, and spill and contaminant mitigation. Beginning from fundamental principles, we developed a physically-based model to quantify the effect of water repellency on infiltration parameters. Experimentally, we used a dry silica sand and treated it to achieve various known degrees of water repellency. The model was verified using data gathered from multiple upward infiltration (wicking) experiments using the treated sand. The model also allowed us to explore the effect of initial soil moisture conditions on infiltration into water-repellent soils, and the physical interpretation of the simple water drop penetration time test. These results provide a fundamental step in the physically-based understanding of how water infiltrates into a less than perfectly wettable porous media.
The stability of perfect elliptic disks. 1: The maximum streaming case
NASA Technical Reports Server (NTRS)
Levine, Stephen E.; Sparke, Linda S.
1994-01-01
Self-consistent distribution functions are constructed for two-dimensional perfect elliptic disks (for which the potential is exactly integrable) in the limit of maximum streaming; these are tested for stability by N-body integration. To obtain a discrete representation for each model, simulated annealing is used to choose a set of orbits which sample the distribution function and reproduce the required density profile while carrying the greatest possible amount of angular momentum. A quiet start technique is developed to place particles on these orbits uniformly in action-angle space, making the initial conditions as smooth as possible. The roundest models exhibit spiral instabilities similar to those of cold axisymmetric disks; the most elongated models show bending instabilities like those seen in prolate systems. Between these extremes, there is a range of axial ratios 0.25 approximately less than b/a approximately less than 0.6 within which these models appear to be stable. All the methods developed in this investigation can easily be extended to integrable potentials in three dimensions.
Analysis of Influenza and RSV dynamics in the community using a ‘Local Transmission Zone’ approach
NASA Astrophysics Data System (ADS)
Almogy, Gal; Stone, Lewi; Bernevig, B. Andrei; Wolf, Dana G.; Dorozko, Marina; Moses, Allon E.; Nir-Paz, Ran
2017-02-01
Understanding the dynamics of pathogen spread within urban areas is critical for the effective prevention and containment of communicable diseases. At these relatively small geographic scales, short-distance interactions and tightly knit sub-networks dominate the dynamics of pathogen transmission; yet, the effective boundaries of these micro-scale groups are generally not known and often ignored. Using clinical test results from hospital admitted patients we analyze the spatio-temporal distribution of Influenza Like Illness (ILI) in the city of Jerusalem over a period of three winter seasons. We demonstrate that this urban area is not a single, perfectly mixed ecology, but is in fact comprised of a set of more basic, relatively independent pathogen transmission units, which we term here Local Transmission Zones, LTZs. By identifying these LTZs, and using the dynamic pathogen-content information contained within them, we are able to differentiate between disease-causes at the individual patient level often with near-perfect predictive accuracy.
Comparison between overground and dynamometer manual wheelchair propulsion.
Koontz, Alicia M; Worobey, Lynn A; Rice, Ian M; Collinger, Jennifer L; Boninger, Michael L
2012-08-01
Laboratory-based simulators afford many advantages for studying physiology and biomechanics; however, they may not perfectly mimic wheelchair propulsion over natural surfaces. The goal of this study was to compare kinetic and temporal parameters between propulsion overground on a tile surface and on a dynamometer. Twenty-four experienced manual wheelchair users propelled at a self-selected speed on smooth, level tile and a dynamometer while kinetic data were collected using an instrumented wheel. A Pearson correlation test was used to examine the relationship between propulsion variables obtained on the dynamometer and the overground condition. Ensemble resultant force and moment curves were compared using cross-correlation and qualitative analysis of curve shape. User biomechanics were correlated (R ranging from 0.41 to 0.83) between surfaces. Overall, findings suggest that although the dynamometer does not perfectly emulate overground propulsion, wheelchair users were consistent with the direction and amount of force applied, the time peak force was reached, push angle, and their stroke frequency between conditions.
NASA Astrophysics Data System (ADS)
Cameron, Gary L.
2012-07-01
Telescopes, reflecting telescopes in particular, underwent considerable development during the eighteenth century. Two classes of telescope maker, the for-profit artisan and the amateur 'gentleman-philosopher,' learned techniques of optical fabrication and testing and produced usable astronomical instruments. One means of disseminating technical knowledge was via the book. The year 1738 saw the publication of a highly-influential book, Robert Smith's A Compleat System of Opticks, a work that included detailed information on telescope-making. It was this book that helped spark the astronomical career of William Herschel, and with Smith's information Herschel produced large reflecting telescopes of exquisite quality. However, artisan-opticians, even the renowned James Short, appear to have cut corners on a portion of their production, thus permitting the sale of some instruments of inferior quality. The reasons for this were clearly economical in nature: artisans depending on telescope sales to earn a living simply could not afford the time required for perfection. The mere presence of written works disseminating technical
Simbaqueba, Jaime; Sánchez, Pilar; Sanchez, Erika; Núñez Zarantes, Victor Manuel; Chacon, Maria Isabel; Barrero, Luz Stella; Mariño-Ramírez, Leonardo
2011-01-01
Physalis peruviana, commonly known as Cape gooseberry, is an Andean Solanaceae fruit with high nutritional value and interesting medicinal properties. In the present study we report the development and characterization of microsatellite loci from a P. peruviana commercial Colombian genotype. We identified 932 imperfect and 201 perfect Simple Sequence Repeats (SSR) loci in untranslated regions (UTRs) and 304 imperfect and 83 perfect SSR loci in coding regions from the assembled Physalis peruviana leaf transcriptome. The UTR SSR loci were used for the development of 162 primers for amplification. The efficiency of these primers was tested via PCR in a panel of seven P. peruviana accessions including Colombia, Kenya and Ecuador ecotypes and one closely related species Physalis floridana. We obtained an amplification rate of 83% and a polymorphic rate of 22%. Here we report the first P. peruviana specific microsatellite set, a valuable tool for a wide variety of applications, including functional diversity, conservation and improvement of the species. PMID:22039540
NASA Astrophysics Data System (ADS)
Dearing, John A.; Bullock, Seth; Costanza, Robert; Dawson, Terry P.; Edwards, Mary E.; Poppy, Guy M.; Smith, Graham M.
2012-04-01
The `Perfect Storm' metaphor describes a combination of events that causes a surprising or dramatic impact. It lends an evolutionary perspective to how social-ecological interactions change. Thus, we argue that an improved understanding of how social-ecological systems have evolved up to the present is necessary for the modelling, understanding and anticipation of current and future social-ecological systems. Here we consider the implications of an evolutionary perspective for designing research approaches. One desirable approach is the creation of multi-decadal records produced by integrating palaeoenvironmental, instrument and documentary sources at multiple spatial scales. We also consider the potential for improved analytical and modelling approaches by developing system dynamical, cellular and agent-based models, observing complex behaviour in social-ecological systems against which to test systems dynamical theory, and drawing better lessons from history. Alongside these is the need to find more appropriate ways to communicate complex systems, risk and uncertainty to the public and to policy-makers.
Pedersen, Line Bjørnskov; Hess, Stephane; Kjær, Trine
2016-12-01
This study uses a best-worst scaling experiment to test whether general practitioners (GPs) act as perfect agents for the patients in the consultation; and if not, whether this is due to asymmetric information and/or other motivations than user orientation. Survey data were collected from 775 GPs and 1379 Danish citizens eliciting preferences for a consultation. Sequential models allowing for within-person preference heterogeneity and heteroskedasticity between best and worst choices were estimated. We show that GPs do not always act as perfect agents and that this non-alignment stems from GPs being both unable and unwilling to do so. Unable since GPs have imperfect information about patients' preferences, and unwilling since they are also motivated by other factors than user orientation. Our findings highlight the need for multi-pronged strategies targeting different motivational factors to ensure that GPs act in correspondence with patients' preferences in areas where alignment is warranted. Copyright © 2016 Elsevier B.V. All rights reserved.
Simbaqueba, Jaime; Sánchez, Pilar; Sanchez, Erika; Núñez Zarantes, Victor Manuel; Chacon, Maria Isabel; Barrero, Luz Stella; Mariño-Ramírez, Leonardo
2011-01-01
Physalis peruviana, commonly known as Cape gooseberry, is an Andean Solanaceae fruit with high nutritional value and interesting medicinal properties. In the present study we report the development and characterization of microsatellite loci from a P. peruviana commercial Colombian genotype. We identified 932 imperfect and 201 perfect Simple Sequence Repeats (SSR) loci in untranslated regions (UTRs) and 304 imperfect and 83 perfect SSR loci in coding regions from the assembled Physalis peruviana leaf transcriptome. The UTR SSR loci were used for the development of 162 primers for amplification. The efficiency of these primers was tested via PCR in a panel of seven P. peruviana accessions including Colombia, Kenya and Ecuador ecotypes and one closely related species Physalis floridana. We obtained an amplification rate of 83% and a polymorphic rate of 22%. Here we report the first P. peruviana specific microsatellite set, a valuable tool for a wide variety of applications, including functional diversity, conservation and improvement of the species.
Comparison Between Overground and Dynamometer Manual Wheelchair Propulsion
Worobey, Lynn A.; Rice, Ian M.; Collinger, Jennifer L.; Boninger, Michael L.
2017-01-01
Laboratory-based simulators afford many advantages for studying physiology and biomechanics; however, they may not perfectly mimic wheelchair propulsion over natural surfaces. The goal of this study was to compare kinetic and temporal parameters between propulsion overground on a tile surface and on a dynamometer. Twenty-four experienced manual wheelchair users propelled at a self-selected speed on smooth, level tile and a dynamometer while kinetic data were collected using an instrumented wheel. A Pearson correlation test was used to examine the relationship between propulsion variables obtained on the dynamometer and the overground condition. Ensemble resultant force and moment curves were compared using cross-correlation and qualitative analysis of curve shape. User biomechanics were correlated (R ranging from 0.41 to 0.83) between surfaces. Overall, findings suggest that although the dynamometer does not perfectly emulate overground propulsion, wheelchair users were consistent with the direction and amount of force applied, the time peak force was reached, push angle, and their stroke frequency between conditions. PMID:22085811
High fidelity quantum gates with vibrational qubits.
Berrios, Eduardo; Gruebele, Martin; Shyshlov, Dmytro; Wang, Lei; Babikov, Dmitri
2012-11-26
Physical implementation of quantum gates acting on qubits does not achieve a perfect fidelity of 1. The actual output qubit may not match the targeted output of the desired gate. According to theoretical estimates, intrinsic gate fidelities >99.99% are necessary so that error correction codes can be used to achieve perfect fidelity. Here we test what fidelity can be accomplished for a CNOT gate executed by a shaped ultrafast laser pulse interacting with vibrational states of the molecule SCCl(2). This molecule has been used as a test system for low-fidelity calculations before. To make our test more stringent, we include vibrational levels that do not encode the desired qubits but are close enough in energy to interfere with population transfer by the laser pulse. We use two complementary approaches: optimal control theory determines what the best possible pulse can do; a more constrained physical model calculates what an experiment likely can do. Optimal control theory finds pulses with fidelity >0.9999, in excess of the quantum error correction threshold with 8 × 10(4) iterations. On the other hand, the physical model achieves only 0.9992 after 8 × 10(4) iterations. Both calculations converge as an inverse power law toward unit fidelity after >10(2) iterations/generations. In principle, the fidelities necessary for quantum error correction are reachable with qubits encoded by molecular vibrations. In practice, it will be challenging with current laboratory instrumentation because of slow convergence past fidelities of 0.99.
FY16 Progress Report on Test Results In Support Of Integrated EPP and SMT Design Methods Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yanli; Jetter, Robert I.; Sham, T. -L.
2016-08-08
The proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology consists of incorporating an SMT data-based approach for creep-fatigue damage evaluation into the EPP methodology to avoid using the creep-fatigue interaction diagram (the D diagram) and to minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed code rules and to verify their applicability, a series of thermomechanical tests have been initiated. This report presents the recent test results for Type 2 SMT specimens on Alloy 617, Pressurization SMT on Alloy 617, Type 1 SMT on Gr. 91, and two-barmore » thermal ratcheting test results on Alloy 617 with a new thermal loading profile.« less
MacMillan, Donna; Lewandrowski, Elizabeth; Lewandrowski, Kent
2004-01-01
Utilization of outside reference laboratories for selected laboratory testing is common in the United States. However, relatively little data exist in the literature describing the scope and impact of these services. In this study, we reviewed use of reference laboratory testing at the Massachusetts General Hospital, a large urban academic medical center in Boston, Massachusetts. A retrospective review of hospital and laboratory administrative records over an 8-year period from fiscal years (FY) 1995-2002. Over the 8 years studied, reference laboratory expenses increased 4.2-fold and totaled 12.4% of the total laboratory budget in FY 2002. Total reference laboratory test volume increased 4-fold to 68,328 tests in FY 2002 but represented only 1.06% of the total test volume in the hospital. The menu of reference laboratory tests comprised 946 tests (65.7% of the hospital test menu) compared to 494 (34.3%) of tests performed in house. The average unit cost of reference laboratory tests was essentially unchanged but was approximately 13 times greater than the average unit cost in the hospital laboratory. Much of the growth in reference laboratory cost can be attributed to the addition of new molecular, genetic, and microbiological assays. Four of the top 10 tests with the highest total cost in 2002 were molecular diagnostic tests that were recently added to the test menu. Reference laboratory testing comprises a major component of hospital clinical laboratory services. Although send out tests represent a small percentage of the total test volume, these services account for the majority of the hospital laboratory test menu and a disproportionate percentage of laboratory costs.
VirSorter: mining viral signal from microbial genomic data.
Roux, Simon; Enault, Francois; Hurwitz, Bonnie L; Sullivan, Matthew B
2015-01-01
Viruses of microbes impact all ecosystems where microbes drive key energy and substrate transformations including the oceans, humans and industrial fermenters. However, despite this recognized importance, our understanding of viral diversity and impacts remains limited by too few model systems and reference genomes. One way to fill these gaps in our knowledge of viral diversity is through the detection of viral signal in microbial genomic data. While multiple approaches have been developed and applied for the detection of prophages (viral genomes integrated in a microbial genome), new types of microbial genomic data are emerging that are more fragmented and larger scale, such as Single-cell Amplified Genomes (SAGs) of uncultivated organisms or genomic fragments assembled from metagenomic sequencing. Here, we present VirSorter, a tool designed to detect viral signal in these different types of microbial sequence data in both a reference-dependent and reference-independent manner, leveraging probabilistic models and extensive virome data to maximize detection of novel viruses. Performance testing shows that VirSorter's prophage prediction capability compares to that of available prophage predictors for complete genomes, but is superior in predicting viral sequences outside of a host genome (i.e., from extrachromosomal prophages, lytic infections, or partially assembled prophages). Furthermore, VirSorter outperforms existing tools for fragmented genomic and metagenomic datasets, and can identify viral signal in assembled sequence (contigs) as short as 3kb, while providing near-perfect identification (>95% Recall and 100% Precision) on contigs of at least 10kb. Because VirSorter scales to large datasets, it can also be used in "reverse" to more confidently identify viral sequence in viral metagenomes by sorting away cellular DNA whether derived from gene transfer agents, generalized transduction or contamination. Finally, VirSorter is made available through the iPlant Cyberinfrastructure that provides a web-based user interface interconnected with the required computing resources. VirSorter thus complements existing prophage prediction softwares to better leverage fragmented, SAG and metagenomic datasets in a way that will scale to modern sequencing. Given these features, VirSorter should enable the discovery of new viruses in microbial datasets, and further our understanding of uncultivated viral communities across diverse ecosystems.
VirSorter: mining viral signal from microbial genomic data
Roux, Simon; Enault, Francois; Hurwitz, Bonnie L.
2015-01-01
Viruses of microbes impact all ecosystems where microbes drive key energy and substrate transformations including the oceans, humans and industrial fermenters. However, despite this recognized importance, our understanding of viral diversity and impacts remains limited by too few model systems and reference genomes. One way to fill these gaps in our knowledge of viral diversity is through the detection of viral signal in microbial genomic data. While multiple approaches have been developed and applied for the detection of prophages (viral genomes integrated in a microbial genome), new types of microbial genomic data are emerging that are more fragmented and larger scale, such as Single-cell Amplified Genomes (SAGs) of uncultivated organisms or genomic fragments assembled from metagenomic sequencing. Here, we present VirSorter, a tool designed to detect viral signal in these different types of microbial sequence data in both a reference-dependent and reference-independent manner, leveraging probabilistic models and extensive virome data to maximize detection of novel viruses. Performance testing shows that VirSorter’s prophage prediction capability compares to that of available prophage predictors for complete genomes, but is superior in predicting viral sequences outside of a host genome (i.e., from extrachromosomal prophages, lytic infections, or partially assembled prophages). Furthermore, VirSorter outperforms existing tools for fragmented genomic and metagenomic datasets, and can identify viral signal in assembled sequence (contigs) as short as 3kb, while providing near-perfect identification (>95% Recall and 100% Precision) on contigs of at least 10kb. Because VirSorter scales to large datasets, it can also be used in “reverse” to more confidently identify viral sequence in viral metagenomes by sorting away cellular DNA whether derived from gene transfer agents, generalized transduction or contamination. Finally, VirSorter is made available through the iPlant Cyberinfrastructure that provides a web-based user interface interconnected with the required computing resources. VirSorter thus complements existing prophage prediction softwares to better leverage fragmented, SAG and metagenomic datasets in a way that will scale to modern sequencing. Given these features, VirSorter should enable the discovery of new viruses in microbial datasets, and further our understanding of uncultivated viral communities across diverse ecosystems. PMID:26038737
Assessing the Formation of Experience-Based Gender Expectations in an Implicit Learning Scenario
Öttl, Anton; Behne, Dawn M.
2017-01-01
The present study investigates the formation of new word-referent associations in an implicit learning scenario, using a gender-coded artificial language with spoken words and visual referents. Previous research has shown that when participants are explicitly instructed about the gender-coding system underlying an artificial lexicon, they monitor the frequency of exposure to male vs. female referents within this lexicon, and subsequently use this probabilistic information to predict the gender of an upcoming referent. In an explicit learning scenario, the auditory and visual gender cues are necessarily highlighted prior to acqusition, and the effects previously observed may therefore depend on participants' overt awareness of these cues. To assess whether the formation of experience-based expectations is dependent on explicit awareness of the underlying coding system, we present data from an experiment in which gender-coding was acquired implicitly, thereby reducing the likelihood that visual and auditory gender cues are used strategically during acquisition. Results show that even if the gender coding system was not perfectly mastered (as reflected in the number of gender coding errors), participants develop frequency based expectations comparable to those previously observed in an explicit learning scenario. In line with previous findings, participants are quicker at recognizing a referent whose gender is consistent with an induced expectation than one whose gender is inconsistent with an induced expectation. At the same time however, eyetracking data suggest that these expectations may surface earlier in an implicit learning scenario. These findings suggest that experience-based expectations are robust against manner of acquisition, and contribute to understanding why similar expectations observed in the activation of stereotypes during the processing of natural language stimuli are difficult or impossible to suppress. PMID:28936186
Quinn, Terence J; Livingstone, Iain; Weir, Alexander; Shaw, Robert; Breckenridge, Andrew; McAlpine, Christine; Tarbert, Claire M
2018-01-01
Visual impairment affects up to 70% of stroke survivors. We designed an app (StrokeVision) to facilitate screening for common post stroke visual issues (acuity, visual fields, and visual inattention). We sought to describe the test time, feasibility, acceptability, and accuracy of our app-based digital visual assessments against (a) current methods used for bedside screening and (b) gold standard measures. Patients were prospectively recruited from acute stroke settings. Index tests were app-based assessments of fields and inattention performed by a trained researcher. We compared against usual clinical screening practice of visual fields to confrontation, including inattention assessment (simultaneous stimuli). We also compared app to gold standard assessments of formal kinetic perimetry (Goldman or Octopus Visual Field Assessment); and pencil and paper-based tests of inattention (Albert's, Star Cancelation, and Line Bisection). Results of inattention and field tests were adjudicated by a specialist Neuro-ophthalmologist. All assessors were masked to each other's results. Participants and assessors graded acceptability using a bespoke scale that ranged from 0 (completely unacceptable) to 10 (perfect acceptability). Of 48 stroke survivors recruited, the complete battery of index and reference tests for fields was successfully completed in 45. Similar acceptability scores were observed for app-based [assessor median score 10 (IQR: 9-10); patient 9 (IQR: 8-10)] and traditional bedside testing [assessor 10 (IQR: 9-10); patient 10 (IQR: 9-10)]. Median test time was longer for app-based testing [combined time to completion of all digital tests 420 s (IQR: 390-588)] when compared with conventional bedside testing [70 s, (IQR: 40-70)], but shorter than gold standard testing [1,260 s, (IQR: 1005-1,620)]. Compared with gold standard assessments, usual screening practice demonstrated 79% sensitivity and 82% specificity for detection of a stroke-related field defect. This compares with 79% sensitivity and 88% specificity for StrokeVision digital assessment. StrokeVision shows promise as a screening tool for visual complications in the acute phase of stroke. The app is at least as good as usual screening and offers other functionality that may make it attractive for use in acute stroke. https://ClinicalTrials.gov/ct2/show/NCT02539381.