Marine chronometry in the Neuchatel mountains (Switzerland)
NASA Astrophysics Data System (ADS)
Fallet, Estelle
The history and evolution of the Swiss marine chronometer industry are summarized. From the 18th century onwards, Neuchatel watchmakers strove to develop precision horology. First J. F. Houriet and later S. Mairet, L. Richard, W. Dubois and H. Grandjean introduced the marine chronometer in the Neuchatel mountains. Precision having become a necessity for the industry, they helped achive this by means of a complex system for the distribution and maintenance of exact time, which allowed optimal adjustment. These men of vision called for the building of a cantonal observatory and strove to have their art practiced in modern watchmaking schools. Under the guidance first of Ulysse and then of Paul David Nardin, the manufacture of marine chronometers began in Le Locle in 1876. In La Chaux-de-Fonds at the beginning of the 20th century, Paul Ditisheim built a number of improved marine, ship and pocket chronometers. Together with scientists and watchmakers, the chronometer makers perfected the regulating parts of the timekeepers and solved the problems of adjustment caused by the various external influences. The manufacturers, the watchmakers at their branches, the timers and the Neuchatel business all contributed to strengthening the position of the products of their region in the world market.
Evolution of heavy-element abundances in the Galactic halo and disk
NASA Technical Reports Server (NTRS)
Mathews, G. J.; Cowan, J. J.; Schramm, D. N.
1988-01-01
The constraints on the universal energy density and cosmological constant from cosmochronological ages and the Hubble age are reviewed. Observational evidence for the galactic chemical evolution of the heavy-element chronometers is descirbed in the context of numerical models. The viability of the recently discovered Th/Nd stellar chronometer is discussed, along with the suggestion that high r-process abundances in metal-poor stars may have resulted from a primordial r-process, as may be required by some inhomogeneous cosmologies.
NASA Astrophysics Data System (ADS)
Sturm, Monika; Richter, Stephan; Aregbe, Yetunde; Wellum, Roger; Mayer, Klaus; Prohaska, Thomas
2014-05-01
Although the age determination of plutonium is and has been a pillar of nuclear forensic investigations for many years, additional research in the field of plutonium age dating is still needed and leads to new insights as the present work shows: Plutonium is commonly dated with the help of the 241Pu/241Am chronometer using gamma spectrometry; in fewer cases the 240Pu/236U chronometer has been used. The age dating results of the 239Pu/235U chronometer and the 238Pu/234U chronometer are scarcely applied in addition to the 240Pu/236U chronometer, although their results can be obtained simultaneously from the same mass spectrometric experiments as the age dating result of latter. The reliability of the result can be tested when the results of different chronometers are compared. The 242Pu/238U chronometer is normally not evaluated at all due to its sensitivity to contamination with natural uranium. This apparent 'weakness' that renders the age dating results of the 242Pu/238U chronometer almost useless for nuclear forensic investigations, however turns out to be an advantage looked at from another perspective: the 242Pu/238U chronometer can be utilized as an indicator for uranium contamination of plutonium samples and even help to identify the nature of this contamination. To illustrate this the age dating results of all four Pu/U clocks mentioned above are discussed for one plutonium sample (NBS 946) that shows no signs of uranium contamination and for three additional plutonium samples. In case the 242Pu/238U chronometer results in an older 'age' than the other Pu/U chronometers, contamination with either a small amount of enriched or with natural or depleted uranium is for example possible. If the age dating result of the 239Pu/235U chronometer is also influenced the nature of the contamination can be identified; enriched uranium is in this latter case a likely cause for the missmatch of the age dating results of the Pu/U chronometers.
Iodine-Xenon Dating: Sensitive Chronometer for Reprocessing in the Primitive Solar System
NASA Technical Reports Server (NTRS)
Pravdivtseva, O. V.; Hohenberg, C. M.
1999-01-01
The I-Xe chronometer is based upon decay of I-129 to Xe-129 in the early Solar System. Recent comparison of I-Xe system in individual mineral separates from twelve different meteorites with independent Pb-Pb data has demonstrated that I-Xe clock is a reliable sensitive chronometer when applied to a single mineral system. Since most iodine hosts are secondary minerals, the I-Xe clock generally records post-formational processing, providing the information on early meteorite evolution. Absolute I-Xe ages can be found by normalization using the measured I-Xe and Pb-Pb ages of Acapulco phosphate (4.557 plus or minus 0.002 Ga). Absolute ages for the I-Xe internal standards Shallow water and Bjurbole, 4.566 plus or minus 0.002 Ga and 4.565 plus or minus 0.003 Ga, respectively, provide absolute I-Xe ages for all other samples. The I-Xe age of bulk meteorite is meaningful and interpretable only when the carrier of primordial iodine is a major mineral phase (e. g., enstatite chondrites). Using the "monomineral" approach, separated phases from the Richardton H5 chondrite provide a case history of post-formational alteration in this object. This work applies the I-Xe chronometer to determine the times of reprocessing of selected minerals in single meteorite types. A preliminary account of this work was recently reported. Additional information is contained in the original extended abstract.
Radiochronometry in the CMX-4 Exercise-Draft
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kristo, M. J.; Williams, R.; Gaffney, A. M.
In a recent international exercise, 10 international nuclear forensics laboratories successfully performed radiochronometry on 3 low-enriched uranium oxide samples, providing 12 analytical results using 3 different parent-daughter pairs serving as independent chronometers. The vast majority of the results were consistent with one another and consistent with the known processing history of the materials. In general, for these particular samples, mass spectrometry gave more accurate and more precise analytical results than decay counting measurements. In addition, the concordance of the 235U- 231Pa and 234U- 230Th chronometers confirmed the validity of the age dating assumption, increasing our confidence in the resulting conclusions.
American Practical Navigator. An Epitome of Navigation. Volume 1
1977-01-01
may also be provided for bearing observations from repeaters. ’’ie telescopic alidade is basically similar to tie beating circle, except that it is...with the half-second beats of a chronometer). This interval can -then be subtracted from the observed time of each sight. 1506. Reading the sextant.-The...chronometer. Chronometers beat in half seconds, with an audible "tick." Ten seconds before the selected time (perhaps a whole minute), the observer
Model selection using cosmic chronometers with Gaussian Processes
NASA Astrophysics Data System (ADS)
Melia, Fulvio; Yennapureddy, Manoj K.
2018-02-01
The use of Gaussian Processes with a measurement of the cosmic expansion rate based solely on the observation of cosmic chronometers provides a completely cosmology-independent reconstruction of the Hubble constant H(z) suitable for testing different models. The corresponding dispersion σH is smaller than ~ 9% over the entire redshift range (lesssim zlesssim 20) of the observations, rivaling many kinds of cosmological measurements available today. We use the reconstructed H(z) function to test six different cosmologies, and show that it favours the Rh=ct universe, which has only one free parameter (i.e., H0) over other models, including Planck ΛCDM . The parameters of the standard model may be re-optimized to improve the fits to the reconstructed H(z) function, but the results have smaller p-values than one finds with Rh=ct.
NASA Astrophysics Data System (ADS)
Gómez-Valent, Adrià; Amendola, Luca
2018-04-01
In this paper we present new constraints on the Hubble parameter H0 using: (i) the available data on H(z) obtained from cosmic chronometers (CCH); (ii) the Hubble rate data points extracted from the supernovae of Type Ia (SnIa) of the Pantheon compilation and the Hubble Space Telescope (HST) CANDELS and CLASH Multy-Cycle Treasury (MCT) programs; and (iii) the local HST measurement of H0 provided by Riess et al. (2018), H0HST=(73.45±1.66) km/s/Mpc. Various determinations of H0 using the Gaussian processes (GPs) method and the most updated list of CCH data have been recently provided by Yu, Ratra & Wang (2018). Using the Gaussian kernel they find H0=(67.42± 4.75) km/s/Mpc. Here we extend their analysis to also include the most released and complete set of SnIa data, which allows us to reduce the uncertainty by a factor ~ 3 with respect to the result found by only considering the CCH information. We obtain H0=(67.06± 1.68) km/s/Mpc, which favors again the lower range of values for H0 and is in tension with H0HST. The tension reaches the 2.71σ level. We round off the GPs determination too by taking also into account the error propagation of the kernel hyperparameters when the CCH with and without H0HST are used in the analysis. In addition, we present a novel method to reconstruct functions from data, which consists in a weighted sum of polynomial regressions (WPR). We apply it from a cosmographic perspective to reconstruct H(z) and estimate H0 from CCH and SnIa measurements. The result obtained with this method, H0=(68.90± 1.96) km/s/Mpc, is fully compatible with the GPs ones. Finally, a more conservative GPs+WPR value is also provided, H0=(68.45± 2.00) km/s/Mpc, which is still almost 2σ away from H0HST.
NASA Astrophysics Data System (ADS)
alhilman, Judi
2017-12-01
In the production line process of the printing office, the reliability of the printing machine plays a very important role, if the machine fail it can disrupt production target so that the company will suffer huge financial loss. One method to calculate the financial loss cause by machine failure is use the Cost of Unreliability(COUR) method. COUR method works based on down time machine and costs associated with unreliability data. Based on the calculation of COUR method, so the sum of cost due to unreliability printing machine during active repair time and downtime is 1003,747.00.
System Study: Emergency Power System 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the emergency power system (EPS) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period while yearly estimates for system unreliability are provided for the entire active period. An extremely statistically significant increasing trend was observed for EPS system unreliability for an 8-hour mission. A statistically significant increasing trend was observed for EPS system start-onlymore » unreliability.« less
Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal
2015-07-01
Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
System Study: Isolation Condenser 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the isolation condenser (ISO) system at four U.S. boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing trends were identified. A statistically significant decreasing trend was identified for ISO unreliability. The magnitude of the trend indicated a 1.5 percent decrease inmore » system unreliability over the last 10 years.« less
Settling the half-life of 60Fe: fundamental for a versatile astrophysical chronometer.
Wallner, A; Bichler, M; Buczak, K; Dressler, R; Fifield, L K; Schumann, D; Sterba, J H; Tims, S G; Wallner, G; Kutschera, W
2015-01-30
In order to resolve a recent discrepancy in the half-life of 60Fe, we performed an independent measurement with a new method that determines the 60Fe content of a material relative to 55Fe (t1/2=2.744 yr) with accelerator mass spectrometry. Our result of (2.50±0.12)×10(6) yr clearly favors the recently reported value (2.62±0.04)×10(6) yr, and rules out the older result of (1.49±0.27)×10(6) yr. The present weighted mean half-life value of (2.60±0.05)×10(6) yr substantially improves the reliability as an important chronometer for astrophysical applications in the million-year time range. This includes its use as a sensitive probe for studying recent chemical evolution of our Galaxy, the formation of the early Solar System, nucleosynthesis processes in massive stars, and as an indicator of a recent nearby supernova.
NASA Astrophysics Data System (ADS)
Hill, V.; Christlieb, N.; Beers, T. C.; Barklem, P. S.; Kratz, K.-L.; Nordström, B.; Pfeiffer, B.; Farouqi, K.
2017-11-01
We report an abundance analysis for the highly r-process-enhanced (r-II) star CS 29497-004, a very metal-poor giant with solar system Teff = 5013 K and [Fe/H] = -2.85, whose nature was initially discovered in the course of the HERES project. Our analysis is based on high signal-to-noise ratio, high-resolution (R 75 000) VLT/UVES spectra and MARCS model atmospheres under the assumption of local thermodynamic equilibrium, and obtains abundance measurements for a total of 46 elements, 31 of which are neutron-capture elements. As is the case for the other 25 r-II stars currently known, the heavy-element abundance pattern of CS 29497-004 well-matches a scaled solar system second peak r-process-element abundance pattern. We confirm our previous detection of Th, and demonstrate that this star does not exhibit an "actinide boost". Uranium is also detected (log ɛ(U) = -2.20 ± 0.30), albeit with a large measurement error that hampers its use as a precision cosmo-chronometer. Combining the various elemental chronometer pairs that are available for this star, we derive a mean age of 12.2 ± 3.7 Gyr using the theoretical production ratios from published waiting-point approximation models. We further explore the high-entropy wind model (Farouqi et al. 2010, ApJ, 712, 1359) production ratios arising from different neutron richness of the ejecta (Ye), and derive an age of 13.7 ± 4.4 Gyr for a best-fitting Ye = 0.447. The U/Th nuclei-chronometer is confirmed to be the most resilient to theoretical production ratios and yields an age of 16.5 ± 6.6 Gyr. Lead (Pb) is also tentatively detected in CS 29497-004, at a level compatible with a scaled solar r-process, or with the theoretical expectations for a pure r-process in this star. Based on observations collected at the European Southern Observatory, Paranal, Chile (Proposal Number 170.D-0010).Table B.1 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/607/A91
System Study: Residual Heat Removal 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the residual heat removal (RHR) system in two modes of operation (low-pressure injection in response to a large loss-of-coolant accident and post-trip shutdown-cooling) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing trends were identified in themore » RHR results. A highly statistically significant decreasing trend was observed for the RHR injection mode start-only unreliability. Statistically significant decreasing trends were observed for RHR shutdown cooling mode start-only unreliability and RHR shutdown cooling model 24-hour unreliability.« less
Clark, Jordan; Urióstegui, Stephanie; Bibby, Richard; ...
2016-10-25
The application of the cosmogenic radioisotope sulfur-35 ( 35S) as a chronometer near spreading basins is evaluated at two well-established Managed Aquifer Recharge (MAR) sites: the Atlantis facility (South Africa) and Orange County Water District’s (OCWD’s) Kraemer Basin (Northern Orange County, CA, USA). Source water for both of these sites includes recycled wastewater. Despite lying nearer to the outlet end of their respective watersheds than to the headwaters, 35S was detected in most of the water sampled, including from wells found close to the spreading ponds and in the source water. Dilution with 35S-dead continental SO 4 was minimal, amore » surprising finding given its short ~3 month half-life. The initial work at the Atlantis MAR site demonstrated that remote laboratories could be set up and that small volume samples—saline solutions collected after the resin elution step from the recently developed batch method described below—can be stored and transported to the counting laboratory. This study also showed that the batch method needed to be altered to remove unknown compounds eluted from the resin along with SO 4. Using the improved batch method, times series measurements of both source and well water from OCWD’s MAR site showed significant temporal variations. Finally, this result indicates that during future studies, monthly to semi-monthly sampling should be conducted. Nevertheless, both of these initial studies suggest the 35S chronometer may become a valuable tool for managing MAR sites where regulations require minimum retention times.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Jordan; Urióstegui, Stephanie; Bibby, Richard
The application of the cosmogenic radioisotope sulfur-35 ( 35S) as a chronometer near spreading basins is evaluated at two well-established Managed Aquifer Recharge (MAR) sites: the Atlantis facility (South Africa) and Orange County Water District’s (OCWD’s) Kraemer Basin (Northern Orange County, CA, USA). Source water for both of these sites includes recycled wastewater. Despite lying nearer to the outlet end of their respective watersheds than to the headwaters, 35S was detected in most of the water sampled, including from wells found close to the spreading ponds and in the source water. Dilution with 35S-dead continental SO 4 was minimal, amore » surprising finding given its short ~3 month half-life. The initial work at the Atlantis MAR site demonstrated that remote laboratories could be set up and that small volume samples—saline solutions collected after the resin elution step from the recently developed batch method described below—can be stored and transported to the counting laboratory. This study also showed that the batch method needed to be altered to remove unknown compounds eluted from the resin along with SO 4. Using the improved batch method, times series measurements of both source and well water from OCWD’s MAR site showed significant temporal variations. Finally, this result indicates that during future studies, monthly to semi-monthly sampling should be conducted. Nevertheless, both of these initial studies suggest the 35S chronometer may become a valuable tool for managing MAR sites where regulations require minimum retention times.« less
System Study: Emergency Power System 1998–2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-02-01
This report presents an unreliability evaluation of the emergency power system (EPS) at 104 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2013 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10-year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant trends were identified in the EPS results.
System Study: Reactor Core Isolation Cooling 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the reactor core isolation cooling (RCIC) system at 31 U.S. commercial boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant trends were identified in the RCIC results.
System Study: Auxiliary Feedwater 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the auxiliary feedwater (AFW) system at 69 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the AFW results.
NASA Astrophysics Data System (ADS)
Liu, G. C.; Lu, Y. J.; Xie, L. Z.; Chen, X. L.; Zhao, Y. H.
2016-01-01
Context. Massive luminous red galaxies (LRGs) are believed to be evolving passively and can be used as cosmic chronometers to estimate the Hubble constant (the differential age method). However, different LRGs may be located in different environments. The environmental effects, if any, on the mean ages of LRGs, and the ages of the oldest LRGs at different redshift, may limit the use of the LRGs as cosmic chronometers. Aims: We aim to investigate the environmental and mass dependence of the formation of "quiescent" LRGs, selected from the Sloan Digital Sky Survey (SDSS) data release 8, and to pave the way for using LRGs as cosmic chronometers. Methods: Using the population synthesis software STARLIGHT, we derive the stellar populations in each LRG through the full spectrum fitting and obtain the mean age distribution and the mean star formation history (SFH) of those LRGs. Results: We find that there is no apparent dependence of the mean age and the SFH of quiescent LRGs on their environment, while the ages of those quiescent LRGs depend weakly on their mass. We compare the SFHs of the SDSS LRGs with those obtained from a semi-analytical galaxy formation model and find that they are roughly consistent with each other if we consider the errors in the STARLIGHT-derived ages. We find that a small fraction of later star formation in LRGs leads to a systematical overestimation (~28%) of the Hubble constant by the differential age method, and the systematical errors in the STARLIGHT-derived ages may lead to an underestimation (~ 16%) of the Hubble constant. However, these errors can be corrected by a detailed study of the mean SFH of those LRGs and by calibrating the STARLIGHT-derived ages with those obtained independently by other methods. Conclusions: The environmental effects do not play a significant role in the age estimates of quiescent LRGs; and the quiescent LRGs as a population can be used securely as cosmic chronometers, and the Hubble constant can be measured with high precision by using the differential age method.
Recent results in nucleocosmochronology
NASA Astrophysics Data System (ADS)
Cowan, John J.; Thielemann, F.-K.; Truran, J. W.
Rates for beta-delayed neutron emission and fission have recently been calculated for the mass range 70-100. Using these new rates and the calculated rates for neutron capture, photodisintegration and beta decay, dynamical r-process calculations have been performed. For certain assumed conditions, these r-process calculations give a good fit to the solar system r-process abundance curve. These calculations have been used to obtain new production ratios for the nuclear chronometer pairs used to determine the age of the Galaxy - (Th-232)/(U-238) = 1.60, (U-235)/(U-238) = 1.16, and (Pu-244)/(U-238) = 0.40. Using the new production ratio for (Th-232)/(U-238) and (U-235)/(U238), with the observed meteoritic values for these nuclei and assuming a model of chemical evolution of the Galaxy, the age of the Galaxy has been determined. The results depend upon the initial nucleosynthesis enrichment in the Galactic disk, S0. While there are uncertainties in the calculations for a range of S0 from 0.1 to 0.3 (i.e., from 10 to 30 percent), the age of the Galaxy is found to be 12.4-14.7 Gyr.
System Study: High-Pressure Coolant Injection 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the high-pressure coolant injection system (HPCI) at 25 U.S. commercial boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the HPCI results.
System Study: High-Pressure Core Spray 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the high-pressure core spray (HPCS) at eight U.S. commercial boiling water reactors. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the HPCS results.
System Study: High-Pressure Safety Injection 1998-2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, John Alton
2015-12-01
This report presents an unreliability evaluation of the high-pressure safety injection system (HPSI) at 69 U.S. commercial nuclear power plants. Demand, run hours, and failure data from fiscal year 1998 through 2014 for selected components were obtained from the Institute of Nuclear Power Operations (INPO) Consolidated Events Database (ICES). The unreliability results are trended for the most recent 10 year period, while yearly estimates for system unreliability are provided for the entire active period. No statistically significant increasing or decreasing trends were identified in the HPSI results.
Army’s Audit Readiness at Risk Because of Unreliable Data in the Appropriation Status Report
2014-06-26
for data reviewed from the December 2012 report. Material differences existed between reported data from the General Fund Enterprise Business System...GFEBS. An independent public accounting firm, KPMG LLP, performed examinations of SBR business processes at Army activities using GFEBS. KPMG LLP...to its processes. A second report from KPMG LLP, issued April 9, 2013, reported that the Army did not meet the FIAR guidance requirements for
Low sulfur content in submarine lavas: an unreliable indicator of subaerial eruption
Davis, A.S.; Clague, D.A.; Schulz, M.S.; Hein, J.R.
1991-01-01
Low S content (<250 ppm) has been used to identify subaerially erupted Hawaiian and Icelandic lavas. Large differences in S content of submarine-erupted lavas from different tectonic settings indicate that the behavior of S is complex. Variations in S abundance in undegassed, submarine-erupted lavas can result from different source compositions, different percentages of partial melting, and crystal fractionation. Low S concentrations in highly vesicular submarine lavas suggest that partial degassing can occur despite great hydrostatic pressure. These processes need to be evaluated before using S content as an indicator of eruption depth. -Authors
I-Xe Dating: The Time Line of Chondrule Formation and Metamorphism in LL Chondrites
NASA Technical Reports Server (NTRS)
Pravdivtseva, O. V.; Hohenberg, C. M.; Meshik, A. P.
2005-01-01
Refractory inclusions, considered to be the oldest solids formed in the solar nebula. (4567.2 0.6 Ma) [1], are common in many carbonaceous and in some ordinary and enstatite chondrites. High-precision Pb- Pb ages for CAI s and chondrules (from different meteorites) suggested that chondrule formation appeared to have started about 2 Ma later than that of CAIs [1]. However, recent 26Al/26Mg data suggest simultaneous formation of CAI s and chondrules in Allende [2]. The I-Xe ages of CAI s in Allende are about 2 Ma younger than the I-Xe ages of Allende chondrules [3] but, like all chronometers, the I-Xe system records closure time of its particular host phase. In the case of Allende CAI s, the major iodine-bearing phase is sodalite, a secondary phase presumably formed by aqueous alteration, so I-Xe reflects the post-formational processes in these objects. In chondrules the iodine host phases vary and can reflect formation and/or alteration but, to put chondrule ages on a quantative basis, some problems should first be addressed.
The R-Process Alliance: 2MASS J09544277+5246414, the Most Actinide-enhanced R-II Star Known
NASA Astrophysics Data System (ADS)
Holmbeck, Erika M.; Beers, Timothy C.; Roederer, Ian U.; Placco, Vinicius M.; Hansen, Terese T.; Sakari, Charli M.; Sneden, Christopher; Liu, Chao; Lee, Young Sun; Cowan, John J.; Frebel, Anna
2018-06-01
We report the discovery of a new actinide-boost star, 2MASS J09544277+5246414, originally identified as a very bright (V = 10.1), extremely metal-poor ([Fe/H] = ‑2.99) K giant in the LAMOST survey, and found to be highly r-process-enhanced (r-II; [Eu/Fe] = +1.28]), during the snapshot phase of the R-Process Alliance (RPA). Based on a high signal-to-noise ratio (S/N), high-resolution spectrum obtained with the Harlan J. Smith 2.7 m telescope, this star is the first confirmed actinide-boost star found by RPA efforts. With an enhancement of [Th/Eu] = +0.37, 2MASS J09544277+5246414 is also the most actinide-enhanced r-II star yet discovered, and only the sixth metal-poor star with a measured uranium abundance ([U/Fe] = +1.40). Using the Th/U chronometer, we estimate an age of 13.0 ± 4.7 Gyr for this star. The unambiguous actinide-boost signature of this extremely metal-poor star, combined with additional r-process-enhanced and actinide-boost stars identified by the RPA, will provide strong constraints on the nature and origin of the r-process at early times.
Asteroidal Differentiation Processes Deduced from Ultramafic Achondrite Ureilite Meteorites
NASA Technical Reports Server (NTRS)
Downes, Hilary; Mittlefehldt, David W.; Hudson, Pierre; Romanek, Christopher S.; Franchi, Ian
2006-01-01
Ureilites are the second largest achondrite group. They are ultramafic achondrites that have experienced igneous processing whilst retaining some degree of nebula-derived chemical heterogeneity. They differ from other achondrites in that they contain abundant carbon and their oxygen isotope compositions are very heterogeneous and similar to those of the carbonaceous chondrite anhydrous mineral line. Their carbonaceous nature and some compositional characteristics indicative of nebular origin suggest that they are primitive materials that form a link between nebular processes and early periods of planetesimal accretion. However, despite numerous studies, the exact origin of ureilites remains unclear. Current opinion is that they represent the residual mantle of an asteroid that underwent silicate and Fe-Ni-S partial melting and melt removal. Recent studies of short-lived chronometers indicate that the parent asteroid of the ureilites differentiated very early in the history of the Solar System. Therefore, they contain important information about processes that formed small rocky planetesimals in the early Solar System. In effect, they form a bridge between nebula processes and differentiation in small planetesimals prior to accretion into larger planets and so a correct interpretation of ureilite petrogenesis is essential for understanding this critical step.
Isotopes as clues to the origin and earliest differentiation history of the Earth.
Jacobsen, Stein B; Ranen, Michael C; Petaev, Michael I; Remo, John L; O'Connell, Richard J; Sasselov, Dimitar D
2008-11-28
Measurable variations in (182)W/(183)W, (142)Nd/(144)Nd, (129)Xe/(130)Xe and (136)XePu/(130)Xe in the Earth and meteorites provide a record of accretion and formation of the core, early crust and atmosphere. These variations are due to the decay of the now extinct nuclides (182)Hf, (146)Sm, (129)I and (244)Pu. The (l82)Hf-(182)W system is the best accretion and core-formation chronometer, which yields a mean time of Earth's formation of 10Myr, and a total time scale of 30Myr. New laser shock data at conditions comparable with those in the Earth's deep mantle subsequent to the giant Moon-forming impact suggest that metal-silicate equilibration was rapid enough for the Hf-W chronometer to reliably record this time scale. The coupled (146)Sm-(147)Sm chronometer is the best system for determining the initial silicate differentiation (magma ocean crystallization and proto-crust formation), which took place at ca 4.47Ga or perhaps even earlier. The presence of a large (129)Xe excess in the deep Earth is consistent with a very early atmosphere formation (as early as 30Myr); however, the interpretation is complicated by the fact that most of the atmospheric Xe may be from a volatile-rich late veneer.
2018-04-12
non-directional) wave spectra, but we consider the energy at high frequencies to be unreliable, so we only use significant waveheight Hs and dominant...spectral density, N=E/s), which is a function of wavenumber or frequency (k or s), direction (θ), space (x,y), and time (t), with spectral density...Elgar 1987). As the spectra are now co-located in time, space , and frequency , the inversion is simply a minimization process for |logVR(6jvH>w(9
Constraining the 40K decay constant with 87Rb-87Sr - 40K-40Ca chronometer intercomparison
NASA Astrophysics Data System (ADS)
Naumenko-Dèzes, Maria O.; Nägler, Thomas F.; Mezger, Klaus; Villa, Igor M.
2018-01-01
A literature survey reveals that the K-Ar chronometer gives ages that are ca. 1% younger than U-Pb ages. This offset is generally attributed to an inaccurate 40K decay constant. Three geological samples selected from a shortlist of eight with known U-Pb ages were investigated using detailed petrological methods and subsequently the Rb-Sr and K-Ca chronometers in order (a) to evaluate if they meet the requirement of a geological history reflecting a ;point-like; event (i.e. isochronous formation and subsequent ideal closure of chronometers) and (b) to narrow down the systematic uncertainty on the 40K decay constant by investigating the metrologically traceable K-Ca decay branch. Lepidolite of the Rubikon pegmatite, Namibia, was dated with Rb-Sr at 504.7 ± 4.2 Ma and the phlogopite and apatite from the Phalaborwa carbonatite complex, South Africa, yielded a Rb-Sr age of 2058.9 ± 5.2 Ma. Both Rb-Sr ages agree with published U-Pb ages. The Rb-Sr age of the late Archean Siilinjärvi carbonatite, Finland, records a later regional metamorphic event at 1869 ± 10 Ma. Only the samples from the Phalaborwa complex represent a ;point-like; magmatic event and meet all the criteria to make them suitable for the 40K decay constant intercalibration. The Phalaborwa K-Ca isochron has a slope of 1.878 ± 0.012. Forcing the K-Ca isochron to coincide with the U-Pb and Rb-Sr ages gives one equation with two unknowns. Assuming that the branching ratio of the K-Ca branch, BCa, lies in the interval (k = 2) of all published references, 0.8925 < BCa < 0.8963, then the most reliable uncertainty interval (k = 2) for the total 40K decay constant, λtot, is calculated as 5.484 × 10-10 a-1 < λtot < 5.498 × 10-10 a-1. This confirms that the currently used IUGS recommendation is inaccurate.
NASA Astrophysics Data System (ADS)
Cargile, Phillip; James, D. J.; Pepper, J.; Kuhn, R.; Siverd, R. J.; Stassun, K. G.
2012-01-01
The age of a star is one of its most fundamental properties, and yet tragically it is also the one property that is not directly measurable in observations. We must therefore rely on age estimates based on mostly model-dependent or empirical methods. Moreover, there remains a critical need for direct comparison of different age-dating techniques using the same stars analyzed in a consistent fashion. One chronometer commonly being employed is using stellar rotation rates to measure stellar ages, i.e., gyrochronology. Although this technique is one of the better-understood chronometers, its calibration relies heavily on the solar datum, as well as benchmark open clusters with reliable ages, and also lacks a comprehensive comparative analysis to other stellar chronometers. The age of the nearby (? pc) open cluster Blanco 1 has been estimated using various techniques, including being one of only 7 clusters with an LDB age measurement, making it a unique and powerful comparative laboratory for stellar chronometry, including gyrochronology. Here, we present preliminary results from our light-curve analysis of solar-type stars in Blanco 1 in order to identify and measure rotation periods of cluster members. The light-curve data were obtained during the engineering and calibration phase of the KELT-South survey. The large area on the sky and low number of contaminating field stars makes Blanco 1 an ideal target for the extremely wide field and large pixel scale of the KELT telescope. We apply a period-finding technique using the Lomb-Scargle periodogram and FAP statistics to measure significant rotation periods in the KELT-South light curves for confirmed Blanco 1 members. These new rotation periods allow us to test and inform rotation evolution models for stellar ages at ? Myr, determining a rotation-age for Blanco 1 using gyrochronology, and compare this rotation-age to other age measurements for this cluster.
Constraints on a generalized deceleration parameter from cosmic chronometers
NASA Astrophysics Data System (ADS)
Mamon, Abdulla Al
2018-04-01
In this paper, we have proposed a generalized parametrization for the deceleration parameter q in order to study the evolutionary history of the universe. We have shown that the proposed model can reproduce three well known q-parametrized models for some specific values of the model parameter α. We have used the latest compilation of the Hubble parameter measurements obtained from the cosmic chronometer (CC) method (in combination with the local value of the Hubble constant H0) and the Type Ia supernova (SNIa) data to place constraints on the parameters of the model for different values of α. We have found that the resulting constraints on the deceleration parameter and the dark energy equation of state support the ΛCDM model within 1σ confidence level at the present epoch.
Synaptic unreliability facilitates information transmission in balanced cortical populations
NASA Astrophysics Data System (ADS)
Gatys, Leon A.; Ecker, Alexander S.; Tchumatchenko, Tatjana; Bethge, Matthias
2015-06-01
Synaptic unreliability is one of the major sources of biophysical noise in the brain. In the context of neural information processing, it is a central question how neural systems can afford this unreliability. Here we examine how synaptic noise affects signal transmission in cortical circuits, where excitation and inhibition are thought to be tightly balanced. Surprisingly, we find that in this balanced state synaptic response variability actually facilitates information transmission, rather than impairing it. In particular, the transmission of fast-varying signals benefits from synaptic noise, as it instantaneously increases the amount of information shared between presynaptic signal and postsynaptic current. Furthermore we show that the beneficial effect of noise is based on a very general mechanism which contrary to stochastic resonance does not reach an optimum at a finite noise level.
Neutron-Capture Elements in Very Metal-Poor Halo Stars
NASA Astrophysics Data System (ADS)
French, R. S.; Sneden, C.; Cowan, J. J.; Lawler, J. E.; Primas, F.; Beers, T. C.; Truran, J. W.
2000-05-01
Abundances of the most massive stable elements (Os -> Pb or 76 <= Z <= 82) in metal-poor stars can provide crucial information about the so-called ``third neutron-capture peak,'' and are critical to the radioactive-dating technique that uses unstable thorium and uranium as chronometers. As the relevant transitions occur in the UV and are inaccessable to ground-based telescopes, we have obtained high resolution (R ~= 30,000) UV spectra of 10 very metal-poor (--3.0 <= [Fe/H] <= --1.4) halo giants using the Space Telescope Imaging Spectrograph (STIS) aboard the Hubble Space Telescope. Using iterative spectrum synthesis techniques, we derive abundances for some of these heavy elements. We compare our abundances to those predicted for very metal-poor stars based on a scaled solar system rapid-process (production in rapid neutron-capture synthesis events, such as occurs during supernovae explosions). This research is supported by NASA STScI grant GO-08342 and NSF grants AST-9618364 to C.S. and AST-9618332 to J.J.C.
New observational constraints on f ( R ) gravity from cosmic chronometers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nunes, Rafael C.; Pan, Supriya; Saridakis, Emmanuel N.
We use the recently released cosmic chronometer data and the latest measured value of the local Hubble parameter, combined with the latest joint light curves of Supernovae Type Ia, and Baryon Acoustic Oscillation distance measurements, in order to impose constraints on the viable and most used f ( R ) gravity models. We consider four f ( R ) models, namely the Hu-Sawicki, the Starobinsky, the Tsujikawa, and the exponential one, and we parametrize them introducing a distortion parameter b that quantifies the deviation from ΛCDM cosmology. Our analysis reveals that a small but non-zero deviation from ΛCDM cosmology ismore » slightly favored, with the corresponding fittings exhibiting very efficient AIC and BIC Information Criteria values. Clearly, f ( R ) gravity is consistent with observations, and it can serve as a candidate for modified gravity.« less
Constraining the evolution of the Hubble Parameter using cosmic chronometers
NASA Astrophysics Data System (ADS)
Dickinson, Hugh
2017-08-01
Substantial investment is being made in space- and ground-based missions with the goal of revealing the nature of the observed cosmic acceleration. This is one of the most important unsolved problems in cosmology today.We propose here to constrain the evolution of the Hubble parameter [H(z)] between 1.3 < z < 2, using the cosmic chronometer method, based on differential age measurements for passively evolving galaxies. Existing WFC3-IR G102 and G141 grisms data obtained by the WISP, 3D-HST+AGHAST, FIGS, and CLEAR surveys will yield a sample of 140 suitable standard clocks, expanding existing samples by a factor of five. These additional data will enable us to improve existing constraints on the evolution of H at high redshift, and insodoing to better understand the fundamental nature of dark energy.
Huang, Ai-Mei; Nguyen, Truong
2009-04-01
In this paper, we address the problems of unreliable motion vectors that cause visual artifacts but cannot be detected by high residual energy or bidirectional prediction difference in motion-compensated frame interpolation. A correlation-based motion vector processing method is proposed to detect and correct those unreliable motion vectors by explicitly considering motion vector correlation in the motion vector reliability classification, motion vector correction, and frame interpolation stages. Since our method gradually corrects unreliable motion vectors based on their reliability, we can effectively discover the areas where no motion is reliable to be used, such as occlusions and deformed structures. We also propose an adaptive frame interpolation scheme for the occlusion areas based on the analysis of their surrounding motion distribution. As a result, the interpolated frames using the proposed scheme have clearer structure edges and ghost artifacts are also greatly reduced. Experimental results show that our interpolated results have better visual quality than other methods. In addition, the proposed scheme is robust even for those video sequences that contain multiple and fast motions.
Cost benefits of advanced software: A review of methodology used at Kennedy Space Center
NASA Technical Reports Server (NTRS)
Joglekar, Prafulla N.
1993-01-01
To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.
Implications of clinical trial design on sample size requirements.
Leon, Andrew C
2008-07-01
The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.
30. Engine controls and valve gear, looking aft on main ...
30. Engine controls and valve gear, looking aft on main (promenade) deck level. Threaded admission valve lift rods (two at immediate left of chronometer) permit adjustment of valve timing in lower and upper admission valves of cylinder (left rod controls lower valve, right rod upper valve). Valve rods are lifted by jaw-like "wipers" during operation. Exhaust valve lift rods and wipers are located to right of chronometer. Crank at extreme right drives valve wiper shaft when engaged to end of eccentric rod, shown under "Crank Indicator" dial. Pair of handles to immediate left of admission valve rods control condenser water valves; handles to right of exhaust valve rods control feedwater flow to boilers from pumps. Gauges indicate boiler pressure (left) and condenser vacuum (right); "Crank Indicator" on wall aids engineer in keeping engine crank off "dead-center" at stop so that engine may be easily restarted. - Steamboat TICONDEROGA, Shelburne Museum Route 7, Shelburne, Chittenden County, VT
New observational constraints on f ( T ) gravity from cosmic chronometers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nunes, Rafael C.; Pan, Supriya; Saridakis, Emmanuel N., E-mail: nunes@ecm.ub.edu, E-mail: span@iiserkol.ac.in, E-mail: Emmanuel_Saridakis@baylor.edu
2016-08-01
We use the local value of the Hubble constant recently measured with 2.4% precision, as well as the latest compilation of cosmic chronometers data, together with standard probes such as Supernovae Type Ia and Baryon Acoustic Oscillation distance measurements, in order to impose constraints on the viable and most used f ( T ) gravity models, where T is the torsion scalar in teleparallel gravity. In particular, we consider three f ( T ) models with two parameters, out of which one is independent, and we quantify their deviation from ΛCDM cosmology through a sole parameter. Our analysis reveals thatmore » for one of the models a small but non-zero deviation from ΛCDM cosmology is slightly favored, while for the other models the best fit is very close to ΛCDM scenario. Clearly, f ( T ) gravity is consistent with observations, and it can serve as a candidate for modified gravity.« less
Strength of Intentional Effort Enhances the Sense of Agency
Minohara, Rin; Wen, Wen; Hamasaki, Shunsuke; Maeda, Takaki; Kato, Motoichiro; Yamakawa, Hiroshi; Yamashita, Atsushi; Asama, Hajime
2016-01-01
Sense of agency (SoA) refers to the feeling of controlling one’s own actions, and the experience of controlling external events with one’s actions. The present study examined the effect of strength of intentional effort on SoA. We manipulated the strength of intentional effort using three types of buttons that differed in the amount of force required to depress them. We used a self-attribution task as an explicit measure of SoA. The results indicate that strength of intentional effort enhanced self-attribution when action-effect congruency was unreliable. We concluded that intentional effort importantly affects the integration of multiple cues affecting explicit judgments of agency when the causal relationship action and effect was unreliable. PMID:27536267
NASA Astrophysics Data System (ADS)
Bowring, S. A.
2010-12-01
Over the past two decades, U-Pb geochronology by ID-TIMS has been refined to achieve internal (analytical) uncertainties on a single grain analysis of ± ~ 0.1-0.2%, and 0.05% or better on weighted mean dates. This level of precision enables unprecedented evaluation of the rates and durations of geological processes, from magma chamber evolution to mass extinctions and recoveries. The increased precision, however, exposes complexity in magmatic/volcanic systems and highlights the importance of corrections related to disequilibrium partitioning of intermediate daughter products, and raises questions as to how best to interpret the complex spectrum of dates characteristic of many volcanic rocks. In addition, the increased precision requires renewed emphasis on the accuracy of U decay constants, the isotopic composition of U, the calibration of isotopic tracers, and the accurate propagation of uncertainties It is now commonplace in the high precision dating of volcanic ash-beds to analyze 5-20 single grains of zircon in an attempt to resolve the eruption/depositional age. Data sets with dispersion far in excess of analytical uncertainties are interpreted to reflect Pb-loss, inheritance, and protracted crystallization, often supported with zircon chemistry. In most cases, a weighted mean of the youngest reproducible dates is interpreted as the time of eruption/deposition. Crystallization histories of silicic magmatic systems recovered from plutonic rocks may also be protracted, though may not be directly applicable to silicic eruptions; each sample must be evaluated independently. A key to robust interpretations is the integration high-spatial resolution zircon trace element geochemistry with high-precision ID-TIMS analyses. The EARTHTIME initiative has focused on many of these issues, and the larger subject of constructing a timeline for earth history using both U-Pb and Ar-Ar chronometers. Despite continuing improvements in both, comparing dates for the same rock with both chronometers is not straightforward. Compelling issues range from pre-eruptive magma chamber residence, recognizing open system behavior, accurately correcting for disequilibrium amounts of 230Th and 231Pa, precise and accurate dates of fluence monitors for 40Ar/39Ar, and inter-laboratory biases. At present, despite the level of internal precision achievable by each technique, obstacles remain to combining both chronometers.
NASA Astrophysics Data System (ADS)
Burkhardt, Christoph; Schönbächler, Maria
2015-09-01
The progressive dissolution of the carbonaceous chondrites Orgueil (CI1), Murchison (CM2) and Allende (CV3) with acids of increasing strength reveals correlated W isotope variations ranging from 3.5 ε182W and 6.5 ε183W in the initial leachate (acetic acid) to -60 ε182W and -40 ε183W in the leachate residue. The observed variations are readily explained by variable mixing of s-process depleted and s-process enriched components. One W s-process carrier is SiC, however, the observed anomaly patterns and mass-balance considerations require at least on additional s-process carrier, possibly a silicate or sulfide. The data reveal well-defined correlations, which provide a test for s-process nucleosynthesis models. The correlations demonstrate that current models need to be revised and highlight the need for more precise W isotope data of SiC grains. Furthermore the correlations provide a mean to disentangle nucleosynthetic and radiogenic contributions to 182W (ε182Wcorrected = ε182Wmeasured - (1.41 ± 0.05) × ε183Wmeasured; ε182Wcorrected = ε182Wmeasured - (-0.12 ± 0.06) × ε184Wmeasured), a prerequisite for the successful application of the Hf-W chronometer to samples with nucleosynthetic anomalies. The overall magnitude of the W isotope variations decreases in the order CI1 > CM2 > CV3. This can be interpreted as the progressive thermal destruction of an initially homogeneous mixture of presolar grains by parent-body processing. However, not only the magnitude but also the W anomaly patterns of the three chondrites are different. In particular leach step 2, that employs nitric acid, reveals a s-deficit signature for Murchison, but a s-excess for Orgueil and Allende. This could be the result of redistribution of anomalous W into a new phase by parent-body alteration, or, the fingerprint of dust processing in the solar nebula. Given that the thermal and aqueous alteration of Murchison is between the CI and CV3 chondrites, parent-body processing is probably not the sole cause for creating the different pattern. Small-scale nebular redistribution of anomalous W may have played a role as well. Similar nebular processes possibly acted differently on specific carrier phases and elements, resulting in the diverse nucleosynthetic signatures observed in planetary materials today.
A Black-Scholes Approach to Satisfying the Demand in a Failure-Prone Manufacturing System
NASA Technical Reports Server (NTRS)
Chavez-Fuentes, Jorge R.; Gonzalex, Oscar R.; Gray, W. Steven
2007-01-01
The goal of this paper is to use a financial model and a hedging strategy in a systems application. In particular, the classical Black-Scholes model, which was developed in 1973 to find the fair price of a financial contract, is adapted to satisfy an uncertain demand in a manufacturing system when one of two production machines is unreliable. This financial model together with a hedging strategy are used to develop a closed formula for the production strategies of each machine. The strategy guarantees that the uncertain demand will be met in probability at the final time of the production process. It is assumed that the production efficiency of the unreliable machine can be modeled as a continuous-time stochastic process. Two simple examples illustrate the result.
Application of Gaussian Process Modeling to Analysis of Functional Unreliability
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Youngblood
2014-06-01
This paper applies Gaussian Process (GP) modeling to analysis of the functional unreliability of a “passive system.” GPs have been used widely in many ways [1]. The present application uses a GP for emulation of a system simulation code. Such an emulator can be applied in several distinct ways, discussed below. All applications illustrated in this paper have precedents in the literature; the present paper is an application of GP technology to a problem that was originally analyzed [2] using neural networks (NN), and later [3, 4] by a method called “Alternating Conditional Expectations” (ACE). This exercise enables a multifacetedmore » comparison of both the processes and the results. Given knowledge of the range of possible values of key system variables, one could, in principle, quantify functional unreliability by sampling from their joint probability distribution, and performing a system simulation for each sample to determine whether the function succeeded for that particular setting of the variables. Using previously available system simulation codes, such an approach is generally impractical for a plant-scale problem. It has long been recognized, however, that a well-trained code emulator or surrogate could be used in a sampling process to quantify certain performance metrics, even for plant-scale problems. “Response surfaces” were used for this many years ago. But response surfaces are at their best for smoothly varying functions; in regions of parameter space where key system performance metrics may behave in complex ways, or even exhibit discontinuities, response surfaces are not the best available tool. This consideration was one of several that drove the work in [2]. In the present paper, (1) the original quantification of functional unreliability using NN [2], and later ACE [3], is reprised using GP; (2) additional information provided by the GP about uncertainty in the limit surface, generally unavailable in other representations, is discussed; (3) a simple forensic exercise is performed, analogous to the inverse problem of code calibration, but with an accident management spin: given an observation about containment pressure, what can we say about the system variables? References 1. For an introduction to GPs, see (for example) Gaussian Processes for Machine Learning, C. E. Rasmussen and C. K. I. Williams (MIT, 2006). 2. Reliability Quantification of Advanced Reactor Passive Safety Systems, J. J. Vandenkieboom, PhD Thesis (University of Michigan, 1996). 3. Z. Cui, J. C. Lee, J. J. Vandenkieboom, and R. W. Youngblood, “Unreliability Quantification of a Containment Cooling System through ACE and ANN Algorithms,” Trans. Am. Nucl. Soc. 85, 178 (2001). 4. Risk and Safety Analysis of Nuclear Systems, J. C. Lee and N. J. McCormick (Wiley, 2011). See especially §11.2.4.« less
Insect vision as model for machine vision
NASA Astrophysics Data System (ADS)
Osorio, D.; Sobey, Peter J.
1992-11-01
The neural architecture, neurophysiology and behavioral abilities of insect vision are described, and compared with that of mammals. Insects have a hardwired neural architecture of highly differentiated neurons, quite different from the cerebral cortex, yet their behavioral abilities are in important respects similar to those of mammals. These observations challenge the view that the key to the power of biological neural computation is distributed processing by a plastic, highly interconnected, network of individually undifferentiated and unreliable neurons that has been a dominant picture of biological computation since Pitts and McCulloch's seminal work in the 1940's.
Unreliable evoked responses in autism
Dinstein, Ilan; Heeger, David J.; Lorenzi, Lauren; Minshew, Nancy J.; Malach, Rafael; Behrmann, Marlene
2012-01-01
Summary Autism has been described as a disorder of general neural processing, but the particular processing characteristics that might be abnormal in autism have mostly remained obscure. Here, we present evidence of one such characteristic: poor evoked response reliability. We compared cortical response amplitude and reliability (consistency across trials) in visual, auditory, and somatosensory cortices of high-functioning individuals with autism and controls. Mean response amplitudes were statistically indistinguishable across groups, yet trial-by-trial response reliability was significantly weaker in autism, yielding smaller signal-to-noise ratios in all sensory systems. Response reliability differences were evident only in evoked cortical responses and not in ongoing resting-state activity. These findings reveal that abnormally unreliable cortical responses, even to elementary non-social sensory stimuli, may represent a fundamental physiological alteration of neural processing in autism. The results motivate a critical expansion of autism research to determine whether (and how) basic neural processing properties such as reliability, plasticity, and adaptation/habituation are altered in autism. PMID:22998867
Analysis of a Uranium Oxide Sample Interdicted in Slovakia (FSC 12-3-1)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borg, Lars E.; Dai, Zurong; Eppich, Gary R.
2014-01-17
We provide a concise summary of analyses of a natural uranium sample seized in Slovakia in November 2007. Results are presented for compound identification, water content, U assay, trace element abundances, trace organic compounds, isotope compositions for U, Pb, Sr and O, and age determination using the 234U – 230Th and 235U – 231Pa chronometers. The sample is a mixture of two common uranium compounds - schoepite and uraninite. The uranium isotope composition is indistinguishable from natural; 236U was not detected. The O, Sr and Pb isotope compositions and trace element abundances are unremarkable. The 234U – 230Th chronometer givesmore » an age of 15.5 years relative to the date of analysis, indicating the sample was produced in January 1997. A comparison of the data for this sample with data in the Uranium Sourcing database failed to find a match, indicating the sample was not produced at a facility represented in the database.« less
Schmidt, Sabine; Cochran, J Kirk
2010-07-01
Radium isotopes have been used extensively to trace the movement of groundwater as well as oceanic water masses, but these radionuclides (and their daughters) are also useful chronometers for the determination of the time scales of other Earth and environmental processes. The purpose of this overview is to present the application of Ra and Ra daughters in the dating of carbonates. We show that the choice of dating method (decay of excess radionuclide or ingrowth of daughter) depends strongly on the parent/daughter activity ratios in the water in which the carbonate was precipitated. Thus freshly precipitated carbonates uniformly show excesses of 226Ra relative to its parent 230Th, and 226Ra decay can provide ages of carbonates over Holocene time scales. In contrast, carbonates are precipitated in waters of greatly varying 210Pb/226Ra. Corals, deep-sea hydrothermal vent clams and the shelled cephalopod Nautilus live in waters with significant dissolved 210Pb and all show excesses of 210Pb in their carbonate. Bivalve molluscs from nearshore and coastal waters, and carbonates deposited from groundwater environments (e.g. travertines) in which 210Pb is efficiently scavenged from solution, show deficiencies of 210Pb relative to 226Ra. In contrast, fish otoliths strongly discriminate against 210Pb regardless of the environment in which the fish lives. Deficiencies of 228Th relative to 228Ra are common in all carbonates. Useful time ranges for the 210Pb/226Ra and 228Th/228Ra chronometers are approximately 100 y and approximately 10 y, respectively. 2009 Elsevier Ltd. All rights reserved.
Nyitray, Alan G; Harris, Robin B; Abalos, Andrew T; Nielson, Carrie M; Papenfuss, Mary; Giuliano, Anna R
2010-12-01
Accurate knowledge about human sexual behaviors is important for increasing our understanding of human sexuality; however, there have been few studies assessing the reliability of sexual behavior questionnaires designed for community samples of adult men. A test-retest reliability study was conducted on a questionnaire completed by 334 men who had been recruited in Tucson, Arizona. Reliability coefficients and refusal rates were calculated for 39 non-sexual and sexual behavior questionnaire items. Predictors of unreliable reporting for lifetime number of female sexual partners were also assessed. Refusal rates were generally low, with slightly higher refusal rates for questions related to immigration, income, the frequency of sexual intercourse with women, lifetime number of female sexual partners, and the lifetime number of male anal sex partners. Kappa and intraclass correlation coefficients were substantial or almost perfect for all non-sexual and sexual behavior items. Reliability dropped somewhat, but was still substantial, for items that asked about household income and the men's knowledge of their sexual partners' health, including abnormal Pap tests and prior sexually transmitted diseases (STD). Age and lifetime number of female sexual partners were independent predictors of unreliable reporting while years of education was inversely associated with unreliable reporting. These findings among a community sample of adult men are consistent with other test-retest reliability studies with populations of women and adolescents.
The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1995-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
A multistage motion vector processing method for motion-compensated frame interpolation.
Huang, Ai- Mei; Nguyen, Truong Q
2008-05-01
In this paper, a novel, low-complexity motion vector processing algorithm at the decoder is proposed for motion-compensated frame interpolation or frame rate up-conversion. We address the problems of having broken edges and deformed structures in an interpolated frame by hierarchically refining motion vectors on different block sizes. Our method explicitly considers the reliability of each received motion vector and has the capability of preserving the structure information. This is achieved by analyzing the distribution of residual energies and effectively merging blocks that have unreliable motion vectors. The motion vector reliability information is also used as a prior knowledge in motion vector refinement using a constrained vector median filter to avoid choosing identical unreliable one. We also propose using chrominance information in our method. Experimental results show that the proposed scheme has better visual quality and is also robust, even in video sequences with complex scenes and fast motion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ubelaker, D H; Buchholz, B A
2005-04-26
Atmospheric testing of nuclear weapons during the 1950s and early 1960s doubled the level of radiocarbon ({sup 14}C) in the atmosphere. From the peak in 1963, the level of {sup 14}CO{sub 2} has decreased exponentially with a mean life of about 16 years, not due to radioactive decay, but due to mixing with large marine and terrestrial carbon reservoirs. Since radiocarbon is incorporated into all living things, the bomb-pulse is an isotopic chronometer of the past half century. The absence of bomb radiocarbon in skeletonized human remains generally indicates a date of death before 1950. Comparison of the radiocarbon valuesmore » with the post 1950 bomb-curve may also help elucidate when in the post 1950 era, the individual was still alive. Such interpretation however, must consider the age at death of the individual and the type of tissue sampled.« less
Inter-monitor standard calibration and tests for Ar-Ar biases
NASA Astrophysics Data System (ADS)
Hemming, S. R.; Turrin, B. D.; Swisher, C. C.; Cox, S. E.; Mesko, G. T.; Chang, S.
2010-12-01
A major issue facing the geochronology community is that there are biases between chronometers that have become significant as we interrogate the rock record with ever increasing levels of precision. Despite much progress there are still major issues with building a timescale with multiple chronometers and for testing synchroneity of anomalous events in Earth history. Improvements in methods for determining U-Pb zircon dates has led to their application at precisions of 0.2% or better in rocks even younger than a million years (e.g., Crowley et al., 2007, Geology), and significantly better than 0.1% in some cases (e.g., Bowring et al., 2006, Paleontological Society Papers, Volume 12). Additionally, the inter-calibration experiments for U-Pb using the EARTHTIME tracer have yielded excellent agreement among labs (0.05%) and these values are traceable back to SI units through the EARTHTIME tracer calibration experiment (e.g., Condon et al., in press, Geochimica et Cosmochimica Acta). These advances have greatly extended the need for cross calibrations of the two chronometers and ultimately seamless integration into the Geologic Time Scale. The direct comparison of ages using different chronometers and laboratories is the central aspect in the quest for a highly resolved and accurate time scale of Earth History. A significant obstacle to high precision inter-comparison of U-Pb and Ar-Ar age results is the current inability of Ar-Ar labs to achieve agreement on monitor standard ages at the 0.1% level. At the heart of Ar-Ar geochronology is the assumption of a known absolute age of a standard, to which all applicable unknowns are referenced. While individual labs are able to achieve highly precise apparent ages on monitor standards, the lack of a “gold standard” for Ar-Ar dating means that we do not know who is, or indeed if anybody is correct. In order to improve our understanding of factors that may lead to biases in our own laboratories at Lamont-Doherty Earth Observatory (AGES) and at Rutgers University, we have begun a concerted effort to test various factors that could lead to biases. AGES uses analogue multiplier peak hopping measurements on a Micromass VG 5400 noble gas mass spectrometer. Rutgers uses ion counting on a MAP 215-50 noble gas mass spectrometer, modified to collect Ar-36 by ion counting and Ar-40 by faraday simultaneously. We will present the results of our internal inter-comparison of monitor standards from each laboratory and will compare them to published results for these standards. We will also present our results from analyzing different sized samples of Fish Canyon sanidine, Alder Creek sanidine, and McClure Mountain hornblende monitor standards.
ERIC Educational Resources Information Center
Yau, Shu Hui; Brock, Jon; McArthur, Genevieve
2016-01-01
It has been proposed that language impairments in children with Autism Spectrum Disorders (ASD) stem from atypical neural processing of speech and/or nonspeech sounds. However, the strength of this proposal is compromised by the unreliable outcomes of previous studies of speech and nonspeech processing in ASD. The aim of this study was to…
Should Secondary Schools Buy Local Area Networks?
ERIC Educational Resources Information Center
Hyde, Hartley
1986-01-01
The advantages of microcomputer networks include resource sharing, multiple user communications, and integrating data processing and office automation. This article nonetheless favors stand-alone computers for Australian secondary school classrooms because of unreliable hardware, software design, and copyright problems, and individual progress…
Software Prototyping: Designing Systems for Users.
ERIC Educational Resources Information Center
Spies, Phyllis Bova
1983-01-01
Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…
Travel behavior of U.S. domestic airline passengers and its impacts on infrastructure utilization
DOT National Transportation Integrated Search
2009-09-30
Unexpected and unannounced delays and cancellations of flights have emerged as a quasinormal : phenomenon in recent months and years. The airline unreliability has become : unbearable day by day. The volume of airline passengers on domestic routes in...
Army Logistician. Volume 39, Issue 5, September-October 2007
2007-10-01
to amend the situation with record speed. Transfer of Authority As the TOA neared , another oversight came to light. The 3d Brigade’s equipment was...geographically dispersed units and their supporting S–1s. However, several problems undercut this solution. Connectivity was unreliable, scanners and...the Army’s supply system has empowered organizations by providing near -real-time data to leaders throughout the battle- field. Col laboration among
Late formation of silicon carbide in type II supernovae
Liu, Nan; Nittler, Larry R.; Alexander, Conel M. O’D.; Wang, Jianhua
2018-01-01
We have found that individual presolar silicon carbide (SiC) dust grains from supernovae show a positive correlation between 49Ti and 28Si excesses, which is attributed to the radioactive decay of the short-lived (t½ = 330 days) 49V to 49Ti in the inner highly 28Si-rich Si/S zone. The 49V-49Ti chronometer shows that these supernova SiC dust grains formed at least 2 years after their parent stars exploded. This result supports recent dust condensation calculations that predict a delayed formation of carbonaceous and SiC grains in supernovae. The astronomical observation of continuous buildup of dust in supernovae over several years can, therefore, be interpreted as a growing addition of C-rich dust to the dust reservoir in supernovae. PMID:29376119
Unreliability as a Threat to Understanding Psychopathology: The Cautionary Tale of Attentional Bias
Rodebaugh, Thomas L.; Scullin, Rachel B.; Langer, Julia K.; Dixon, David J.; Huppert, Jonathan D.; Bernstein, Amit; Zvielli, Ariel; Lenze, Eric J.
2016-01-01
The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically-oriented measures can only be certain if such measurements are reliable. Two pillars of NIMH’s portfolio – the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials – cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally-used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. PMID:27322741
The role of fission in Supernovae r-process nucleosynthesis
NASA Astrophysics Data System (ADS)
Otsuki, Kaori; Kajino, Toshitaka; Sumiyoshi, Kosuke; Ohta, Masahisa; Mathews, J. Grant
2001-10-01
The r-process elements are presumed to be produced in an explosive environment with short timescale at high entropy, like type-II supernova explosion. Intensive flux of free neutrons are absorbed successively by seed elements to form the nuclear reaction flow on extremely unstable nuclei on the neutron rich side. It would probe our knowledge of the properties of nulei far from the beta stability. It is also important in astronomy since this process forms the long-lived nuclear chronometers Thorium and Uranium that are utilised dating the age of the Milky Way. In our previous work, we showed that the succesful r-process nucleosynthesis can occure above young, hot protoneutron star. Although these long-lived heavy elements are produced comparable amounts to observation in several supernova models which we constructed, fission and alpha-decay were not included there. The fission products could play an important role in setting actinide yields which are used as cosmochronometers. In this talk, we report an infulence of fission on actinide yields and on estimate of Galactic age as well. We also discuss fission yields at lighter elements (Z ~ 50).
NASA Astrophysics Data System (ADS)
Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.
2018-03-01
Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.
From Dates to Rates: The Emergence of Integrated Geochronometry (Invited)
NASA Astrophysics Data System (ADS)
Hodges, K. V.; Adams, B. A.; Bohon, W.; Cooper, F. J.; Tripathy-Lang, A.; Van Soest, M. C.; Watson, E. B.; Young, K. E.
2013-12-01
Many applications of isotope geochemistry to telling time have involved geochronology - the measurement of the crystallization age of a mineral - or thermochronology, the measurement of the time at which a mineral cooled through an estimated closure temperature. The resulting data typically provide one or two points along an evolving temperature-time (Tt) path. Unfortunately, many problems require a richer knowledge of longer portions of the Tt path and thus the integrated application of multiple chronometers to individual minerals or suites of minerals from a particular sample or outcrop. In this presentation, we review some of the most recent advances in geochronometry, the direct dating of rates of a wide range of geologic processes on timescales ranging from seconds (in the case of bolide impact on Earth and elsewhere in the Solar System) to hundreds of millions of years (in the case of very slowly cooled Precambrian terrains). For all chronometers except those based on the production of fission tracks, our capacity to extract precise and accurate Tt paths depends on a good understanding of the kinetics of diffusive loss of radiogenic daughter isotopes. Laboratory experiments have substantially improved our understanding of nominal kinetic parameters in recent years, but our increased use of new methods for their determination (e.g., Rutherford backscattering spectroscopy, nuclear reaction analysis, and laser depth profiling) have demonstrated complexities related to compositional variations and asymmetric diffusion. At the same time, a growing number of geologic applications of these chronometers illustrate the importance of deformation history and radiation damage in modifying effective diffusion parameters. Such factors have two important implications for geochronometry. First, they suggest that studies of multiple minerals employing multiple isotopic methods - integrated geochronometry - are likely to produce more robust constraints on Tt paths than those involving the application of a single geochronometer. Second, they suggest that characterization of the chemistry and structure of minerals prior to dating may become standard procedure in most laboratories. Some of the most valuable constraints on the cooling histories of individual crystals come from microanalytical techniques that illuminate natural diffusive loss profiles, either directly (e.g., laser and ion microprobe mapping) or indirectly (e.g., 40Ar/39Ar and 4He/3He incremental heating experimentation). For most materials and most cooling histories, direct microanalytical approaches yield less spatial resolution and thus a poorer resolution of the cooling history. On the other hand, the extraction of cooling histories based on data obtained through indirect techniques requires significant simplifying assumptions regarding the three-dimensional distribution of parent isotopes that are not always warranted. Studies that integrate such techniques, rare in the literature thus far, are ushering in a new era of quantitative geochronometry.
Reliable Radiation Hybrid Maps: An Efficient Scalable Clustering-based Approach
USDA-ARS?s Scientific Manuscript database
The process of mapping markers from radiation hybrid mapping (RHM) experiments is equivalent to the traveling salesman problem and, thereby, has combinatorial complexity. As an additional problem, experiments typically result in some unreliable markers that reduce the overall quality of the map. We ...
Constraint of the 13C(α,n) Cross Section Toward Astrophysical Energies for the Main s-Process
NASA Astrophysics Data System (ADS)
Toomey, Rebecca; Febbraro, Michael T.; Pain, Steven D.; Peters, William A.; Cizewski, Jolie A.; Havener, Charles C.; Bannister, Mark E.; Chipps, Kelly A.; Walter, David G.; Ummel, Chad C.; Sims, Harrison
2017-09-01
The slow neutron capture process (s-process) typically occurs in relatively low neutron flux environments, such as AGB stars, and is a key mechanism in heavy-element synthesis. The dominant source of neutrons for the main s-process is the 13C(α,n) reaction, which proceeds at stellar temperatures ( 0.1 GK, 200 keV), via reactions well below the Coulomb barrier. Direct measurements of the reaction rate in the Gamow window ( 140- 230 keV) is difficult, complicated by the low yields and high beam currents required. Current measurements have constrained the cross section down to approximately 320 keV - still well above stellar conditions- with significant statistical uncertainties. These uncertainties, and the influence of a near-threshold 1 /2+ state at 6.4 MeV, means that extrapolation of the data into the Gamow window is unreliable. These measurements typically use high-efficiency moderated neutron counter detectors, meaning energy information of the incident neutrons is lost. A quasi-spectroscopic approach has been used to measure the 13C(α,n) reaction rate at energies between 300-350 keV with the aim of reducing uncertainties in current measurements. Work supported in part by U.S. D.O.E., the National Science Foundation and the LDRD Program of ORNL, managed by UT-Battelle, LLC.
China’s Currency: Economic Issues and Options for U.S. Trade Policy
2008-01-09
order to foster economic stability and investor confidence, a policy that is practiced by a variety of developing countries . Chinese officials have...powerful in theory, it has been proven to be unreliable in reality: prices are consistently lower in developing countries than industrialized countries ...total U.S. bilateral trade deficits in 2006, indicating that the overall U.S. trade deficit is not caused by the exchange rate policy of one country
Cell Phones and Sun Shadows: Exploring the Equation of Time
ERIC Educational Resources Information Center
Madden, Sean P.
2010-01-01
For thousands of years before the invention of reliable clocks, humans measured their days by the motion of the sun. Astronomically, one day was the length of time it took for the sun to return to the same position in the sky. With the advent of precise mechanical chronometers such as Harrison's timekeepers (Sobel and Andrewes 1998), which ran at…
Gilbert [Gilberd], William (1544-1603)
NASA Astrophysics Data System (ADS)
Murdin, P.
2000-11-01
Doctor and scientist, born in Colchester, England, wrote De Magnete (On the Magnet), published in 1600. The magnetic compass was one of most useful the navigational instruments before the chronometer, but little was known about the lodestone (magnetic iron ore). Gilbert made his own experiments, such as testing the folk-belief that garlic destroys the magnetic effect of the compass needle. He dra...
Reducing measurement errors during functional capacity tests in elders.
da Silva, Mariane Eichendorf; Orssatto, Lucas Bet da Rosa; Bezerra, Ewertton de Souza; Silva, Diego Augusto Santos; Moura, Bruno Monteiro de; Diefenthaeler, Fernando; Freitas, Cíntia de la Rocha
2018-06-01
Accuracy is essential to the validity of functional capacity measurements. To evaluate the error of measurement of functional capacity tests for elders and suggest the use of the technical error of measurement and credibility coefficient. Twenty elders (65.8 ± 4.5 years) completed six functional capacity tests that were simultaneously filmed and timed by four evaluators by means of a chronometer. A fifth evaluator timed the tests by analyzing the videos (reference data). The means of most evaluators for most tests were different from the reference (p < 0.05), except for two evaluators for two different tests. There were different technical error of measurement between tests and evaluators. The Bland-Altman test showed difference in the concordance of the results between methods. Short duration tests showed higher technical error of measurement than longer tests. In summary, tests timed by a chronometer underestimate the real results of the functional capacity. Difference between evaluators' reaction time and perception to determine the start and the end of the tests would justify the errors of measurement. Calculation of the technical error of measurement or the use of the camera can increase data validity.
Bollard, Jean; Connelly, James N.; Bizzarro, Martin
2016-01-01
The CB chondrites are metal-rich meteorites with characteristics that sharply distinguish them from other chondrite groups. Their unusual chemical and petrologic features and a young formation age of bulk chondrules dated from the CBa chondrite Gujba are interpreted to reflect a single-stage impact origin. Here, we report high-precision internal isochrons for four individual chondrules of the Gujba chondrite to probe the formation history of CB chondrites and evaluate the concordancy of relevant short-lived radionuclide chronometers. All four chondrules define a brief formation interval with a weighted mean age of 4562.49 ± 0.21 Myr, consistent with its origin from the vapor-melt impact plume generated by colliding planetesimals. Formation in a debris disk mostly devoid of nebular gas and dust sets an upper limit for the solar protoplanetary disk lifetime at 4.8 ± 0.3 Myr. Finally, given the well-behaved Pb-Pb systematics of all four chondrules, a precise formation age and the concordancy of the Mn-Cr, Hf-W, and I-Xe short-lived radionuclide relative chronometers, we propose that Gujba may serve as a suitable time anchor for these systems. PMID:27429545
NASA Astrophysics Data System (ADS)
Wang, Deng
2018-06-01
To explore whether there is new physics going beyond the standard cosmological model or not, we constrain seven cosmological models by combining the latest and largest Pantheon Type Ia supernovae sample with the data combination of baryonic acoustic oscillations, cosmic microwave background radiation, Planck lensing and cosmic chronometers. We find that a spatially flat universe is preferred in the framework of Λ CDM cosmology, that the constrained equation of state of dark energy is very consistent with the cosmological constant hypothesis in the ω CDM model, that there is no evidence of dynamical dark energy in the dark energy density-parametrization model, that there is no hint of interaction between dark matter and dark energy in the dark sector of the universe in the decaying vacuum model, and that there does not exist the sterile neutrino in the neutrino sector of the universe in the Λ CDM model. We also give the 95% upper limit of the total mass of three active neutrinos Σ mν<0.178 eV under the assumption of Λ CDM scenario. It is clear that there is no any departure from the standard cosmological model based on current observational datasets.
Instrumentation development for In Situ 40Ar/39Ar planetary geochronology
Morgan, Leah; Munk, Madicken; Davidheiser-Kroll, Brett; Warner, Nicholas H.; Gupta, Sanjeev; Slaybaugh, Rachel; Harkness, Patrick; Mark, Darren
2017-01-01
The chronology of the Solar System, particularly the timing of formation of extra-terrestrial bodies and their features, is an outstanding problem in planetary science. Although various chronological methods for in situ geochronology have been proposed (e.g., Rb-Sr, K-Ar), and even applied (K-Ar), the reliability, accuracy, and applicability of the 40Ar/39Ar method makes it by far the most desirable chronometer for dating extra-terrestrial bodies. The method however relies on the neutron irradiation of samples, and thus a neutron source. Herein, we discuss the challenges and feasibility of deploying a passive neutron source to planetary surfaces for the in situ application of the 40Ar/39Ar chronometer. Requirements in generating and shielding neutrons, as well as analysing samples are described, along with an exploration of limitations such as mass, power and cost. Two potential solutions for the in situ extra-terrestrial deployment of the 40Ar/39Ar method are presented. Although this represents a challenging task, developing the technology to apply the 40Ar/39Ar method on planetary surfaces would represent a major advance towards constraining the timescale of solar system formation and evolution.
A hierarchical approach to reliability modeling of fault-tolerant systems. M.S. Thesis
NASA Technical Reports Server (NTRS)
Gossman, W. E.
1986-01-01
A methodology for performing fault tolerant system reliability analysis is presented. The method decomposes a system into its subsystems, evaluates vent rates derived from the subsystem's conditional state probability vector and incorporates those results into a hierarchical Markov model of the system. This is done in a manner that addresses failure sequence dependence associated with the system's redundancy management strategy. The method is derived for application to a specific system definition. Results are presented that compare the hierarchical model's unreliability prediction to that of a more complicated tandard Markov model of the system. The results for the example given indicate that the hierarchical method predicts system unreliability to a desirable level of accuracy while achieving significant computational savings relative to component level Markov model of the system.
China’s Currency: Economic Issues and Options for U.S. Trade Policy
2007-07-15
a policy that is practiced by a variety of developing countries . Chinese officials have expressed concern that abandoning the current currency...has been proven to be unreliable in reality: prices are consistently lower in developing countries than industrialized countries . Some economists have...not caused by the exchange rate policy of one country , but rather the shortfall between U.S. saving and investment. That being said, there are a
Specification and Verification of Communication Protocols in AFFIRM Using State Transition Models.
1981-03-01
NewQueueOfftcket; theorem Pendnglnvariant, Remove(Pending(s)) = NewQueueOfPacket; Since the implementation is in keeping with the specification, its salp ...another communication line. The communication lines are unreliable; messages traveling in either direction can be lost, reordered, corrupted, or
Vlastarakos, Petros V; Vasileiou, Alexandra; Nikolopoulos, Thomas P
2017-12-01
We conducted an analysis to assess the relative contribution of auditory brainstem response (ABR) testing and auditory steady-state response (ASSR) testing in providing appropriate hearing aid fitting in hearing-impaired children with difficult or unreliable behavioral audiometry. Of 150 infants and children who had been referred to us for hearing assessment as part of a neonatal hearing screening and cochlear implantation program, we identified 5 who exhibited significant discrepancies between click-ABR and ASSR testing results and difficult or unreliable behavioral audiometry. Hearing aid fitting in pediatric cochlear implant candidates for a trial period of 3 to 6 months is a common practice in many implant programs, but monitoring the progress of the amplified infants and providing appropriate hearing aid fitting can be challenging. If we accept the premise that we can assess the linguistic progress of amplified infants with an acceptable degree of certainty, the auditory behavior that we are monitoring presupposes appropriate bilateral hearing aid fitting. This may become very challenging in young children, or even in older children with difficult or unreliable behavioral audiometry results. This challenge can be addressed by using data from both ABR and ASSR testing. Fitting attempts that employ data from only ABR testing provide amplification that involves the range of spoken language but is not frequency-specific. Hearing aid fitting should also incorporate and take into account ASSR data because reliance on ABR testing alone might compromise the validity of the monitoring process. In conclusion, we believe that ASSR threshold-based bilateral hearing aid fitting is necessary to provide frequency-specific amplification of hearing and appropriate propulsion in the prelinguistic vocalizations of monitored infants.
When is Information Sufficient for Action Search with Unreliable Yet Informative Intelligence
2016-03-30
information: http://pubsonline.informs.org When Is Information Sufficient for Action? Search with Unreliable yet Informative Intelligence Michael Atkinson... Search with Unreliable yet Informative Intelligence. Operations Research Published online in Articles in Advance 30 Mar 2016 . http://dx.doi.org/10.1287...print) � ISSN 1526-5463 (online) http://dx.doi.org/10.1287/opre.2016.1488 © 2016 INFORMS When Is Information Sufficient for Action? Search with
The complete project will greatly increase the sustainability of small gasoline and/or diesel powered generators that are currently used to supplement or replace an unreliable power grid. This phase will develop the feedstock processing equipment needed to produce syngas bio-...
Ending Conflicts and Vandalism in Knowledge Collaboration of Social Media
ERIC Educational Resources Information Center
Zhao, Haifeng
2013-01-01
Social media provide a multitude of opportunities for knowledge contribution and sharing. However, the content reliability issue has caused comprehensive attention, especially on credible social media, such as Wikipedia. Despite Wikipedia's success with the open editing model, dissenting voices give rise to unreliable content due to two…
Temperature and humidity control in indirect calorimeter chambers
USDA-ARS?s Scientific Manuscript database
A three-chamber, indirect calorimeter has been a part of the Environmental Laboratory at the U.S. Meat Animal Research Center (MARC) for over 25 yr. Corrosion of the animal chambers and unreliable temperature control forced either major repairs or complete replacement. There is a strong demand for...
NASA Astrophysics Data System (ADS)
Landis, Joshua D.; Renshaw, Carl E.; Kaste, James M.
2016-05-01
Soil systems are known to be repositories for atmospheric carbon and metal contaminants, but the complex processes that regulate the introduction, migration and fate of atmospheric elements in soils are poorly understood. This gap in knowledge is attributable, in part, to the lack of an established chronometer that is required for quantifying rates of relevant processes. Here we develop and test a framework for adapting atmospheric lead-210 chronometry (210Pb; half-life 22 years) to soil systems. We propose a new empirical model, the Linked Radionuclide aCcumulation model (LRC, aka "lark"), that incorporates measurements of beryllium-7 (7Be; half-life 54 days) to account for 210Pb penetration of the soil surface during initial deposition, a process which is endemic to soils but omitted from conventional 210Pb models (e.g., the Constant Rate of Supply, CRS model) and their application to sedimentary systems. We validate the LRC model using the 1963-1964 peak in bomb-fallout americium-241 (241Am; half-life of 432 years) as an independent, corroborating time marker. In three different soils we locate a sharp 241Am weapons horizon at disparate depths ranging from 2.5 to 6 cm, but with concordant ages averaging 1967 ± 4 via the LRC model. Similarly, at one site contaminated with mercury (HgT) we find that the LRC model is consistent with the recorded history of Hg emission. The close agreement of Pb, Am and Hg behavior demonstrated here suggests that organo-metallic colloid formation and migration incorporates many trace metals in universal soil processes and that these processes may be described quantitatively using atmospheric 210Pb chronometry. The 210Pb models evaluated here show that migration rates of soil colloids on the order of 1 mm yr-1 are typical, but also that these rates vary systematically with depth and are attributable to horizon-specific processes of leaf-litter decay, eluviation and illuviation. We thus interpret 210Pb models to quantify (i) exposure of the soil system to atmospheric aerosol deposition in the context of (ii) organic carbon assimilation, colloid production, and advection through the soil profile. The behavior of some other elements, such as Cs, diverges from the conservative colloid behavior exemplified by Pb and Am, and in these cases the value of empirical 210Pb chronometry models like LRC and CRS is as a comparator rather than as an absolute chronometer. We conclude that 210Pb chronometry is valuable for tracing colloidally-mediated transport of Pb and similarly-refractory metals, as well as the mobile pool of carbon in soils.
Legal Cynicism and Parental Appraisals of Adolescent Violence
Soller, Brian; Jackson, Aubrey L.; Browning, Christopher R.
2014-01-01
Research suggests that legal cynicism—a cultural frame in which the law is viewed as illegitimate and ineffective—encourages violence to maintain personal safety when legal recourse is unreliable. But no study has tested the impact of legal cynicism on appraisals of violence. Drawing from symbolic interaction theory and cultural sociology, we tested whether neighbourhood legal cynicism alters the extent to which parents appraise their children’s violence as indicative of aggressive or impulsive temperaments using data from the Project on Human Development in Chicago Neighborhoods. We find that legal cynicism attenuates the positive association between adolescent violence and parental assessments of aggression and impulsivity. Our study advances the understanding of micro-level processes through which prevailing cultural frames in the neighbourhood shape violence appraisals. PMID:24932013
The role of impact cratering for Mars sample return
NASA Technical Reports Server (NTRS)
Schultz, P. H.
1988-01-01
The preserved cratering record of Mars indicates that impacts play an important role in deciphering Martian geologic history, whether as a mechanism to modify the lithosphere and atmosphere or as a tool to sample the planet. The various roles of impact cratering in adding a broader understanding of Mars through returned samples are examined. Five broad roles include impact craters as: (1) a process in response to a different planetary localizer environment; (2) a probe for excavating crustal/mantle materials; (3) a possible localizer of magmatic and hydrothermal processes; (4) a chronicle of changes in the volcanic, sedimentary, atmospheric, and cosmic flux history; and (5) a chronometer for extending the geologic time scale to unsampled regions. The evidence for Earth-like processes and very nonlunar styles of volcanism and tectonism may shift the emphasis of a sampling strategy away from equally fundamental issues including crustal composition, unit ages, and climate history. Impact cratering not only played an important active role in the early Martian geologic history, it also provides an important tool for addressing such issues.
2013-12-01
Safe Drinking Water Act28 and the Clean Water Act.29 • Potable water : According to Waterworks officials, Guam’s potable water system currently is in...noncompliance with the Safe Drinking Water Act. The unreliable drinking water distribution system has historically resulted in bacterial...Protection Consolidated Grants program, provided Guam with almost $6.8 million in fiscal year 2012 to fund drinking water and wastewater system
[How valid are student self-reports of bullying in schools?].
Morbitzer, Petra; Spröber, Nina; Hautzinger, Martin
2009-01-01
In this study we examine the reliability and validity of students' self-reports about bullying and victimization in schools. 208 5th class students of four "middle schools" in Southern Germany filled in the Bully-Victim-Questionnaire (Olweus, 1989, adapted by Lösel, Bliesener, Averbeck, 1997) and the School Climate Survey (Brockenborough, 2001) to assess the prevalence of bullying/victimization, and to evaluate attitudes towards aggression and support for victims. By using reliability and validity criteria, one third (31%) of the questionnaires was classified as "unreliable/invalid". Mean comparisons of the "unreliable/invalid" group and the "valid" group of the subscales concerning bullying/victimization found significant differences. The "unreliable/invalid" group stated higher values of bullying and victimization. Based on the "unreliable/invalid" questionnaires more students could be identified as bullies/victims or bully-victims. The prevalence of bullying/victimization in the whole sample was reduced if "unreliable/invalid" questionnaires were excluded. The results are discussed in the framework of theories about the presentation of the self ("impression management', "social desirability") and systematic response patterns ("extreme response bias").
Nearshore coastal mapping. [in Lake Michigan and Puerto Rico
NASA Technical Reports Server (NTRS)
Polcyn, F. C.; Lyzenga, D. R.
1975-01-01
Two test sites of different water quality and bottom topography were used to test for maximum water depth penetration using the Skylab S-192 MSS for measurement of nearshore coastal bathymetry. Sites under investigation lie along the Lake Michigan coastline where littoral transport acts to erode sand bluffs and endangers developments along 1,200 miles of shore, and on the west coast of Puerto Rico where unreliable shoal location and depth information constitutes a safety hazard to navigation. The S-192 and S-190A and B provide data on underwater features because of water transparency in the blue/green portion of the spectrum. Depth of 20 meters were measured with the S-192 in the Puerto Rico test site. The S-190B photography with its improved spatial resolution clearly delineates the triple sand bar topography in the Lake Michigan test site. Several processing techniques were employed to test for maximum depth measurement with least error. The results are useful for helping to determine an optimum spectral bandwidth for future space sensors that will increase depth measurements for different water attenuation conditions where a bottom reflection is detectable.
Gannoun, Abdelmouhcine; Boyet, Maud; Rizo, Hanika; El Goresy, Ahmed
2011-05-10
The short-lived (146)Sm-(142)Nd chronometer (T(1/2) = 103 Ma) is used to constrain the early silicate evolution of planetary bodies. The composition of bulk terrestrial planets is then considered to be similar to that of primitive chondrites that represent the building blocks of rocky planets. However for many elements chondrites preserve small isotope differences. In this case it is not always clear to what extent these variations reflect the isotope heterogeneity of the protosolar nebula rather than being produced by the decay of parent isotopes. Here we present Sm-Nd isotopes data measured in a comprehensive suite of enstatite chondrites (EC). The EC preserve (142)Nd/(144)Nd ratios that range from those of ordinary chondrites to values similar to terrestrial samples. The EC having terrestrial (142)Nd/(144)Nd ratios are also characterized by small (144)Sm excesses, which is a pure p-process nuclide. The correlation between (144)Sm and (142)Nd for chondrites may indicate a heterogeneous distribution in the solar nebula of p-process matter synthesized in supernovae. However to explain the difference in (142)Nd/(144)Nd ratios, 20% of the p-process contribution to (142)Nd is required, at odds with the value of 4% currently proposed in stellar models. This study highlights the necessity of obtaining high-precision (144)Sm measurements to interpret properly measured (142)Nd signatures. Another explanation could be that the chondrites sample material formed in different pulses of the lifetime of asymptotic giant branch stars. Then the isotope signature measured in SiC presolar would not represent the unique s-process signature of the material present in the solar nebula during accretion.
Gannoun, Abdelmouhcine; Boyet, Maud; Rizo, Hanika; El Goresy, Ahmed
2011-01-01
The short-lived 146Sm–142Nd chronometer (T1/2 = 103 Ma) is used to constrain the early silicate evolution of planetary bodies. The composition of bulk terrestrial planets is then considered to be similar to that of primitive chondrites that represent the building blocks of rocky planets. However for many elements chondrites preserve small isotope differences. In this case it is not always clear to what extent these variations reflect the isotope heterogeneity of the protosolar nebula rather than being produced by the decay of parent isotopes. Here we present Sm–Nd isotopes data measured in a comprehensive suite of enstatite chondrites (EC). The EC preserve 142Nd/144Nd ratios that range from those of ordinary chondrites to values similar to terrestrial samples. The EC having terrestrial 142Nd/144Nd ratios are also characterized by small 144Sm excesses, which is a pure p-process nuclide. The correlation between 144Sm and 142Nd for chondrites may indicate a heterogeneous distribution in the solar nebula of p-process matter synthesized in supernovae. However to explain the difference in 142Nd/144Nd ratios, 20% of the p-process contribution to 142Nd is required, at odds with the value of 4% currently proposed in stellar models. This study highlights the necessity of obtaining high-precision 144Sm measurements to interpret properly measured 142Nd signatures. Another explanation could be that the chondrites sample material formed in different pulses of the lifetime of asymptotic giant branch stars. Then the isotope signature measured in SiC presolar would not represent the unique s-process signature of the material present in the solar nebula during accretion. PMID:21515828
Mars chronology: Assessing techniques for quantifying surficial processes
Doran, P.T.; Clifford, S.M.; Forman, S.L.; Nyquist, Larry; Papanastassiou, D.A.; Stewart, B.W.; Sturchio, N.C.; Swindle, T.D.; Cerling, T.; Kargel, J.; McDonald, G.; Nishiizumi, K.; Poreda, R.; Rice, J.W.; Tanaka, K.
2004-01-01
Currently, the absolute chronology of Martian rocks, deposits and events is based mainly on crater counting and remains highly imprecise with epoch boundary uncertainties in excess of 2 billion years. Answers to key questions concerning the comparative origin and evolution of Mars and Earth will not be forthcoming without a rigid Martian chronology, enabling the construction of a time scale comparable to Earth's. Priorities for exploration include calibration of the cratering rate, dating major volcanic and fluvial events and establishing chronology of the polar layered deposits. If extinct and/or extant life is discovered, the chronology of the biosphere will be of paramount importance. Many radiometric and cosmogenic techniques applicable on Earth and the Moon will apply to Mars after certain baselines (e.g. composition of the atmosphere, trace species, chemical and physical characteristics of Martian dust) are established. The high radiation regime may pose a problem for dosimetry-based techniques (e.g. luminescence). The unique isotopic composition of nitrogen in the Martian atmosphere may permit a Mars-specific chronometer for tracing the time-evolution of the atmosphere and of lithic phases with trapped atmospheric gases. Other Mars-specific chronometers include measurement of gas fluxes and accumulation of platinum group elements (PGE) in the regolith. Putting collected samples into geologic context is deemed essential, as is using multiple techniques on multiple samples. If in situ measurements are restricted to a single technique it must be shown to give consistent results on multiple samples, but in all cases, using two or more techniques (e.g. on the same lander) will reduce error. While there is no question that returned samples will yield the best ages, in situ techniques have the potential to be flown on multiple missions providing a larger data set and broader context in which to place the more accurate dates. ?? 2004 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Grocke, S. B.; Andrews, B. J.; Manga, M.; Quinn, E. T.
2015-12-01
Dacite lavas from Chaos Crags, Lassen Volcanic Center, CA contain inclusions of more mafic magmas, suggesting that mixing or mingling of magmas occurred just prior to lava dome extrusion, and perhaps triggered the eruption. The timescales between the mixing event and eruption are unknown, but reaction rims on biotite grains hosted in the Chaos Crags dacite may provide a record of the timescale (i.e., chronometer) between mixing and eruption. To quantify the effect of pre-eruptive heating on the formation of reaction rims on biotite, we conducted isobaric (150 MPa), H2O-saturated, heating experiments on the dacite end-member. In heating experiments, we held the natural dacite at 800°C and 150MPa for 96 hours and then isobarically heated the experiments to 825 and 850°C (temperatures above the biotite liquidus, <815°C at 150MPa) for durations ≤96 hours. We analyzed run products using high-resolution SEM imaging and synchrotron-based X-ray tomography, which provides a 3-dimensional rendering of biotite breakdown reaction products and textures. X-ray tomography images of experimental run products reveal that in all heating experiments, biotite breakdown occurs and reaction products include orthopyroxenes, Fe-Ti oxides, and vapor (inferred from presence of bubbles). Experiments heated to 850°C for 96 h show extensive breakdown, consisting of large orthopyroxene crystals, Fe-Ti oxide laths (<100μm), and bubbles. When the process of biotite breakdown goes to completion, the resulting H2O bubble comprises roughly the equivalent volume of the original biotite crystal. This observation suggests that biotite breakdown can add significant water to the melt and lead to extensive bubble formation. Although bubble expansion and magma flow may disrupt the reaction products in some magmas, our experiments suggest that biotite breakdown textures in natural samples can be used as a chronometer for pre-eruptive magma mixing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moresco, Michele; Cimatti, Andrea; Citro, Annalisa
2016-05-01
Deriving the expansion history of the Universe is a major goal of modern cosmology. To date, the most accurate measurements have been obtained with Type Ia Supernovae (SNe) and Baryon Acoustic Oscillations (BAO), providing evidence for the existence of a transition epoch at which the expansion rate changes from decelerated to accelerated. However, these results have been obtained within the framework of specific cosmological models that must be implicitly or explicitly assumed in the measurement. It is therefore crucial to obtain measurements of the accelerated expansion of the Universe independently of assumptions on cosmological models. Here we exploit the unprecedentedmore » statistics provided by the Baryon Oscillation Spectroscopic Survey (BOSS, [1-3]) Data Release 9 to provide new constraints on the Hubble parameter H ( z ) using the cosmic chronometers approach. We extract a sample of more than 130000 of the most massive and passively evolving galaxies, obtaining five new cosmology-independent H ( z ) measurements in the redshift range 0.3 < z < 0.5, with an accuracy of ∼11–16% incorporating both statistical and systematic errors. Once combined, these measurements yield a 6% accuracy constraint of H ( z = 0.4293) = 91.8 ± 5.3 km/s/Mpc. The new data are crucial to provide the first cosmology-independent determination of the transition redshift at high statistical significance, measuring z {sub t} = 0.4 ± 0.1, and to significantly disfavor the null hypothesis of no transition between decelerated and accelerated expansion at 99.9% confidence level. This analysis highlights the wide potential of the cosmic chronometers approach: it permits to derive constraints on the expansion history of the Universe with results competitive with standard probes, and most importantly, being the estimates independent of the cosmological model, it can constrain cosmologies beyond—and including—the ΛCDM model.« less
Holst, Jesper C.; Olsen, Mia B.; Paton, Chad; Nagashima, Kazuhide; Schiller, Martin; Wielandt, Daniel; Larsen, Kirsten K.; Connelly, James N.; Jørgensen, Jes K.; Krot, Alexander N.; Nordlund, Åke; Bizzarro, Martin
2013-01-01
Refractory inclusions [calcium–aluminum-rich inclusions, (CAIs)] represent the oldest Solar System solids and provide information regarding the formation of the Sun and its protoplanetary disk. CAIs contain evidence of now extinct short-lived radioisotopes (e.g., 26Al, 41Ca, and 182Hf) synthesized in one or multiple stars and added to the protosolar molecular cloud before or during its collapse. Understanding how and when short-lived radioisotopes were added to the Solar System is necessary to assess their validity as chronometers and constrain the birthplace of the Sun. Whereas most CAIs formed with the canonical abundance of 26Al corresponding to 26Al/27Al of ∼5 × 10−5, rare CAIs with fractionation and unidentified nuclear isotope effects (FUN CAIs) record nucleosynthetic isotopic heterogeneity and 26Al/27Al of <5 × 10−6, possibly reflecting their formation before canonical CAIs. Thus, FUN CAIs may provide a unique window into the earliest Solar System, including the origin of short-lived radioisotopes. However, their chronology is unknown. Using the 182Hf–182W chronometer, we show that a FUN CAI recording a condensation origin from a solar gas formed coevally with canonical CAIs, but with 26Al/27Al of ∼3 × 10−6. The decoupling between 182Hf and 26Al requires distinct stellar origins: steady-state galactic stellar nucleosynthesis for 182Hf and late-stage contamination of the protosolar molecular cloud by a massive star(s) for 26Al. Admixing of stellar-derived 26Al to the protoplanetary disk occurred during the epoch of CAI formation and, therefore, the 26Al–26Mg systematics of CAIs cannot be used to define their formation interval. In contrast, our results support 182Hf homogeneity and chronological significance of the 182Hf–182W clock. PMID:23671077
Holst, Jesper C; Olsen, Mia B; Paton, Chad; Nagashima, Kazuhide; Schiller, Martin; Wielandt, Daniel; Larsen, Kirsten K; Connelly, James N; Jørgensen, Jes K; Krot, Alexander N; Nordlund, Ake; Bizzarro, Martin
2013-05-28
Refractory inclusions [calcium-aluminum-rich inclusions, (CAIs)] represent the oldest Solar System solids and provide information regarding the formation of the Sun and its protoplanetary disk. CAIs contain evidence of now extinct short-lived radioisotopes (e.g., (26)Al, (41)Ca, and (182)Hf) synthesized in one or multiple stars and added to the protosolar molecular cloud before or during its collapse. Understanding how and when short-lived radioisotopes were added to the Solar System is necessary to assess their validity as chronometers and constrain the birthplace of the Sun. Whereas most CAIs formed with the canonical abundance of (26)Al corresponding to (26)Al/(27)Al of ∼5 × 10(-5), rare CAIs with fractionation and unidentified nuclear isotope effects (FUN CAIs) record nucleosynthetic isotopic heterogeneity and (26)Al/(27)Al of <5 × 10(-6), possibly reflecting their formation before canonical CAIs. Thus, FUN CAIs may provide a unique window into the earliest Solar System, including the origin of short-lived radioisotopes. However, their chronology is unknown. Using the (182)Hf-(182)W chronometer, we show that a FUN CAI recording a condensation origin from a solar gas formed coevally with canonical CAIs, but with (26)Al/(27)Al of ∼3 × 10(-6). The decoupling between (182)Hf and (26)Al requires distinct stellar origins: steady-state galactic stellar nucleosynthesis for (182)Hf and late-stage contamination of the protosolar molecular cloud by a massive star(s) for (26)Al. Admixing of stellar-derived (26)Al to the protoplanetary disk occurred during the epoch of CAI formation and, therefore, the (26)Al-(26)Mg systematics of CAIs cannot be used to define their formation interval. In contrast, our results support (182)Hf homogeneity and chronological significance of the (182)Hf-(182)W clock.
Process Research ON Semix Silicon Materials (PROSSM)
NASA Astrophysics Data System (ADS)
Wohlgemuth, J. H.; Warfield, D. B.
1982-02-01
A cost effective process sequence was identified, equipment was designed to implement a 6.6 MW per year automated production line, and a cost analysis projected a $0.56 per watt cell add-on cost for this line. Four process steps were developed for this program: glass beads back clean-up, hot spray antireflective coating, wave soldering of fronts, and ion milling for edging. While spray dopants were advertised as an off the shelf developed product, they were unreliable with shorter than advertised shelf life.
Process Research ON Semix Silicon Materials (PROSSM)
NASA Technical Reports Server (NTRS)
Wohlgemuth, J. H.; Warfield, D. B.
1982-01-01
A cost effective process sequence was identified, equipment was designed to implement a 6.6 MW per year automated production line, and a cost analysis projected a $0.56 per watt cell add-on cost for this line. Four process steps were developed for this program: glass beads back clean-up, hot spray antireflective coating, wave soldering of fronts, and ion milling for edging. While spray dopants were advertised as an off the shelf developed product, they were unreliable with shorter than advertised shelf life.
Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T
2007-03-01
Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.
Planning effectiveness may grow on fault trees.
Chow, C W; Haddad, K; Mannino, B
1991-10-01
The first step of a strategic planning process--identifying and analyzing threats and opportunities--requires subjective judgments. By using an analytical tool known as a fault tree, healthcare administrators can reduce the unreliability of subjective decision making by creating a logical structure for problem solving and decision making. A case study of 11 healthcare administrators showed that an analysis technique called prospective hindsight can add to a fault tree's ability to improve a strategic planning process.
Worthy of His Sufferings: How Strategic Leaders Learned from Failure
2013-03-01
his career during America’s war with Mexico . In the course of the war, the accidental soldier proved to be a steady, introspective, yet aggressive...calm deportment under fire. Grant’s performance in Mexico earned him a reputation as a “man of fire” among his fellow soldiers, but more importantly...terrain, weather, unreliable logistics and a cholera outbreak. Even though many dependants died during the journey due to these circumstances
Analytic score distributions for a spatially continuous tridirectional Monte Carol transport problem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Booth, T.E.
1996-01-01
The interpretation of the statistical error estimates produced by Monte Carlo transport codes is still somewhat of an art. Empirically, there are variance reduction techniques whose error estimates are almost always reliable, and there are variance reduction techniques whose error estimates are often unreliable. Unreliable error estimates usually result from inadequate large-score sampling from the score distribution`s tail. Statisticians believe that more accurate confidence interval statements are possible if the general nature of the score distribution can be characterized. Here, the analytic score distribution for the exponential transform applied to a simple, spatially continuous Monte Carlo transport problem is provided.more » Anisotropic scattering and implicit capture are included in the theory. In large part, the analytic score distributions that are derived provide the basis for the ten new statistical quality checks in MCNP.« less
Social control of unreliable signals of strength in male but not female crayfish, Cherax destructor.
Walter, Gregory M; van Uitregt, Vincent O; Wilson, Robbie S
2011-10-01
The maintenance of unreliable signals within animal populations remains a highly controversial subject in studies of animal communication. Crustaceans are an ideal group for studying unreliable signals of strength because their chela muscles are cryptically concealed beneath an exoskeleton, making it difficult for competitors to visually assess an opponent's strength. In this study, we examined the importance of social avenues for mediating the possible advantages gained by unreliable signals of strength in crustaceans. To do this, we investigated the factors that determine social dominance and the relative importance of signalling and fighting during aggressive encounters in male and female freshwater crayfish, Cherax destructor. Like other species of crayfish, we expected substantial variation in weapon force for a given weapon size, making the assessment of actual fighting ability of an opponent difficult from signalling alone. In addition, we expected fighting would be used to ensure that individuals that are weak for their signal (i.e. chela) size would not achieve higher than expected dominance. For both male and female C. destructor, we found large variation in the actual force of their chela for any given weapon size, indicating that it is difficult for competitors to accurately assess an opponent's force on signal size alone. For males, these unreliable signals of strength were controlled socially through increased levels of fighting and a decreased reliance on signalling, thus directly limiting the benefits accrued to individuals employing high-quality signals (large chelae) with only low resource holding potential. However, in contrast to our predictions, we found that females primarily relied on signalling to settle disputes, resulting in unreliable signals of strength being routinely used to establish dominance. The reliance by females on unreliable signals to determine dominance highlights our poor current understanding of the prevalence and distribution of dishonesty in animal communication.
Determination of Moulting Events in Rock Lobsters from Pleopod Clipping
Gardner, Caleb; Mills, David J.
2013-01-01
Rock lobster growth is routinely measured for research to optimise management measures such as size limits and quotas. The process of estimating growth is complicated in crustaceans as growth only occurs when the animal moults. As data are typically collected by tag-recapture methods, the timing of moulting events can bias results. For example, if annual moulting events take place within a very short time-at-large after tagging, or if time-at-large is long and no moulting occurs. Classifying data into cases where moulting has / has not occurred during time-at-large can be required and can generally be determined by change in size between release and recapture. However, in old or slow growth individuals the moult increment can be too small to provide surety that moulting has occurred. A method that has been used since the 1970’s to determine moulting in rock lobsters involves clipping the distal portion of a pleopod so that any regeneration observed at recapture can be used as evidence of a moult. We examined the use of this method in both tank and long-duration field trials within a marine protected area, which provided access to large animals with smaller growth increments. Our results emphasised that determination of moulting by change in size was unreliable with larger lobsters and that pleopod clipping can assist in identifying moulting events. However, regeneration was an unreliable measure of moulting if clipping occurred less than three months before the moult. PMID:24009769
Determination of moulting events in rock lobsters from pleopod clipping.
Gardner, Caleb; Mills, David J
2013-01-01
Rock lobster growth is routinely measured for research to optimise management measures such as size limits and quotas. The process of estimating growth is complicated in crustaceans as growth only occurs when the animal moults. As data are typically collected by tag-recapture methods, the timing of moulting events can bias results. For example, if annual moulting events take place within a very short time-at-large after tagging, or if time-at-large is long and no moulting occurs. Classifying data into cases where moulting has / has not occurred during time-at-large can be required and can generally be determined by change in size between release and recapture. However, in old or slow growth individuals the moult increment can be too small to provide surety that moulting has occurred. A method that has been used since the 1970's to determine moulting in rock lobsters involves clipping the distal portion of a pleopod so that any regeneration observed at recapture can be used as evidence of a moult. We examined the use of this method in both tank and long-duration field trials within a marine protected area, which provided access to large animals with smaller growth increments. Our results emphasised that determination of moulting by change in size was unreliable with larger lobsters and that pleopod clipping can assist in identifying moulting events. However, regeneration was an unreliable measure of moulting if clipping occurred less than three months before the moult.
Wet meadow ecosystems and the longevity of biologically-mediated geomorphic features
NASA Astrophysics Data System (ADS)
Nash, C.; Grant, G.; O'Connor, J. E.
2016-12-01
Upland meadows represent a ubiquitous feature of montane landscapes in the U.S. West and beyond. Characterized by flat valley floors flanked by higher-gradient hillslopes, these meadows are important features, both for the diverse ecosystems they support but also because they represent depositional features in what is primarily an erosional environment. As such, they serve as long-term chronometers of both geological and ecological processes in a portion of the landscape where such records are rare, and provide a useful microcosm for exploring many of the questions motivating critical zone science. Specifically, meadows can offer insights into questions regarding the longevity of theses biologically-mediated landscapes, and the geomorphic thresholds associated with transitions between metastable landscape states. Though categorically depositional, wet meadows have been shown to rapidly shift into erosional landscapes characterized by deep arroyos, declining water tables, and sparse, semi-arid ecosystems. Numerous hypotheses have been proposed explaining this shift: intensive ungulate usage, removal of beaver, climatic shifts, and intrinsic geomorphic evolution. Even less is known about the mechanisms controlling the construction of these meadow features. Evidence seems to suggest these channels oscillate between two metastable conditions: deeply incised, single-threaded channels and sheet-flow dominated valley-spanning wetlands. We present new evidence exploring the subsurface architecture of wet meadows and the bidirectional process cascades potentially responsible for their temporal evolution. Using a combination of near surface geophysical techniques and detailed stratigraphic descriptions of incised and un-incised meadows throughout the Silvies River Basin, OR, we examine mechanisms responsible both for the construction of these features and their apparently rapid transition from depositional to erosional. Our investigation focuses specifically on potential interactions between biogenic and geomorphic features and processes: beaver meadow complexes, downed wood, and the accumulation of senescent vegetation to form thick peat mounds. These observations have broad potential utility to help guide meadow restoration efforts across the Western U.S.
Kayzar, Theresa M.; Williams, Ross W.
2015-09-26
The model age or ‘date of purification’ of a nuclear material is an important nuclear forensic signature. In this study, chemical separation and MC-ICP-MS measurement techniques were developed for 226 Ra and 227Ac: grand-daughter nuclides in the 238U and 235U decay chains respectively. The 230Th- 234U, 226Ra- 238U, 231Pa- 235U, and 227Ac- 235U radiochronometers were used to calculate model ages for CRM-U100 standard reference material and two highly-enriched pieces of uranium metal from the International Technical Working Group Round Robin 3 Exercise. In conclusion, the results demonstrate the accuracy of the 226Ra- 238U and 227Ac- 235U chronometers and provide informationmore » about nuclide migration during uranium processing.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kayzar, Theresa M.; Williams, Ross W.
The model age or ‘date of purification’ of a nuclear material is an important nuclear forensic signature. In this study, chemical separation and MC-ICP-MS measurement techniques were developed for 226 Ra and 227Ac: grand-daughter nuclides in the 238U and 235U decay chains respectively. The 230Th- 234U, 226Ra- 238U, 231Pa- 235U, and 227Ac- 235U radiochronometers were used to calculate model ages for CRM-U100 standard reference material and two highly-enriched pieces of uranium metal from the International Technical Working Group Round Robin 3 Exercise. In conclusion, the results demonstrate the accuracy of the 226Ra- 238U and 227Ac- 235U chronometers and provide informationmore » about nuclide migration during uranium processing.« less
Reflectometry diagnostics on TCV
NASA Astrophysics Data System (ADS)
Molina Cabrera, Pedro; Coda, Stefano; Porte, Laurie; Offeddu, Nicola; Tcv Team
2017-10-01
Both profile reflectometer and Doppler back-scattering (DBS) diagnostics are being developed for the TCV Tokamak using a steerable quasi-optical launcher and universal polarizers. First results will be presented. A pulse reflectometer is being developed to complement Thomson Scattering measurements of electron density, greatly increasing temporal resolution and also effectively enabling fluctuation measurements. Pulse reflectometry consists of sending short pulses of varying frequency and measuring the roundtrip group-delay with precise chronometers. A fast arbitrary waveform generator is used as a pulse source feeding frequency multipliers that bring the pulses to V-band. A DBS diagnostic is currently operational in TCV. DBS may be used to infer the perpendicular velocity and wave number spectrum of electron density fluctuations in the 3-15 cm-1 wave-number range. Off-the-shelf transceiver modules, originally used for VNA measurements, are being used in a Doppler radar configuration. See author list of S. Coda et al., 2017 Nucl. Fusion 57 102011.
Model-based pH monitor for sensor assessment.
van Schagen, Kim; Rietveld, Luuk; Veersma, Alex; Babuska, Robert
2009-01-01
Owing to the nature of the treatment processes, monitoring the processes based on individual online measurements is difficult or even impossible. However, the measurements (online and laboratory) can be combined with a priori process knowledge, using mathematical models, to objectively monitor the treatment processes and measurement devices. The pH measurement is a commonly used measurement at different stages in the drinking water treatment plant, although it is a unreliable instrument, requiring significant maintenance. It is shown that, using a grey-box model, it is possible to assess the measurement devices effectively, even if detailed information of the specific processes is unknown.
Challenges and opportunities with spin-based logic
NASA Astrophysics Data System (ADS)
Perricone, Robert; Niemier, Michael; Hu, X. Sharon
2017-09-01
In this paper, we provide a short overview of efforts to process information with spin as a state variable. We highlight initial efforts in spintronics where devices concepts such as spinwaves, field coupled nanomagnets, etc. were are considered as vehicles for processing information. We also highlight more recent work where spintronic logic and memory devices are considered in the context of information processing hardware for the internet of things (IoT), and where the ability to constantly "checkpoint" processor state can support computing in environments with unreliable power supplies.
Direct costs of unintended pregnancy in the Russian federation.
Lowin, Julia; Jarrett, James; Dimova, Maria; Ignateva, Victoria; Omelyanovsky, Vitaly; Filonenko, Anna
2015-02-01
In 2010, almost every third pregnancy in Russia was terminated, indicating that unintended pregnancy (UP) is a public health problem. The aim of this study was to estimate the direct cost of UP to the healthcare system in Russia and the proportion attributable to using unreliable contraception. A cost model was built, adopting a generic payer perspective with a 1-year time horizon. The analysis cohort was defined as women of childbearing age between 18 and 44 years actively seeking to avoid pregnancy. Model inputs were derived from published sources or government statistics with a 2012 cost base. To estimate the number of UPs attributable to unreliable methods, the model combined annual typical use failure rates and age-adjusted utilization for each contraceptive method. Published survey data was used to adjust the total cost of UP by the number of UPs that were mistimed rather than unwanted. Scenario analysis considered alternate allocation of methods to the reliable and unreliable categories and estimate of the burden of UP in the target sub-group of women aged 18-29 years. The model estimated 1,646,799 UPs in the analysis cohort (women aged 18-44 years) with an associated annual cost of US$783 million. The model estimated 1,019,371 UPs in the target group of 18-29 years, of which 88 % were attributable to unreliable contraception. The total cost of UPs in the target group was estimated at approximately US$498 million, of which US$441 million could be considered attributable to the use of unreliable methods. The cost of UP attributable to use of unreliable contraception in Russia is substantial. Policies encouraging use of reliable contraceptive methods could reduce the burden of UP.
Connecting the U-Th and U-Pb Chronometers: New Algorithms and Applications
NASA Astrophysics Data System (ADS)
McLean, N. M.; Smith, C. J. M.; Roberts, N. M. W.; Richards, D. A.
2016-12-01
The U-Th and U-Pb geochronometers are important clocks for separate intervals of the geologic timescale. U-Th dates exploit disequilibrium in the 238U intermediate daughter isotopes 234U and 230Th, and are often used to date corals and speleothems that are zero age through 800 ka. The U-Pb system relies on secular equilibrium decay of 238U to 206Pb and 235U to 207Pb over longer timescales, and can be used to date samples from <1 Ma to 4.5 Ga. Disequilibrium plays a role in young U-Pb dates, but only as a nuisance correction. Both chronometers can produce dates with uncertainties <0.1% near the center of their applicable age ranges, but become less precise at their intersection, when the 238U decay chain approaches secular equilibrium and there has been little time for ingrowth of radiogenic Pb. However, if measurements or assumptions about both chronometers can be made, then they can be combined into a single, more informed date. Coupling the datasets can improve their precision and accuracy and help interrogate the assumptions that underpin each. Working with this data is difficult for two reasons. The Bateman equations are long and cumbersome for U decay chains that include 238U, 234U, 230Th, 226Ra, 206Pb and 235U, 231Pa, and 207Pb. Also, Pb measurements often comprise varying amounts of radiogenic Pb from locally heterogeneous U concentrations mixed with varying amounts of common Pb. At present there is no established, flexible computational framework to combine information from measurements and/or assumptions of these parameters, and no way to visualize and interpret the results. We present new algorithms to quickly and accurately solve the system of differential equations defined by both of the uranium decay chains and the linear regression through the U-Pb isochron. The results are illustrated on a new concordia diagram, where the concordia curve is determined by measured and/or assumed U-series disequilibrium and can have unfamiliar topologies. We demonstrate this approach using data collected by solution and laser ablation ICPMS on carbonates with measurable 230Th and 234U disequilibrium, measurable disequilibrium for only 234U, and when only assumptions can be made about initial U-series disequilibrium. Potential applications include refining chronologies at ca. 1 Ma, an important period in Earth history.
Swindle, T.D.; Grossman, J.N.; Olinger, C.T.; Garrison, D.H.
1991-01-01
We have performed INAA, petrographie, and noble gas analyses on seventeen chondrules from the Semarkona meteorite (LL3.0) primarily to study the relationship of the I-Xe system to other measured properties. We observe a range of ???10 Ma in apparent I-Xe ages. The three latest apparent ages fall in a cluster, suggesting the possibility of a common event. The initial 129I/127I ratio (R0) is apparently related to chondrule type and/or mineralogy, with nonporphyritic and pyroxene-rich chondrules showing evidence for lower R0'S (later apparent I-Xe ages) than porphyritic and olivine-rich chondrules. In addition, chondrules with sulfides on or near the surface have lower R0S than other chondrules. The 129Xe/132Xe ratio in the trapped Xe component anticorrelates with R0, consistent with evolution of a chronometer in a closed system or in multiple similar systems. On the basis of these correlations, we conclude that the variations in R0 represent variations in ages, and that later event(s), possibly aqueous alteration, preferentially affected chondrules with nonporphyritic textures and/or sulfide-rich exteriors about 10 Ma after the formation of the chondrules. ?? 1991.
Extrinsic and Intrinsic Motivation at 30: Unresolved Scientific Issues
ERIC Educational Resources Information Center
Reiss, Steven
2005-01-01
The undermining effect of extrinsic reward on intrinsic motivation remains unproven. The key unresolved issues are construct invalidity (all four definitions are unproved and two are illogical); measurement unreliability (the free-choice measure requires unreliable, subjective judgments to infer intrinsic motivation); inadequate experimental…
Coman, Emil N; Iordache, Eugen; Dierker, Lisa; Fifield, Judith; Schensul, Jean J; Suggs, Suzanne; Barbour, Russell
2014-05-01
The advantages of modeling the unreliability of outcomes when evaluating the comparative effectiveness of health interventions is illustrated. Adding an action-research intervention component to a regular summer job program for youth was expected to help in preventing risk behaviors. A series of simple two-group alternative structural equation models are compared to test the effect of the intervention on one key attitudinal outcome in terms of model fit and statistical power with Monte Carlo simulations. Some models presuming parameters equal across the intervention and comparison groups were underpowered to detect the intervention effect, yet modeling the unreliability of the outcome measure increased their statistical power and helped in the detection of the hypothesized effect. Comparative Effectiveness Research (CER) could benefit from flexible multi-group alternative structural models organized in decision trees, and modeling unreliability of measures can be of tremendous help for both the fit of statistical models to the data and their statistical power.
USDA-ARS?s Scientific Manuscript database
Natural pond spawning of channel catfish is unreliable, unpredictable, and is dependent on environmental conditions. Male and female broodfish are typically held in the same pond for 2 or 3 years. Approximately 30-50% of the females and 10 percent of the males present in the pond participate in the...
USDA-ARS?s Scientific Manuscript database
In the past, several techniques have been developed as diagnostic tools for the differential diagnosis of tumours produced by Marek’s disease virus (MDV) from those induced by avian leukosis virus (ALV) and reticuloendotheliosis virus (REV). However, most current techniques are unreliable using form...
Army Science Board 2001 AD HOC Study Knowledge Management
2001-11-01
dissemination, Army, Army culture, information dominance , knowledge dominance, information sharing, situational awareness, network-centric, infosphere...proposed effort and the emerging Army ICT for Information Dominance are all excellent foundation efforts for KM and Information Assurance. The panel’s...level is critical to survivability and lethality. – Unreliable information will quickly reverse the advantages of “ Information Dominance ” essential to
U.S. Navy Ships Food Service Divisions: Modernizing Inventory Management
2010-06-01
management procedures for receipt, inventory, stowage, and issue of provisions onboard ships have remained relatively unchanged for decades. Culinary ...improve the quality of life for Culinary Specialists 15. NUMBER OF PAGES 87 14. SUBJECT TERMS Inventory management, records keeper, stores onload...remained relatively unchanged for decades. Culinary Specialists are utilizing an antiquated and unreliable inventory management program (the Food
A Simple Equation to Predict a Subscore's Value
ERIC Educational Resources Information Center
Feinberg, Richard A.; Wainer, Howard
2014-01-01
Subscores are often used to indicate test-takers' relative strengths and weaknesses and so help focus remediation. But a subscore is not worth reporting if it is too unreliable to believe or if it contains no information that is not already contained in the total score. It is possible, through the use of a simple linear equation provided in…
A high performance totally ordered multicast protocol
NASA Technical Reports Server (NTRS)
Montgomery, Todd; Whetten, Brian; Kaplan, Simon
1995-01-01
This paper presents the Reliable Multicast Protocol (RMP). RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service such as IP Multicasting. RMP is fully and symmetrically distributed so that no site bears un undue portion of the communication load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These QoS guarantees are selectable on a per packet basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, an implicit naming service, mutually exclusive handlers for messages, and mutually exclusive locks. It has commonly been held that a large performance penalty must be paid in order to implement total ordering -- RMP discounts this. On SparcStation 10's on a 1250 KB/sec Ethernet, RMP provides totally ordered packet delivery to one destination at 842 KB/sec throughput and with 3.1 ms packet latency. The performance stays roughly constant independent of the number of destinations. For two or more destinations on a LAN, RMP provides higher throughput than any protocol that does not use multicast or broadcast.
Horn, P L; Neil, H L; Paul, L J; Marriott, P
2010-11-01
Age validation of bluenose Hyperoglyphe antarctica was sought using the independent bomb chronometer procedure. Radiocarbon ((14) C) levels were measured in core micro-samples from 12 otoliths that had been aged using a zone count method. The core (14) C measurement for each fish was compared with the value on a surface water reference curve for the calculated birth year of the fish. There was good agreement, indicating that the line-count ageing method described here is not substantially biased. A second micro-sample was also taken near the edge of nine of the otolith cross-sections to help define a bomb-carbon curve for waters deeper than 200-300 m. There appears to be a 10 to 15 year lag in the time it takes the (14) C to reach the waters where adult H. antarctica are concentrated. The maximum estimated age of this species was 76 years, and females grow significantly larger than males. Von Bertalanffy growth curves were estimated, and although they fit the available data reasonably well, the lack of aged juvenile fish results in the K and t(0) parameters being biologically meaningless. Consequently, curves that are likely to better represent population growth were estimated by forcing t(0) to be -0·5. © 2010 NIWA. Journal of Fish Biology © 2010 The Fisheries Society of the British Isles.
How Do Households Respond to Unreliable Water Supplies? A Systematic Review.
Majuru, Batsirai; Suhrcke, Marc; Hunter, Paul R
2016-12-09
Although the Millennium Development Goal (MDG) target for drinking water was met, in many developing countries water supplies are unreliable. This paper reviews how households in developing countries cope with unreliable water supplies, including coping costs, the distribution of coping costs across socio-economic groups, and effectiveness of coping strategies in meeting household water needs. Structured searches were conducted in peer-reviewed and grey literature in electronic databases and search engines, and 28 studies were selected for review, out of 1643 potentially relevant references. Studies were included if they reported on strategies to cope with unreliable household water supplies and were based on empirical research in developing countries. Common coping strategies include drilling wells, storing water, and collecting water from alternative sources. The choice of coping strategies is influenced by income, level of education, land tenure and extent of unreliability. The findings of this review highlight that low-income households bear a disproportionate coping burden, as they often engage in coping strategies such as collecting water from alternative sources, which is labour and time-intensive, and yields smaller quantities of water. Such alternative sources may be of lower water quality, and pose health risks. In the absence of dramatic improvements in the reliability of water supplies, a point of critical avenue of enquiry should be what coping strategies are effective and can be readily adopted by low income households.
False Beliefs in Unreliable Knowledge Networks
NASA Astrophysics Data System (ADS)
Ioannidis, Evangelos; Varsakelis, Nikos; Antoniou, Ioannis
2017-03-01
The aims of this work are: (1) to extend knowledge dynamics analysis in order to assess the influence of false beliefs and unreliable communication channels, (2) to investigate the impact of selection rule-policy for knowledge acquisition, (3) to investigate the impact of targeted link attacks ("breaks" or "infections") of certain "healthy" communication channels. We examine the knowledge dynamics analytically, as well as by simulations on both artificial and real organizational knowledge networks. The main findings are: (1) False beliefs have no significant influence on knowledge dynamics, while unreliable communication channels result in non-monotonic knowledge updates ("wild" knowledge fluctuations may appear) and in significant elongation of knowledge attainment. Moreover, false beliefs may emerge during knowledge evolution, due to the presence of unreliable communication channels, even if they were not present initially, (2) Changing the selection rule-policy, by raising the awareness of agents to avoid the selection of unreliable communication channels, results in monotonic knowledge upgrade and in faster knowledge attainment, (3) "Infecting" links is more harmful than "breaking" links, due to "wild" knowledge fluctuations and due to the elongation of knowledge attainment. Moreover, attacking even a "small" percentage of links (≤5%) with high knowledge transfer, may result in dramatic elongation of knowledge attainment (over 100%), as well as in delays of the onset of knowledge attainment. Hence, links of high knowledge transfer should be protected, because in Information Warfare and Disinformation, these links are the "best targets".
How Do Households Respond to Unreliable Water Supplies? A Systematic Review
Majuru, Batsirai; Suhrcke, Marc; Hunter, Paul R.
2016-01-01
Although the Millennium Development Goal (MDG) target for drinking water was met, in many developing countries water supplies are unreliable. This paper reviews how households in developing countries cope with unreliable water supplies, including coping costs, the distribution of coping costs across socio-economic groups, and effectiveness of coping strategies in meeting household water needs. Structured searches were conducted in peer-reviewed and grey literature in electronic databases and search engines, and 28 studies were selected for review, out of 1643 potentially relevant references. Studies were included if they reported on strategies to cope with unreliable household water supplies and were based on empirical research in developing countries. Common coping strategies include drilling wells, storing water, and collecting water from alternative sources. The choice of coping strategies is influenced by income, level of education, land tenure and extent of unreliability. The findings of this review highlight that low-income households bear a disproportionate coping burden, as they often engage in coping strategies such as collecting water from alternative sources, which is labour and time-intensive, and yields smaller quantities of water. Such alternative sources may be of lower water quality, and pose health risks. In the absence of dramatic improvements in the reliability of water supplies, a point of critical avenue of enquiry should be what coping strategies are effective and can be readily adopted by low income households. PMID:27941695
Oxidation Ditch Technology for Upgrading Army Sewage Treatment Facilities.
1983-08-01
expensive and unreliable anaerobic digestion . Because of these advantages, oxidation ditch technology should be considered when planning wastewater...eliminates the need for further sludge treatment (e.g., anaerobic digestion can be eliminated). Does not need primary clarifier. Few moving parts in...four Army plants (see Chapter 2) use the anaerobic digestion process for sludge treatment. There are often problems in operating these digestors, and
Additive Manufacturing in the Marine Corps
2015-06-01
commonly referred to as 3D printing. This thesis answers the question of how additive manufacturing can improve the effectiveness of Marine Corps...analysis of current and future 3D -printing processes, examination of several civilian and military examples, and examination of the impact across...fully integrating 3D printers, such as the lack of certification and qualification standards, unreliable end product results, and determining ownership
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byerly, Benjamin L.; Stanley, Floyd; Spencer, Khal
In our study, a certified plutonium metal reference material (CRM 126) with a known production history is examined using analytical methods that are commonly employed in nuclear forensics for provenancing and attribution. Moreover, the measured plutonium isotopic composition and actinide assay are consistent with values reported on the reference material certificate. Model ages from U/Pu and Am/Pu chronometers agree with the documented production timeline. Finally, these results confirm the utility of these analytical methods and highlight the importance of a holistic approach for forensic study of unknown materials.
The GAMMA Ray Sky as Seen by Fermi: Opening a New Window on the High Energy Space Environment
2009-01-01
pulsars , stars whose repeating emissions can be used as ultra-precise chronometers. Measurement of gamma radiation provides unique insight...diffuse glow are a number of bright point sources, mostly gamma ray pulsars — rotating, magnetized neutron stars — as discussed below. The bright sources...important early discoveries of Fermi have been from objects in our galaxy. The LAT has discovered 12 new pulsars that seem to be visible only in gamma
The Effects of Source Unreliability on Prior and Future Word Learning
ERIC Educational Resources Information Center
Faught, Gayle G.; Leslie, Alicia D.; Scofield, Jason
2015-01-01
Young children regularly learn words from interactions with other speakers, though not all speakers are reliable informants. Interestingly, children will reverse to trusting a reliable speaker when a previously endorsed speaker proves unreliable. When later asked to identify the referent of a novel word, children who reverse trust are less willing…
NASA Technical Reports Server (NTRS)
Neufeld, David A.; Melnick, Gary J.; Harwit, Martin
1998-01-01
We have detected the S(1), S(2), S(3), S(4), and S(5) pure rotational lines of molecular hydrogen toward the outflow source HH 54 using the Short Wavelength Spectrometer on board the Infrared Space Observatory. The observed H2 line ratios indicate the presence of warm molecular gas with an H2 density of at least 10(sup 5) /cc and a temperature approximately 650 K in which the ratio of ortho- to para-H2 is only 1.2 -+ 0.4, significantly smaller than the equilibrium ratio of 3 expected in gas at that temperature. These observations imply that the measured ratio of ortho- to para-H2 is the legacy of an earlier stage in the thermal history of the gas when the gas had reached equilibrium at a temperature approximately 90 K. Based upon the expected timescale for equilibration, we argue that the nonequilibrium ratio of ortho- to para-H2 observed in HH 54 serves as a chronometer that places a conservative upper limit of approximately 5000 yr on the period for which the emitting gas has been warm. The S(2)/,S(l) and S(3)/S(1) H2 line ratios measured toward HH 54 are consistent with recent theoretical models of Timmermann for the conversion of para- to ortho-H2 behind slow, C-type shocks, but only if the preshock ratio of ortho- to para-H2 was approximately < 0.2.
Water-in-Olivine Magma Ascent Chronometry: Every Crystal is a Clock
NASA Astrophysics Data System (ADS)
Newcombe, M. E.; Asimow, P. D.; Ferriss, E.; Barth, A.; Lloyd, A. S.; Hauri, E.; Plank, T. A.
2017-12-01
The syneruptive decompression rate of basaltic magma in volcanic conduits is thought to be a critical control on eruptive vigor. Recent efforts have constrained decompression rates using models of diffusive water loss from melt embayments (Lloyd et al. 2014; Ferguson et al. 2016), olivine-hosted melt inclusions (Chen et al. 2013; Le Voyer et al. 2014), and clinopyroxene phenocrysts (Lloyd et al. 2016). However, these techniques are difficult to apply because of the rarity of melt embayments and clinopyroxene phenocrysts suitable for analysis and the complexities associated with modeling water loss from melt inclusions. We are developing a new magma ascent chronometer based on syneruptive diffusive water loss from olivine phenocrysts. We have found water zonation in every olivine phenocryst we have measured, from explosive eruptions of Pavlof, Seguam, Fuego, Cerro Negro and Kilauea volcanoes. Phenocrysts were polished to expose a central plane normal to the crystallographic `b' axis and volatile concentration profiles were measured along `a' and `c' axes by SIMS or nanoSIMS. Profiles are compared to 1D and 3D finite-element models of diffusive water loss from olivine, with or without melt inclusions, whose boundaries are in equilibrium with a melt undergoing closed-system degassing. In every case, we observe faster water diffusion along the `a' axis, consistent with the diffusion anisotropy observed by Kohlstedt and Mackwell (1998) for the so-called `proton-polaron' mechanism of H-transport. Water concentration gradients along `a' match the 1D diffusion model with a diffusivity of 10-10 m2/s (see Plank et al., this meeting), olivine-melt partition coefficient of 0.0007-0.002 (based on melt inclusion-olivine pairs), and decompression rates equal to the best-fit values from melt embayment studies (Lloyd et al. 2014; Ferguson et al. 2016). Agreement between the melt embayment and water-in-olivine ascent chronometers at Fuego, Seguam, and Kilauea Iki demonstrates the potential of this new technique, which can be applied to any olivine-bearing mafic-intermediate eruption using common analytical tools (SIMS and FTIR). In theory, each crystal is a clock, with the potential to record variable ascent in the conduit, over the course of an eruption, and between eruptions.
The Proper Sequence for Correcting Correlation Coefficients for Range Restriction and Unreliability.
ERIC Educational Resources Information Center
Stauffer, Joseph M.; Mendoza, Jorge L.
2001-01-01
Uses classical test theory to show that it is the nature of the range restriction, rather than the nature of the available reliability coefficient, that determines the sequence for applying corrections for range restriction and unreliability. Shows how the common rule of thumb for choosing the sequence is tenable only when the correction does not…
Method for removal of random noise in eddy-current testing system
Levy, Arthur J.
1995-01-01
Eddy-current response voltages, generated during inspection of metallic structures for anomalies, are often replete with noise. Therefore, analysis of the inspection data and results is difficult or near impossible, resulting in inconsistent or unreliable evaluation of the structure. This invention processes the eddy-current response voltage, removing the effect of random noise, to allow proper identification of anomalies within and associated with the structure.
Army General Fund Adjustments Not Adequately Documented or Supported
2016-07-26
compilation process. Finding The Office of the Assistant Secretary of the Army (Financial Management & Comptroller) (OASA[FM&C]) and the Defense Finance and...statements were unreliable and lacked an adequate audit trail. Furthermore, DoD and Army managers could not rely on the data in their accounting...systems when making management and resource decisions. Until the Army and DFAS Indianapolis correct these control deficiencies, there is considerable
Autotune Calibrates Models to Building Use Data
None
2018-01-16
Models of existing buildings are currently unreliable unless calibrated manually by a skilled professional. Autotune, as the name implies, automates this process by calibrating the model of an existing building to measured data, and is now available as open source software. This enables private businesses to incorporate Autotune into their products so that their customers can more effectively estimate cost savings of reduced energy consumption measures in existing buildings.
Direct measurement of neon production rates by (α,n) reactions in minerals
NASA Astrophysics Data System (ADS)
Cox, Stephen E.; Farley, Kenneth A.; Cherniak, Daniele J.
2015-01-01
The production of nucleogenic neon from alpha particle capture by 18O and 19F offers a potential chronometer sensitive to temperatures higher than the more widely used (U-Th)/He chronometer. The accuracy depends on the cross sections and the calculated stopping power for alpha particles in the mineral being studied. Published 18O(α,n)21Ne production rates are in poor agreement and were calculated from contradictory cross sections, and therefore demand experimental verification. Similarly, the stopping powers for alpha particles are calculated from SRIM (Stopping Range of Ions in Matter software) based on a limited experimental dataset. To address these issues we used a particle accelerator to implant alpha particles at precisely known energies into slabs of synthetic quartz (SiO2) and barium tungstate (BaWO4) to measure 21Ne production from capture by 18O. Within experimental uncertainties the observed 21Ne production rates compare favorably to our predictions using published cross sections and stopping powers, indicating that ages calculated using these quantities are accurate at the ∼3% level. In addition, we measured the 22Ne/21Ne ratio and (U-Th)/He and (U-Th)/Ne ages of Durango fluorapatite, which is an important model system for this work because it contains both oxygen and fluorine. Finally, we present 21Ne/4He production rate ratios for a variety of minerals of geochemical interest along with software for calculating neon production rates and (U-Th)/Ne ages.
230Th-234U Model-Ages of Some Uranium Standard Reference Materials
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, R W; Gaffney, A M; Kristo, M J
The 'age' of a sample of uranium is an important aspect of a nuclear forensic investigation and of the attribution of the material to its source. To the extent that the sample obeys the standard rules of radiochronometry, then the production ages of even very recent material can be determined using the {sup 230}Th-{sup 234}U chronometer. These standard rules may be summarized as (a) the daughter/parent ratio at time=zero must be known, and (b) there has been no daughter/parent fractionation since production. For most samples of uranium, the 'ages' determined using this chronometer are semantically 'model-ages' because (a) some assumptionmore » of the initial {sup 230}Th content in the sample is required and (b) closed-system behavior is assumed. The uranium standard reference materials originally prepared and distributed by the former US National Bureau of Standards and now distributed by New Brunswick Laboratory as certified reference materials (NBS SRM = NBL CRM) are good candidates for samples where both rules are met. The U isotopic standards have known purification and production dates, and closed-system behavior in the solid form (U{sub 3}O{sub 8}) may be assumed with confidence. We present here {sup 230}Th-{sup 234}U model-ages for several of these standards, determined by isotope dilution mass spectrometry using a multicollector ICP-MS, and compare these ages with their known production history.« less
ERIC Educational Resources Information Center
Knight, Megan E.
2017-01-01
Today's grading practices mirror those of the early 1900s, and despite myriad research suggesting they are invalid, unreliable, and a hindrance to student learning, many teachers continue detrimental practices such as using 100-point percentage scales averaging all academic and nonacademic factors together into a single grade, and using grades to…
Self-mixing instrument for simultaneous distance and speed measurement
NASA Astrophysics Data System (ADS)
Norgia, Michele; Melchionni, Dario; Pesatori, Alessandro
2017-12-01
A novel instrument based on Self-mixing interferometry is proposed to simultaneously measure absolute distance and velocity. The measurement method is designed for working directly on each kind of surface, in industrial environment, overcoming also problems due to speckle pattern effect. The laser pump current is modulated at quite high frequency (40 kHz) and the estimation of the induced fringes frequency allows an almost instantaneous measurement (measurement time equal to 25 μs). A real time digital elaboration processes the measurement data and discards unreliable measurements. The simultaneous measurement reaches a relative standard deviation of about 4·10-4 in absolute distance, and 5·10-3 in velocity measurement. Three different laser sources are tested and compared. The instrument shows good performances also in harsh environment, for example measuring the movement of an opaque iron tube rotating under a running water flow.
Saving-enhanced memory: the benefits of saving on the learning and remembering of new information.
Storm, Benjamin C; Stone, Sean M
2015-02-01
With the continued integration of technology into people's lives, saving digital information has become an everyday facet of human behavior. In the present research, we examined the consequences of saving certain information on the ability to learn and remember other information. Results from three experiments showed that saving one file before studying a new file significantly improved memory for the contents of the new file. Notably, this effect was not observed when the saving process was deemed unreliable or when the contents of the to-be-saved file were not substantial enough to interfere with memory for the new file. These results suggest that saving provides a means to strategically off-load memory onto the environment in order to reduce the extent to which currently unneeded to-be-remembered information interferes with the learning and remembering of other information. © The Author(s) 2014.
ERIC Educational Resources Information Center
Steiner, Peter M.; Cook, Thomas D.; Shadish, William R.
2011-01-01
The effect of unreliability of measurement on propensity score (PS) adjusted treatment effects has not been previously studied. The authors report on a study simulating different degrees of unreliability in the multiple covariates that were used to estimate the PS. The simulation uses the same data as two prior studies. Shadish, Clark, and Steiner…
Navarro, Jordan; Yousfi, Elsa; Deniel, Jonathan; Jallais, Christophe; Bueno, Mercedes; Fort, Alexandra
2016-12-01
In the past, lane departure warnings (LDWs) were demonstrated to improve driving behaviours during lane departures but little is known about the effects of unreliable warnings. This experiment focused on the influence of false warnings alone or in combination with missed warnings and warning onset on assistance effectiveness and acceptance. Two assistance unreliability levels (33 and 17%) and two warning onsets (partial and full lane departure) were manipulated in order to investigate interaction. Results showed that assistance, regardless unreliability levels and warning onsets, improved driving behaviours during lane departure episodes and outside of these episodes by favouring better lane-keeping performances. Full lane departure and highly unreliable warnings, however, reduced assistance efficiency. Drivers' assistance acceptance was better for the most reliable warnings and for the subsequent warnings. The data indicate that imperfect LDWs (false warnings or false and missed warnings) further improve driving behaviours compared to no assistance. Practitioner Summary: This study revealed that imperfect lane departure warnings are able to significantly improve driving performances and that warning onset is a key element for assistance effectiveness and acceptance. The conclusion may be of particular interest for lane departure warning designers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sperling, Joshua; Fisher, Stephen; Reiner, Mark B.
The term 'leapfrogging' has been applied to cities and nations that have adopted a new form of infrastructure by bypassing the traditional progression of development, e.g., from no phones to cell phones - bypassing landlines all together. However, leapfrogging from unreliable infrastructure systems to 'smart' cities is too large a jump resulting in unsustainable and unhealthy infrastructure systems. In the Global South, a baseline of unreliable infrastructure is a prevalent problem. The push for sustainable and 'smart' [re]development tends to ignore many of those already living with failing, unreliable infrastructure. Without awareness of baseline conditions, uninformed projects run the riskmore » of returning conditions to the status quo, keeping many urban populations below targets of the United Nations' Sustainable Development Goals. A key part of understanding the baseline is to identify how citizens have long learned to adjust their expectations of basic services. To compensate for poor infrastructure, most residents in the Global South invest in remedial secondary infrastructure (RSI) at the household and business levels. The authors explore three key 'smart' city transformations that address RSI within a hierarchical planning pyramid known as the comprehensive resilient and reliable infrastructure systems (CRISP) planning framework.« less
NASA Astrophysics Data System (ADS)
Li, Zhengxiang; Gonzalez, J. E.; Yu, Hongwei; Zhu, Zong-Hong; Alcaniz, J. S.
2016-02-01
We apply two methods, i.e., the Gaussian processes and the nonparametric smoothing procedure, to reconstruct the Hubble parameter H (z ) as a function of redshift from 15 measurements of the expansion rate obtained from age estimates of passively evolving galaxies. These reconstructions enable us to derive the luminosity distance to a certain redshift z , calibrate the light-curve fitting parameters accounting for the (unknown) intrinsic magnitude of type Ia supernova (SNe Ia), and construct cosmological model-independent Hubble diagrams of SNe Ia. In order to test the compatibility between the reconstructed functions of H (z ), we perform a statistical analysis considering the latest SNe Ia sample, the so-called joint light-curve compilation. We find that, for the Gaussian processes, the reconstructed functions of Hubble parameter versus redshift, and thus the following analysis on SNe Ia calibrations and cosmological implications, are sensitive to prior mean functions. However, for the nonparametric smoothing method, the reconstructed functions are not dependent on initial guess models, and consistently require high values of H0, which are in excellent agreement with recent measurements of this quantity from Cepheids and other local distance indicators.
NASA Technical Reports Server (NTRS)
Swindle, T. D.; Grossman, J. N.; Olinger, C. T.; Garrison, D. H.
1991-01-01
The relationship of the I-Xe system of the Semarkona meteorite to other measured properties is investigated via INAA, petrographic, and noble-gas analyses on 17 chondrules from the meteorite. A range of not less than 10 Ma in apparent I-Xe ages is observed. The three latest apparent ages fall in a cluster, suggesting the possibility of a common event. It is argued that the initial I-129/I-127 ratio (R0) is related to chondrule type and/or mineralogy, with nonporphyritic and pyroxene-rich chondrules showing evidence for lower R0s than porphyritic and olivine-rich chondrules. Chondrules with sulfides on or near the surface have lower R0s than other chondrules. The He-129/Xe-132 ratio in the trapped Xe component anticorrelates with R0, consistent with the evolution of a chronometer in a closed system or in multiple systems. It is concluded that the variations in R0 represent variations in ages, and that later events, possibly aqueous alteration, preferentially affected chondrules with nonporphyritic textures and/or sulfide-rich exteriors about 10 Ma after the formation of the chondrules.
Unknown Risks: Parental Hesitation about Vaccination.
Blaisdell, Laura L; Gutheil, Caitlin; Hootsmans, Norbert A M; Han, Paul K J
2016-05-01
This qualitative study of a select sample of vaccine-hesitant parents (VHPs) explores perceived and constructed personal judgments about the risks and uncertainties associated with vaccines and vaccine-preventable diseases (VPDs) and how these subjective risk judgments influence parents' decisions about childhood vaccination. The study employed semistructured focus group interviews with 42 VHPs to elicit parents' perceptions and thought processes regarding the risks associated with vaccination and nonvaccination, the sources of these perceptions, and their approach to decision making about vaccination for their children. VHPs engage in various reasoning processes and tend to perceive risks of vaccination as greater than the risks of VPDs. At the same time, VHPs engage in other reasoning processes that lead them to perceive ambiguity in information about the harms of vaccination-citing concerns about the missing, conflicting, changing, or otherwise unreliable nature of information. VHPs' refusal of vaccination may reflect their aversion to both the risk and ambiguity they perceive to be associated with vaccination. Mitigating this vaccine hesitancy likely requires reconstructing the risks and ambiguities associated with vaccination-a challenging task that requires providing parents with meaningful evidence-based information on the known risks of vaccination versus VPDs and explicitly acknowledging the risks that remain truly unknown. © The Author(s) 2015.
Hybrid Rendering with Scheduling under Uncertainty
Tamm, Georg; Krüger, Jens
2014-01-01
As scientific data of increasing size is generated by today’s simulations and measurements, utilizing dedicated server resources to process the visualization pipeline becomes necessary. In a purely server-based approach, requirements on the client-side are minimal as the client only displays results received from the server. However, the client may have a considerable amount of hardware available, which is left idle. Further, the visualization is put at the whim of possibly unreliable server and network conditions. Server load, bandwidth and latency may substantially affect the response time on the client. In this paper, we describe a hybrid method, where visualization workload is assigned to server and client. A capable client can produce images independently. The goal is to determine a workload schedule that enables a synergy between the two sides to provide rendering results to the user as fast as possible. The schedule is determined based on processing and transfer timings obtained at runtime. Our probabilistic scheduler adapts to changing conditions by shifting workload between server and client, and accounts for the performance variability in the dynamic system. PMID:25309115
Efficient Solar Concentrators: Affordable Energy from Water and Sunlight
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-01-01
Broad Funding Opportunity Announcement Project: Teledyne is developing a liquid prism panel that tracks the position of the sun to help efficiently concentrate its light onto a solar cell to produce power. Typically, solar tracking devices have bulky and expensive mechanical moving parts that require a lot of power and are often unreliable. Teledyne’s liquid prism panel has no bulky and heavy supporting parts—instead it relies on electrowetting. Electrowetting is a process where an electric field is applied to the liquid to control the angle at which it meets the sunlight above and to control the angle of the sunlightmore » to the focusing lensthe more direct the angle to the focusing lens, the more efficiently the light can be concentrated to solar panels and converted into electricity. This allows the prism to be tuned like a radio to track the sun across the sky and steer sunlight into the solar cell without any moving mechanical parts. This process uses very little power and requires no expensive supporting hardware or moving parts, enabling efficient and quiet rooftop operation for integration into buildings.« less
Forensic investigation of plutonium metal: a case study of CRM 126
Byerly, Benjamin L.; Stanley, Floyd; Spencer, Khal; ...
2016-11-01
In our study, a certified plutonium metal reference material (CRM 126) with a known production history is examined using analytical methods that are commonly employed in nuclear forensics for provenancing and attribution. Moreover, the measured plutonium isotopic composition and actinide assay are consistent with values reported on the reference material certificate. Model ages from U/Pu and Am/Pu chronometers agree with the documented production timeline. Finally, these results confirm the utility of these analytical methods and highlight the importance of a holistic approach for forensic study of unknown materials.
I-Pu-Xe dating and the relative ages of the earth and moon
NASA Technical Reports Server (NTRS)
Swindle, T. D.; Caffee, M. W.; Hohenberg, C. M.; Taylor, S. R.
1986-01-01
The ages of the earth and moon as determined by various chronometric systems are discussed with primary emphasis placed on the development of an I-Pu-Xe chronometer. Data on excess fission xenon are reviewed with attention given to the strengths and weaknesses of the assumptions required for lunar I-Pu-Xe chronometry. Using I-Pu-Xe dating, it is estimated that the retention of excess fission xenon in lunar samples began no more than 63 + or - 42 m.y. after the time of primitive meteorite formation.
Radioactive dating of the elements
NASA Technical Reports Server (NTRS)
Cowan, John J.; Thielemann, Friedrich-Karl; Truran, James W.
1991-01-01
The extent to which an accurate determination of the age of the Galaxy, and thus a lower bound on the age of the universe, can be obtained from radioactive dating is discussed. Emphasis is given to the use of the long-lived radioactive nuclei Re-187, Th-232, U-238, and U-235. The nature of the production sites of these and other potential Galactic chronometers is examined along with their production ratios. Age determinations from models of nucleocosmochronology are reviewed and compared with age determination from stellar sources and age constraints form cosmological considerations.
White dwarf stars: cosmic chronometers and dark matter probes
NASA Astrophysics Data System (ADS)
Salaris, Maurizio; Cassisi, Santi
2018-04-01
White dwarfs (WD) are the endpoint of the evolution of the large majority of stars formed in our galaxy. In the last two decades observations and theory have improved to a level that makes it possible to employ WD for determining ages of the stellar populations in the disk of the Milky Way and in the nearest star clusters, and constrain the existence and properties of dark matter (DM) candidates. This review is centred on WD models, age-dating, and DM identification methods, recent results and future developments of the field.
2009-12-02
an overarching regional information technology architecture to synchronize the vessel arrival and departure schedules with marine terminal, short... wakeup call for the rest of the nation’s ports and regions. The impacts of these disruptions are felt throughout the supply chain - in the national...and equipment repositioning. Supply chain unreliability wreaks havoc with planning, scheduling , purchasing, sales, and distribution. Suppliers are
MASQOT: a method for cDNA microarray spot quality control
Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan
2005-01-01
Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442
Developing Reliable Telemedicine Platforms with Unreliable and Limited Communication Bandwidth
2017-10-01
hospital health care, the benefit of high -resolution medical data is greatly limited in battlefield or natural disaster areas, where communication to...sampling rate. For high - frequency data like waveforms, the downsampling approach could directly reduce the amount of data. Therefore, it could be used...AFRL-SA-WP-TR-2017-0019 Developing Reliable Telemedicine Platforms with Unreliable and Limited Communication Bandwidth Peter F
Estimating Economic and Logistic Utility of Connecting to Unreliable Power Grids
2016-06-17
the most unreliable host nation grids almost always have a higher availability than solar photovoltaics ( PV ), which for most parts of the world will...like solar , and still design a facility energy architecture that benefits from that source when available. Index Terms—facilities management, energy...Maintenance PV Photovoltaic SAIDI System Average Interruption Duration Index SAIFI System Average Interruption Frequency Index SHP Simplified Host
NASA Technical Reports Server (NTRS)
Neufeld, David A.; Melnick, Gary J.; Harwit, Martin
1998-01-01
We have detected the S(1), S(2), S(3), S(4), and S(5) pure rotational lines of molecular hydrogen toward the outflow source HH 54 using the Short Wavelength Spectrometer on board the Infrared Space Observatory. The observed H2 line ratios indicate the presence of warm molecular gas with an H2 density of at least 10(exp 5) cm(exp -3) and a temperature approximately 650 K in which the ratio of ortho- to para-H2 is only 1.2 +/- 0.4, significantly smaller than the equilibrium ratio of 3 expected in gas at that temperature. These observations imply that the measured ratio of ortho- to para-H2 is the legacy of an earlier stage in the thermal history of the gas when the gas had reached equilibrium at a temperature approximately less than 90 K. Based upon the expected timescale for equilibration, we argue that the nonequilibrium ratio of ortho- to para-H2 observed in HH 54 serves as a chronometer that places a conservative upper limit of approximately 5000 yr on the period for which the emitting gas has been warm. The S(2)/S(1) and S(3)/S(1) H2 line ratios measured toward HH 54 are consistent with recent theoretical models of Timmermann for the conversion of para- to ortho-H2 behind slow, C-type shocks, but only if the preshock ratio of ortho- to para-H2 was approximately less than 0.2.
Winter, S; Smith, A; Lappin, D; McDonagh, G; Kirk, B
2017-12-01
Dental handpieces are required to be sterilized between patient use. Vacuum steam sterilization processes with fractionated pre/post-vacuum phases or unique cycles for specified medical devices are required for hollow instruments with internal lumens to assure successful air removal. Entrapped air will compromise achievement of required sterilization conditions. Many countries and professional organizations still advocate non-vacuum sterilization processes for these devices. To investigate non-vacuum downward/gravity displacement, type-N steam sterilization of dental handpieces, using thermometric methods to measure time to achieve sterilization temperature at different handpiece locations. Measurements at different positions within air turbines were undertaken with thermocouples and data loggers. Two examples of widely used UK benchtop steam sterilizers were tested: a non-vacuum benchtop sterilizer (Little Sister 3; Eschmann, Lancing, UK) and a vacuum benchtop sterilizer (Lisa; W&H, Bürmoos, Austria). Each sterilizer cycle was completed with three handpieces and each cycle in triplicate. A total of 140 measurements inside dental handpiece lumens were recorded. The non-vacuum process failed (time range: 0-150 s) to reliably achieve sterilization temperatures within the time limit specified by the international standard (15 s equilibration time). The measurement point at the base of the handpiece failed in all test runs (N = 9) to meet the standard. No failures were detected with the vacuum steam sterilization type B process with fractionated pre-vacuum and post-vacuum phases. Non-vacuum downward/gravity displacement, type-N steam sterilization processes are unreliable in achieving sterilization conditions inside dental handpieces, and the base of the handpiece is the site most likely to fail. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Use of multivariate statistics to identify unreliable data obtained using CASA.
Martínez, Luis Becerril; Crispín, Rubén Huerta; Mendoza, Maximino Méndez; Gallegos, Oswaldo Hernández; Martínez, Andrés Aragón
2013-06-01
In order to identify unreliable data in a dataset of motility parameters obtained from a pilot study acquired by a veterinarian with experience in boar semen handling, but without experience in the operation of a computer assisted sperm analysis (CASA) system, a multivariate graphical and statistical analysis was performed. Sixteen boar semen samples were aliquoted then incubated with varying concentrations of progesterone from 0 to 3.33 µg/ml and analyzed in a CASA system. After standardization of the data, Chernoff faces were pictured for each measurement, and a principal component analysis (PCA) was used to reduce the dimensionality and pre-process the data before hierarchical clustering. The first twelve individual measurements showed abnormal features when Chernoff faces were drawn. PCA revealed that principal components 1 and 2 explained 63.08% of the variance in the dataset. Values of principal components for each individual measurement of semen samples were mapped to identify differences among treatment or among boars. Twelve individual measurements presented low values of principal component 1. Confidence ellipses on the map of principal components showed no statistically significant effects for treatment or boar. Hierarchical clustering realized on two first principal components produced three clusters. Cluster 1 contained evaluations of the two first samples in each treatment, each one of a different boar. With the exception of one individual measurement, all other measurements in cluster 1 were the same as observed in abnormal Chernoff faces. Unreliable data in cluster 1 are probably related to the operator inexperience with a CASA system. These findings could be used to objectively evaluate the skill level of an operator of a CASA system. This may be particularly useful in the quality control of semen analysis using CASA systems.
ERIC Educational Resources Information Center
Cook, Thomas D.; Steiner, Peter M.; Pohl, Steffi
2009-01-01
This study uses within-study comparisons to assess the relative importance of covariate choice, unreliability in the measurement of these covariates, and whether regression or various forms of propensity score analysis are used to analyze the outcome data. Two of the within-study comparisons are of the four-arm type, and many more are of the…
Chen, Hui-Ya; Chang, Hsiao-Yun; Ju, Yan-Ying; Tsao, Hung-Ting
2017-06-01
Rhythmic gymnasts specialise in dynamic balance under sensory conditions of numerous somatosensory, visual, and vestibular stimulations. This study investigated whether adolescent rhythmic gymnasts are superior to peers in Sensory Organisation test (SOT) performance, which quantifies the ability to maintain standing balance in six sensory conditions, and explored whether they plateaued faster during familiarisation with the SOT. Three and six sessions of SOTs were administered to 15 female rhythmic gymnasts (15.0 ± 1.8 years) and matched peers (15.1 ± 2.1 years), respectively. The gymnasts were superior to their peers in terms of fitness measures, and their performance was better in the SOT equilibrium score when visual information was unreliable. The SOT learning effects were shown in more challenging sensory conditions between Sessions 1 and 2 and were equivalent in both groups; however, over time, the gymnasts gained marginally significant better visual ability and relied less on visual sense when unreliable. In conclusion, adolescent rhythmic gymnasts have generally the same sensory organisation ability and learning rates as their peers. However, when visual information is unreliable, they have superior sensory organisation ability and learn faster to rely less on visual sense.
NASA Astrophysics Data System (ADS)
Costa, K.; Faith, E. S.; McManus, J. F.
2017-12-01
Deep-sea sediment mixing by bioturbation is ubiquitous on the seafloor, and it can be an important influence on the fidelity of paleoceanographic records. Bioturbation can be difficult to quantify, especially in the past, but diffusive models based on radioactive tracer profiles have provided a relatively successful approach. Stable isotope and radiocarbon data from five different foraminiferal species from sediment on the Juan de Fuca Ridge, Northeast Pacific, have previously identified age plateaus that correspond to peak foraminiferal abundances, related to assemblage shifts and carbonate preservation changes since the last glacial period. Here we present size-specific foraminiferal assemblages and over 100 radiocarbon dates to better constrain the effects of bioturbation on fossil chronometers. N. pachyderma is the dominant species in the 150-212µm while G. bulloides is the dominant species in all other size fractions (212-250 µm, 250-300 µm, 300-355 µm). The foraminiferal assemblage of 212-300 µm is found to be representative of the entire adult foraminiferal population >150 µm. Size-specific radiocarbon analyses on G. bulloides demonstrate that larger specimens are generally younger than smaller specimens, but all sizes are susceptible to abundance peak age plateaus. The young bias towards larger specimens may reflect their greater susceptibility to fragmentation during prolonged bioturbation, so that the influence of abundance peaks is shorter-lived in these size fractions. When foraminiferal abundance peaks are unavoidable, e.g. due to large shifts in carbonate preservation, we suggest that larger foraminiferal may provide a more accurate chronometer for sediment age.
Application of Methods of Numerical Analysis to Physical and Engineering Data.
1980-10-15
directed algorithm would seem to be called for. However, 1(0) is itself a random process, making its gradient too unreliable for such a sensitive algorithm...radiation energy on the detector . Active laser systems, on the other hand, have created now the possibility for extremely narrow path band systems...emitted by the earth and its atmosphere. The broad spectral range was selected so that the field of view of the detector could be narrowed to obtain
Engineering Amorphous Systems, Using Global-to-Local Compilation
NASA Astrophysics Data System (ADS)
Nagpal, Radhika
Emerging technologies are making it possible to assemble systems that incorporate myriad of information-processing units at almost no cost: smart materials, selfassembling structures, vast sensor networks, pervasive computing. How does one engineer robust and prespecified global behavior from the local interactions of immense numbers of unreliable parts? We discuss organizing principles and programming methodologies that have emerged from Amorphous Computing research, that allow us to compile a specification of global behavior into a robust program for local behavior.
The Search for, Recovery, and Positive Identification of a Vietnam-Era U.S. Army Soldier
2010-02-01
CriminalJustice, College at Brnckpon. SUNY Brockpon. NY 14420. tCeiilrül Identification Laboratory (CIL), Joint POW-MIA Accounting Ciimmand (JPAC...most consistent with Mongoloid ancestry during the subse- quent joint forensic review (JFR). which involved American and Vietnamese forensic experts...described by analysts as "unreliable" apparently because of his nervous behavior. One of the other four witnesses turned over a wrist- watch and a dog
Information Collection using Handheld Devices in Unreliable Networking Environments
2014-06-01
different types of mobile devices that connect wirelessly to a database 8 server. The actual backend database is not important to the mobile clients...Google’s infrastructure and local servers with MySQL and PostgreSQL on the backend (ODK 2014b). (2) Google Fusion Tables are used to do basic link...how we conduct business. Our requirements to share information do not change simply because there is little or no existing infrastructure in our
Manufacturing process scale-up of optical grade transparent spinel ceramic at ArmorLine Corporation
NASA Astrophysics Data System (ADS)
Spilman, Joseph; Voyles, John; Nick, Joseph; Shaffer, Lawrence
2013-06-01
While transparent Spinel ceramic's mechanical and optical characteristics are ideal for many Ultraviolet (UV), visible, Short-Wave Infrared (SWIR), Mid-Wave Infrared (MWIR), and multispectral sensor window applications, commercial adoption of the material has been hampered because the material has historically been available in relatively small sizes (one square foot per window or less), low volumes, unreliable supply, and with unreliable quality. Recent efforts, most notably by Technology Assessment and Transfer (TA and T), have scaled-up manufacturing processes and demonstrated the capability to produce larger windows on the order of two square feet, but with limited output not suitable for production type programs. ArmorLine Corporation licensed the hot-pressed Spinel manufacturing know-how of TA and T in 2009 with the goal of building the world's first dedicated full-scale Spinel production facility, enabling the supply of a reliable and sufficient volume of large Transparent Armor and Optical Grade Spinel plates. With over $20 million of private investment by J.F. Lehman and Company, ArmorLine has installed and commissioned the largest vacuum hot press in the world, the largest high-temperature/high-pressure hot isostatic press in the world, and supporting manufacturing processes within 75,000 square feet of manufacturing space. ArmorLine's equipment is capable of producing window blanks as large as 50" x 30" and the facility is capable of producing substantial volumes of material with its Lean configuration and 24/7 operation. Initial production capability was achieved in 2012. ArmorLine will discuss the challenges that were encountered during scale-up of the manufacturing processes, ArmorLine Optical Grade Spinel optical performance, and provide an overview of the facility and its capabilities.
Lebel, Etienne P; Paunonen, Sampo V
2011-04-01
Implicit measures have contributed to important insights in almost every area of psychology. However, various issues and challenges remain concerning their use, one of which is their considerable variation in reliability, with many implicit measures having questionable reliability. The goal of the present investigation was to examine an overlooked consequence of this liability with respect to replication, when such implicit measures are used as dependent variables in experimental studies. Using a Monte Carlo simulation, the authors demonstrate that a higher level of unreliability in such dependent variables is associated with substantially lower levels of replicability. The results imply that this overlooked consequence can have far-reaching repercussions for the development of a cumulative science. The authors recommend the routine assessment and reporting of the reliability of implicit measures and also urge the improvement of implicit measures with low reliability.
HIV/AIDS and access to water: A case study of home-based care in Ngamiland, Botswana
NASA Astrophysics Data System (ADS)
Ngwenya, B. N.; Kgathi, D. L.
This case study investigates access to potable water in HIV/AIDS related home-based care households in five rural communities in Ngamiland, Botswana. Primary data collected from five villages consisted of two parts. The first survey collected household data on demographic and rural livelihood features and impacts of HIV/AIDS. A total of 129 households were selected using a two-stage stratified random sampling method. In the second survey, a total of 39 family primary and community care givers of continuously ill, bed-ridden or non-bed-ridden HIV/AIDS patients were interviewed. A detailed questionnaire, with closed and open-ended questions, was used to collect household data. In addition to using the questionnaire, data were also collected through participant observation, informal interviews and secondary sources. The study revealed that there are several sources of water for communities in Ngamiland such as off-plot, outdoor (communal) and on-plot outdoor and/or indoor (private) water connections, as well as other sources such as bowsed water, well-points, boreholes and open perennial/ephemeral water from river channels and pans. There was a serious problem of unreliable water supply caused by, among other things, the breakdown of diesel-powered water pumps, high frequency of HIV/AIDS related absenteeism, and the failure of timely delivery of diesel fuel. Some villages experienced chronic supply disruptions while others experienced seasonal or occasional water shortages. Strategies for coping with unreliability of water supply included economizing on water, reserve storage, buying water, and collection from river/dug wells or other alternative sources such as rain harvesting tanks in government institutions. The unreliability of water supply resulted in an increase in the use of water of poor quality and other practices of poor hygiene as well as a high opportunity cost of water collection. In such instances, bathing of patients was cut from twice daily to once or not at all. Depending on the severity of HIV/AIDS related symptoms, e.g. diarrhoea, 20-80 additional litres of water could be required daily. The case study demonstrates that, at individual level, access to water is an integral element of the patient’s holistic healing process and psychosocial well being. At household and community levels, access to sufficient supplies of potable water when and where it is needed is central to mitigation of HIV/AIDS impacts. Access to water should therefore not be treated strictly as an economic good due to its importance as a basic human need, a social good and indeed a human right.
Exploring the Feasibility of Reputation Models for Improving P2P Routing under Churn
NASA Astrophysics Data System (ADS)
Sànchez-Artigas, Marc; García-López, Pedro; Herrera, Blas
Reputation mechanisms help peer-to-peer (P2P) networks to detect and avoid unreliable or uncooperative peers. Recently, it has been discussed that routing protocols can be improved by conditioning routing decisions to the past behavior of forwarding peers. However, churn — the continuous process of node arrival and departure — may severely hinder the applicability of rating mechanisms. In particular, short lifetimes mean that reputations are often generated from a small number of transactions.
Russia and NATO Missile Defense: The European Phased Adaptive Approach Experience, 2009-2017
2018-03-01
the Manhattan Project, the U.S. effort to develop the atomic bomb . Donald L. Hafner observes, “In 1941, two years after crucial scientific work by...Szilard and Fermi suggested an atomic bomb might be feasible, the Manhattan Project began its task with promise from the project advocates that a...German V-2 rocket. The V-1 flying bomb preceded the V-2 rocket and was only somewhat unreliable but was able to save on both fuel and air crews.35
Local Free-Space Mapping and Path Guidance for Mobile Robots.
1988-03-01
CM a CD U 00 Technical Document 1227 March 1988 Local Free- Space Mapping o and Path Guidance for Mobile Robots o William T. Gex N’% Nancy L. Campbell...TITLE (inludvSeocutCl&sas~o*) Local Free- Space Mapping and Path Guidance for Mobile Robots 12. PERSONAL AUTHOR(S) William T. Gex and Nancy L...Description of Robot System... 2 Free- Space Mapping ... 4 Map Construction ... 4 . ,12pping Examplk... 5 ’ft Sensor Unreliability... 8 % Path Guidance
Is Q for Quantum? From Quantum Mechanics to Formation of the Solar System
NASA Technical Reports Server (NTRS)
Wilson, T. L.; Mittlefehldt, D. W.
2006-01-01
The realization in 1985 that fullerenes exist in nature [1] as a third form of carbon-carbon clustering, continues to inspire new areas of research. In particular, the study of closed-cage endohedral fullerenes [2-6] is of scientific interest because of its potential application in a number of promising fields from medical imaging to astrophysics. One of these is to provide a possible chronometer for studying the age and origin of certain astromaterials in the solar system. Fullerenes are closed carbon cages that are fundamentally related to a long-standing debate over the "Q-Phase" origin of planetary noble gases in carbonaceous chondrites [7]. Although Q-phase has been identified as the carrier of planetary noble gases [8- 10], its physical nature has not been explained. Our limited understanding of it is based primarily on the laboratory chemical processing which it survives as well as the fact that it must have been widely distributed in the solar nebula [11]. Yet as important as it might be while preoccupying some 30 years of research, the question of what actually is Q-phase remains unresolved.
Sm-Nd, Rb-Sr, and Mn-Cr Ages of Yamato 74013
NASA Technical Reports Server (NTRS)
Nyquist, L. E.; Shih, C.- Y.; Reese, Y.D.
2009-01-01
Yamato 74013 is one of 29 paired diogenites having granoblastic textures. The Ar-39 - Ar-40 age of Y-74097 is approximately 1100 Ma. Rb-Sr and Sm-Nd analyses of Y-74013, -74037, -74097, and -74136 suggested that multiple young metamorphic events disturbed their isotopic systems. Masuda et al. reported that REE abundances were heterogeneous even within the same sample (Y-74010) for sample sizes less than approximately 2 g. Both they and Nyquist et al. reported data for some samples showing significant LREE enrichment. In addition to its granoblastic texture, Y-74013 is characterized by large, isolated clots of chromite up to 5 mm in diameter. Takeda et al. suggested that these diogenites originally represented a single or very small number of coarse orthopyroxene crystals that were recrystallized by shock processes. They further suggested that initial crystallization may have occurred very early within the deep crust of the HED parent body. Here we report the chronology of Y-74013 as recorded in chronometers based on long-lived Rb-87 and Sm-147, intermediate- lived Sm-146, and short-lived Mn-53.
NASA Technical Reports Server (NTRS)
Tatsunori, T.; Misawa, K.; Okano, O.; Shih, C.-Y.; Nyquist, L. E.; Simon, J. I.; Tappa, M. J.; Yoneda, S.
2015-01-01
Radiogenic ingrowth of Ca-40 due to decay of K-40 occurred early in the solar system history causing the Ca-40 abundance to vary within different early-former reservoirs. Marshall and DePaolo ] demonstrated that the K-40/Ca-40 decay system could be a useful radiogenic tracer for studies of terrestrial rocks. Shih et al. [3,4] determined 40K/40Ca ages of lunar granitic rock fragments and discussed the chemical characteristics of their source materials. Recently, Yokoyama et al. [5] showed the application of the K-40/Ca-40 chronometer for high K/Ca materials in ordinary chondrites (OCs). High-precision calcium isotopic data are needed to constrain mixing processes among early solar system materials and the time of planetesimal formation. To better constrain the solar system calcium isotopic compositions among astromaterials, we have determined the calcium isotopic compositions of OCs and an angrite. We further estimated a source K/Ca ratio for alkali-rich fragments in a chondritic breccia using the estimated solar system initial Ca-40/Ca-44.
Contamination and Radiation Effects on Nonlinear Crystals for Space Laser Systems
NASA Technical Reports Server (NTRS)
Abdeldayem, Hossain A.; Dowdye, Edward; Jamison, Tracee; Canham, John; Jaeger, Todd
2005-01-01
Space Lasers are vital tools for NASA s space missions and military applications. Although, lasers are highly reliable on the ground, several past space laser missions proved to be short-lived and unreliable. In this communication, we are shedding more light on the contamination and radiation issues, which are the most common causes for optical damages and laser failures in space. At first, we will present results based on the study of liquids and subsequently correlate these results to the particulates of the laser system environment. We present a model explaining how the laser beam traps contaminants against the optical surfaces and cause optical damages and the role of gravity in the process. We also report the results of the second harmonic generation efficiency for nonlinear optical crystals irradiated with high-energy beams of protons. In addition, we are proposing to employ the technique of adsorption to minimize the presence of adsorbing molecules present in the laser compartment.
Kühn, Simone; Fernyhough, Charles; Alderson-Day, Benjamin; Hurlburt, Russell T.
2014-01-01
To provide full accounts of human experience and behavior, research in cognitive neuroscience must be linked to inner experience, but introspective reports of inner experience have often been found to be unreliable. The present case study aimed at providing proof of principle that introspection using one method, descriptive experience sampling (DES), can be reliably integrated with fMRI. A participant was trained in the DES method, followed by nine sessions of sampling within an MRI scanner. During moments where the DES interview revealed ongoing inner speaking, fMRI data reliably showed activation in classic speech processing areas including left inferior frontal gyrus. Further, the fMRI data validated the participant’s DES observations of the experiential distinction between inner speaking and innerly hearing her own voice. These results highlight the precision and validity of the DES method as a technique of exploring inner experience and the utility of combining such methods with fMRI. PMID:25538649
Kidd, Celeste; Palmeri, Holly; Aslin, Richard N
2013-01-01
Children are notoriously bad at delaying gratification to achieve later, greater rewards (e.g., Piaget, 1970)-and some are worse at waiting than others. Individual differences in the ability-to-wait have been attributed to self-control, in part because of evidence that long-delayers are more successful in later life (e.g., Shoda, Mischel, & Peake, 1990). Here we provide evidence that, in addition to self-control, children's wait-times are modulated by an implicit, rational decision-making process that considers environmental reliability. We tested children (M=4;6, N=28) using a classic paradigm-the marshmallow task (Mischel, 1974)-in an environment demonstrated to be either unreliable or reliable. Children in the reliable condition waited significantly longer than those in the unreliable condition (p<0.0005), suggesting that children's wait-times reflected reasoned beliefs about whether waiting would ultimately pay off. Thus, wait-times on sustained delay-of-gratification tasks (e.g., the marshmallow task) may not only reflect differences in self-control abilities, but also beliefs about the stability of the world. Copyright © 2012 Elsevier B.V. All rights reserved.
Bellucci, L G; Frignani, M; Cochran, J K; Albertazzi, S; Zaggia, L; Cecconi, G; Hopkins, H
2007-01-01
Five salt marsh sediment cores from different parts of the Venice Lagoon were studied to determine their depositional history and its relationship with the environmental changes occurred during the past approximately 100 years. X-radiographs of the cores show no disturbance related to particle mixing. Accretion rates were calculated using a constant flux model applied to excess (210)Pb distributions in the cores. The record of (137)Cs fluxes to the sites, determined from (137)Cs profiles and the (210)Pb chronologies, shows inputs from the global fallout of (137)Cs in the late 1950s to early 1960s and the Chernobyl accident in 1986. Average accretion rates in the cores are comparable to the long-term average rate of mean sea level rise in the Venice Lagoon ( approximately 0.25 cm y(-1)) except for a core collected in a marsh presumably affected by inputs from the Dese River. Short-term variations in accretion rate are correlated with the cumulative frequency of flooding, as determined by records of Acqua Alta, in four of the five cores, suggesting that variations in the phenomena causing flooding (such as wind patterns, storm frequency and NAO) are short-term driving forces for variations in marsh accretion rate.
Cooperation Survives and Cheating Pays in a Dynamic Network Structure with Unreliable Reputation
NASA Astrophysics Data System (ADS)
Antonioni, Alberto; Sánchez, Angel; Tomassini, Marco
2016-06-01
In a networked society like ours, reputation is an indispensable tool to guide decisions about social or economic interactions with individuals otherwise unknown. Usually, information about prospective counterparts is incomplete, often being limited to an average success rate. Uncertainty on reputation is further increased by fraud, which is increasingly becoming a cause of concern. To address these issues, we have designed an experiment based on the Prisoner’s Dilemma as a model for social interactions. Participants could spend money to have their observable cooperativeness increased. We find that the aggregate cooperation level is practically unchanged, i.e., global behavior does not seem to be affected by unreliable reputations. However, at the individual level we find two distinct types of behavior, one of reliable subjects and one of cheaters, where the latter artificially fake their reputation in almost every interaction. Cheaters end up being better off than honest individuals, who not only keep their true reputation but are also more cooperative. In practice, this results in honest subjects paying the costs of fraud as cheaters earn the same as in a truthful environment. These findings point to the importance of ensuring the truthfulness of reputation for a more equitable and fair society.
NASA Technical Reports Server (NTRS)
Yin, Qingzhu; Jacobsen, Stein B.
2003-01-01
The utility of the Hf-182 (bar-tau ==13 x 10(exp 6) yr) -W-182 chronometer for early solar system processes is now well established. At the 2002 LPSC meeting we first reported new Hf-W data for chondritic meteorites showing that some crucial data as well as interpretations of Lee and Halliday for chondrites were incorrect. Our results were confirmed by reports of two other groups. This new data imply a much-shorter timescale for the early Solar System evolution and the formation of the Earth s core more consistent with the original conclusions of Harper and Jacobsen. Thus, the chondritic Hf-W evolution is now well established as beginning with epsilon(sub W)(0) = -3.45 +/- 0.25 at the time of origin of the solar system and evolving to -2.2 by 20 Myr and -1.9 +/- 0.20 at present. However, there are a number of iron meteorite data that suggest the existence of initial W lower than those measured for chondrites. If the low epsilon(sub W)(0) of -4 to -5 are correct then we face an embarrassing dilemma of differentiated iron meteorites being older than the primitive chondrites, or we would have to conclude that there is an additional pre-history of 5-10 Myr in primitive chondritic meteorites prior to the closure of the Hf-182 - W-182 system. Such a prolonged early time does not seem reasonable to us. We have therefore initiated a study to resolve this issue.
The success and complementarity of Sm-Nd and Lu-Hf garnet geochronology
NASA Astrophysics Data System (ADS)
Baxter, E. F.; Scherer, E. E.
2013-12-01
Garnet's potential as a direct chronometer of tectonometamorphic processes and conditions was first realized over 30 years ago. Since then, the Sm-Nd and Lu-Hf systems have emerged as the most effective, with both permitting age precision < ×1 Myr. Both have proven successful not merely in dating garnet growth itself, but rather in constraining the ages, durations, and rates of particular earth processes or conditions that can be directly linked to garnet growth via chemical, thermodynamic, or petrographic, means. Appreciating important differences between Sm-Nd and Lu-Hf in terms of contaminant phases, partitioning, daughter element diffusivity, and isotopic analysis makes these two systems powerfully complementary when used and interpreted in concert. Well established, robust analytical methods mitigate the effects of ubiquitous mineral inclusions (monazite is most significant for Sm-Nd; zircon is most significant for Lu-Hf), improving the precision and accuracy of garnet dates from both systems. Parent-daughter ratios tend to be higher for Lu-Hf leading to the potential for better age precision in general. The Lu-176 decay rate is faster than Sm-147, meaning that Lu-Hf provides better age precision potential for young (Cenozoic) samples. However, Sm-Nd provides better precision potential for older (Precambrian) samples primarily because of the higher precisions on the parent-daughter ratios (i.e., 147Sm/144Nd) that can be achieved by ID-TIMS analysis. For dating microsampled zones or growth rings in single garnet crystals, Sm-Nd has proven most successful owing to more uniform distribution of Sm, and established methods to measure <10 ng quantities of Nd at high precision via TIMS. However, new MC-ICP-MS sample introduction technologies are closing this gap for small samples. For analyses of bulk garnet that grew over a protracted interval, Lu-Hf dates are expected to be older than Sm-Nd dates owing to differences in Lu and Sm zonation (i.e. Lu tends to be strongly sequestered by garnet cores, whereas Sm is more evenly distributed). Thus, Lu-Hf is often useful for targeting nucleation times (and earliest growth zones), whereas Sm-Nd is preferable when targeting the mid- to later stages (and outermost growth zones) of garnet. Depending on grain size and heating duration, most garnets can retain their primary growth chronology up to about 700 C, and thus they are one of the few metamorphic minerals that faithfully record prograde metamorphic processes and conditions. For granulite facies rocks (e.g., > about 700 C), higher retentivity (i.e., slower diffusivity) for Hf than for Nd can lead to older Lu-Hf (growth) ages compared to Sm-Nd (partially reset) dates for the same sample. Finally, as with all geochronometers, decay constant uncertainties and sources of systematic error in methods (e.g., spike calibrations) should be considered when comparing absolute Lu-Hf and Sm-Nd dates to each other or to other chronometers.
A view into crustal evolution at mantle depths
NASA Astrophysics Data System (ADS)
Kooijman, Ellen; Smit, Matthijs A.; Ratschbacher, Lothar; Kylander-Clark, Andrew R. C.
2017-05-01
Crustal foundering is an important mechanism in the differentiation and recycling of continental crust. Nevertheless, little is known about the dynamics of the lower crust, the temporal scale of foundering and its role in the dynamics of active margins and orogens. This particularly applies to active settings where the lower crust is typically still buried and direct access is not possible. Crustal xenoliths derived from mantle depth in the Pamir provide a unique exception to this. The rocks are well-preserved and comprise a diverse set of lithologies, many of which re-equilibrated at high-pressure conditions before being erupted in their ultrapotassic host lavas. In this study, we explore the petrological and chronological record of eclogite and felsic granulite xenoliths. We utilized accessory minerals - zircon, monazite and rutile - for coupled in-situ trace-element analysis and U-(Th-)Pb chronology by laser-ablation (split-stream) inductively coupled plasma mass spectrometry. Each integrated analysis was done on single mineral zones and was performed in-situ in thin section to maintain textural context and the ability to interpret the data in this framework. Rutile thermo-chronology exclusively reflects eruption (11.17 ± 0.06Ma), which demonstrates the reliability of the U-Pb rutile thermo-chronometer and its ability to date magmatic processes. Conversely, zircon and monazite reveal a series of discrete age clusters between 55-11 Ma, with the youngest being identical to the age of eruption. Matching age populations between samples, despite a lack of overlapping ages for different chronometers within samples, exhibit the effectiveness of our multi-mineral approach. The REE systematics and age data for zircon and monazite, and Ti-in-zircon data together track the history of the rocks at a million-year resolution. The data reveal that the rocks resided at 30-40 km depth along a stable continental geotherm at 720-750 °C until 24-20 Ma, and were subsequently melted, densified, and buried to 80-90 km depth - 20 km deeper than the present-day Moho - at 930 ± 35°C. The material descended rapidly, accelerating from 0.9-1.7 mm yr-1 to 4.7-5.8 mm yr-1 within 10-12 Myr, and continued descending after reaching mantle depth at 14-13 Ma. The data reflect the foundering of differentiated deep-crustal fragments (2.9-3.5 g cm-3) into a metasomatized and less dense mantle wedge. Through our new approach in constraining the burial history of rocks, we provided the first time-resolved record of this crustal-recycling process. Foundering introduced vestiges of old evolved crust into the mantle wedge over a relatively short period (c. 10 Myr). The recycling process could explain the variability in the degree of crustal contamination of mantle-derived magmatic rocks in the Pamir and neighboring Tibet during the Cenozoic without requiring a change in plate dynamics or source region.
NASA Astrophysics Data System (ADS)
Peters, Stefan T. M.; Münker, Carsten; Pfeifer, Markus; Elfers, Bo-Magnus; Sprung, Peter
2017-02-01
Some nuclides that were produced in supernovae are heterogeneously distributed between different meteoritic materials. In some cases these heterogeneities have been interpreted as the result of interaction between ejecta from a nearby supernova and the nascent solar system. Particularly in the case of the oldest objects that formed in the solar system - Ca-Al rich inclusions (CAIs) - this view is confirm the hypothesis that a nearby supernova event facilitated or even triggered solar system formation. We present Hf isotope data for bulk meteorites, terrestrial materials and CAIs, for the first time including the low-abundance isotope 174Hf (∼0.16%). This rare isotope was likely produced during explosive O/Ne shell burning in massive stars (i.e., the classical "p-process"), and therefore its abundance potentially provides a sensitive tracer for putative heterogeneities within the solar system that were introduced by supernova ejecta. For CAIs and one LL chondrite, also complementary W isotope data are reported for the same sample cuts. Once corrected for small neutron capture effects, different chondrite groups, eucrites, a silicate inclusion of a IAB iron meteorite, and terrestrial materials display homogeneous Hf isotope compositions including 174Hf. Hafnium-174 was thus uniformly distributed in the inner solar system when planetesimals formed at the <50 ppm level. This finding is in good agreement with the evidently homogeneous distributions of p-process isotopes 180W, 184Os and possibly 190Pt between different iron meteorite groups. In contrast to bulk meteorite samples, CAIs show variable depletions in p-process 174Hf with respect to the inner solar system composition, and also variable r-process (or s-process) Hf and W contributions. Based on combined Hf and W isotope compositions, we show that CAIs sampled at least one component in which the proportion of r- and s-process derived Hf and W deviates from that of supernova ejecta. The Hf and W isotope anomalies in CAIs are therefore best explained by selective processing of presolar carrier phases prior to CAI formation, and not by a late injection of supernova materials. Likewise, other isotope anomalies in additional elements in CAIs relative to the bulk solar system may reflect the same process. The isotopic heterogeneities between the first refractory condensates may have been eradicated partially during CAI formation, because W isotope anomalies in CAIs appear to decrease with increasing W concentrations as inferred from time-integrated 182W/184W. Importantly, the 176Lu-176Hf and 182Hf-182W chronometers are not significantly affected by nucleosynthetic heterogeneity of Hf isotopes in bulk meteorites, but may be affected in CAIs.
K-Ca Dating of Alkali-Rich Fragments in the Y-74442 and Bhola LL-Chondritic Breccias
NASA Technical Reports Server (NTRS)
Yokoyama, T; Misawa, K.; Okano, O; Shih, C. -Y.; Nyquist, L. E.; Simon, J. I.; Tappa, M. J.; Yoneda, S.
2013-01-01
Alkali-rich igneous fragments in the brecciated LL-chondrites, Krahenberg (LL5) [1], Bhola (LL3-6) [2], Siena (LL5) [3] and Yamato (Y)-74442 (LL4) [4-6], show characteristic fractionation patterns of alkali and alkaline elements [7]. The alkali-rich fragments in Krahenberg, Bhola and Y-74442 are very similar in mineralogy and petrography, suggesting that they could have come from related precursor materials [6]. Recently we reported Rb-Sr isotopic systematics of alkali-rich igneous rock fragments in Y-74442: nine fragments from Y-74442 yield the Rb-Sr age of 4429 plus or minus 54 Ma (2 sigma) for lambda(Rb-87) = 0.01402 Ga(exp -1) [8] with the initial ratio of Sr-87/Sr-86 = 0.7144 plus or minus 0.0094 (2 sigma) [9]. The Rb-Sr age of the alkali-rich fragments of Y-74442 is younger than the primary Rb-Sr age of 4541 plus or minus 14 Ma for LL-chondrite whole-rock samples [10], implying that they formed after accumulation of LL-chondrite parental bodies, although enrichment may have happened earlier. Marshall and DePaolo [11,12] demonstrated that the K-40 - Ca-40 decay system could be an important chronometer as well as a useful radiogenic tracer for studies of terrestrial rocks. Shih et al. [13,14] and more recently Simon et al. [15] determined K-Ca ages of lunar granitic rocks, and showed the application of the K-Ca chronometer for K-rich planetary materials. Since alkali-rich fragments in the LL-chondritic breccias are highly enriched in K, we can expect enhancements of radiogenic Ca-40. Here, we report preliminary results of K-Ca isotopic systematics of alkali-rich fragments in the LL-chondritic breccias, Y-74442 and Bhola.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Getman, Konstantin V.; Feigelson, Eric D.; Kuhn, Michael A.
2014-06-01
A major impediment to understanding star formation in massive star-forming regions (MSFRs) is the absence of a reliable stellar chronometer to unravel their complex star formation histories. We present a new estimation of stellar ages using a new method that employs near-infrared (NIR) and X-ray photometry, Age {sub JX} . Stellar masses are derived from X-ray luminosities using the L{sub X} -M relation from the Taurus cloud. J-band luminosities are compared to mass-dependent pre-main-sequence (PMS) evolutionary models to estimate ages. Age {sub JX} is sensitive to a wide range of evolutionary stages, from disk-bearing stars embedded in a cloud tomore » widely dispersed older PMS stars. The Massive Young Star-Forming Complex Study in Infrared and X-ray (MYStIX) project characterizes 20 OB-dominated MSFRs using X-ray, mid-infrared, and NIR catalogs. The Age {sub JX} method has been applied to 5525 out of 31,784 MYStIX Probable Complex Members. We provide a homogeneous set of median ages for over 100 subclusters in 15 MSFRs; median subcluster ages range between 0.5 Myr and 5 Myr. The important science result is the discovery of age gradients across MYStIX regions. The wide MSFR age distribution appears as spatially segregated structures with different ages. The Age {sub JX} ages are youngest in obscured locations in molecular clouds, intermediate in revealed stellar clusters, and oldest in distributed populations. The NIR color index J – H, a surrogate measure of extinction, can serve as an approximate age predictor for young embedded clusters.« less
Disturbances in the Isotopic Record of Asuka 881394
NASA Technical Reports Server (NTRS)
Nyquist, L. E.; Bogard, D. D.
2011-01-01
Asuka 881394 is a unique achondrite with a granulitic texture, very calcic approximately An(sub 98) plagioclase, and pigeonite that has not inverted to orthopyroxene. First thought to be a eucrite, recent Oisotopic studies show it has a closer affinity to angrites . Initial isotopic studies provided evidence for now extinct A-26, Mn-53, and Sm-146. A recent study confirmed an early chronology with an absolute Pb-207 - Pb-206 age of 4566.5 +/- 0.2 Ma, a new measurement of the Al-Mg formation interval as 3.7 +/- 0.1 Ma since Al-26/Al-27 = approximately 4.63 x 10(exp -5) for the E60 CAI, and a Mn-Cr formation interval of -6.0 +/- 0.2 Ma relative to LEW86010 ("LEW"). Absolute ages relative to age anchors presented by were 4563.4 +/- 0.2 Ma by Al- Mg and 4564.6 +/- 0.5 Ma by Mn-Cr. These ages are in good, but not perfect, agreement with the Pb-207 - Pb-206 age. Perhaps the most direct comparison of the early chronology of A881394 as determined by various workers is provided by reported Al-26/Al-27 values of 1.18 +/- 0.14, 1.28 +/- 0.07, and 2.1 +/- 0.4 x 10(exp -6). Analyses of mineral separates by TIMS and MC-ICPMS6] agree well, but the higher value obtained by in situ SIMS analysis is significant in light of the slight inconsistency between absolute ages inferred from the short-lived chronometers and the Pb-207 - Pb-206 age. We examine the possibility that inconsistencies in the earliest fine-scale chronology of Asuka 881394 may be related to isotopic "disturbances" observed in Ar-39 - Ar-40, Rb-97 - Sr-87, and Sm-147 - Nd-143 chronometers.
An Examination of the Neural Unreliability Thesis of Autism
Butler, John S.; Molholm, Sophie; Andrade, Gizely N.; Foxe, John J.
2017-01-01
Abstract An emerging neuropathological theory of Autism, referred to here as “the neural unreliability thesis,” proposes greater variability in moment-to-moment cortical representation of environmental events, such that the system shows general instability in its impulse response function. Leading evidence for this thesis derives from functional neuroimaging, a methodology ill-suited for detailed assessment of sensory transmission dynamics occurring at the millisecond scale. Electrophysiological assessments of this thesis, however, are sparse and unconvincing. We conducted detailed examination of visual and somatosensory evoked activity using high-density electrical mapping in individuals with autism (N = 20) and precisely matched neurotypical controls (N = 20), recording large numbers of trials that allowed for exhaustive time-frequency analyses at the single-trial level. Measures of intertrial coherence and event-related spectral perturbation revealed no convincing evidence for an unreliability account of sensory responsivity in autism. Indeed, results point to robust, highly reproducible response functions marked for their exceedingly close correspondence to those in neurotypical controls PMID:27923839
An Examination of the Neural Unreliability Thesis of Autism.
Butler, John S; Molholm, Sophie; Andrade, Gizely N; Foxe, John J
2017-01-01
An emerging neuropathological theory of Autism, referred to here as "the neural unreliability thesis," proposes greater variability in moment-to-moment cortical representation of environmental events, such that the system shows general instability in its impulse response function. Leading evidence for this thesis derives from functional neuroimaging, a methodology ill-suited for detailed assessment of sensory transmission dynamics occurring at the millisecond scale. Electrophysiological assessments of this thesis, however, are sparse and unconvincing. We conducted detailed examination of visual and somatosensory evoked activity using high-density electrical mapping in individuals with autism (N = 20) and precisely matched neurotypical controls (N = 20), recording large numbers of trials that allowed for exhaustive time-frequency analyses at the single-trial level. Measures of intertrial coherence and event-related spectral perturbation revealed no convincing evidence for an unreliability account of sensory responsivity in autism. Indeed, results point to robust, highly reproducible response functions marked for their exceedingly close correspondence to those in neurotypical controls. © The Author 2016. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Lin, Yi-Kuei; Huang, Cheng-Fu
2015-04-01
From a quality of service viewpoint, the transmission packet unreliability and transmission time are both critical performance indicators in a computer system when assessing the Internet quality for supervisors and customers. A computer system is usually modelled as a network topology where each branch denotes a transmission medium and each vertex represents a station of servers. Almost every branch has multiple capacities/states due to failure, partial failure, maintenance, etc. This type of network is known as a multi-state computer network (MSCN). This paper proposes an efficient algorithm that computes the system reliability, i.e., the probability that a specified amount of data can be sent through k (k ≥ 2) disjoint minimal paths within both the tolerable packet unreliability and time threshold. Furthermore, two routing schemes are established in advance to indicate the main and spare minimal paths to increase the system reliability (referred to as spare reliability). Thus, the spare reliability can be readily computed according to the routing scheme.
Selective social learning in infancy: looking for mechanisms.
Crivello, Cristina; Phillips, Sara; Poulin-Dubois, Diane
2018-05-01
Although there is mounting evidence that selective social learning begins in infancy, the psychological mechanisms underlying this ability are currently a controversial issue. The purpose of this study is to investigate whether theory of mind abilities and statistical learning skills are related to infants' selective social learning. Seventy-seven 18-month-olds were first exposed to a reliable or an unreliable speaker and then completed a word learning task, two theory of mind tasks, and a statistical learning task. If domain-general abilities are linked to selective social learning, then infants who demonstrate superior performance on the statistical learning task should perform better on the selective learning task, that is, should be less likely to learn words from an unreliable speaker. Alternatively, if domain-specific abilities are involved, then superior performance on theory of mind tasks should be related to selective learning performance. Findings revealed that, as expected, infants were more likely to learn a novel word from a reliable speaker. Importantly, infants who passed a theory of mind task assessing knowledge attribution were significantly less likely to learn a novel word from an unreliable speaker compared to infants who failed this task. No such effect was observed for the other tasks. These results suggest that infants who possess superior social-cognitive abilities are more apt to reject an unreliable speaker as informant. A video abstract of this article can be viewed at: https://youtu.be/zuuCniHYzqo. © 2017 John Wiley & Sons Ltd.
The beta(+) decay and cosmic-ray half-life of Mn-54
NASA Astrophysics Data System (ADS)
Dacruz, M. T. F.; Norman, E. B.; Chan, Y. D.; Garcia, A.; Larimer, R. M.; Lesko, K. T.; Stokstad, R. G.; Wietfeldt, F. E.
1993-03-01
We performed a search for the beta(+) branch of Mn-54 decay. As a cosmic ray, Mn-54, deprived of its atomic electrons, can decay only via beta(+) and beta(-) decay, with a half-life of the order of 106 yr. This turns Mn-54 into a suitable cosmic chronometer for the study of cosmic-ray confinement times. We searched for coincident back-to-back 511-keV gamma-rays using two germanium detectors inside a Nal(Tl) annulus. An upper limit of 2 x 10-8 was found for the beta(+) decay branch, corresponding to a lower limit of 13.7 for the log ft value.
Mkoka, Dickson Ally; Goicolea, Isabel; Kiwara, Angwara; Mwangu, Mughwira; Hurtig, Anna-Karin
2014-03-19
Provision of quality emergency obstetric care relies upon the presence of skilled health attendants working in an environment where drugs and medical supplies are available when needed and in adequate quantity and of assured quality. This study aimed to describe the experience of rural health facility managers in ensuring the timely availability of drugs and medical supplies for emergency obstetric care (EmOC). In-depth interviews were conducted with a total of 17 health facility managers: 14 from dispensaries and three from health centers. Two members of the Council Health Management Team and one member of the Council Health Service Board were also interviewed. A survey of health facilities was conducted to supplement the data. All the materials were analysed using a qualitative thematic analysis approach. Participants reported on the unreliability of obtaining drugs and medical supplies for EmOC; this was supported by the absence of essential items observed during the facility survey. The unreliability of obtaining drugs and medical supplies was reported to result in the provision of untimely and suboptimal EmOC services. An insufficient budget for drugs from central government, lack of accountability within the supply system and a bureaucratic process of accessing the locally mobilized drug fund were reported to contribute to the current situation. The unreliability of obtaining drugs and medical supplies compromises the timely provision of quality EmOC. Multiple approaches should be used to address challenges within the health system that prevent access to essential drugs and supplies for maternal health. There should be a special focus on improving the governance of the drug delivery system so that it promotes the accountability of key players, transparency in the handling of information and drug funds, and the participation of key stakeholders in decision making over the allocation of locally collected drug funds.
2014-01-01
Background Provision of quality emergency obstetric care relies upon the presence of skilled health attendants working in an environment where drugs and medical supplies are available when needed and in adequate quantity and of assured quality. This study aimed to describe the experience of rural health facility managers in ensuring the timely availability of drugs and medical supplies for emergency obstetric care (EmOC). Methods In-depth interviews were conducted with a total of 17 health facility managers: 14 from dispensaries and three from health centers. Two members of the Council Health Management Team and one member of the Council Health Service Board were also interviewed. A survey of health facilities was conducted to supplement the data. All the materials were analysed using a qualitative thematic analysis approach. Results Participants reported on the unreliability of obtaining drugs and medical supplies for EmOC; this was supported by the absence of essential items observed during the facility survey. The unreliability of obtaining drugs and medical supplies was reported to result in the provision of untimely and suboptimal EmOC services. An insufficient budget for drugs from central government, lack of accountability within the supply system and a bureaucratic process of accessing the locally mobilized drug fund were reported to contribute to the current situation. Conclusion The unreliability of obtaining drugs and medical supplies compromises the timely provision of quality EmOC. Multiple approaches should be used to address challenges within the health system that prevent access to essential drugs and supplies for maternal health. There should be a special focus on improving the governance of the drug delivery system so that it promotes the accountability of key players, transparency in the handling of information and drug funds, and the participation of key stakeholders in decision making over the allocation of locally collected drug funds. PMID:24646098
Clinical height measurements are unreliable: a call for improvement.
Mikula, A L; Hetzel, S J; Binkley, N; Anderson, P A
2016-10-01
Height measurements are currently used to guide imaging decisions that assist in osteoporosis care, but their clinical reliability is largely unknown. We found both clinical height measurements and electronic health record height data to be unreliable. Improvement in height measurement is needed to improve osteoporosis care. The aim of this study is to assess the accuracy and reliability of clinical height measurement in a university healthcare clinical setting. Electronic health record (EHR) review, direct measurement of clinical stadiometer accuracy, and observation of staff height measurement technique at outpatient facilities of the University of Wisconsin Hospital and Clinics. We examined 32 clinical stadiometers for reliability and observed 34 clinic staff perform height measurements at 12 outpatient primary care and specialty clinics. An EHR search identified 4711 men and women age 43 to 89 with no known metabolic bone disease who had more than one height measurement over 3 months. The short study period and exclusion were selected to evaluate change in recorded height not due to pathologic processes. Mean EHR recorded height change (first to last measurement) was -0.02 cm (SD 1.88 cm). Eighteen percent of patients had height measurement differences noted in the EHR of ≥2 cm over 3 months. The technical error of measurement (TEM) was 1.77 cm with a relative TEM of 1.04 %. None of the staff observed performing height measurements followed all recommended height measurement guidelines. Fifty percent of clinic staff reported they on occasion enter patient reported height into the EHR rather than performing a measurement. When performing direct measurements on stadiometers, the mean difference from a gold standard length was 0.24 cm (SD 0.80). Nine percent of stadiometers examined had an error of >1.5 cm. Clinical height measurements and EHR recorded height results are unreliable. Improvement in this measure is needed as an adjunct to improve osteoporosis care.
238U-Series in Fe Oxy/Hydroxides by LA-MC-ICP-MS, New Insights Into Weathering Geochronology
NASA Astrophysics Data System (ADS)
Bernal, J.; McCulloch, M.; Eggins, S.; Grun, R.; Eggleton, R.
2003-12-01
The establishment of a geochronological framework for weathering processes is essential for an understanding of the evolution of the regolith and its dynamics. However, there are few robust answers regarding the absolute age of weathering and its rates. Nowadays, 40Ar/39Ar analysis of Mn-Oxides (cryptomelane) and K-bearing secondary sulphates have provided one of the few generally reliable chronometers (e.g. 1), but is restricted to high-K secondary phases. This work presents a different approach to obtain geochronological information from weathering minerals, namely measurement of 238U-series disequilibria in authigenic Fe oxy/hydroxides. These may be potentially useful recorders of weathering processes as they commonly occur as weathering products and have high affinity towards dissolved uranyl complexes. Furthermore, U-Th fractionation during weathering has been extensively reported [2], effectively resetting the U/230Th geochronometer. LA-MC-ICP-MS facilitates in situ measurement of 238U-series disequilibria in authigenic microcrystalline iron oxy/hydroxides (precipitated between cracks and veins in partially and heavily weathered chlorite-muscovite schist) and pisoliths (ferruginous concretions). Contrary to previous studies [e.g. 3], in situ measurement of 238U-nuclides enables selective analysis or iron oxy/hydroxides phases, minimizes contributions from allogenic phases and, reduces the need of mathematical corrections to obtain the activity ratios for the authigenic phase [4, 5]. The results suggest that supergene iron oxy/hydroxides are good recorders of weathering processes; they precipitate during the early stages of weathering, reflect the U-isotopic composition of the groundwater, appear to act as closed-systems in weathering conservative environments, and behave in a predictable fashion when subjected to intense weathering and leaching conditions. The 230Th-ages of the iron oxy/hydroxides indicate that the timing and intensity of weathering appears to be largely controlled by global climatic changes, suggesting that weathering rates have not been constant during the last 300 ka in Northern Australia. References: 1 P.M. Vasconcelos. Annual Review in Earth and Planetary Sciences 27(1), 183-229, (1999) 2 M. Ivanovich and R.S. Harmon, Uranium-series disequilibrium : applications to earth, marine, and environmental science, xxxiv, 910 pp., Oxford University Press, Oxford, (1992) 3 S.A. Short, R.T. Lowson, J. Ellis and D.M. Price. Geochimica et Cosmochimica Acta 53, 1379-1389, (1989) 4 K.R. Ludwig and D.M. Titterington. Geochimica et Cosmochimica Acta 58(22), 5031-5042, (1994) 5 Luo, S. and T. L. Ku. Geochimica et Cosmochimica Acta 55(2): 555-564. (1991)
Event-Driven Messaging for Offline Data Quality Monitoring at ATLAS
NASA Astrophysics Data System (ADS)
Onyisi, Peter
2015-12-01
During LHC Run 1, the information flow through the offline data quality monitoring in ATLAS relied heavily on chains of processes polling each other's outputs for handshaking purposes. This resulted in a fragile architecture with many possible points of failure and an inability to monitor the overall state of the distributed system. We report on the status of a project undertaken during the LHC shutdown to replace the ad hoc synchronization methods with a uniform message queue system. This enables the use of standard protocols to connect processes on multiple hosts; reliable transmission of messages between possibly unreliable programs; easy monitoring of the information flow; and the removal of inefficient polling-based communication.
Parametric Design and Mechanical Analysis of Beams based on SINOVATION
NASA Astrophysics Data System (ADS)
Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.
2017-07-01
In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.
1990-05-01
J3 w c’f oz us~ w - 0n fn 00:1 0 Ic 0 L o 0j 0 0I LL 0 Iof the less than adequate reliability of the earlier Exploding Foil Initiator ( EFI ) design...Action and Alternatives EFI Exploding Foil Initiator Environmental Assessment (EA) A concise public document in which a Federal agency provides...Interceptor (GBI) firing unit (the Explosive Foil Initiator ) was built and tested, it operated unreliably. Many hardware development problems were
1980-11-01
chalk) (Evans 1978:67). Bone may not be preserved in soils whose acidity is too high (pH 6.3) ( Heizer and Graham 1968:125-126). Within the project...goals were directed toward discerning the patterns of interaction among the components of the system (Hole and Heizer 1973:315). Archeologists realized...unreliable (Hole and Heizer 1973: 140). They believe that surface artifacts can serve as only a rough guide to the site’s contents. No random sampling
Diffendorfer, James E.; Beston, Julie A.; Merrill, Matthew D.; Stanton, Jessica C.; Corum, Margo D.; Loss, Scott R.; Thogmartin, Wayne E.; Johnson, Douglas H.; Erickson, Richard A.; Heist, Kevin W.
2015-01-01
Components of the methodology are based on simplifying assumptions and require information that, for many species, may be sparse or unreliable. These assumptions are presented in the report and should be carefully considered when using output from the methodology. In addition, this methodology can be used to recommend species for more intensive demographic modeling or highlight those species that may not require any additional protection because effects of wind energy development on their populations are projected to be small.
2016-10-01
Reports an error in "Unreliability as a threat to understanding psychopathology: The cautionary tale of attentional bias" by Thomas L. Rodebaugh, Rachel B. Scullin, Julia K. Langer, David J. Dixon, Jonathan D. Huppert, Amit Bernstein, Ariel Zvielli and Eric J. Lenze ( Journal of Abnormal Psychology , 2016[Aug], Vol 125[6], 840-851). There was an error in the Author Note concerning the support of the MacBrain Face Stimulus Set. The correct statement is provided. (The following abstract of the original article appeared in record 2016-30117-001.) The use of unreliable measures constitutes a threat to our understanding of psychopathology, because advancement of science using both behavioral and biologically oriented measures can only be certain if such measurements are reliable. Two pillars of the National Institute of Mental Health's portfolio-the Research Domain Criteria (RDoC) initiative for psychopathology and the target engagement initiative in clinical trials-cannot succeed without measures that possess the high reliability necessary for tests involving mediation and selection based on individual differences. We focus on the historical lack of reliability of attentional bias measures as an illustration of how reliability can pose a threat to our understanding. Our own data replicate previous findings of poor reliability for traditionally used scores, which suggests a serious problem with the ability to test theories regarding attentional bias. This lack of reliability may also suggest problems with the assumption (in both theory and the formula for the scores) that attentional bias is consistent and stable across time. In contrast, measures accounting for attention as a dynamic process in time show good reliability in our data. The field is sorely in need of research reporting findings and reliability for attentional bias scores using multiple methods, including those focusing on dynamic processes over time. We urge researchers to test and report reliability of all measures, considering findings of low reliability not just as a nuisance but as an opportunity to modify and improve upon the underlying theory. Full assessment of reliability of measures will maximize the possibility that RDoC (and psychological science more generally) will succeed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
Monitoring outcomes with relational databases: does it improve quality of care?
Clemmer, Terry P
2004-12-01
There are 3 key ingredients in improving quality of medial care: 1) using a scientific process of improvement, 2) executing the process at the lowest possible level in the organization, and 3) measuring the results of any change reliably. Relational databases when used within these guidelines are of great value in these efforts if they contain reliable information that is pertinent to the project and used in a scientific process of quality improvement by a front line team. Unfortunately, the data are frequently unreliable and/or not pertinent to the local process and is used by persons at very high levels in the organization without a scientific process and without reliable measurement of the outcome. Under these circumstances the effectiveness of relational databases in improving care is marginal at best, frequently wasteful and has the potential to be harmful. This article explores examples of these concepts.
Abundances in the Uranium-rich Star CS 31082-001
NASA Astrophysics Data System (ADS)
Qian, Y.-Z.; Wasserburg, G. J.
2001-05-01
The recent discovery by Cayrel et al. of U in CS 31082-001 along with Os and Ir at greatly enhanced abundances but with [Fe/H]=-2.9 strongly reinforces the argument that there are at least two kinds of Type II supernova (SN II) sources for r-nuclei. One source is the high-frequency H events responsible for heavy r-nuclei (A>135) but not Fe. The H-yields calculated from data on other ultra-metal-poor stars and the Sun provide a template for quantitatively predicting the abundances of all other r-elements. In CS 31082-001 these should show a significant deficiency at A<135 relative to the solar r-pattern. It is proposed that CS 31082-001 should have had a companion that exploded as an SN II H event. If the binary survived the explosion, this star should now have a compact companion, most likely a stellar-mass black hole. Comparison of abundance data with predicted values and a search for a compact companion should provide a stringent test of the proposed r-process model. The U-Th age determined by Cayrel et al. for CS 31082-001 is, to within substantial uncertainties, in accord with the r-process age determined from solar system data. The time gap between the big bang and the onset of normal star formation allows r-process chronometers to provide only a lower limit on the age of the universe.
Memory, metamemory, and social cues: Between conformity and resistance.
Zawadzka, Katarzyna; Krogulska, Aleksandra; Button, Roberta; Higham, Philip A; Hanczakowski, Maciej
2016-02-01
When presented with responses of another person, people incorporate these responses into memory reports: a finding termed memory conformity. Research on memory conformity in recognition reveals that people rely on external social cues to guide their memory responses when their own ability to respond is at chance. In this way, conforming to a reliable source boosts recognition performance but conforming to a random source does not impair it. In the present study we assessed whether people would conform indiscriminately to reliable and unreliable (random) sources when they are given the opportunity to exercise metamemory control over their responding by withholding answers in a recognition test. In Experiments 1 and 2, we found the pattern of memory conformity to reliable and unreliable sources in 2 variants of a free-report recognition test, yet at the same time the provision of external cues did not affect the rate of response withholding. In Experiment 3, we provided participants with initial feedback on their recognition decisions, facilitating the discrimination between the reliable and unreliable source. This led to the reduction of memory conformity to the unreliable source, and at the same time modulated metamemory decisions concerning response withholding: participants displayed metamemory conformity to the reliable source, volunteering more responses in their memory report, and metamemory resistance to the random source, withholding more responses from the memory report. Together, the results show how metamemory decisions dissociate various types of memory conformity and that memory and metamemory decisions can be independent of each other. PsycINFO Database Record (c) 2016 APA, all rights reserved.
Effects of imperfect automation on decision making in a simulated command and control task.
Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja
2007-02-01
Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.
Thomas, Thaddeus P.; Van Hofwegen, Christopher J.; Anderson, Donald D.; Brown, Thomas D.; Marsh, J. Lawrence
2010-01-01
The pathophysiology of post-traumatic osteoarthritis (PTOA) after intra-articular fractures is poorly understood. Pursuit of a better understanding of this disease is complicated by inability to accurately monitor its onset, progression and severity. Common radiographic methods used to assess PTOA do not provide sufficient image quality for precise cartilage measurements. Double-contrast MDCT is an alternative method that may be useful, since it produces high-quality images in normal ankles. The purpose of this study was to assess this technique’s performance in assessing cartilage maintenance in ankles with an intra-articular fracture. Thirty-six tibial plafond fractures were followed over two years, with thirty-one MDCTs being obtained four months after injury, and twenty-two MDCTs after two years. Unfortunately, clinical results with this technique were unreliable due to pathology (presumed arthrofibrosis) and technical problems (pooling of contrast). The arthrofibrosis that developed in many patients inhibited proper joint access and contrast infiltration, although high-quality images were obtained in eleven patients. In this patient subset, in which focal regions of cartilage degeneration could be visualized, thickness could be measured with a high degree of fidelity. While thus useful in selected instances, double-contrast MDCT was too unreliable to be recommended to assess these particular types of injuries. PMID:20634971
Optimizing Air Transportation Service to Metroplex Airports. Part 1; Analysis of Historical Data
NASA Technical Reports Server (NTRS)
Donohue, George; Hoffman, Karla; Sherry, Lance; Ferguson, John; Kara, Abdul Qadar
2010-01-01
The air transportation system is a significant driver of the U.S. economy, providing safe, affordable, and rapid transportation. During the past three decades airspace and airport capacity has not grown in step with demand for air transportation (+4% annual growth), resulting in unreliable service and systemic delays. Estimates of the impact of delays and unreliable air transportation service on the economy range from $32B to $41B per year. This report describes the results of an analysis of airline strategic decision-making with regards to: (1) geographic access, (2) economic access, and (3) airline finances. This analysis evaluated markets-served, scheduled flights, aircraft size, airfares, and profit from 2005-2009. During this period, airlines experienced changes in costs of operation (due to fluctuations in hedged fuel prices), changes in travel demand (due to changes in the economy), and changes in infrastructure capacity (due to the capacity limits at EWR, JFK, and LGA). This analysis captures the impact of the implementation of capacity limits at airports, as well as the effect of increased costs of operation (i.e. hedged fuel prices). The increases in costs of operation serve as a proxy for increased costs per flight that might occur if auctions or congestion pricing are imposed.
Representation and Re-Presentation in Litigation Science
Jasanoff, Sheila
2008-01-01
Federal appellate courts have devised several criteria to help judges distinguish between reliable and unreliable scientific evidence. The best known are the U.S. Supreme Court’s criteria offered in 1993 in Daubert v. Merrell Dow Pharmaceuticals, Inc. This article focuses on another criterion, offered by the Ninth Circuit Court of Appeals, that instructs judges to assign lower credibility to “litigation science” than to science generated before litigation. In this article I argue that the criterion-based approach to judicial screening of scientific evidence is deeply flawed. That approach buys into the faulty premise that there are external criteria, lying outside the legal process, by which judges can distinguish between good and bad science. It erroneously assumes that judges can ascertain the appropriate criteria and objectively apply them to challenged evidence before litigation unfolds, and before methodological disputes are sorted out during that process. Judicial screening does not take into account the dynamics of litigation itself, including gaming by the parties and framing by judges, as constitutive factors in the production and representation of knowledge. What is admitted through judicial screening, in other words, is not precisely what a jury would see anyway. Courts are sites of repeated re-representations of scientific knowledge. In sum, the screening approach fails to take account of the wealth of existing scholarship on the production and validation of scientific facts. An unreflective application of that approach thus puts courts at risk of relying upon a “junk science” of the nature of scientific knowledge. PMID:18197311
Riotous assemblage and the materials of regulation.
Bulstrode, Jenny
2018-06-01
In the stores of the British Museum are three exquisite springs, made in the late 1820s and 1830s, to regulate the most precise timepieces in the world. Barely the thickness of a hair, they are exquisite because they are made entirely of glass. Combining new documentary evidence, funded by the Antiquarian Horological Society, with the first technical analysis of the springs, undertaken in collaboration with the British Museum, the research presented here uncovers their extraordinary significance to the global extension of nineteenth century capitalism through the repeal of the Corn Laws. In the 1830s and 1840s the Astronomer Royal, George Biddell Airy; the Hydrographer to the Admiralty, Francis Beaufort; and the Prime Minister, Sir Robert Peel, collaborated with the virtuoso chronometer-maker, Edward John Dent, to mobilize the specificity of particular forms of glass, the salience of the Glass Tax, and the significance of state standards, as means to reform. These protagonists looked to glass and its properties to transform the fiscal military state into an exquisitely regulated machine with the appearance of automation and the gloss of the free-trade liberal ideal. Surprising but significant connections, linking Newcastle mobs to tales of Cinderella and the use of small change, demonstrate why historians must attend to materials and how such attention exposes claims to knowledge, the interests behind such claims, and the impact they have had upon the design and architecture of the modern world. Through the pivotal role of glass, this paper reveals the entangled emergence of state and market capitalism, and how the means of production was transformed in vitreous proportions.
Replication Unreliability in Psychology: Elusive Phenomena or “Elusive” Statistical Power?
Tressoldi, Patrizio E.
2012-01-01
The focus of this paper is to analyze whether the unreliability of results related to certain controversial psychological phenomena may be a consequence of their low statistical power. Applying the Null Hypothesis Statistical Testing (NHST), still the widest used statistical approach, unreliability derives from the failure to refute the null hypothesis, in particular when exact or quasi-exact replications of experiments are carried out. Taking as example the results of meta-analyses related to four different controversial phenomena, subliminal semantic priming, incubation effect for problem solving, unconscious thought theory, and non-local perception, it was found that, except for semantic priming on categorization, the statistical power to detect the expected effect size (ES) of the typical study, is low or very low. The low power in most studies undermines the use of NHST to study phenomena with moderate or low ESs. We conclude by providing some suggestions on how to increase the statistical power or use different statistical approaches to help discriminate whether the results obtained may or may not be used to support or to refute the reality of a phenomenon with small ES. PMID:22783215
Cortactin Tyrosine Phosphorylation Promotes Its Deacetylation and Inhibits Cell Spreading
Meiler, Eugenia; Nieto-Pelegrín, Elvira; Martinez-Quiles, Narcisa
2012-01-01
Background Cortactin is a classical Src kinase substrate that participates in actin cytoskeletal dynamics by activating the Arp2/3 complex and interacting with other regulatory proteins, including FAK. Cortactin has various domains that may contribute to the assembly of different protein platforms to achieve process specificity. Though the protein is known to be regulated by post-translational modifications such as phosphorylation and acetylation, how tyrosine phosphorylation regulates cortactin activity is poorly understood. Since the basal level of tyrosine phosphorylation is low, this question must be studied using stimulated cell cultures, which are physiologically relevant but unreliable and difficult to work with. In fact, their unreliability may be the cause of some contradictory findings about the dynamics of tyrosine phosphorylation of cortactin in different processes. Methodology/Principal Findings In the present study, we try to overcome these problems by using a Functional Interaction Trap (FIT) system, which involves cotransfecting cells with a kinase (Src) and a target protein (cortactin), both of which are fused to complementary leucine-zipper domains. The FIT system allowed us to control precisely the tyrosine phosphorylation of cortactin and explore its relationship with cortactin acetylation. Conclusions/Significance Using this system, we provide definitive evidence that a competition exists between acetylation and tyrosine phosphorylation of cortactin and that phosphorylation inhibits cell spreading. We confirmed the results from the FIT system by examining endogenous cortactin in different cell types. Furthermore, we demonstrate that cell spreading promotes the association of cortactin and FAK and that tyrosine phosphorylation of cortactin disrupts this interaction, which may explain how it inhibits cell spreading. PMID:22479425
Improving the quantification of contrast enhanced ultrasound using a Bayesian approach
NASA Astrophysics Data System (ADS)
Rizzo, Gaia; Tonietto, Matteo; Castellaro, Marco; Raffeiner, Bernd; Coran, Alessandro; Fiocco, Ugo; Stramare, Roberto; Grisan, Enrico
2017-03-01
Contrast Enhanced Ultrasound (CEUS) is a sensitive imaging technique to assess tissue vascularity, that can be useful in the quantification of different perfusion patterns. This can be particularly important in the early detection and staging of arthritis. In a recent study we have shown that a Gamma-variate can accurately quantify synovial perfusion and it is flexible enough to describe many heterogeneous patterns. Moreover, we have shown that through a pixel-by-pixel analysis the quantitative information gathered characterizes more effectively the perfusion. However, the SNR ratio of the data and the nonlinearity of the model makes the parameter estimation difficult. Using classical non-linear-leastsquares (NLLS) approach the number of unreliable estimates (those with an asymptotic coefficient of variation greater than a user-defined threshold) is significant, thus affecting the overall description of the perfusion kinetics and of its heterogeneity. In this work we propose to solve the parameter estimation at the pixel level within a Bayesian framework using Variational Bayes (VB), and an automatic and data-driven prior initialization. When evaluating the pixels for which both VB and NLLS provided reliable estimates, we demonstrated that the parameter values provided by the two methods are well correlated (Pearson's correlation between 0.85 and 0.99). Moreover, the mean number of unreliable pixels drastically reduces from 54% (NLLS) to 26% (VB), without increasing the computational time (0.05 s/pixel for NLLS and 0.07 s/pixel for VB). When considering the efficiency of the algorithms as computational time per reliable estimate, VB outperforms NLLS (0.11 versus 0.25 seconds per reliable estimate respectively).
Kuang, Li; Yu, Long; Huang, Lan; Wang, Yin; Ma, Pengju; Li, Chuanbin; Zhu, Yujia
2018-05-14
With the rapid development of cyber-physical systems (CPS), building cyber-physical systems with high quality of service (QoS) has become an urgent requirement in both academia and industry. During the procedure of building Cyber-physical systems, it has been found that a large number of functionally equivalent services exist, so it becomes an urgent task to recommend suitable services from the large number of services available in CPS. However, since it is time-consuming, and even impractical, for a single user to invoke all of the services in CPS to experience their QoS, a robust QoS prediction method is needed to predict unknown QoS values. A commonly used method in QoS prediction is collaborative filtering, however, it is hard to deal with the data sparsity and cold start problem, and meanwhile most of the existing methods ignore the data credibility issue. Thence, in order to solve both of these challenging problems, in this paper, we design a framework of QoS prediction for CPS services, and propose a personalized QoS prediction approach based on reputation and location-aware collaborative filtering. Our approach first calculates the reputation of users by using the Dirichlet probability distribution, so as to identify untrusted users and process their unreliable data, and then it digs out the geographic neighborhood in three levels to improve the similarity calculation of users and services. Finally, the data from geographical neighbors of users and services are fused to predict the unknown QoS values. The experiments using real datasets show that our proposed approach outperforms other existing methods in terms of accuracy, efficiency, and robustness.
Huang, Lan; Wang, Yin; Ma, Pengju; Li, Chuanbin; Zhu, Yujia
2018-01-01
With the rapid development of cyber-physical systems (CPS), building cyber-physical systems with high quality of service (QoS) has become an urgent requirement in both academia and industry. During the procedure of building Cyber-physical systems, it has been found that a large number of functionally equivalent services exist, so it becomes an urgent task to recommend suitable services from the large number of services available in CPS. However, since it is time-consuming, and even impractical, for a single user to invoke all of the services in CPS to experience their QoS, a robust QoS prediction method is needed to predict unknown QoS values. A commonly used method in QoS prediction is collaborative filtering, however, it is hard to deal with the data sparsity and cold start problem, and meanwhile most of the existing methods ignore the data credibility issue. Thence, in order to solve both of these challenging problems, in this paper, we design a framework of QoS prediction for CPS services, and propose a personalized QoS prediction approach based on reputation and location-aware collaborative filtering. Our approach first calculates the reputation of users by using the Dirichlet probability distribution, so as to identify untrusted users and process their unreliable data, and then it digs out the geographic neighborhood in three levels to improve the similarity calculation of users and services. Finally, the data from geographical neighbors of users and services are fused to predict the unknown QoS values. The experiments using real datasets show that our proposed approach outperforms other existing methods in terms of accuracy, efficiency, and robustness. PMID:29757995
NASA Astrophysics Data System (ADS)
Mitchell, R. N.; Thissen, C.; Kirschvink, J. L.; Schrag, D. P.; Montanari, A.; Coccioni, R.; Slotznick, S. P.; Yamazaki, T.; Penserini, B. D.; Abrahams, J. N. H.; Cruz-Heredia, M.; Evans, D. A.
2015-12-01
High-resolution paleomagnetism of Cretaceous-aged limestone in Italy reveals evidence for a previously unrecognized ~10˚ directional variation, or "wobble", of either the geographic or magnetic pole on a 106-year, "Milankovitch" time scale. Ten ~1 million year (Myr) wobbles of magnetic inclination can be identified and correlated across Italy from 87-74 Myr ago, potentially refining the global polarity time scale and seafloor spreading rates. Milankovitch wobble is an omnipresent geophysical process that represents, irrespective of its mechanism, a new chronometer for age calibration with paleomagnetism. If Milankovitch wobble is interpreted as a geomagnetic artifact—the long-considered but still unproven idea that astronomical variations influence the geodynamo—the geocentric-axial dipole hypothesis would only be viable when averaged over time scales 100 times greater than currently thought, making present-day geocentricity largely coincidental. If interpreted as true geographic change, Milankovitch wobble implies an unrecognized, rapid time scale (~10˚ Myr-1) of true polar wander, possibly due to ice sheet dynamics driven by the 1.2 Myr modulation of Earth's rotational obliquity. Stable isotope data co-vary with the Milankovitch wobble, possibly favoring the polar wander mechanism that predicts rapid environmental change where the geomagnetic artifact hypothesis does not.
Isotope production and target preparation for nuclear astrophysics data
NASA Astrophysics Data System (ADS)
Schumann, Dorothea; Dressler, Rugard; Maugeri, Emilio Andrea; Heinitz, Stephan
2017-09-01
Targets are in many cases an indispensable ingredient for successful experiments aimed to produce nuclear data. With the recently observed shift to study nuclear reactions on radioactive targets, this task can become extremely challenging. Concerted actions of a certain number of laboratories able to produce isotopes and manufacture radioactive targets are urgently needed. We present here some examples of successful isotope and target production at PSI, in particular the production of 60Fe samples used for half-life measurements and neutron capture cross section experiments, the chemical processing and fabrication of lanthanide targets for capture cross section experiments at n_TOF (European Organization for Nuclear Research (CERN), Switzerland) as well as the recently performed manufacturing of highly-radioactive 7Be targets for the measurement of the 7Be(n,α)4He cross section in the energy range of interest for the Big-Bang nucleosynthesis contributing to the solving of the cosmological Li-problem. The two future projects: "Determination of the half-life and experiments on neutron capture cross sections of 53Mn" and "32Si - a new chronometer for nuclear dating" are briefly described. Moreover, we propose to work on the establishment of a dedicated network on isotope and target producing laboratories.
Concentration variance decay during magma mixing: a volcanic chronometer.
Perugini, Diego; De Campos, Cristina P; Petrelli, Maurizio; Dingwell, Donald B
2015-09-21
The mixing of magmas is a common phenomenon in explosive eruptions. Concentration variance is a useful metric of this process and its decay (CVD) with time is an inevitable consequence during the progress of magma mixing. In order to calibrate this petrological/volcanological clock we have performed a time-series of high temperature experiments of magma mixing. The results of these experiments demonstrate that compositional variance decays exponentially with time. With this calibration the CVD rate (CVD-R) becomes a new geochronometer for the time lapse from initiation of mixing to eruption. The resultant novel technique is fully independent of the typically unknown advective history of mixing - a notorious uncertainty which plagues the application of many diffusional analyses of magmatic history. Using the calibrated CVD-R technique we have obtained mingling-to-eruption times for three explosive volcanic eruptions from Campi Flegrei (Italy) in the range of tens of minutes. These in turn imply ascent velocities of 5-8 meters per second. We anticipate the routine application of the CVD-R geochronometer to the eruptive products of active volcanoes in future in order to constrain typical "mixing to eruption" time lapses such that monitoring activities can be targeted at relevant timescales and signals during volcanic unrest.
Isotopic and Chemical Evidence for Primitive Aqueous Alteration in the Tagish Lake Meteorite
NASA Astrophysics Data System (ADS)
Sakuma, Keisuke; Hidaka, Hiroshi; Yoneda, Shigekazu
2018-01-01
Aqueous alteration is one of the primitive activities that occurred on meteorite parent bodies in the early solar system. The Tagish Lake meteorite is known to show an intense parent body aqueous alteration signature. In this study, quantitative analyses of the alkaline elements and isotopic analyses of Sr and Ba from acid leachates of TL (C2-ungrouped) were performed to investigate effects of aqueous alteration. The main purpose of this study is to search for isotopic evidence of extinct 135Cs from the Ba isotopic analyses in the chemical separates from the Tagish Lake meteorite. Barium isotopic data from the leachates show variable 135Ba isotopic anomalies (ε = ‑2.6 ∼ +3.6) which correlatewith 137Ba and 138Ba suggesting a heterogeneous distribution of s- and r-rich nucleosynthetic components in the early solar system. The 87Rb–87Sr and 135Cs–135Ba decay systems on TL in this study do not provide any chronological information. The disturbance of the TL chronometers is likely a reflection of the selective dissolution of Cs and Rb given the relatively higher mobility of Cs and Rb compared to Ba and Sr, respectively, during fluid mineral interactions.
Age of metamorphic events : petrochronology and hygrochronology
NASA Astrophysics Data System (ADS)
Bosse, Valerie; Villa, Igor M.
2017-04-01
Geodynamic models of the lithosphere require quantitative data from natural samples. Time is a key parameter: it allows to calculate rates and duration of geological processes and provides informations about the involved physical processes (Vance et al. 2003). Large-scale orogenic models require linking geochronological data with other parameters: structures, kinematics, magmatic and metamorphic petrology (P-T-A-X conditions), thermobarometric evolution of the lithosphere, chemical dynamics (Muller, 2003). This requires geochronometers that are both powerful chemical and petrological tracers. In-situ techniques allow dating a mineral in its petrological-microstructural environment. Getting a "date" has become quite easy... But what do we date in the end ? What is the link between the numbers obtained from the mass spectrometer and the age of the metamorphic event we are trying to date ? How can we transform the date into a geological meaningful age ? What do we learn about the behavior of the geochronometer minerals? Now that we can perform precise dating on very small samples directly in the studied rock, it is important to improve the way we interpret the ages to give them more pertinence in the geodynamic context. We propose to discuss the Th/U/Pb system isotopic closure in various metamorphic contexts using our published examples of in situ dating on monazite and zircon (Bosse et al. 2009; Didier et al. 2014, 2015). The studied examples show that (i) fluid assisted dissolution-precipitation processes rather than temperature-dependent solid diffusion predominantly govern the closure of the Th/U/Pb system (ii) monazite and zircon are sensitive to the interaction with fluids of specific composition (F, CO2, K ...), even at low temperature (iii) in the absence of fluids, monazite is able to record HT events and to retain this information in poly-orogenic contexts or during partial melting events (iv) complex chemical and isotopic zonations, well known in monazite, reflect the interaction with the surrounding mineral assemblages. An often neglected observation is that the K-Ar chronometer minerals show similar patterns of isotopic inheritance closely tied to relict patches and heterochemical retrogression phases (Villa and Williams 2013). Isotopic closure in the U-Pb and K-Ar systems follows the same principle: thermal diffusion is very slow, dissolution and reprecipitation are several orders of magnitude faster. This means that both U-Pb and K-Ar mineral chronometers are hygrochronometers. The interpretation of the ages of the different domains cannot be decoupled from the geochemical and petrological context. The focus on petrology also requires, following Villa (1998, 2016), that the ages measured in metamorphic rocks no longer can be used in geodynamic models according to the "closure temperature" concept as originally defined by Dodson (1973). Bosse et al. (2009) Chem Geol 261: 286 Didier et al. (2014) Chem Geol 381: 206 Didier et al. (2015) Contrib Mineral Petrol 170: 45 Dodson (1973) Contrib Mineral Petrol 40: 259 Muller (2003) EPSL, 206: 237 Villa (1998) Terra Nova 10: 42 Villa (2016) Chem Geol 420: 1 Villa & Williams (2013) In: Harlov & Austrheim (eds.), Metasomatism and the Chemical Transformation of Rock. Springer, p171
Low-thrust mission risk analysis, with application to a 1980 rendezvous with the comet Encke
NASA Technical Reports Server (NTRS)
Yen, C. L.; Smith, D. B.
1973-01-01
A computerized failure process simulation procedure is used to evaluate the risk in a solar electric space mission. The procedure uses currently available thrust-subsystem reliability data and performs approximate simulations of the thrust sybsystem burn operation, the system failure processes, and the retargeting operations. The method is applied to assess the risks in carrying out a 1980 rendezvous mission to the comet Encke. Analysis of the results and evaluation of the effects of various risk factors on the mission show that system component failure rates are the limiting factors in attaining a high mission relability. It is also shown that a well-designed trajectory and system operation mode can be used effectively to partially compensate for unreliable thruster performance.
Astronomy, Data and the Problem of Large File Transfers
NASA Astrophysics Data System (ADS)
Bryant, J.; Sutorius, E.; Bunclark, P. S.
2008-08-01
During the lifetime of a survey a considerable amount of data are produced, this data needs to be moved between processing centres quickly and efficiently. Over time there has been a jostling as to which method of transferring data from one location to another is best: tape, disk or network, as they say ``Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway." With the VDFS the most painless solution seems to have been using network transfers. We started by using JANET to transfer our data but the rising monetary cost and unreliable transfer speeds when dealing with terabyte scale data volumes led us to investigate alternatives. Here we describe our work on connecting to the UKLight network, run by UKERNA, alongside JANET. This network provided us with a dedicated 1Gbit/s dark fibre for our sole use that allows us to transfer astronomical data between CASU and WFAU at speeds that are limited more by end server hardware than by the network (so maybe we can beat that station wagon.)
NASA Technical Reports Server (NTRS)
Basu, J. P. (Principal Investigator); Dragich, S. M.; Mcguigan, D. P.
1978-01-01
The author has identified the following significant results. The stratification procedure in the new sampling strategy for LACIE included: (1) correlation test results indicating that an agrophysical stratum may be homogeneous with respect to agricultural density, but not with respect to wheat density; and (2) agrophysical unit homogeneity test results indicating that with respect to agricultural density many agrophysical units are not homogeneous, but removal of one or more refined strata from any such current agrophysical unit can make the strata homogeneous. The apportioning procedure results indicated that the current procedure is not performing well and that the apportioned estimates of refined strata wheat area are often unreliable.
Semi-Markov adjunction to the Computer-Aided Markov Evaluator (CAME)
NASA Technical Reports Server (NTRS)
Rosch, Gene; Hutchins, Monica A.; Leong, Frank J.; Babcock, Philip S., IV
1988-01-01
The rule-based Computer-Aided Markov Evaluator (CAME) program was expanded in its ability to incorporate the effect of fault-handling processes into the construction of a reliability model. The fault-handling processes are modeled as semi-Markov events and CAME constructs and appropriate semi-Markov model. To solve the model, the program outputs it in a form which can be directly solved with the Semi-Markov Unreliability Range Evaluator (SURE) program. As a means of evaluating the alterations made to the CAME program, the program is used to model the reliability of portions of the Integrated Airframe/Propulsion Control System Architecture (IAPSA 2) reference configuration. The reliability predictions are compared with a previous analysis. The results bear out the feasibility of utilizing CAME to generate appropriate semi-Markov models to model fault-handling processes.
Combating the Reliability Challenge of GPU Register File at Low Supply Voltage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Jingweijia; Song, Shuaiwen; Yan, Kaige
Supply voltage reduction is an effective approach to significantly reduce GPU energy consumption. As the largest on-chip storage structure, the GPU register file becomes the reliability hotspot that prevents further supply voltage reduction below the safe limit (Vmin) due to process variation effects. This work addresses the reliability challenge of the GPU register file at low supply voltages, which is an essential first step for aggressive supply voltage reduction of the entire GPU chip. We propose GR-Guard, an architectural solution that leverages long register dead time to enable reliable operations from unreliable register file at low voltages.
Tolerancing aspheres based on manufacturing knowledge
NASA Astrophysics Data System (ADS)
Wickenhagen, S.; Kokot, S.; Fuchs, U.
2017-10-01
A standard way of tolerancing optical elements or systems is to perform a Monte Carlo based analysis within a common optical design software package. Although, different weightings and distributions are assumed they are all counting on statistics, which usually means several hundreds or thousands of systems for reliable results. Thus, employing these methods for small batch sizes is unreliable, especially when aspheric surfaces are involved. The huge database of asphericon was used to investigate the correlation between the given tolerance values and measured data sets. The resulting probability distributions of these measured data were analyzed aiming for a robust optical tolerancing process.
Tolerancing aspheres based on manufacturing statistics
NASA Astrophysics Data System (ADS)
Wickenhagen, S.; Möhl, A.; Fuchs, U.
2017-11-01
A standard way of tolerancing optical elements or systems is to perform a Monte Carlo based analysis within a common optical design software package. Although, different weightings and distributions are assumed they are all counting on statistics, which usually means several hundreds or thousands of systems for reliable results. Thus, employing these methods for small batch sizes is unreliable, especially when aspheric surfaces are involved. The huge database of asphericon was used to investigate the correlation between the given tolerance values and measured data sets. The resulting probability distributions of these measured data were analyzed aiming for a robust optical tolerancing process.
Smith, David E; Raswyck, Glenn E; Davidson, Leigh Dickerson
2014-01-01
Since the discovery of its psychedelic properties in 1943, lysergic acid diethylamide (LSD) has been explored by psychiatric/therapeutic researchers, military/intelligence agencies, and a significant portion of the general population. Promising early research was halted by LSD's placement as a Schedule I drug in the early 1970s. The U.S. Army and CIA dropped their research after finding it unreliable for their purposes. NSDUH estimates that more than 22 million (9.1% of the population) have used LSD at least once in their lives. Recently, researchers have been investigating the therapeutic use of LSD and other psychedelics for end-of-life anxiety, post-traumatic stress disorder (PTSD), cancer, and addiction treatment. Adverse psychedelic reactions can be managed using talkdown techniques developed and in use since the 1960s.
Two-Step Fair Scheduling of Continuous Media Streams over Error-Prone Wireless Channels
NASA Astrophysics Data System (ADS)
Oh, Soohyun; Lee, Jin Wook; Park, Taejoon; Jo, Tae-Chang
In wireless cellular networks, streaming of continuous media (with strict QoS requirements) over wireless links is challenging due to their inherent unreliability characterized by location-dependent, bursty errors. To address this challenge, we present a two-step scheduling algorithm for a base station to provide streaming of continuous media to wireless clients over the error-prone wireless links. The proposed algorithm is capable of minimizing the packet loss rate of individual clients in the presence of error bursts, by transmitting packets in the round-robin manner and also adopting a mechanism for channel prediction and swapping.
Lord, Dominique
2006-07-01
There has been considerable research conducted on the development of statistical models for predicting crashes on highway facilities. Despite numerous advancements made for improving the estimation tools of statistical models, the most common probabilistic structure used for modeling motor vehicle crashes remains the traditional Poisson and Poisson-gamma (or Negative Binomial) distribution; when crash data exhibit over-dispersion, the Poisson-gamma model is usually the model of choice most favored by transportation safety modelers. Crash data collected for safety studies often have the unusual attributes of being characterized by low sample mean values. Studies have shown that the goodness-of-fit of statistical models produced from such datasets can be significantly affected. This issue has been defined as the "low mean problem" (LMP). Despite recent developments on methods to circumvent the LMP and test the goodness-of-fit of models developed using such datasets, no work has so far examined how the LMP affects the fixed dispersion parameter of Poisson-gamma models used for modeling motor vehicle crashes. The dispersion parameter plays an important role in many types of safety studies and should, therefore, be reliably estimated. The primary objective of this research project was to verify whether the LMP affects the estimation of the dispersion parameter and, if it is, to determine the magnitude of the problem. The secondary objective consisted of determining the effects of an unreliably estimated dispersion parameter on common analyses performed in highway safety studies. To accomplish the objectives of the study, a series of Poisson-gamma distributions were simulated using different values describing the mean, the dispersion parameter, and the sample size. Three estimators commonly used by transportation safety modelers for estimating the dispersion parameter of Poisson-gamma models were evaluated: the method of moments, the weighted regression, and the maximum likelihood method. In an attempt to complement the outcome of the simulation study, Poisson-gamma models were fitted to crash data collected in Toronto, Ont. characterized by a low sample mean and small sample size. The study shows that a low sample mean combined with a small sample size can seriously affect the estimation of the dispersion parameter, no matter which estimator is used within the estimation process. The probability the dispersion parameter becomes unreliably estimated increases significantly as the sample mean and sample size decrease. Consequently, the results show that an unreliably estimated dispersion parameter can significantly undermine empirical Bayes (EB) estimates as well as the estimation of confidence intervals for the gamma mean and predicted response. The paper ends with recommendations about minimizing the likelihood of producing Poisson-gamma models with an unreliable dispersion parameter for modeling motor vehicle crashes.
NASA Astrophysics Data System (ADS)
Liou, Cheng-Dar
2015-09-01
This study investigates an infinite capacity Markovian queue with a single unreliable service station, in which the customers may balk (do not enter) and renege (leave the queue after entering). The unreliable service station can be working breakdowns even if no customers are in the system. The matrix-analytic method is used to compute the steady-state probabilities for the number of customers, rate matrix and stability condition in the system. The single-objective model for cost and bi-objective model for cost and expected waiting time are derived in the system to fit in with practical applications. The particle swarm optimisation algorithm is implemented to find the optimal combinations of parameters in the pursuit of minimum cost. Two different approaches are used to identify the Pareto optimal set and compared: the epsilon-constraint method and non-dominate sorting genetic algorithm. Compared results allow using the traditional optimisation approach epsilon-constraint method, which is computationally faster and permits a direct sensitivity analysis of the solution under constraint or parameter perturbation. The Pareto front and non-dominated solutions set are obtained and illustrated. The decision makers can use these to improve their decision-making quality.
Gao, Xiang; Lin, Huaiying; Revanna, Kashi; Dong, Qunfeng
2017-05-10
Species-level classification for 16S rRNA gene sequences remains a serious challenge for microbiome researchers, because existing taxonomic classification tools for 16S rRNA gene sequences either do not provide species-level classification, or their classification results are unreliable. The unreliable results are due to the limitations in the existing methods which either lack solid probabilistic-based criteria to evaluate the confidence of their taxonomic assignments, or use nucleotide k-mer frequency as the proxy for sequence similarity measurement. We have developed a method that shows significantly improved species-level classification results over existing methods. Our method calculates true sequence similarity between query sequences and database hits using pairwise sequence alignment. Taxonomic classifications are assigned from the species to the phylum levels based on the lowest common ancestors of multiple database hits for each query sequence, and further classification reliabilities are evaluated by bootstrap confidence scores. The novelty of our method is that the contribution of each database hit to the taxonomic assignment of the query sequence is weighted by a Bayesian posterior probability based upon the degree of sequence similarity of the database hit to the query sequence. Our method does not need any training datasets specific for different taxonomic groups. Instead only a reference database is required for aligning to the query sequences, making our method easily applicable for different regions of the 16S rRNA gene or other phylogenetic marker genes. Reliable species-level classification for 16S rRNA or other phylogenetic marker genes is critical for microbiome research. Our software shows significantly higher classification accuracy than the existing tools and we provide probabilistic-based confidence scores to evaluate the reliability of our taxonomic classification assignments based on multiple database matches to query sequences. Despite its higher computational costs, our method is still suitable for analyzing large-scale microbiome datasets for practical purposes. Furthermore, our method can be applied for taxonomic classification of any phylogenetic marker gene sequences. Our software, called BLCA, is freely available at https://github.com/qunfengdong/BLCA .
NASA Astrophysics Data System (ADS)
Baksi, Ajoy K.
2018-04-01
40Ar/39Ar step heating analyses were carried out on seven rocks (five basalts, an andesite and a rhyolite) from the southern Paraná Province ( 28°S-30°S); they yield plateau/isochron ages of 135-134 Ma, in good agreement with published step heating data on rocks from the same area. Critical review of laser spot isochron ages for rocks from the Province, ranging from 140 to 130 Ma, are shown to be unreliable estimates of crystallization ages, as the rocks were substantially altered; step heating results on three of these rocks thought to yield good plateau ages, are shown to be incorrect, as a result of a technicality in dating procedures followed. U-Pb ages on zircon and baddeleyite separated from a variety of rock types ( 30°S-23°S) fall in the range 135 to 134 Ma. All reliable 40Ar/39Ar and U-Pb ages indicate volcanism was sharply focused, initiated at 135 Ma, and 1 Myr in duration; no variation of age with either latitude or longitude is noted, Scrutiny of published 40Ar/39Ar ages on the Florianopolis dykes shows they cannot be used as reliable crystallization ages. U-Pb work shows that this dyke swarm was formed coevally with the main part of the Parana province. Most of the published 40Ar/39Ar ages on the Ponta Grossa dyke swarm are unreliable; a few ages appear reliable and suggest the magmatic event in this area, may have postdated the main Paraná pulse by 1-2 Myr. A single 40Ar/39Ar age from a high-Nb basalt in the southernmost part ( 34°S) of the Paraná at 135 Ma, highlights the need for further radiometric work on other areas of this flood basalt province. The Paraná Province postdates the time of the Jurassic-Cretaceous boundary by 10 Myr.
Evaluation of Ages in the Lunar Highlands with Implications for the Evolution of the Moon
NASA Astrophysics Data System (ADS)
Borg, L. E.; Gaffney, A. M.; Carlson, R. W.
2012-12-01
The lunar highlands are composed of rocks from the ferroan anorthosite (FAN) and Mg-suites. These samples have been extensively studied because they record most of the major events associated with the formation and evolution of the Earth's Moon. Despite their potential to constrain the timing of these events, chronologic investigations are often ambiguous; in most cases because absolute ages and/or initial isotopic compositions are inconsistent with stratigraphic and petrologic relationships of various rock suites inferred from mineralogical and geochemical studies. The problem is exacerbated by the fact that most samples are difficult to date due to their small size and nearly monomineralic nature, as well as isotopic disturbances associated with impacts. Here several criteria are used to assess the reliability of lunar ages, including: (1) concordance between multiple chronometers, (2) linearity of individual isochrons, (3) resistance of the chronometers to disruption by impact or contamination, (4) consistency between initial isotopic compositions and the petrogenisis of samples, and (5) reasonableness of the elemental concentrations of mineral fractions. If only those samples that meet 4 out of 5 of these criteria are used to constrain lunar chronology many of the apparent conflicts between chronometry and petrology disappear. For example, this analysis demonstrates that the most ancient ages reported for lunar samples are some of the least reliable. The oldest ages determined on both FAN and Mg-suite highland rocks with confidence are in fact ~4.35 Ga. This age is concordant with 142Nd mare source formation ages and a peak in zircon ages, suggesting it represents a major event at ~4.35 Ga. In contrast, several apparently reliable KREEP model ages are older at ~4.48 Ga. If these older model ages are correct, they may represent the solidification age of the Moon, whereas the 4.35 Ga event could reflect secondary magmatism and cumulate re-equilibration associated with density overturn of primordial magma ocean cumulates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moresco, M.; Cimatti, A.; Jimenez, R.
2012-08-01
We present new improved constraints on the Hubble parameter H(z) in the redshift range 0.15 < z < 1.1, obtained from the differential spectroscopic evolution of early-type galaxies as a function of redshift. We extract a large sample of early-type galaxies ( ∼ 11000) from several spectroscopic surveys, spanning almost 8 billion years of cosmic lookback time (0.15 < z < 1.42). We select the most massive, red elliptical galaxies, passively evolving and without signature of ongoing star formation. Those galaxies can be used as standard cosmic chronometers, as firstly proposed by Jimenez and Loeb (2002), whose differential age evolutionmore » as a function of cosmic time directly probes H(z). We analyze the 4000 Å break (D4000) as a function of redshift, use stellar population synthesis models to theoretically calibrate the dependence of the differential age evolution on the differential D4000, and estimate the Hubble parameter taking into account both statistical and systematical errors. We provide 8 new measurements of H(z), and determine its change in H(z) to a precision of 5–12% mapping homogeneously the redshift range up to z ∼ 1.1; for the first time, we place a constraint on H(z) at z≠0 with a precision comparable with the one achieved for the Hubble constant (about 5–6% at z ∼ 0.2), and covered a redshift range (0.5 < z < 0.8) which is crucial to distinguish many different quintessence cosmologies. These measurements have been tested to best match a ΛCDM model, clearly providing a statistically robust indication that the Universe is undergoing an accelerated expansion. This method shows the potentiality to open a new avenue in constrain a variety of alternative cosmologies, especially when future surveys (e.g. Euclid) will open the possibility to extend it up to z ∼ 2.« less
Boysen, Angela K; Heal, Katherine R; Carlson, Laura T; Ingalls, Anitra E
2018-01-16
The goal of metabolomics is to measure the entire range of small organic molecules in biological samples. In liquid chromatography-mass spectrometry-based metabolomics, formidable analytical challenges remain in removing the nonbiological factors that affect chromatographic peak areas. These factors include sample matrix-induced ion suppression, chromatographic quality, and analytical drift. The combination of these factors is referred to as obscuring variation. Some metabolomics samples can exhibit intense obscuring variation due to matrix-induced ion suppression, rendering large amounts of data unreliable and difficult to interpret. Existing normalization techniques have limited applicability to these sample types. Here we present a data normalization method to minimize the effects of obscuring variation. We normalize peak areas using a batch-specific normalization process, which matches measured metabolites with isotope-labeled internal standards that behave similarly during the analysis. This method, called best-matched internal standard (B-MIS) normalization, can be applied to targeted or untargeted metabolomics data sets and yields relative concentrations. We evaluate and demonstrate the utility of B-MIS normalization using marine environmental samples and laboratory grown cultures of phytoplankton. In untargeted analyses, B-MIS normalization allowed for inclusion of mass features in downstream analyses that would have been considered unreliable without normalization due to obscuring variation. B-MIS normalization for targeted or untargeted metabolomics is freely available at https://github.com/IngallsLabUW/B-MIS-normalization .
Hill, Hugh G. M.; Grady, Carol A.; Nuth, Joseph A.; Hallenbeck, Susan L.; Sitko, Michael L.
2001-01-01
Understanding dynamic conditions in the Solar Nebula is the key to prediction of the material to be found in comets. We suggest that a dynamic, large-scale circulation pattern brings processed dust and gas from the inner nebula back out into the region of cometesimal formation—extending possibly hundreds of astronomical units (AU) from the sun—and that the composition of comets is determined by a chemical reaction network closely coupled to the dynamic transport of dust and gas in the system. This scenario is supported by laboratory studies of Mg silicates and the astronomical data for comets and for protoplanetary disks associated with young stars, which demonstrate that annealing of nebular silicates must occur in conjunction with a large-scale circulation. Mass recycling of dust should have a significant effect on the chemical kinetics of the outer nebula by introducing reduced, gas-phase species produced in the higher temperature and pressure environment of the inner nebula, along with freshly processed grains with “clean” catalytic surfaces to the region of cometesimal formation. Because comets probably form throughout the lifetime of the Solar Nebula and processed (crystalline) grains are not immediately available for incorporation into the first generation of comets, an increasing fraction of dust incorporated into a growing comet should be crystalline olivine and this fraction can serve as a crude chronometer of the relative ages of comets. The formation and evolution of key organic and biogenic molecules in comets are potentially of great consequence to astrobiology. PMID:11226213
Robust detection of heartbeats using association models from blood pressure and EEG signals.
Jeon, Taegyun; Yu, Jongmin; Pedrycz, Witold; Jeon, Moongu; Lee, Boreom; Lee, Byeongcheol
2016-01-15
The heartbeat is fundamental cardiac activity which is straightforwardly detected with a variety of measurement techniques for analyzing physiological signals. Unfortunately, unexpected noise or contaminated signals can distort or cut out electrocardiogram (ECG) signals in practice, misleading the heartbeat detectors to report a false heart rate or suspend itself for a considerable length of time in the worst case. To deal with the problem of unreliable heartbeat detection, PhysioNet/CinC suggests a challenge in 2014 for developing robust heart beat detectors using multimodal signals. This article proposes a multimodal data association method that supplements ECG as a primary input signal with blood pressure (BP) and electroencephalogram (EEG) as complementary input signals when input signals are unreliable. If the current signal quality index (SQI) qualifies ECG as a reliable input signal, our method applies QRS detection to ECG and reports heartbeats. Otherwise, the current SQI selects the best supplementary input signal between BP and EEG after evaluating the current SQI of BP. When BP is chosen as a supplementary input signal, our association model between ECG and BP enables us to compute their regular intervals, detect characteristics BP signals, and estimate the locations of the heartbeat. When both ECG and BP are not qualified, our fusion method resorts to the association model between ECG and EEG that allows us to apply an adaptive filter to ECG and EEG, extract the QRS candidates, and report heartbeats. The proposed method achieved an overall score of 86.26 % for the test data when the input signals are unreliable. Our method outperformed the traditional method, which achieved 79.28 % using QRS detector and BP detector from PhysioNet. Our multimodal signal processing method outperforms the conventional unimodal method of taking ECG signals alone for both training and test data sets. To detect the heartbeat robustly, we have proposed a novel multimodal data association method of supplementing ECG with a variety of physiological signals and accounting for the patient-specific lag between different pulsatile signals and ECG. Multimodal signal detectors and data-fusion approaches such as those proposed in this article can reduce false alarms and improve patient monitoring.
A Comparative Study of Shaping Ability of four Rotary Systems
Zarzosa, José Ignacio; Pallarés, Antonio
2015-01-01
Purpose This study compared the cutting area, instrumentation time, root canal anatomy preservation and non-instrumented areas obtained by F360®, Mtwo®, RaCe® and Hyflex® files with ISO size 35. Material and Methods 120 teeth with a single straight root and root canal were divided into 4 groups. Working length was calculated by using X-rays. The teeth were sectioned with a handpiece and a diamond disc, and the sections were observed with Nikon SMZ-2T stereoscopic microscope and an Intralux 4000-1 light source. The groups were adjusted with a preoperative analysis with AutoCAD. The teeth were reconstructed by a #10 K-File and epoxy glue. Each group was instrumented with one of the four file systems. The instrumentation time was calculated with a 1/100 second chronometer. The area of the thirds and root canal anatomy preservation were analyzed with AutoCAD 2013 and the non-instrumented areas with AutoCAD 2013 and SMZ-2T stereoscopic microscope. The statistical analysis was made with Levene’s Test, ANOVA, Bonferroni Test and Pearson´s Chi-square. Results Equal variances were shown by Levene’s Test (P > 0.05). ANOVA (P > 0.05) showed the absence of significant differences. There were significant differences in the instrumentation time (P < 0.05). For root canal anatomy preservation and non-instrumented areas, there were no significant differences between all systems (P > 0.05). Conclusions The 4 different rotary systems produced similar cutting area, root canal anatomy preservation and non-instrumented areas. Regarding instrumentation time, F360® was the fastest system statistically. PMID:27688412
Lactobacillus insicii sp. nov., isolated from fermented raw meat.
Ehrmann, Matthias A; Kröckel, Lothar; Lick, Sonja; Radmann, Pia; Bantleon, Annegret; Vogel, Rudi F
2016-01-01
The analysis of the bacterial microbiota of retain samples of pork salami revealed an isolate (strain TMW 1.2011T) that could neither be assigned to typical genera of starter organisms nor to any other known meat-associated species. Cells were Gram-stain-positive, short, straight rods occurring singly, in pairs or short chains. Phylogenetic analysis of the 16S rRNA gene sequence and specific phenotypic characteristics showed that strain TMW 1.2011T belonged to the phylogenetic Lactobacillus alimentarius group, and the closest neighbours were Lactobacillus nodensis JCM 14932T (97.8 % 16S rRNA gene sequence similarity), Lactobacillus tucceti DSM 20183T (97.4 %), 'Lactobacillus ginsenosidimutans' EMML 3041 (97.3 %), Lactobacillus versmoldensis DSM 14857T (96.9 %) and Lactobacillus furfuricola JCM 18764T (97.2 %). Similarities using partial gene sequences of the alternative chronometers pheS, dnaK and rpoA also support these relationships. DNA-DNA relatedness between the novel isolate and L. nodensis JCM 14932T, L. versmoldensis DSM 14857T and L. tucceti DSM 20183T, L. furfuricola JCM 18764T and 'L. ginsenosidimutans' EMML 3041 were below 70 % and the DNA G+C content was 36.3 mol%. The cell-wall peptidoglycan type is l-Lys-Gly-d-Asp. Based on phylogenetic, chemotaxonomic and physiological evidence, strain TMW 1.2011T represents a novel species of the genus Lactobacillus, for which the name Lactobacillus insicii sp. nov. is proposed. The type strain is TMW 1.2011T ( = CECT 8802T = DSM 29801T).
Hart-Smith, Gene; Yagoub, Daniel; Tay, Aidan P.; Pickford, Russell; Wilkins, Marc R.
2016-01-01
All large scale LC-MS/MS post-translational methylation site discovery experiments require methylpeptide spectrum matches (methyl-PSMs) to be identified at acceptably low false discovery rates (FDRs). To meet estimated methyl-PSM FDRs, methyl-PSM filtering criteria are often determined using the target-decoy approach. The efficacy of this methyl-PSM filtering approach has, however, yet to be thoroughly evaluated. Here, we conduct a systematic analysis of methyl-PSM FDRs across a range of sample preparation workflows (each differing in their exposure to the alcohols methanol and isopropyl alcohol) and mass spectrometric instrument platforms (each employing a different mode of MS/MS dissociation). Through 13CD3-methionine labeling (heavy-methyl SILAC) of Saccharomyces cerevisiae cells and in-depth manual data inspection, accurate lists of true positive methyl-PSMs were determined, allowing methyl-PSM FDRs to be compared with target-decoy approach-derived methyl-PSM FDR estimates. These results show that global FDR estimates produce extremely unreliable methyl-PSM filtering criteria; we demonstrate that this is an unavoidable consequence of the high number of amino acid combinations capable of producing peptide sequences that are isobaric to methylated peptides of a different sequence. Separate methyl-PSM FDR estimates were also found to be unreliable due to prevalent sources of false positive methyl-PSMs that produce high peptide identity score distributions. Incorrect methylation site localizations, peptides containing cysteinyl-S-β-propionamide, and methylated glutamic or aspartic acid residues can partially, but not wholly, account for these false positive methyl-PSMs. Together, these results indicate that the target-decoy approach is an unreliable means of estimating methyl-PSM FDRs and methyl-PSM filtering criteria. We suggest that orthogonal methylpeptide validation (e.g. heavy-methyl SILAC or its offshoots) should be considered a prerequisite for obtaining high confidence methyl-PSMs in large scale LC-MS/MS methylation site discovery experiments and make recommendations on how to reduce methyl-PSM FDRs in samples not amenable to heavy isotope labeling. Data are available via ProteomeXchange with the data identifier PXD002857. PMID:26699799
Effect of food processing on plant DNA degradation and PCR-based GMO analysis: a review.
Gryson, Nicolas
2010-03-01
The applicability of a DNA-based method for GMO detection and quantification depends on the quality and quantity of the DNA. Important food-processing conditions, for example temperature and pH, may lead to degradation of the DNA, rendering PCR analysis impossible or GMO quantification unreliable. This review discusses the effect of several food processes on DNA degradation and subsequent GMO detection and quantification. The data show that, although many of these processes do indeed lead to the fragmentation of DNA, amplification of the DNA may still be possible. Length and composition of the amplicon may, however, affect the result, as also may the method of extraction used. Also, many techniques are used to describe the behaviour of DNA in food processing, which occasionally makes it difficult to compare research results. Further research should be aimed at defining ingredients in terms of their DNA quality and PCR amplification ability, and elaboration of matrix-specific certified reference materials.
Dual processing model of medical decision-making
2012-01-01
Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. Methods We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. Results We show that physician’s beliefs about whether to treat at higher (lower) probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker’s threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. Conclusions We have developed the first dual processing model of medical decision-making that has potential to enrich the current medical decision-making field, which is still to the large extent dominated by expected utility theory. The model also provides a platform for reconciling two groups of competing dual processing theories (parallel competitive with default-interventionalist theories). PMID:22943520
ARTiiFACT: a tool for heart rate artifact processing and heart rate variability analysis.
Kaufmann, Tobias; Sütterlin, Stefan; Schulz, Stefan M; Vögele, Claus
2011-12-01
The importance of appropriate handling of artifacts in interbeat interval (IBI) data must not be underestimated. Even a single artifact may cause unreliable heart rate variability (HRV) results. Thus, a robust artifact detection algorithm and the option for manual intervention by the researcher form key components for confident HRV analysis. Here, we present ARTiiFACT, a software tool for processing electrocardiogram and IBI data. Both automated and manual artifact detection and correction are available in a graphical user interface. In addition, ARTiiFACT includes time- and frequency-based HRV analyses and descriptive statistics, thus offering the basic tools for HRV analysis. Notably, all program steps can be executed separately and allow for data export, thus offering high flexibility and interoperability with a whole range of applications.
Renormalization group theory for percolation in time-varying networks.
Karschau, Jens; Zimmerling, Marco; Friedrich, Benjamin M
2018-05-22
Motivated by multi-hop communication in unreliable wireless networks, we present a percolation theory for time-varying networks. We develop a renormalization group theory for a prototypical network on a regular grid, where individual links switch stochastically between active and inactive states. The question whether a given source node can communicate with a destination node along paths of active links is equivalent to a percolation problem. Our theory maps the temporal existence of multi-hop paths on an effective two-state Markov process. We show analytically how this Markov process converges towards a memoryless Bernoulli process as the hop distance between source and destination node increases. Our work extends classical percolation theory to the dynamic case and elucidates temporal correlations of message losses. Quantification of temporal correlations has implications for the design of wireless communication and control protocols, e.g. in cyber-physical systems such as self-organized swarms of drones or smart traffic networks.
Progress in the Early Solar System Chronology: A Sketch of an Ever-Changing Landscape
NASA Technical Reports Server (NTRS)
Amelin, Yuri; Yin, Q.-Z.; Krot, A. N.; Bouvier, A.; Wadhwa, M.; Kleine, T.; Nyquist, L. E.
2011-01-01
The years since the Workshop on the Chronology of Meteorites and the Early Solar System, are marked with ongoing progress in cosmochronology. Rapid improvements in techniques, discovery of new meteorites unlike any previously known, and findings that what was deemed well established constants are actually variables, will be reflected in an updated review of the solar system chronology we are currently preparing. Along with updating the database of meteorite ages, it will involve development of a set of criteria for evaluation of accuracy and consistency of isotopic dates across the entire range of meteorite classes and isotope chronometer systems. Here we present some ideas on what we think is important in meteorite chronology, and invite the cosmochemistry community to discuss them.
HOW OLD IS IT? - 241PU/241AM NUCLEAR FORENSIC CHRONOLOGY REFERENCE MATERIALS
Fitzgerald, Ryan; Inn, Kenneth G.W.; Horgan, Christopher
2018-01-01
One material attribute for nuclear forensics is material age. 241Pu is almost always present in uranium- and plutonium-based nuclear weapons, which pose the greatest threat to our security. The in-growth of 241Am due to the decay of 241Pu provides an excellent chronometer of the material. A well-characterized 241Pu/241Am standard is needed to validate measurement capability, as a basis for between-laboratory comparability, and as material for verifying laboratory performance. This effort verifies the certification of a 38 year old 241Pu Standard Reference Material (SRM4340) through alpha-gamma anticoincidence counting, and also establishes the separation date to two weeks of the documented date. PMID:29720779
Distributed parameter estimation in unreliable sensor networks via broadcast gossip algorithms.
Wang, Huiwei; Liao, Xiaofeng; Wang, Zidong; Huang, Tingwen; Chen, Guo
2016-01-01
In this paper, we present an asynchronous algorithm to estimate the unknown parameter under an unreliable network which allows new sensors to join and old sensors to leave, and can tolerate link failures. Each sensor has access to partially informative measurements when it is awakened. In addition, the proposed algorithm can avoid the interference among messages and effectively reduce the accumulated measurement and quantization errors. Based on the theory of stochastic approximation, we prove that our proposed algorithm almost surely converges to the unknown parameter. Finally, we present a numerical example to assess the performance and the communication cost of the algorithm. Copyright © 2015 Elsevier Ltd. All rights reserved.
Effects of antineoplastic drugs on Lactobacillus casei and radioisotopic assays for serum folate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmel, R.
1978-02-01
Microbiologic assay, usually employing Lactobacillus casei, remains the standard assay for serum folate to date. Among its disadvantages have been falsely low results in patients receiving bacteriostatic agents such as antibiotics. This study examined whether commonly used antineoplastic drugs had similar effect. Methotrexate and 5-fluorouracil depressed microbiologic serum folate levels. No effect was found for adriamycin, bleomycin, BCNU, cyclophosphamide, cytosine arabinoside, vincristine, vinblastine, mechlorethamine, mithramycin, hydroxyurea, and hydrocortisone. None of the drugs affected radioassay except methotrexate, which produced falsely high folate results. Thus, it appears that L. casei assay for folate becomes unreliable in patients receiving 5-fluorouracil and radioisotopic assaymore » becomes unreliable in those receiving methotrexate.« less
NASA Astrophysics Data System (ADS)
Lloyd, Alexander S.; Ruprecht, Philipp; Hauri, Erik H.; Rose, William; Gonnermann, Helge M.; Plank, Terry
2014-08-01
The explosivity of volcanic eruptions is governed in part by the rate at which magma ascends and degasses. Because the time scales of eruptive processes can be exceptionally fast relative to standard geochronometers, magma ascent rate remains difficult to quantify. Here we use as a chronometer concentration gradients of volatile species along open melt embayments within olivine crystals. Continuous degassing of the external melt during magma ascent results in diffusion of volatile species from embayment interiors to the bubble located at their outlets. The novel aspect of this study is the measurement of concentration gradients in five volatile elements (CO2, H2O, S, Cl, F) at fine-scale (5-10 μm) using the NanoSIMS. The wide range in diffusivity and solubility of these different volatiles provides multiple constraints on ascent timescales over a range of depths. We focus on four 100-200 μm, olivine-hosted embayments erupted on October 17, 1974 during the sub-Plinian eruption of Volcán de Fuego. H2O, CO2, and S all decrease toward the embayment outlet bubble, while F and Cl increase or remain roughly constant. Compared to an extensive melt inclusion suite from the same day of the eruption, the embayments have lost both H2O and CO2 throughout the entire length of the embayment. We fit the profiles with a 1-D numerical diffusion model that allows varying diffusivities and external melt concentrations as a function of pressure. Assuming a constant decompression rate from the magma storage region at approximately 220 MPa to the surface, H2O, CO2 and S profiles for all embayments can be fit with a relatively narrow range in decompression rates of 0.3-0.5 MPa/s, equivalent to 11-17 m/s ascent velocity and an 8 to 12 minute duration of magma ascent from ~ 10 km depth. A two stage decompression model takes advantage of the different depth ranges over which CO2 and H2O degas, and produces good fits given an initial stage of slow decompression (0.05-0.3 MPa/s) at high pressure (> 145 MPa), with similar decompression rates to the single-stage model for the shallower stage. The magma ascent rates reported here are among the first for explosive basaltic eruptions and demonstrate the potential of the embayment method for quantifying magmatic timescales associated with eruptions of different vigor.
Detecting incapacity of a quantum channel.
Smith, Graeme; Smolin, John A
2012-06-08
Using unreliable or noisy components for reliable communication requires error correction. But which noise processes can support information transmission, and which are too destructive? For classical systems any channel whose output depends on its input has the capacity for communication, but the situation is substantially more complicated in the quantum setting. We find a generic test for incapacity based on any suitable forbidden transformation--a protocol for communication with a channel passing our test would also allow one to implement the associated forbidden transformation. Our approach includes both known quantum incapacity tests--positive partial transposition and antidegradability (no cloning)--as special cases, putting them both on the same footing.
Copilot: Monitoring Embedded Systems
NASA Technical Reports Server (NTRS)
Pike, Lee; Wegmann, Nis; Niller, Sebastian; Goodloe, Alwyn
2012-01-01
Runtime verification (RV) is a natural fit for ultra-critical systems, where correctness is imperative. In ultra-critical systems, even if the software is fault-free, because of the inherent unreliability of commodity hardware and the adversity of operational environments, processing units (and their hosted software) are replicated, and fault-tolerant algorithms are used to compare the outputs. We investigate both software monitoring in distributed fault-tolerant systems, as well as implementing fault-tolerance mechanisms using RV techniques. We describe the Copilot language and compiler, specifically designed for generating monitors for distributed, hard real-time systems. We also describe two case-studies in which we generated Copilot monitors in avionics systems.
Cook, Thomas D; Steiner, Peter M
2010-03-01
In this article, we note the many ontological, epistemological, and methodological similarities between how Campbell and Rubin conceptualize causation. We then explore 3 differences in their written emphases about individual case matching in observational studies. We contend that (a) Campbell places greater emphasis than Rubin on the special role of pretest measures of outcome among matching variables; (b) Campbell is more explicitly concerned with unreliability in the covariates; and (c) for analyzing the outcome, only Rubin emphasizes the advantages of using propensity score over regression methods. To explore how well these 3 factors reduce bias, we reanalyze and review within-study comparisons that contrast experimental and statistically adjusted nonexperimental causal estimates from studies with the same target population and treatment content. In this context, the choice of covariates counts most for reducing selection bias, and the pretest usually plays a special role relative to all the other covariates considered singly. Unreliability in the covariates also influences bias reduction but by less. Furthermore, propensity score and regression methods produce comparable degrees of bias reduction, though these within-study comparisons may not have met the theoretically specified conditions most likely to produce differences due to analytic method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun Xudong; Hoeksema, J. Todd; Liu Yang
The solar active region photospheric magnetic field evolves rapidly during major eruptive events, suggesting appreciable feedback from the corona. Previous studies of these “magnetic imprints” are mostly based on line of sight only or lower-cadence vector observations; a temporally resolved depiction of the vector field evolution is hitherto lacking. Here, we introduce the high-cadence (90 s or 135 s) vector magnetogram data set from the Helioseismic and Magnetic Imager, which is well suited for investigating the phenomenon. These observations allow quantitative characterization of the permanent, step-like changes that are most pronounced in the horizontal field component (B {sub h}). Amore » highly structured pattern emerges from analysis of an archetypical event, SOL2011-02-15T01:56, where B {sub h} near the main polarity inversion line increases significantly during the earlier phase of the associated flare with a timescale of several minutes, while B {sub h} in the periphery decreases at later times with smaller magnitudes and a slightly longer timescale. The data set also allows effective identification of the “magnetic transient” artifact, where enhanced flare emission alters the Stokes profiles and the inferred magnetic field becomes unreliable. Our results provide insights on the momentum processes in solar eruptions. The data set may also be useful to the study of sunquakes and data-driven modeling of the corona.« less
Concentration variance decay during magma mixing: a volcanic chronometer
Perugini, Diego; De Campos, Cristina P.; Petrelli, Maurizio; Dingwell, Donald B.
2015-01-01
The mixing of magmas is a common phenomenon in explosive eruptions. Concentration variance is a useful metric of this process and its decay (CVD) with time is an inevitable consequence during the progress of magma mixing. In order to calibrate this petrological/volcanological clock we have performed a time-series of high temperature experiments of magma mixing. The results of these experiments demonstrate that compositional variance decays exponentially with time. With this calibration the CVD rate (CVD-R) becomes a new geochronometer for the time lapse from initiation of mixing to eruption. The resultant novel technique is fully independent of the typically unknown advective history of mixing – a notorious uncertainty which plagues the application of many diffusional analyses of magmatic history. Using the calibrated CVD-R technique we have obtained mingling-to-eruption times for three explosive volcanic eruptions from Campi Flegrei (Italy) in the range of tens of minutes. These in turn imply ascent velocities of 5-8 meters per second. We anticipate the routine application of the CVD-R geochronometer to the eruptive products of active volcanoes in future in order to constrain typical “mixing to eruption” time lapses such that monitoring activities can be targeted at relevant timescales and signals during volcanic unrest. PMID:26387555
Isotopic, Chemical and Mineralogical Investigation's of Extraterrestrial Materials
NASA Technical Reports Server (NTRS)
Lugmair, G. W.
2003-01-01
During the grant period we have concentrated on the following main topics: 1. Enstatite meteorites and original heterogeneity of Mn-53 distribution in the solar nebula. We have completed our studies of the enstatite chondrites. 2. Processes of planetary differentiation. We have completed our study of silicate clasts from the mesosiderite Vaca Muerta and found that the global Mn/Cr fractionation event that established mantle source reservoirs on the parent body of the Vaca Muerta silicate clasts occurred approx. 2 Ma after a similar event on the howardite-eucrite-diogenite (HED) parent body. 3. Carbonaceous chondrites. Much effort has been devoted during the last three years to the investigation of this important class of meteorites. 4. Early solar system timescales. Based on the studies of the Mn-53 - Cr-53 isotope system in various meteorites and using results obtained with other isotope chronometers we constructed an absolute time-scale for events in the early solar system. 5.Unusual meteorites. We have studied the anomalous pallasite Eagle Station. 6. The chromium isotopic composition as a tracer for extraterrestrial material on Earth. Based on the observed difference in the Cr-53/Cr-52 ratios between Earth and the other solar system objects we developed a method for detecting cosmic materials on Earth using the Cr-53/Cr-52 ratio as a tracer.
Latourette, Matthew T; Siebert, James E; Barto, Robert J; Marable, Kenneth L; Muyepa, Anthony; Hammond, Colleen A; Potchen, Michael J; Kampondeni, Samuel D; Taylor, Terrie E
2011-08-01
As part of an NIH-funded study of malaria pathogenesis, a magnetic resonance (MR) imaging research facility was established in Blantyre, Malaŵi to enhance the clinical characterization of pediatric patients with cerebral malaria through application of neurological MR methods. The research program requires daily transmission of MR studies to Michigan State University (MSU) for clinical research interpretation and quantitative post-processing. An intercontinental satellite-based network was implemented for transmission of MR image data in Digital Imaging and Communications in Medicine (DICOM) format, research data collection, project communications, and remote systems administration. Satellite Internet service costs limited the bandwidth to symmetrical 384 kbit/s. DICOM routers deployed at both the Malaŵi MRI facility and MSU manage the end-to-end encrypted compressed data transmission. Network performance between DICOM routers was measured while transmitting both mixed clinical MR studies and synthetic studies. Effective network latency averaged 715 ms. Within a mix of clinical MR studies, the average transmission time for a 256 × 256 image was ~2.25 and ~6.25 s for a 512 × 512 image. Using synthetic studies of 1,000 duplicate images, the interquartile range for 256 × 256 images was [2.30, 2.36] s and [5.94, 6.05] s for 512 × 512 images. Transmission of clinical MRI studies between the DICOM routers averaged 9.35 images per minute, representing an effective channel utilization of ~137% of the 384-kbit/s satellite service as computed using uncompressed image file sizes (including the effects of image compression, protocol overhead, channel latency, etc.). Power unreliability was the primary cause of interrupted operations in the first year, including an outage exceeding 10 days.
New frontiers in coral geochronology: advancing the state of the art
NASA Astrophysics Data System (ADS)
Thompson, William G.
2010-05-01
New developments in mass spectrometry and a better understanding of open-system processes are ushering in a new era of precision and accuracy in coral geochronology. An effort is underway to develop a uniform set of reference materials and reporting standards to assure age comparability between laboratories and eliminate inter-laboratory and age interpretation biases. PALSEA is a PAGES/IMAGES working group that aims to extract information about ice sheet response to temperature change by examining the history of sea during past interglacials. As reef-building corals are one of the primary archives of past sea levels, the U-series coral dating community is well represented in this group. During workshop discussions, it became clear that further progress on the sea level problem requires engaging the coral dating community in a cooperative standardization effort. Improvements in analytical precision continue to extend the potential precision and range of the U-Th chronometer. As a result, assuring comparability of ages reported by different labs becomes a crucial issue. Ideally, all measurements should be traceable to the same set of reference standards. Unfortunately, internationally recognized standards are not currently available. A widely used U/Th uraninite standard, HU-1, is no longer be suitable, as different aliquots have different isotope ratios and the assumption of radioactive equilibrium no longer appears valid when measured at current levels of precision. The time is ripe for the development of new reference standards. A strategy for their production and distribution has been initiated in collaboration with the NERC Isotope Geosciences Laboratory, UK, and drawing on the experiences of the EARTHTIME initiative (http://www.earth-time.org). Quaternary sea level data is presently scattered across the scientific literature with widely varying reporting formats, screening and correction criteria, and decay constants. Stratigraphic information is often incomplete, and elevations are not tied to consistent benchmarks. It would be highly desirable to compile existing data in a uniform format that can be made available to the wider community, and to adopt a uniform set of standards for future data reporting. While best practices for sample screening and/or age correction are still keenly debated, reported ages depend heavily on assumptions about the 234U/238U history of seawater over the last 800 thousand years. A standard history of ocean 234U/238U for quality and correction criteria, with associated error estimates, would make ages reported by different labs more directly comparable. Data reduction and archiving software has been developed as part of the EARTHTIME project, and discussions are underway to adapt this software for the U-Th chronometer. Standardized reporting through data reduction and databasing software has great potential to make U-series dating of coral sea-level indicators more useful and accessible to the wider paleoclimate community.
NASA Astrophysics Data System (ADS)
Ke, Jyh-Bin; Lee, Wen-Chiung; Wang, Kuo-Hsiung
2007-07-01
This paper presents the reliability and sensitivity analysis of a system with M primary units, W warm standby units, and R unreliable service stations where warm standby units switching to the primary state might fail. Failure times of primary and warm standby units are assumed to have exponential distributions, and service times of the failed units are exponentially distributed. In addition, breakdown times and repair times of the service stations also follow exponential distributions. Expressions for system reliability, RY(t), and mean time to system failure, MTTF are derived. Sensitivity analysis, relative sensitivity analysis of the system reliability and the mean time to failure, with respect to system parameters are also investigated.
Extrinsic and intrinsic motivation at 30: Unresolved scientific issues.
Reiss, Steven
2005-01-01
The undermining effect of extrinsic reward on intrinsic motivation remains unproven. The key unresolved issues are construct invalidity (all four definitions are unproved and two are illogical); measurement unreliability (the free-choice measure requires unreliable, subjective judgments to infer intrinsic motivation); inadequate experimental controls (negative affect and novelty, not cognitive evaluation, may explain "undermining" effects); and biased metareviews (studies with possible floor effects excluded, but those with possible ceiling effects included). Perhaps the greatest error with the undermining theory, however, is that it does not adequately recognize the multifaceted nature of intrinsic motivation (Reiss, 2004a). Advice to limit the use of applied behavior analysis based on "hidden" undermining effects is ideologically inspired and is unsupported by credible scientific evidence.
Biological Conditions and Economic Development: Nineteenth-Century Stature on the U.S. Great Plains.
Carson, Scott Alan
2015-06-01
Average stature is now a well-accepted measure of material and economic well-being in development studies when traditional measures are sparse or unreliable, but little work has been done on the biological conditions for individuals on the nineteenth-century U.S. Great Plains. Records of 14,427 inmates from the Nebraska state prison are used to examine the relationship between stature and economic conditions. Statures of both black and white prisoners in Nebraska increased through time, indicating that biological conditions improved as Nebraska's output market and agricultural sectors developed. The effect of rural environments on stature is illustrated by the fact that farm laborers were taller than common laborers. Urbanization and industrialization had significant impacts on stature, and proximity to trade routes and waterways was inversely related to stature.
Watershed Models for Decision Support for Inflows to Potholes Reservoir, Washington
Mastin, Mark C.
2009-01-01
A set of watershed models for four basins (Crab Creek, Rocky Ford Creek, Rocky Coulee, and Lind Coulee), draining into Potholes Reservoir in east-central Washington, was developed as part of a decision support system to aid the U.S. Department of the Interior, Bureau of Reclamation, in managing water resources in east-central Washington State. The project is part of the U.S. Geological Survey and Bureau of Reclamation collaborative Watershed and River Systems Management Program. A conceptual model of hydrology is outlined for the study area that highlights the significant processes that are important to accurately simulate discharge under a wide range of conditions. The conceptual model identified the following factors as significant for accurate discharge simulations: (1) influence of frozen ground on peak discharge, (2) evaporation and ground-water flow as major pathways in the system, (3) channel losses, and (4) influence of irrigation practices on reducing or increasing discharge. The Modular Modeling System was used to create a watershed model for the four study basins by combining standard Precipitation Runoff Modeling System modules with modified modules from a previous study and newly modified modules. The model proved unreliable in simulating peak-flow discharge because the index used to track frozen ground conditions was not reliable. Mean monthly and mean annual discharges were more reliable when simulated. Data from seven USGS streamflow-gaging stations were used to compare with simulated discharge for model calibration and evaluation. Mean annual differences between simulated and observed discharge varied from 1.2 to 13.8 percent for all stations used in the comparisons except one station on a regional ground-water discharge stream. Two thirds of the mean monthly percent differences between the simulated mean and the observed mean discharge for these six stations were between -20 and 240 percent, or in absolute terms, between -0.8 and 11 cubic feet per second. A graphical user interface was developed for the user to easily run the model, make runoff forecasts, and evaluate the results. The models; however, are not reliable for managing short-term operations because of their demonstrated inability to match individual storm peaks and individual monthly discharge values. Short-term forecasting may be improved with real-time monitoring of the extent of frozen ground and the snow-water equivalent in the basin. Despite the models unreliability for short-term runoff forecasts, they are useful in providing long-term, time-series discharge data where no observed data exist.
Integration of nanoscale memristor synapses in neuromorphic computing architectures
NASA Astrophysics Data System (ADS)
Indiveri, Giacomo; Linares-Barranco, Bernabé; Legenstein, Robert; Deligeorgis, George; Prodromakis, Themistoklis
2013-09-01
Conventional neuro-computing architectures and artificial neural networks have often been developed with no or loose connections to neuroscience. As a consequence, they have largely ignored key features of biological neural processing systems, such as their extremely low-power consumption features or their ability to carry out robust and efficient computation using massively parallel arrays of limited precision, highly variable, and unreliable components. Recent developments in nano-technologies are making available extremely compact and low power, but also variable and unreliable solid-state devices that can potentially extend the offerings of availing CMOS technologies. In particular, memristors are regarded as a promising solution for modeling key features of biological synapses due to their nanoscale dimensions, their capacity to store multiple bits of information per element and the low energy required to write distinct states. In this paper, we first review the neuro- and neuromorphic computing approaches that can best exploit the properties of memristor and scale devices, and then propose a novel hybrid memristor-CMOS neuromorphic circuit which represents a radical departure from conventional neuro-computing approaches, as it uses memristors to directly emulate the biophysics and temporal dynamics of real synapses. We point out the differences between the use of memristors in conventional neuro-computing architectures and the hybrid memristor-CMOS circuit proposed, and argue how this circuit represents an ideal building block for implementing brain-inspired probabilistic computing paradigms that are robust to variability and fault tolerant by design.
Pin, Christian; Gannoun, Abdelmouhcine
2017-02-21
A fast and efficient sample preparation method in view of isotope ratio measurements is described, allowing the separation of 11 elements involved, either as "parent" or as "daughter" isotopes, in six radiogenic isotope systems used as chronometers and tracers in earth, planetary, and environmental sciences. The protocol is based on small extraction chromatographic columns, used either alone or in tandem, through which a single nitric acid solution is passed, without any intervening evaporation step. The columns use commercially available extraction resins (Sr resin, TRU resin, Ln resin, RE resin, and again Ln resin for isolating Sr and Pb, LREE then La-Ce-Nd-Sm, Lu(Yb), and Hf, Th, and U, respectively) along with an additional, in-house prepared resin for separating Rb. A simplified scheme is proposed for samples requiring the separation of Sr, Pb, Nd, and Hf only. Adverse effects of troublesome major elements (Fe 3+ , Ti) are circumvented by masking with ascorbic acid and hydrofluoric acid, respectively. Typical recoveries in the 85-95% range are achieved, with procedural blanks of 10-100 pg, negligible with regard to the amounts of analytes processed. The fractions separated are suitable for high precision isotope ratio measurements by TIMS or MC-ICP-MS, as demonstrated by the repeat analyses of several international reference materials of basaltic composition for 87 Sr/ 86 Sr, 208,207,206 Pb/ 204 Pb, 143 Nd/ 144 Nd, 176 Hf/ 177 Hf, and 230 Th/ 232 Th. Concentration data could be obtained by spiking and equilibrating the sample with appropriate isotopic tracers before the onset of the separation process and, finally, measuring the isotope ratios modified by the isotope dilution process.
Stein, H.J.; Sundblad, K.; Markey, R.J.; Morgan, J.W.; Motuza, G.
1998-01-01
Seven 187Re-187Os ages were determined for molybdenite and pyrite samples from two well-dated Precambrian intrusions in Fennoscandia to examine the sustainability of the Re-Os chronometer in a metamorphic and metasomatic setting. Using a new 187Re decay constant (1.666 x 10-11y-1) with a much improved uncertainty (±0.31%), we determined replicate Re-Os ages for molybdenite and pyrite from the Kuittila and Kivisuo prospects in easternmost Finland and for molybdenite from the Kabeliai prospect in southernmost Lithuania. These two localities contain some of the oldest and youngest plutonic activity in Fennoscandia and are associated with newly discovered economic Au mineralization (Ilomantsi, Finland) and a Cu-Mo prospect (Kabeliai, Lithuania). Two Re-Os ages for veinhosted Kabeliai molybdenite average 1486 ± 5 Ma, in excellent agreement with a 1505 ± 11 Ma U-Pb zircon age for the hosting Kabeliai granite pluton. The slightly younger age suggests the introduction of Cu-Mo mineralization by a later phase of the Kabeliai magmatic system. Mean Re-Os ages of 2778 ± 8 Ma and 2781 ± 8 Ma for Kuittila and Kivisuo molybdenites, respectively, are in reasonable agreement with a 2753 ± 5 Ma weighted mean U-Pb zircon age for hosting Kuittila tonalite. These Re-Os ages agree well with less precise ages of 2789 ± 290 Ma for a Rb-Sr whole-rock isochron and 2771 ± 75 Ma for the average of six Sm-Nd T(DM) model ages for Kuittila tonalite. Three Re-Os analyses of a single pyrite mineral separate, from the same sample of Kuittila pluton that yielded a molybdenite separate, provide individual model ages of 2710 ± 27, 2777 ± 28, and 2830 ± 28 Ma (Re = 17.4, 12.1, and 8.4 ppb, respectively), with a mean value of 2770 ± 120 Ma in agreement with the Kuittila molybdenite age. The Re and 187Os abundances in these three pyrite splits are highly correlated (r = 0.9994), and provide a 187Re-187Os isochron age of 2607 ± 47 Ma with an intercept of 21 ppt 187Os (MSWD = 1.1). It appears that the Re-Os isotopic system in pyrite has been reset on the millimeter scale and that the 21 ppt 187Os intercept reflects the in situ decay of 187Re during the ~160 to 170 m.y. interval from ~2778 Ma (time of molybdenite ± pyrite deposition) to ~2607 Ma (time of pyrite resetting). When the Re-Os data for molybdenites from the nearby Kivisuo prospect are plotted together with the Kuittila molybdenite and pyrite data, a well-constrained five-point isochron with an age of 2780 ± 8 Ma and a 187Os intercept (-2.4 ± 3.8 ppt) of essentially zero results (MSWD = 1.5). We suggest that the pyrite isochron age records a regional metamorphic and/or hydrothermal event, possibly the time of Au mineralization. A proposed Re-Os age of ~2607 Ma for Au mineralization is in good agreement with radiometric ages by other methods that address the timing of Archean Au mineralization in deposits worldwide (so-called 'late Au model'). Molybdenite, in contrast, provides a robust Re-Os chronometer, retaining its original formation age of ~2780 Ma, despite subsequent metamorphic disturbances in Archean and Proterozoic time.
18 CFR 806.23 - Standards for water withdrawals.
Code of Federal Regulations, 2013 CFR
2013-04-01
... of groundwater or stream flow levels; rendering competing supplies unreliable; affecting other water... reasonably foreseeable water needs from available groundwater or surface water without limitation: (i...
18 CFR 806.23 - Standards for water withdrawals.
Code of Federal Regulations, 2014 CFR
2014-04-01
... of groundwater or stream flow levels; rendering competing supplies unreliable; affecting other water... reasonably foreseeable water needs from available groundwater or surface water without limitation: (i...
18 CFR 806.23 - Standards for water withdrawals.
Code of Federal Regulations, 2012 CFR
2012-04-01
... of groundwater or stream flow levels; rendering competing supplies unreliable; affecting other water... reasonably foreseeable water needs from available groundwater or surface water without limitation: (i...
NASA Astrophysics Data System (ADS)
Tang, Wenqing; Yueh, Simon H.; Fore, Alexander G.; Hayashi, Akiko
2014-09-01
We validate sea surface salinity (SSS) retrieved from Aquarius instrument on SAC-D satellite with in situ measurements by Argo floats and moored buoy arrays. We assess the error structure of three Aquarius SSS products: the standard product processed by Aquarius Data Processing System (ADPS) and two data sets produced at the Jet Propulsion Laboratory (JPL): the Combined Active-Passive algorithm with and without rain correction, CAP and CAP_RC, respectively. We examine the effect of various filters to prevent unreliable point retrievals from entering Level 3 averaging, such as land or ice contamination, radio frequency interference (RFI), and cold water. Our analyses show that Aquarius SSS agrees well with Argo in a monthly average sense between 40°S and 40°N except in the Eastern Pacific Fresh Pool and Amazon River outflow. Buoy data within these regions show excellent agreement with Aquarius but have discrepancies with the Argo gridded products. Possible reasons include strong near-surface stratification and sampling problems in Argo in regions with significant western boundary currents. We observe large root-mean-square (RMS) difference and systematic negative bias between ADPS and Argo in the tropical Indian Ocean and along the Southern Pacific Convergence Zone. Excluding these regions removes the suspicious seasonal peak in the monthly RMS difference between the Aquarius SSS products and Argo. Between 40°S and 40°N, the RMS difference for CAP is less than 0.22 PSU for all 28 months, CAP_RC has essentially met the monthly 0.2 PSU accuracy requirement, while that for ADPS fluctuates between 0.22 and 0.3 PSU.
Reisner, Andrew T; Chen, Liangyou; McKenna, Thomas M; Reifman, Jaques
2008-10-01
Prehospital severity scores can be used in routine prehospital care, mass casualty care, and military triage. If computers could reliably calculate clinical scores, new clinical and research methodologies would be possible. One obstacle is that vital signs measured automatically can be unreliable. We hypothesized that Signal Quality Indices (SQI's), computer algorithms that differentiate between reliable and unreliable monitored physiologic data, could improve the predictive power of computer-calculated scores. In a retrospective analysis of trauma casualties transported by air ambulance, we computed the Triage Revised Trauma Score (RTS) from archived travel monitor data. We compared the areas-under-the-curve (AUC's) of receiver operating characteristic curves for prediction of mortality and red blood cell transfusion for 187 subjects with comparable quantities of good-quality and poor-quality data. Vital signs deemed reliable by SQI's led to significantly more discriminatory severity scores than vital signs deemed unreliable. We also compared automatically-computed RTS (using the SQI's) versus RTS computed from vital signs documented by medics. For the subjects in whom the SQI algorithms identified 15 consecutive seconds of reliable vital signs data (n = 350), the automatically-computed scores' AUC's were the same as the medic-based scores' AUC's. Using the Prehospital Index in place of RTS led to very similar results, corroborating our findings. SQI algorithms improve automatically-computed severity scores, and automatically-computed scores using SQI's are equivalent to medic-based scores.
A portable W-band radar system for enhancement of infrared vision in fire fighting operations
NASA Astrophysics Data System (ADS)
Klenner, Mathias; Zech, Christian; Hülsmann, Axel; Kühn, Jutta; Schlechtweg, Michael; Hahmann, Konstantin; Kleiner, Bernhard; Ulrich, Michael; Ambacher, Oliver
2016-10-01
In this paper, we present a millimeter wave radar system which will enhance the performance of infrared cameras used for fire-fighting applications. The radar module is compact and lightweight such that the system can be combined with inertial sensors and integrated in a hand-held infrared camera. This allows for precise distance measurements in harsh environmental conditions, such as tunnel or industrial fires, where optical sensors are unreliable or fail. We discuss the design of the RF front-end, the antenna and a quasi-optical lens for beam shaping as well as signal processing and demonstrate the performance of the system by in situ measurements in a smoke filled environment.
Faraday Rotation Measurement with the SMAP Radiometer
NASA Technical Reports Server (NTRS)
Le Vine, D. M.; Abraham, S.
2016-01-01
Faraday rotation is an issue that needs to be taken into account in remote sensing of parameters such as soil moisture and ocean salinity at L-band. This is especially important for SMAP because Faraday rotation varies with azimuth around the conical scan. SMAP retrieves Faraday rotation in situ using the ratio of the third and second Stokes parameters, a procedure that was demonstrated successfully by Aquarius. This manuscript reports the performance of this algorithm on SMAP. Over ocean the process works reasonably well and results compare favorably with expected values. But over land, the inhomogeneous nature of the scene results in much noisier, and in some cases unreliable estimates of Faraday rotation.
Beyond Open Big Data: Addressing Unreliable Research
Moseley, Edward T; Hsu, Douglas J; Stone, David J
2014-01-01
The National Institute of Health invests US $30.9 billion annually in medical research. However, the subsequent impact of this research output on society and the economy is amplified dramatically as a result of the actual medical treatments, biomedical innovations, and various commercial enterprises that emanate from and depend on these findings. It is therefore a great concern to discover that much of published research is unreliable. We propose extending the open data concept to the culture of the scientific research community. By dialing down unproductive features of secrecy and competition, while ramping up cooperation and transparency, we make a case that what is published would then be less susceptible to the sometimes corrupting and confounding pressures to be first or journalistically attractive, which can compromise the more fundamental need to be robustly correct. PMID:25405277
Exposure assessment of process-related contaminants in food by biomarker monitoring
Rietjens, Ivonne M. C. M.; Dussort, P.; Gunther, Helmut; ...
2018-01-04
Exposure assessment is a fundamental part of the risk assessment paradigm, but can often present a number of challenges and uncertainties. This is especially the case for process contaminants formed during the processing, e.g. heating of food, since they are in part highly reactive and/or volatile, thus making exposure assessment by analysing contents in food unreliable. New approaches are therefore required to accurately assess consumer exposure and thus better inform the risk assessment. Such novel approaches may include the use of biomarkers, physiologically based kinetic (PBK) modelling-facilitated reverse dosimetry, and/or duplicate diet studies. This review focuses on the state ofmore » the art with respect to the use of biomarkers of exposure for the process contaminants acrylamide, 3-MCPD esters, glycidyl esters, furan and acrolein. From the overview presented, it becomes clear that the field of assessing human exposure to process-related contaminants in food by biomarker monitoring is promising and strongly developing. The current state of the art as well as the existing data gaps and challenges for the future were defined. They include (1) using PBK modelling and duplicate diet studies to establish, preferably in humans, correlations between external exposure and biomarkers; (2) elucidation of the possible endogenous formation of the process-related contaminants and the resulting biomarker levels; (3) the influence of inter-individual variations and how to include that in the biomarker-based exposure predictions; (4) the correction for confounding factors; (5) the value of the different biomarkers in relation to exposure scenario’s and risk assessment, and (6) the possibilities of novel methodologies. Here, in spite of these challenges it can be concluded that biomarker-based exposure assessment provides a unique opportunity to more accurately assess consumer exposure to process-related contaminants in food and thus to better inform risk assessment.« less
Exposure assessment of process-related contaminants in food by biomarker monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rietjens, Ivonne M. C. M.; Dussort, P.; Gunther, Helmut
Exposure assessment is a fundamental part of the risk assessment paradigm, but can often present a number of challenges and uncertainties. This is especially the case for process contaminants formed during the processing, e.g. heating of food, since they are in part highly reactive and/or volatile, thus making exposure assessment by analysing contents in food unreliable. New approaches are therefore required to accurately assess consumer exposure and thus better inform the risk assessment. Such novel approaches may include the use of biomarkers, physiologically based kinetic (PBK) modelling-facilitated reverse dosimetry, and/or duplicate diet studies. This review focuses on the state ofmore » the art with respect to the use of biomarkers of exposure for the process contaminants acrylamide, 3-MCPD esters, glycidyl esters, furan and acrolein. From the overview presented, it becomes clear that the field of assessing human exposure to process-related contaminants in food by biomarker monitoring is promising and strongly developing. The current state of the art as well as the existing data gaps and challenges for the future were defined. They include (1) using PBK modelling and duplicate diet studies to establish, preferably in humans, correlations between external exposure and biomarkers; (2) elucidation of the possible endogenous formation of the process-related contaminants and the resulting biomarker levels; (3) the influence of inter-individual variations and how to include that in the biomarker-based exposure predictions; (4) the correction for confounding factors; (5) the value of the different biomarkers in relation to exposure scenario’s and risk assessment, and (6) the possibilities of novel methodologies. Here, in spite of these challenges it can be concluded that biomarker-based exposure assessment provides a unique opportunity to more accurately assess consumer exposure to process-related contaminants in food and thus to better inform risk assessment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Humberto E. Garcia
This paper illustrates safeguards benefits that process monitoring (PM) can have as a diversion deterrent and as a complementary safeguards measure to nuclear material accountancy (NMA). In order to infer the possible existence of proliferation-driven activities, the objective of NMA-based methods is often to statistically evaluate materials unaccounted for (MUF) computed by solving a given mass balance equation related to a material balance area (MBA) at every material balance period (MBP), a particular objective for a PM-based approach may be to statistically infer and evaluate anomalies unaccounted for (AUF) that may have occurred within a MBP. Although possibly being indicativemore » of proliferation-driven activities, the detection and tracking of anomaly patterns is not trivial because some executed events may be unobservable or unreliably observed as others. The proposed similarity between NMA- and PM-based approaches is important as performance metrics utilized for evaluating NMA-based methods, such as detection probability (DP) and false alarm probability (FAP), can also be applied for assessing PM-based safeguards solutions. To this end, AUF count estimates can be translated into significant quantity (SQ) equivalents that may have been diverted within a given MBP. A diversion alarm is reported if this mass estimate is greater than or equal to the selected value for alarm level (AL), appropriately chosen to optimize DP and FAP based on the particular characteristics of the monitored MBA, the sensors utilized, and the data processing method employed for integrating and analyzing collected measurements. To illustrate the application of the proposed PM approach, a protracted diversion of Pu in a waste stream was selected based on incomplete fuel dissolution in a dissolver unit operation, as this diversion scenario is considered to be problematic for detection using NMA-based methods alone. Results demonstrate benefits of conducting PM under a system-centric strategy that utilizes data collected from a system of sensors and that effectively exploits known characterizations of sensors and facility operations in order to significantly improve anomaly detection, reduce false alarm, and enhance assessment robustness under unreliable partial sensor information.« less
NASA Astrophysics Data System (ADS)
Kavetski, Dmitri; Clark, Martyn P.
2010-10-01
Despite the widespread use of conceptual hydrological models in environmental research and operations, they remain frequently implemented using numerically unreliable methods. This paper considers the impact of the time stepping scheme on model analysis (sensitivity analysis, parameter optimization, and Markov chain Monte Carlo-based uncertainty estimation) and prediction. It builds on the companion paper (Clark and Kavetski, 2010), which focused on numerical accuracy, fidelity, and computational efficiency. Empirical and theoretical analysis of eight distinct time stepping schemes for six different hydrological models in 13 diverse basins demonstrates several critical conclusions. (1) Unreliable time stepping schemes, in particular, fixed-step explicit methods, suffer from troublesome numerical artifacts that severely deform the objective function of the model. These deformations are not rare isolated instances but can arise in any model structure, in any catchment, and under common hydroclimatic conditions. (2) Sensitivity analysis can be severely contaminated by numerical errors, often to the extent that it becomes dominated by the sensitivity of truncation errors rather than the model equations. (3) Robust time stepping schemes generally produce "better behaved" objective functions, free of spurious local optima, and with sufficient numerical continuity to permit parameter optimization using efficient quasi Newton methods. When implemented within a multistart framework, modern Newton-type optimizers are robust even when started far from the optima and provide valuable diagnostic insights not directly available from evolutionary global optimizers. (4) Unreliable time stepping schemes lead to inconsistent and biased inferences of the model parameters and internal states. (5) Even when interactions between hydrological parameters and numerical errors provide "the right result for the wrong reason" and the calibrated model performance appears adequate, unreliable time stepping schemes make the model unnecessarily fragile in predictive mode, undermining validation assessments and operational use. Erroneous or misleading conclusions of model analysis and prediction arising from numerical artifacts in hydrological models are intolerable, especially given that robust numerics are accepted as mainstream in other areas of science and engineering. We hope that the vivid empirical findings will encourage the conceptual hydrological community to close its Pandora's box of numerical problems, paving the way for more meaningful model application and interpretation.
Representation and re-presentation in litigation science.
Jasanoff, Sheila
2008-01-01
Federal appellate courts have devised several criteria to help judges distinguish between reliable and unreliable scientific evidence. The best known are the U.S. Supreme Court's criteria offered in 1993 in Daubert v. Merrell Dow Pharmaceuticals, Inc. This article focuses on another criterion, offered by the Ninth Circuit Court of Appeals, that instructs judges to assign lower credibility to "litigation science" than to science generated before litigation. In this article I argue that the criterion-based approach to judicial screening of scientific evidence is deeply flawed. That approach buys into the faulty premise that there are external criteria, lying outside the legal process, by which judges can distinguish between good and bad science. It erroneously assumes that judges can ascertain the appropriate criteria and objectively apply them to challenged evidence before litigation unfolds, and before methodological disputes are sorted out during that process. Judicial screening does not take into account the dynamics of litigation itself, including gaming by the parties and framing by judges, as constitutive factors in the production and representation of knowledge. What is admitted through judicial screening, in other words, is not precisely what a jury would see anyway. Courts are sites of repeated re-representations of scientific knowledge. In sum, the screening approach fails to take account of the wealth of existing scholarship on the production and validation of scientific facts. An unreflective application of that approach thus puts courts at risk of relying upon a "junk science" of the nature of scientific knowledge.
Carter, Joanne L; Lane, Catherine E; Fan, Stanley L; Lamb, Edmund J
2011-11-01
Measuring glomerular filtration rate (GFR) is an important assessment in peritoneal dialysis patients. In clinical practice, it is commonly measured by calculating the mean of the urinary clearance of urea and creatinine (GFR(UrCl)) but this process is time consuming and unreliable. We wished to compare several estimates of GFR including residual GFR estimated from cystatin C (GFR(CysC)) using a published equation (Hoek), GFR(UrCl) and (51)Cr-ethylenediaminetetraacetic acid (EDTA) clearance, in peritoneal dialysis patients. GFR(CysC), GFR(UrCl) and (51)Cr-EDTA clearance were measured in 28 patients undergoing peritoneal dialysis in a single dialysis unit. GFR(CysC) was related to GFR(UrCl) (Spearman's rank correlation coefficient r(s) = 0.44; P = 0.0185) and to (51)Cr-EDTA clearance (r(s) = 0.48; P = 0.0099). GFR(CysC) values were significantly (P = 0.0077) lower than (51)Cr-EDTA clearance results (mean bias -19.7%). However, GFR(CysC) did not differ significantly (P > 0.05) from GFR(UrCl). GFR(CysC) is related to GFR(UrCl) but has a significant negative bias against (51)Cr-EDTA. Given the known limitations of (51)Cr-EDTA in estimating GFR in renal failure, this study provides additional validation suggesting that cystatin C-estimated rGFR (GFR(CysC)) gives a reasonable estimation of GFR without the clinical problems associated with 24 h urine collections.
Guide to luminescence dating techniques and their application for paleoseismic research
Gray, Harrison J.; Mahan, Shannon; Rittenour, Tammy M.; Nelson, Michelle Summa; Lund, William R.
2015-01-01
Over the past 25 years, luminescence dating has become a key tool for dating sediments of interest in paleoseismic research. The data obtained from luminescence dating has been used to determine timing of fault displacement, calculate slip rates, and estimate earthquake recurrence intervals. The flexibility of luminescence is a key complement to other chronometers such as radiocarbon or cosmogenic nuclides. Careful sampling and correct selection of sample sites exert two of the strongest controls on obtaining an accurate luminescence age. Factors such as partial bleaching and post-depositional mixing should be avoided during sampling and special measures may be needed to help correct for associated problems. Like all geochronologic techniques, context is necessary for interpreting and calculating luminescence results and this can be achieved by supplying participating labs with associated trench logs, photos, and stratigraphic locations of sample sites.
NASA Technical Reports Server (NTRS)
Kelly, W. R.; Wasserburg, G. J.
1978-01-01
Measurements of the concentration and isotopic composition of Ag and Pd in the Santa Clara iron meteorite suggest that in situ decay of Pd-107 occurred in the meteorite or its parent body. The initial solar ratio of Pd-107/Pd-110 is estimated from the observed ratio of excess Ag-107/Pd-110, and the value of the Pd ratio is incompatible with an interval of approximately 100,000,000 years between the end of nucleosynthesis and the formation of planetary objects but is compatible with a later injection of material. The inferred existence of Pd-107 and Al-26 indicates that the late injection included freshly synthesized material of both intermediate and low atomic weight on a similar time scale. The significance of the Pd-107/Ag-107 chronometer is considered.
Effects of neutrino mass hierarchies on dynamical dark energy models
NASA Astrophysics Data System (ADS)
Yang, Weiqiang; Nunes, Rafael C.; Pan, Supriya; Mota, David F.
2017-05-01
We investigate how three different possibilities of neutrino mass hierarchies, namely normal, inverted, and degenerate, can affect the observational constraints on three well-known dynamical dark energy models, namely the Chevallier-Polarski-Linder, logarithmic, and the Jassal-Bagla-Padmanabhan parametrizations. In order to impose the observational constraints on the models, we performed a robust analysis using Planck 2015 temperature and polarization data, supernovae type Ia from the joint light curve analysis, baryon acoustic oscillation distance measurements, redshift space distortion characterized by f (z )σ8(z ) data, weak gravitational lensing data from the Canada-France-Hawaii Telescope Lensing Survey, and cosmic chronometer data plus the local value of the Hubble parameter. We find that different neutrino mass hierarchies return similar fits on almost all model parameters and mildly change the dynamical dark energy properties.
Evaluation of the 129I Half-Life Value Through Analyses of Primitive Meteorites
NASA Astrophysics Data System (ADS)
Pravdivtseva, Olga; Meshik, Alex; Hohenberg, Charles M.
The preserved record of decay of now-extinct 129I into 129Xe forms the basis of the I-Xe chronometer. Comparison of the high precision I-Xe and Pb-Pb ages of chondrules and pure mineral phases separated from eight meteorites suggests the 17.5 ÷ 14.6 Ma range for the 129I half-life, assuming that the 235U and 238U half-lives are correct. The mean value of 16 Ma indicates that the 15.7 Ma half-life of 129I used here for the I-Xe age calculations is most probably correct. Since the 129I half-life value only affects the relative I-Xe ages, the few Ma relative to the Shallowater standard, the absolute I-Xe ages are almost immune to this uncertainty in the 129I half-life.
The white dwarf mass-radius relation with Gaia, Hubble and FUSE
NASA Astrophysics Data System (ADS)
Joyce, Simon R. G.; Barstow, Martin A.; Casewell, Sarah L.; Holberg, Jay B.; Bond, Howard E.
2018-04-01
White dwarfs are becoming useful tools for many areas of astronomy. They can be used as accurate chronometers over Gyr timescales. They are also clues to the history of star formation in our galaxy. Many of these studies require accurate estimates of the mass of the white dwarf. The theoretical mass-radius relation is often invoked to provide these mass estimates. While the theoretical mass-radius relation is well developed, observational tests of this relation show a much larger scatter in the results than expected. High precision observational tests to confirm this relation are required. Gaia is providing distance measurements which will remove one of the main source of uncertainty affecting most previous observations. We combine Gaia distances with spectra from the Hubble and FUSE satelites to make precise tests of the white dwarf mass-radius relation.
Glaucoma history and risk factors.
McMonnies, Charles W
Apart from the risk of developing glaucoma there is also the risk that it is not detected and irreversible loss of vision ensues. Some studies of methods of glaucoma diagnosis have examined the results of instrument-based examinations with great if not complete reliance on objective findings in arriving at a diagnosis. The very valuable advances in glaucoma detection instrument technologies, and apparent increasing dependence on them, may have led to reduced consideration of information available from a patient history in those studies. Dependence on objective evidence of glaucomatous pathology may reduce the possibility of detecting glaucoma suspects or patients at risk for becoming glaucoma suspects. A valid positive family history of glaucoma is very valuable information. However, negative family histories can often be unreliable due to large numbers of glaucoma cases being undiagnosed. No evidence of family history is appropriate rather than no family history. In addition the unreliability of a negative family history is increased when patients with glaucoma fail to inform their family members. A finding of no family history can only be stated as no known family history. In examining the potential diagnostic contribution from a patient history, this review considers, age, frailty, race, type and degree of refractive error, systemic hyper- and hypotension, vasospasm, migraine, pigmentary dispersion syndrome, pseudoexfoliation syndrome, obstructive sleep apnea syndrome, diabetes, medication interactions and side effects, the degree of exposure to intraocular and intracranial pressure elevations and fluctuations, smoking, and symptoms in addition to genetics and family history of the disease. Copyright © 2016 Spanish General Council of Optometry. Published by Elsevier España, S.L.U. All rights reserved.
Reliability Assessment of Reconfigurable Flight Control Systems Using Sure and Assist
NASA Technical Reports Server (NTRS)
Wu, N. Eva
1992-01-01
This paper presents a reliability assessment of Reconfigurable Flight Control Systems using Semi-Markov Unreliability Range Evaluator (SURE) and Abstract Semi-Markov Specification Interface to the SURE Tool (ASSIST).
Field Study of Stress: Psychophysiological Measures During Project Supex.
1978-10-01
recordings proved to be unreliable utilizing the current procedures. The perceived scales evaluated the current state of the individual, but they were not good predictors of performance or heart rate activity. (Author)
Reliable quantum communication over a quantum relay channel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyongyosi, Laszlo, E-mail: gyongyosi@hit.bme.hu; Imre, Sandor
2014-12-04
We show that reliable quantum communication over an unreliable quantum relay channels is possible. The coding scheme combines the results on the superadditivity of quantum channels and the efficient quantum coding approaches.
Surface code quantum communication.
Fowler, Austin G; Wang, David S; Hill, Charles D; Ladd, Thaddeus D; Van Meter, Rodney; Hollenberg, Lloyd C L
2010-05-07
Quantum communication typically involves a linear chain of repeater stations, each capable of reliable local quantum computation and connected to their nearest neighbors by unreliable communication links. The communication rate of existing protocols is low as two-way classical communication is used. By using a surface code across the repeater chain and generating Bell pairs between neighboring stations with probability of heralded success greater than 0.65 and fidelity greater than 0.96, we show that two-way communication can be avoided and quantum information can be sent over arbitrary distances with arbitrarily low error at a rate limited only by the local gate speed. This is achieved by using the unreliable Bell pairs to measure nonlocal stabilizers and feeding heralded failure information into post-transmission error correction. Our scheme also applies when the probability of heralded success is arbitrarily low.
The uranium-isotopic composition of Saharan dust collected over the central Atlantic Ocean
NASA Astrophysics Data System (ADS)
Aciego, Sarah M.; Aarons, Sarah M.; Sims, Kenneth W. W.
2015-06-01
Uranium isotopic compositions, (234U/238U)activity , are utilized by earth surface disciplines as chronometers and source tracers, including in soil science where aeolian dust is a significant source to the total nutrient pool. However, the (234U/238U)activity composition of dust is under characterized due to material and analytical constraints. Here we present new uranium isotope data measured by high precision MC-ICP-MS on ten airborne dust samples collected on the M55 trans-Atlantic cruise in 2002. Two pairs of samples are presented with different size fractions, coarse (1-30 μm) and fine (<1 μm), and all samples were processed to separate the water soluble component in order to assess the controls on the (234U/238U)activity of mineral aerosols transported from the Sahara across the Atlantic. Our results indicate (234U/238U)activity above one for both the water soluble (1.13-1.17) and the residual solid (1.06-1.18) fractions of the dust; no significant correlation is found between isotopic composition and travel distance. Residual solids indicate a slight dependance of (234U/238U)activity on particle size. Future modeling work that incorporates dust isotopic compositions into mixing or isotopic fractionation models will need to account for the wide variability in dust (234U/238U)activity .
Reconstruction of cosmological matter perturbations in modified gravity
NASA Astrophysics Data System (ADS)
Gonzalez, J. E.
2017-12-01
The analysis of perturbative quantities is a powerful tool to distinguish between different dark energy models and gravity theories degenerated at the background level. In this work, we generalize the integral solution of the matter density contrast for general relativity gravity [V. Sahni and A. Starobinsky, Int. J. Mod. Phys. D 15, 2105 (2006)., 10.1142/S0218271806009704, U. Alam, V. Sahni, and A. A. Starobinsky, Astrophys. J. 704, 1086 (2009)., 10.1088/0004-637X/704/2/1086] to a wide class of modified gravity (MG) theories. To calculate this solution, it is necessary to have prior knowledge of the Hubble rate, the density parameter at the present epoch (Ωm 0), and the functional form of the effective Newton's constant that characterizes the gravity theory. We estimate in a model-independent way the Hubble expansion rate by applying a nonparametric reconstruction method to model-independent cosmic chronometer data and high-z quasar data. In order to compare our generalized solution of the matter density contrast, using the nonparametric reconstruction of H (z ) from observational data, with a purely theoretical one, we choose a parametrization of the screened modified gravity and the Ωm 0 from WMAP-9 Collaborations. Finally, we calculate the growth index for the analyzed cases, finding very good agreement between theoretical values and the obtained ones using the approach presented in this work.
Effectiveness of Mentha piperita in the Treatment of Infantile Colic: A Crossover Study
Alves, João Guilherme Bezerra; de Brito, Rita de Cássia Coelho Moraes; Cavalcanti, Telma Samila
2012-01-01
Background. Infantile colic is a distressing and common condition for which there is no proven standard treatment. Objective. To compare the efficacy of Mentha piperita with simethicone in treatment for infantile colic. Methods. A double-blind crossover study was performed with 30 infants attending IMIP, Recife, Brazil. They were randomized to use Mentha piperita or simethicone in the treatment of infantile colic during 7 days with each drug. Primary outcomes were mother_s opinion about responses to the treatment, number of daily episodes of colic, and time spent crying, measured by a chronometer. Mann-Whitney and chi-square tests were used to compare the results. This study was previously approved by the Ethical Committee in Research at IMIP. Results. At baseline daily episodes of infantile colic was 3.9 (±1.1) and the mean crying time per day was 192 minutes (±51.6). At the end of the study daily episodes of colic fell to 1.6 (±0.6) and the crying duration decreased to 111 (±28) minutes. All mothers reported decrease of frequency and duration of the episodes of infantile colic and there were no differences between responses to Mentha piperita and simethicone. Conclusions. These findings suggest that Mentha piperita may be used to help control infantile colic. However, these results must be repeated by others studies. PMID:22844342
Ar/Ar Dating Independent of Monitor Standard Ages
NASA Astrophysics Data System (ADS)
Boswell, S.; Hemming, S. R.
2015-12-01
Because the reported age of an analyzed sample is dependent on the age of the co-irradiated monitor standard(s), Ar/Ar dating is a relative dating technique. There is disagreement at the 1% scale in the age of commonly used monitor standards, and there is a great need to improve the inter-laboratory calibrations. Additionally, new approaches and insights are needed to meet the challenge of bringing the Ar/Ar chronometer to the highest possible precision and accuracy. In this spirit, we present a conceptual framework for Ar/Ar dating that does not depend on the age of monitor standards, but only on the K content of a solid standard. The concept is demonstrated by introducing a re-expressed irradiation parameter (JK) that depends on the ratio of 39ArK to 40Ar* rather than the 40Ar*/39ArK ratio. JK is equivalent to the traditional irradiation parameter J and is defined as JK = (39Ar/40K) • (λ/λe). The ultimate precision and accuracy of the method will depend on how precisely and accurately the 39Ar and 40K can be estimated, and will require isotope dilution measurements of both from the same aliquot. We are testing the workability of our technique at the 1% level by measuring weighed and irradiated hornblende and biotite monitor standards using GLO-1 glauconite to define a calibration curve for argon signals versus abundance.
Navarro, Aaron; Martínez-Murcia, Antonio
2018-04-19
The phylogenies derived from housekeeping gene sequence alignments, although mere evolutionary hypotheses, have increased our knowledge about the Aeromonas genetic diversity, providing a robust species delineation framework invaluable for reliable, easy and fast species identification. Previous classifications of Aeromonas, have been fully surpassed by recently developed phylogenetic (natural) classification obtained from the analysis of so-called "molecular chronometers". Despite ribosomal RNAs cannot split all known Aeromonas species, the conserved nature of 16S rRNA offers reliable alignments containing mosaics of sequence signatures which may serve as targets of genus-specific oligonucleotides for subsequent identification/detection tests in samples without culturing. On the contrary, some housekeeping genes coding for proteins show a much better chronometric capacity to discriminate highly related strains. Although both, species and loci, do not all evolve at exactly the same rate, published Aeromonas phylogenies were congruent to each other, indicating that, phylogenetic markers are synchronized and a concatenated multi-gene phylogeny, may be "the mirror" of the entire genomic relationships. Thanks to MLPA approaches, the discovery of new Aeromonas species and strains of rarely isolated species is today more frequent and, consequently, should be extensively promoted for isolate screening and species identification. Although, accumulated data still should be carefully catalogued to inherit a reliable database. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
The use of mineral crystals as bio-markers in the search for life on Mars
NASA Technical Reports Server (NTRS)
Schwartz, D. E.; Mancinelli, R. L.; Kaneshiro, E. S.
1992-01-01
It is proposed that minerals resulting from biologically controlled mineralization processes be utilized as biomarkers because of their favorable qualities. Universal signatures of life (biomarkers) are discussed in terms of their terrestrial forms and hypothetical Martian counterparts including organics, suites of specific inorganic and organic compounds, and isotopic ratios. It is emphasized that minerals produced under biologic control have morphological and isotopic compositions that are not found in their abiotic counterparts. Other biomarkers are not necessarily indicative of biological origin and are therefore unreliable resources for scientific study. Mineral crystals are also stable over long geological periods, and the minerals from Martian fluvial features can therefore be employed to search for fossils and biomarkers of early biological activity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saaban, Azizan; Zainudin, Lutfi; Bakar, Mohd Nazari Abu
This paper intends to reveal the ability of the linear interpolation method to predict missing values in solar radiation time series. Reliable dataset is equally tends to complete time series observed dataset. The absence or presence of radiation data alters long-term variation of solar radiation measurement values. Based on that change, the opportunities to provide bias output result for modelling and the validation process is higher. The completeness of the observed variable dataset has significantly important for data analysis. Occurrence the lack of continual and unreliable time series solar radiation data widely spread and become the main problematic issue. However,more » the limited number of research quantity that has carried out to emphasize and gives full attention to estimate missing values in the solar radiation dataset.« less
How accurate are lexile text measures?
Stenner, A Jackson; Burdick, Hal; Sanford, Eleanor E; Burdick, Donald S
2006-01-01
The Lexile Framework for Reading models comprehension as the difference between a reader measure and a text measure. Uncertainty in comprehension rates results from unreliability in reader measures and inaccuracy in text readability measures. Whole-text processing eliminates sampling error in text measures. However, Lexile text measures are imperfect due to misspecification of the Lexile theory. The standard deviation component associated with theory misspecification is estimated at 64L for a standard-length passage (approximately 125 words). A consequence is that standard errors for longer texts (2,500 to 150,000 words) are measured on the Lexile scale with uncertainties in the single digits. Uncertainties in expected comprehension rates are largely due to imprecision in reader ability and not inaccuracies in text readabilities.
Tsaltas, G; Ford, C H
1993-02-01
Methods following the process of binding and internalization of antibodies to cell surface antigens have often employed low pH isoosmolar buffers in order to dissociate surface antigen-antibody complexes. One of the most widely used buffers is a 0.05 M glycine-HCL buffer pH 2.8. Since the efficacy of action of this buffer was critical to a series of internalization experiments employing monoclonal antibodies (Mabs) to carcinoembryonic antigen (CEA) expressing cancer cell lines in this laboratory, we tested its performance in a number of different assays. Our results indicate that this buffer only partially dissociates antigen-antibody bonds and therefore can introduce major inaccuracies in internalization experiments.
ERIC Educational Resources Information Center
Lumsden, James
1977-01-01
Person changes can be of three kinds: developmental trends, swells, and tremors. Person unreliability in the tremor sense (momentary fluctuations) can be estimated from person characteristic curves. Average person reliability for groups can be compared from item characteristic curves. (Author)
Vibrational energy transfer and relaxation in O2 and H2O.
Huestis, David L
2006-06-01
Near-resonant vibrational energy exchange between oxygen and water molecules is an important process in the Earth's atmosphere, combustion chemistry, and the chemical oxygen iodine laser (COIL). The reactions in question are (1) O2(1) + O2(0) --> O2(0) + O2(0); (2) O2(1) + H2O(000) --> O2(0) + H2O(000); (3) O2(1) + H2O(000) <--> O2(0) + H2O(010); (4) H2O(010) + H2O(000) --> H2O(000) + H2O(000); and (5) H2O(010) + O2(0) --> H2O(000) + O2(0). Reanalysis of the data available in the chemical kinetics literature provides reliable values for rate coefficients for reactions 1 and 4 and strong evidence that reactions 2 and 5 are slow in comparison with reaction 3. Analytical solution of the chemical rate equations shows that previous attempts to measure the rate of reaction 3 are unreliable unless the water mole fraction is higher than 1%. Reanalysis of data from the only experiment satisfying this constraint provides a rate coefficient of (5.5 +/- 0.4) x 10(-13) cm3/s at room temperature, between the values favored by the atmospheric and laser modeling communities.
An Intersensory Interaction Account of Priming Effects-and Their Absence.
Klatzky, Roberta L; Creswell, J David
2014-01-01
Psychological researchers have found that exposures to stimuli (primes) can subsequently influence people's behavior by pathways that would seem to be quite remote. For example, people exposed to words associated with older adults may walk more slowly. Recently priming studies, particularly those showing dramatic effects on social behavior, have been under scrutiny because of the unreliability of empirical results. In this article, we shed light on the issue by describing a general model of intersensory interaction, in which two or more sources of information provide an estimate or "bid" on a property of the world, with the perceptual outcome being a weighted combination of the bids. When it is extended by adding bids that stem from memory or inference, the model identifies systematic factors that might undermine priming, including random variation in estimates, contextual influences on memory retrieval and inference, competition among information sources, and cognitive control. These factors are not only explanatory but are predictive of when priming effects can be expected. Our hope is that by promoting the understanding of underlying processes that may explain how primes can influence behavior, the bidding model and the general approach that it represents offer novel insights into the hotly debated area of priming research. © The Author(s) 2013.
Determination of solute descriptors by chromatographic methods.
Poole, Colin F; Atapattu, Sanka N; Poole, Salwa K; Bell, Andrea K
2009-10-12
The solvation parameter model is now well established as a useful tool for obtaining quantitative structure-property relationships for chemical, biomedical and environmental processes. The model correlates a free-energy related property of a system to six free-energy derived descriptors describing molecular properties. These molecular descriptors are defined as L (gas-liquid partition coefficient on hexadecane at 298K), V (McGowan's characteristic volume), E (excess molar refraction), S (dipolarity/polarizability), A (hydrogen-bond acidity), and B (hydrogen-bond basicity). McGowan's characteristic volume is trivially calculated from structure and the excess molar refraction can be calculated for liquids from their refractive index and easily estimated for solids. The remaining four descriptors are derived by experiment using (largely) two-phase partitioning, chromatography, and solubility measurements. In this article, the use of gas chromatography, reversed-phase liquid chromatography, micellar electrokinetic chromatography, and two-phase partitioning for determining solute descriptors is described. A large database of experimental retention factors and partition coefficients is constructed after first applying selection tools to remove unreliable experimental values and an optimized collection of varied compounds with descriptor values suitable for calibrating chromatographic systems is presented. These optimized descriptors are demonstrated to be robust and more suitable than other groups of descriptors characterizing the separation properties of chromatographic systems.
NASA Astrophysics Data System (ADS)
Selby, D.
2011-12-01
Geochronology is fundamental to understand the age, rates and durations of Earth processes. This concerned Arthur Holmes who, for much of his career, attempted to define a geological time scale. This is a topic still important to Earth Scientists today, specifically the chronostratigraphy of sedimentary rocks. Here I explore the Re-Os geochronology of marine and lacustrine sedimentary rocks and its application to yield absolute time constraints for stratigraphy. The past decade has seen the pioneering research of Re-Os organic-rich sedimentary rock geochronology blossom into a tool that can now to be used to accurately and precisely determine depositional ages of organic-rich rock units that have experienced up to low grade greenschist metamorphism. This direct dating of sedimentary rocks is critical where volcanic horizons are absent. As a result, this tool has been applied to timescale calibration, basin correlation, formation duration and the timing of key Earth events (e.g., Neoproterozoic glaciations). The application of Re-Os chronometer to the Devonian-Mississippian boundary contained within the Exshaw Formation, Canada, determined an age of 361.3 ± 2.4 Ma. This age is in accord with U-Pb dates of interbedded tuff horizons and also U-Pb zircon date for the type Devonian-Mississippian Hasselbachtal section, Germany. The agreement of the biostratigraphic and U-Pb constraints of the Exshaw Formation with the Re-Os date illustrated the potential of the Re-Os chronometer to yield age determinations for sedimentary packages, especially in the absence of interbedd tuff horizons and biozones. A Re-Os date for the proposed type section of the Oxfordian-Kimmeridgian boundary, Staffin Bay, Isle of Skye, U.K., gave an age of 154.1 ± 2.2 Ma. This Re-Os age presents a 45 % (1.8 Ma) improvement in precision for the basal Kimmeridgian. It also demonstrated that the duration of the Kimmeridgian is nominally 3.3 Ma and thus is 1.6 Ma shorter than previously indicated. In addition to these examples, several studies have presented precise dates for Phanerozoic marine organic-rich units that are in excellent agreement with biostratigraphic determinations. A recent Re-Os study of the Woodford Shale (that was deposited throughout the Frasnian and Famennian) has provided important time markers as well as suggesting that the sedimentation rate of the Formation was relatively constant for ~20 Ma. To date only marine organic-rich sedimentary rocks have been utilized for Re-Os geochronology. However, lacustrine sedimentary rocks provide an invaluable archive of continental geological processes responding to tectonic, climatic and magmatic influences. Correlating these rocks to global geological phenomena requires accurate geochronological frameworks. The organic-rich lacustrine sedimentary units of the Eocene Green River Formation are enriched is Re and Os comparable to that of marine units. The Re-Os dates for the Green River Formation from the Uinta basin are 48.5 ± 0.6 Ma and 49.2 ± 1.0 Ma. These dates are in excellent agreement with Ar/Ar and U/Pb dates of interbedded tuffs in the GRF, therefore demonstrating that lacustrine units can be used for Re-Os geochronology in addition to marine organic-rich units.
Traffic congestion and reliability : linking solutions to problems.
DOT National Transportation Integrated Search
2004-07-19
The Traffic Congestion and Reliability: Linking Solutions to Problems Report provides : a snapshot of congestion in the United States by summarizing recent trends in : congestion, highlighting the role of unreliable travel times in the effects of con...
Megatrends: Megahype, Megabad.
ERIC Educational Resources Information Center
Goldman, Louis
1983-01-01
Criticizes John Naisbitt's bestselling novel, "Megatrends," for reifying constructs (industrial society and information society), treating these entities as mutually exclusive, and endowing them with a life cycle. In addition, claims the novel is marred by faddish jargon and is statistically unreliable. (MLF)
QUALITY ASSESSMENT OF CONFOCAL MICROSCOPY SLIDE-BASED SYSTEMS: INSTABLITY
Background: All slide-based fluorescence cytometry detections systems basically include an excitation light source, intermediate optics, and a detection device (CCD or PMT). Occasionally, this equipment becomes unstable, generating unreliable and inferior data. Methods: A num...
Development and Evaluation of Smart Bus System
DOT National Transportation Integrated Search
2016-12-13
Due to stochastic traffic conditions and fluctuated demand, transit passengers often suffer from unreliable services. Especially for buses, keeping on-time schedules is challenging as they share the right of way with non-transit traffic. With the adv...
The Reliability of Psychiatric Diagnosis Revisited
Rankin, Eric; France, Cheryl; El-Missiry, Ahmed; John, Collin
2006-01-01
Background: The authors reviewed the topic of reliability of psychiatric diagnosis from the turn of the 20th century to present. The objectives of this paper are to explore the reasons of unreliability of psychiatric diagnosis and propose ways to improve the reliability of psychiatric diagnosis. Method: The authors reviewed the literature on the concept of reliability of psychiatric diagnosis with emphasis on the impact of interviewing skills, use of diagnostic criteria, and structured interviews on the reliability of psychiatric diagnosis. Results: Causes of diagnostic unreliability are attributed to the patient, the clinician and psychiatric nomenclature. The reliability of psychiatric diagnosis can be enhanced by using diagnostic criteria, defining psychiatric symptoms and structuring the interviews. Conclusions: The authors propose the acronym ‘DR.SED,' which stands for diagnostic criteria, reference definitions, structuring the interview, clinical experience, and data. The authors recommend that clinicians use the DR.SED paradigm to improve the reliability of psychiatric diagnoses. PMID:21103149
NASA Astrophysics Data System (ADS)
Li, Cong; Jing, Hui; Wang, Rongrong; Chen, Nan
2018-05-01
This paper presents a robust control schema for vehicle lateral motion regulation under unreliable communication links via controller area network (CAN). The communication links between the system plant and the controller are assumed to be imperfect and therefore the data packet dropouts occur frequently. The paper takes the form of parallel distributed compensation and treats the dropouts as random binary numbers that form Bernoulli distribution. Both of the tire cornering stiffness uncertainty and external disturbances are considered to enhance the robustness of the controller. In addition, a robust H∞ static output-feedback control approach is proposed to realize the lateral motion control with relative low cost sensors. The stochastic stability of the closed-loop system and conservation of the guaranteed H∞ performance are investigated. Simulation results based on CarSim platform using a high-fidelity and full-car model verify the effectiveness of the proposed control approach.
When Reputation Enforces Evolutionary Cooperation in Unreliable MANETs.
Tang, Changbing; Li, Ang; Li, Xiang
2015-10-01
In self-organized mobile ad hoc networks (MANETs), network functions rely on cooperation of self-interested nodes, where a challenge is to enforce their mutual cooperation. In this paper, we study cooperative packet forwarding in a one-hop unreliable channel which results from loss of packets and noisy observation of transmissions. We propose an indirect reciprocity framework based on evolutionary game theory, and enforce cooperation of packet forwarding strategies in both structured and unstructured MANETs. Furthermore, we analyze the evolutionary dynamics of cooperative strategies and derive the threshold of benefit-to-cost ratio to guarantee the convergence of cooperation. The numerical simulations verify that the proposed evolutionary game theoretic solution enforces cooperation when the benefit-to-cost ratio of the altruistic exceeds the critical condition. In addition, the network throughput performance of our proposed strategy in structured MANETs is measured, which is in close agreement with that of the full cooperative strategy.
Dipstick measurements of urine specific gravity are unreliable.
de Buys Roessingh, A S; Drukker, A; Guignard, J P
2001-08-01
To evaluate the reliability of dipstick measurements of urine specific gravity (U-SG). Fresh urine specimens were tested for urine pH and osmolality (U-pH, U-Osm) by a pH meter and an osmometer, and for U-SG by three different methods (refractometry, automatic readout of a dipstick (Clinitek-50), and (visual) change of colour of the dipstick). The correlations between the visual U-SG dipstick measurements and U-SG determined by a refractometer and the comparison of Clinitek((R))-50 dipstick U-SG measurements with U-Osm were less than optimal, showing very wide scatter of values. Only the U-SG refractometer values and U-Osm had a good linear correlation. The tested dipstick was unreliable for the bedside determination of U-SG, even after correction for U-pH, as recommended by the manufacturer. Among the bedside determinations, only refractometry gives reliable U-SG results. Dipstick U-SG measurements should be abandoned.
Behavior-Based Cleaning for Unreliable RFID Data Sets
Fan, Hua; Wu, Quanyuan; Lin, Yisong
2012-01-01
Radio Frequency IDentification (RFID) technology promises to revolutionize the way we track items and assets, but in RFID systems, missreading is a common phenomenon and it poses an enormous challenge to RFID data management, so accurate data cleaning becomes an essential task for the successful deployment of systems. In this paper, we present the design and development of a RFID data cleaning system, the first declarative, behavior-based unreliable RFID data smoothing system. We take advantage of kinematic characteristics of tags to assist in RFID data cleaning. In order to establish the conversion relationship between RFID data and kinematic parameters of the tags, we propose a movement behavior detection model. Moreover, a Reverse Order Filling Mechanism is proposed to ensure a more complete access to get the movement behavior characteristics of tag. Finally, we validate our solution with a common RFID application and demonstrate the advantages of our approach through extensive simulations. PMID:23112595
Behavior-based cleaning for unreliable RFID data sets.
Fan, Hua; Wu, Quanyuan; Lin, Yisong
2012-01-01
Radio Frequency IDentification (RFID) technology promises to revolutionize the way we track items and assets, but in RFID systems, missreading is a common phenomenon and it poses an enormous challenge to RFID data management, so accurate data cleaning becomes an essential task for the successful deployment of systems. In this paper, we present the design and development of a RFID data cleaning system, the first declarative, behavior-based unreliable RFID data smoothing system. We take advantage of kinematic characteristics of tags to assist in RFID data cleaning. In order to establish the conversion relationship between RFID data and kinematic parameters of the tags, we propose a movement behavior detection model. Moreover, a Reverse Order Filling Mechanism is proposed to ensure a more complete access to get the movement behavior characteristics of tag. Finally, we validate our solution with a common RFID application and demonstrate the advantages of our approach through extensive simulations.
Benefits of Imperfect Conflict Resolution Advisory Aids for Future Air Traffic Control.
Trapsilawati, Fitri; Wickens, Christopher D; Qu, Xingda; Chen, Chun-Hsien
2016-11-01
The aim of this study was to examine the human-automation interaction issues and the interacting factors in the context of conflict detection and resolution advisory (CRA) systems. The issues of imperfect automation in air traffic control (ATC) have been well documented in previous studies, particularly in conflict-alerting systems. The extent to which the prior findings can be applied to an integrated conflict detection and resolution system in future ATC remains unknown. Twenty-four participants were evenly divided into two groups corresponding to a medium- and a high-traffic density condition, respectively. In each traffic density condition, participants were instructed to perform simulated ATC tasks under four automation conditions, including reliable, unreliable with short time allowance to secondary conflict (TAS), unreliable with long TAS, and manual conditions. Dependent variables accounted for conflict resolution performance, workload, situation awareness, and trust in and dependence on the CRA aid, respectively. Imposing the CRA automation did increase performance and reduce workload as compared with manual performance. The CRA aid did not decrease situation awareness. The benefits of the CRA aid were manifest even when it was imperfectly reliable and were apparent across traffic loads. In the unreliable blocks, trust in the CRA aid was degraded but dependence was not influenced, yet the performance was not adversely affected. The use of CRA aid would benefit ATC operations across traffic densities. CRA aid offers benefits across traffic densities, regardless of its imperfection, as long as its reliability level is set above the threshold of assistance, suggesting its application for future ATC. © 2016, Human Factors and Ergonomics Society.
Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events
NASA Astrophysics Data System (ADS)
DeChant, C. M.; Moradkhani, H.
2014-12-01
Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.
Staugaard, Benjamin; Christensen, Peer Brehm; Mössner, Belinda; Hansen, Janne Fuglsang; Madsen, Bjørn Stæhr; Søholm, Jacob; Krag, Aleksander; Thiele, Maja
2016-11-01
Transient elastography (TE) is hampered in some patients by failures and unreliable results. We hypothesized that real time two-dimensional shear wave elastography (2D-SWE), the FibroScan XL probe, and repeated TE exams, could be used to obtain reliable liver stiffness measurements in patients with an invalid TE examination. We reviewed 1975 patients with 5764 TE exams performed between 2007 and 2014, to identify failures and unreliable exams. Fifty-four patients with an invalid TE at their latest appointment entered a comparative feasibility study of TE vs. 2D-SWE. The initial TE exam was successful in 93% (1835/1975) of patients. Success rate increased from 89% to 96% when the XL probe became available (OR: 1.07, 95% CI 1.06-1.09). Likewise, re-examining those with a failed or unreliable TE led to a reliable TE in 96% of patients. Combining availability of the XL probe with TE re-examination resulted in a 99.5% success rate on a per-patient level. When comparing the feasibility of TE vs. 2D-SWE, 96% (52/54) of patients obtained a reliable TE, while 2D-SWE was reliable in 63% (34/54, p < 0.001). The odds of a successful 2D-SWE exam decreased with higher skin-capsule distance (OR = 0.77, 95% CI 0.67-0.98). Transient elastography can be accomplished in nearly all patients by use of the FibroScan XL probe and repeated examinations. In difficult-to-scan patients, the feasibility of TE is superior to 2D-SWE.
Constraining the physics of carbon crystallization through pulsations of a massive DAV BPM37093
NASA Astrophysics Data System (ADS)
Nitta, Atsuko; Kepler, S. O.; Chené, André-Nicolas; Koester, D.; Provencal, J. L.; Kleinmani, S. J.; Sullivan, D. J.; Chote, Paul; Sefako, Ramotholo; Kanaan, Antonio; Romero, Alejandra; Corti, Mariela; Kilic, Mukremin; Montgomery, M. H.; Winget, D. E.
We are trying to reduce the largest uncertainties in using white dwarf stars as Galactic chronometers by understanding the details of carbon crystalliazation that currently result in a 1-2 Gyr uncertainty in the ages of the oldest white dwarf stars. We expect the coolest white dwarf stars to have crystallized interiors, but theory also predicts hotter white dwarf stars, if they are massive enough, will also have some core crystallization. BPM 37093 is the first discovered of only a handful of known massive white dwarf stars that are also pulsating DAV, or ZZ Ceti, variables. Our approach is to use the pulsations to constrain the core composition and amount of crystallization. Here we report our analysis of 4 hours of continuous time series spectroscopy of BPM 37093 with Gemini South combined with simultaneous time-series photometry from Mt. John (New Zealand), SAAO, PROMPT, and Complejo Astronomico El Leoncito (CASLEO, Argentina).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eppich, Gary R.; Williams, Ross W.; Gaffney, Amy M.
Here, age dating of nuclear material can provide insight into source and suspected use in nuclear forensic investigations. We report here a method for the determination of the date of most recent chemical purification for uranium materials using the 235U- 231Pa chronometer. Protactinium is separated from uranium and neptunium matrices using anion exchange resin, followed by sorption of Pa to an SiO 2 medium. The concentration of 231Pa is measured by isotope dilution mass spectrometry using 233Pa spikes prepared from an aliquot of 237Np and calibrated in-house using the rock standard Table Mountain Latite and the uranium isotopic standard U100.more » Combined uncertainties of age dates using this method are 1.5 to 3.5 %, an improvement over alpha spectrometry measurement methods. Model ages of five uranium standard reference materials are presented; all standards have concordant 235U- 231Pa and 234U- 230Th model ages.« less
Latest astronomical constraints on some non-linear parametric dark energy models
NASA Astrophysics Data System (ADS)
Yang, Weiqiang; Pan, Supriya; Paliathanasis, Andronikos
2018-04-01
We consider non-linear redshift-dependent equation of state parameters as dark energy models in a spatially flat Friedmann-Lemaître-Robertson-Walker universe. To depict the expansion history of the universe in such cosmological scenarios, we take into account the large-scale behaviour of such parametric models and fit them using a set of latest observational data with distinct origin that includes cosmic microwave background radiation, Supernove Type Ia, baryon acoustic oscillations, redshift space distortion, weak gravitational lensing, Hubble parameter measurements from cosmic chronometers, and finally the local Hubble constant from Hubble space telescope. The fitting technique avails the publicly available code Cosmological Monte Carlo (COSMOMC), to extract the cosmological information out of these parametric dark energy models. From our analysis, it follows that those models could describe the late time accelerating phase of the universe, while they are distinguished from the Λ-cosmology.
Shearer, C. K.; Elardo, S. M.; Petro, N. E.; ...
2014-12-23
The Mg-suite represents an enigmatic episode of lunar highlands magmatism that presumably represents the first stage of crustal building following primordial differentiation. This review examines the mineralogy, geochemistry, petrology, chronology, and the planetary-scale distribution of this suite of highlands plutonic rocks, presents models for their origin, examines petrogenetic relationships to other highlands rocks, and explores the link between this style of magmatism and early stages of lunar differentiation. Of the models considered for the origin of the parent magmas for the Mg-suite, the data best fit a process in which hot (solidus temperature at ≥2 GPa = 1600 to 1800more » °C) and less dense (r ~3100 kg/m3) early lunar magma ocean cumulates rise to the base of the crust during cumulate pile overturn. Some decompressional melting would occur, but placing a hot cumulate horizon adjacent to the plagioclase-rich primordial crust and KREEP-rich lithologies (at temperatures of <1300 °C) would result in the hybridization of these divergent primordial lithologies, producing Mg-suite parent magmas. As urKREEP (primeval KREEP) is not the “petrologic driver” of this style of magmatism, outside of the Procellarum KREEP Terrane (PKT), Mg-suite magmas are not required to have a KREEP signature. Evaluation of the chronology of this episode of highlands evolution indicates that Mg-suite magmatism was initiated soon after primordial differentiation (<10 m.y.). Alternatively, the thermal event associated with the mantle overturn may have disrupted the chronometers utilized to date the primordial crust. Petrogenetic relationships between the Mg-suite and other highlands suites (e.g., alkali-suite and magnesian anorthositic granulites) are consistent with both fractional crystallization processes and melting of distinctly different hybrid sources.« less
Mammalian species - Neotoma magister
Steven B. Castleberry; Michael T. Mengak; W. Mark Ford
2006-01-01
External morphology of N. magister (Fig. 1) is similar to that of N. floridana, the only parapatric Neotoma. Although N. magister generally is larger in mass and with longer vibrissae, identification based on single measurements is unreliable because of morphometric overlap (Ray 2000)....
Automatic Refraction: How It Is Done: Some Clinical Results
ERIC Educational Resources Information Center
Safir, Aran; And Others
1973-01-01
Compaired are methods of determining visual refraction needs of young children or other unreliable observers by means of retinosocopy or the Opthalmetron, an automatic instrument which can be operated by a technician with no knowledge of refraction. (DB)
High-Temperature Optical Sensor
NASA Technical Reports Server (NTRS)
Adamovsky, Grigory; Juergens, Jeffrey R.; Varga, Donald J.; Floyd, Bertram M.
2010-01-01
A high-temperature optical sensor (see Figure 1) has been developed that can operate at temperatures up to 1,000 C. The sensor development process consists of two parts: packaging of a fiber Bragg grating into a housing that allows a more sturdy thermally stable device, and a technological process to which the device is subjected to in order to meet environmental requirements of several hundred C. This technology uses a newly discovered phenomenon of the formation of thermally stable secondary Bragg gratings in communication-grade fibers at high temperatures to construct robust, optical, high-temperature sensors. Testing and performance evaluation (see Figure 2) of packaged sensors demonstrated operability of the devices at 1,000 C for several hundred hours, and during numerous thermal cycling from 400 to 800 C with different heating rates. The technology significantly extends applicability of optical sensors to high-temperature environments including ground testing of engines, flight propulsion control, thermal protection monitoring of launch vehicles, etc. It may also find applications in such non-aerospace arenas as monitoring of nuclear reactors, furnaces, chemical processes, and other hightemperature environments where other measurement techniques are either unreliable, dangerous, undesirable, or unavailable.
Bi-alkali antimonide photocathode growth: An X-ray diffraction study
Schubert, Susanne; Wong, Jared; Feng, Jun; ...
2016-07-21
Bi-alkali antimonide photocathodes are one of the best known sources of electrons for high current and/or high bunch charge applications like Energy Recovery Linacs or Free Electron Lasers. Despite their high quantum efficiency in visible light and low intrinsic emittance, the surface roughness of these photocathodes prohibits their use as low emittance cathodes in high accelerating gradient superconducting and normal conducting radio frequency photoguns and limits the minimum possible intrinsic emittance near the threshold. Also, the growth process for these materials is largely based on recipes obtained by trial and error and is very unreliable. In this paper, using X-raymore » diffraction, we investigate the different structural and chemical changes that take place during the growth process of the bi-alkali antimonide material K 2 CsSb. Our measurements give us a deeper understanding of the growth process of alkali-antimonide photocathodes allowing us to optimize it with the goal of minimizing the surface roughness to preserve the intrinsic emittance at high electric fields and increasing its reproducibility.« less
Li, Jun; Lin, Qiu-Hua; Kang, Chun-Yu; Wang, Kai; Yang, Xiu-Ting
2018-03-18
Direction of arrival (DOA) estimation is the basis for underwater target localization and tracking using towed line array sonar devices. A method of DOA estimation for underwater wideband weak targets based on coherent signal subspace (CSS) processing and compressed sensing (CS) theory is proposed. Under the CSS processing framework, wideband frequency focusing is accompanied by a two-sided correlation transformation, allowing the DOA of underwater wideband targets to be estimated based on the spatial sparsity of the targets and the compressed sensing reconstruction algorithm. Through analysis and processing of simulation data and marine trial data, it is shown that this method can accomplish the DOA estimation of underwater wideband weak targets. Results also show that this method can considerably improve the spatial spectrum of weak target signals, enhancing the ability to detect them. It can solve the problems of low directional resolution and unreliable weak-target detection in traditional beamforming technology. Compared with the conventional minimum variance distortionless response beamformers (MVDR), this method has many advantages, such as higher directional resolution, wider detection range, fewer required snapshots and more accurate detection for weak targets.
Drury, J P; Grether, G F; Garland, T; Morlon, H
2018-05-01
Much ecological and evolutionary theory predicts that interspecific interactions often drive phenotypic diversification and that species phenotypes in turn influence species interactions. Several phylogenetic comparative methods have been developed to assess the importance of such processes in nature; however, the statistical properties of these methods have gone largely untested. Focusing mainly on scenarios of competition between closely-related species, we assess the performance of available comparative approaches for analyzing the interplay between interspecific interactions and species phenotypes. We find that many currently used statistical methods often fail to detect the impact of interspecific interactions on trait evolution, that sister-taxa analyses are particularly unreliable in general, and that recently developed process-based models have more satisfactory statistical properties. Methods for detecting predictors of species interactions are generally more reliable than methods for detecting character displacement. In weighing the strengths and weaknesses of different approaches, we hope to provide a clear guide for empiricists testing hypotheses about the reciprocal effect of interspecific interactions and species phenotypes and to inspire further development of process-based models.
Hendriks, Friederike; Kienhues, Dorothe; Bromme, Rainer
2015-01-01
Given their lack of background knowledge, laypeople require expert help when dealing with scientific information. To decide whose help is dependable, laypeople must judge an expert’s epistemic trustworthiness in terms of competence, adherence to scientific standards, and good intentions. Online, this may be difficult due to the often limited and sometimes unreliable source information available. To measure laypeople’s evaluations of experts (encountered online), we constructed an inventory to assess epistemic trustworthiness on the dimensions expertise, integrity, and benevolence. Exploratory (n = 237) and confirmatory factor analyses (n = 345) showed that the Muenster Epistemic Trustworthiness Inventory (METI) is composed of these three factors. A subsequent experimental study (n = 137) showed that all three dimensions of the METI are sensitive to variation in source characteristics. We propose using this inventory to measure assignments of epistemic trustworthiness, that is, all judgments laypeople make when deciding whether to place epistemic trust in–and defer to–an expert in order to solve a scientific informational problem that is beyond their understanding. PMID:26474078
Lessons Learned from Monitoring Drought in Data Sparse Regions in the United States
NASA Astrophysics Data System (ADS)
Edwards, L. M.; Redmond, K. T.
2011-12-01
Drought monitoring in the geographic domain represented by the Western Regional Climate Center (WRCC) in the United States can serve as an example of many of the challenges that face a global drought early warning system (GDEWS). The WRCC area includes numerous climate regions, such as: the Pacific coast of the continental U.S., the lowest elevation in North America, arid and alpine environments, temperate rainforest, Alaska, Hawaii and the Pacific territories of the U.S. in the tropics. This area is quite diverse in its climatological regimes, from rainforest to high desert to tundra, and covers a large area of land and water. Drought in the WRCC domain affects a wide range of constituents and interests, and the complex interplay between "human-caused" and natural drought cannot be understated. Data to support a GDEWS, as in the WRCC region, is often non-existent or unreliable in remote locations. Even in the continental U.S., data is not as dense as the topography and climate zones demand for accurate drought assessment. Challenges and efforts to address drought monitoring at the WRCC will be presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Levin, Barnaby D. A.; Zachman, Michael J.; Werner, Jörg G.
Abstract Lithium sulfur (Li–S) batteries have the potential to provide higher energy storage density at lower cost than conventional lithium ion batteries. A key challenge for Li–S batteries is the loss of sulfur to the electrolyte during cycling. This loss can be mitigated by sequestering the sulfur in nanostructured carbon–sulfur composites. The nanoscale characterization of the sulfur distribution within these complex nanostructured electrodes is normally performed by electron microscopy, but sulfur sublimates and redistributes in the high-vacuum conditions of conventional electron microscopes. The resulting sublimation artifacts render characterization of sulfur in conventional electron microscopes problematic and unreliable. Here, we demonstratemore » two techniques, cryogenic transmission electron microscopy (cryo-TEM) and scanning electron microscopy in air (airSEM), that enable the reliable characterization of sulfur across multiple length scales by suppressing sulfur sublimation. We use cryo-TEM and airSEM to examine carbon–sulfur composites synthesized for use as Li–S battery cathodes, noting several cases where the commonly employed sulfur melt infusion method is highly inefficient at infiltrating sulfur into porous carbon hosts.« less
The pharmacotherapy of male hypogonadism besides androgens.
Corona, Giovanni; Rastrelli, Giulia; Ratrelli, Giulia; Maggi, Mario
2015-02-01
Adulthood male hypogonadism (HG) is the most common form of HG. Although testosterone (T) replacement therapy (TRT) is the most common way of treating HG, other options are available depending on patient's needs and expectations. We analyze alternative options to TRT as a medical intervention in treating HG. Gonadotropin (Gn) therapy is the treatment of choice in men with secondary HG (sHG), who require fertility. Gonadotropin-releasing hormone therapy represents an alternative to Gn for inducing spermatogenesis in patients with sHG, however, its use is limited by the poor patient compliance and high cost. In obese HG men, lifestyle modifications and, in particular, weight loss should be the first step. Recent data suggest that antiestrogens represent a successful treatment for sHG. Other potential therapeutic options include the stimulation of hypothalamic activity (i.e., kisspeptin and neurokinin-B agonists). Conversely, the possibility of increasing Leydig cell steroid production, independently from Gn stimulation, seems unreliable. Understanding the nature of male HG and patient's needs are mandatory before choosing among treatment options. For primary HG only TRT is advisable, whereas for the secondary form several alternative possibilities can be offered.
Evaluation of a follow-up protocol for patients on chloroquine and hydroxychloroquine treatment.
Sanabria, M R; Toledo-Lucho, S C
2016-01-01
To review the problems found after a new follow-up protocol for patients on chloroquine and hydroxychloroquine treatment. Retrospective study was conducted between May 2012 and January 2013 on the clinical files, retinographies, fundus auto-fluorescence (FAF) images, and central-10 degree visual fields (VF) of patients who were referred to the Ophthalmology Department as they had started treatment with hydroxychloroquine. One hundred twenty-six patients were included; 94.4% were referred from the Rheumatology Department and 5.6% from Dermatology. Mean age was 59.7 years, and 73.8% were women. All of them were on hydroxychloroquine treatment, and 300mg was the most frequent daily dose. Rheumatoid arthritis was the most common diagnosis (40.5%), followed by systemic lupus erythematosus (15.9%). The mean Snellen visual acuity was 0.76, and 26 patients had lens opacities. The VF were normal in 97 patients, 8 had mild to moderate defects with no definite pattern, and in 9 the results were unreliable. Of the 51 patients older than 65years, 16 (31.4%) had altered or unreliable VF. The FAF was normal in 104 patients (82.5%), and abnormal, but consistent with ophthalmoscopic features, in 12 patients (pathological myopia, age related changes, early, middle or late age-related macular degeneration). Visual fields as a reference test for the diagnosis of AP toxicity are not quite reliable for patients over 65. Therefore, the FAF is recommended as primary test, perhaps combined with another objective test, such as SD-OCT instead of VF. Copyright © 2015 Sociedad Española de Oftalmología. Published by Elsevier España, S.L.U. All rights reserved.
Effect of tilt on strong motion data processing
Graizer, V.M.
2005-01-01
In the near-field of an earthquake the effects of the rotational components of ground motion may not be negligible compared to the effects of translational motions. Analyses of the equations of motion of horizontal and vertical pendulums show that horizontal sensors are sensitive not only to translational motion but also to tilts. Ignoring this tilt sensitivity may produce unreliable results, especially in calculations of permanent displacements and long-period calculations. In contrast to horizontal sensors, vertical sensors do not have these limitations, since they are less sensitive to tilts. In general, only six-component systems measuring rotations and accelerations, or three-component systems similar to systems used in inertial navigation assuring purely translational motion of accelerometers can be used to calculate residual displacements. ?? 2004 Elsevier Ltd. All rights reserved.
More on the lambda 2800 A 'interstellar extinction' feature
NASA Astrophysics Data System (ADS)
McLachlan, A.; Nandy, K.
1985-02-01
In a response made to a recent letter by Karim et al. (1984), it is shown that the examples of interstellar absorption at 2800 A that they attribute to proteinaceous material can all be attributed to overexposure of IUE detectors. It is pointed out that stars in the Large Magellanic Cloud show pronounced absorption at 2800 A which cannot be due to interstellar protein since there is no associated absorption at 2200 A; this lack of absorption cannot be due to presence of graphite, whose absorption is weak in the Cloud. The claim by Karim et al. that the spectra of eight stars show 2800 A absorption and that these spectra are saturation-free is considered, and it is shown that data processing problems at IUE ground stations make these spectra unreliable.
A review of the impact of the environment on aerogenerator materials
NASA Astrophysics Data System (ADS)
Mortimer, A. R.
Factors which contribute to the unreliability of windpowered generators are examined, with specific regard to the availability of materials, durability, cost, ease of production, and ease of repair. The effects of wind loading and methods for testing salt air contaminants are discussed, along with the effect of moisture, of salt air on lubricants, of microbiological attack, of UV radiation, rain erosion, and icing. The probability of bird strikes is statistically defined, and consideration is given to electrostatic charging, lightning strikes, temperature changes, the corrosion of rubber by ozone, the effects of guano, and wet dry pollution. The visual, EM, and acoustic effects of wind turbines are explored, and production processes which may affect the integrity of the structure are outlined. Finally, failure mechanisms due to salt-air environments are detailed.
Bemis, David A; Greenacre, Cheryl B; Bryant, Mary Jean; Jones, Rebekah D; Kania, Stephen A
2011-01-01
Isolates of gram-negative anaerobic bacteria from reptiles have only occasionally been identified to the genus and species level in the veterinary medical literature. In particular, reports identifying Porphyromonas spp. from infections in reptiles are scarce. The present report describes unique Porphyromonas isolates obtained from necrosuppurative infections in central bearded dragons (Pogona vitticeps). The isolates grew in the presence of oxygen, were strongly hemolytic, and did not produce detectable black, iron porphyrin pigment. Biochemical identification kit numeric biocodes gave high but unreliable probabilities (>99.9%) for identification as Porphyromonas gingivalis. Partial 16S ribosomal RNA gene sequences of the isolates were identical to each other and shared 91% identity with those of Porphyromonas gulae. The isolates may represent a new reptile-associated Porphyromonas species.
32 CFR Appendix D to Part 154 - Reporting of Nonderogatory Cases
Code of Federal Regulations, 2010 CFR
2010-07-01
... abuse of drugs or alcohol, theft or dishonesty, unreliability, irresponsibility, immaturity, instability... promiscuity, aberrant, deviant, or bizarre sexual conduct or behavior, transvestitism, transsexualism, indecent exposure, rape, contributing to the delinquency of a minor, child molestation, wife-swapping...
Social media and health care professionals: benefits, risks, and best practices.
Ventola, C Lee
2014-07-01
Health care professionals can use a variety of social media tools to improve or enhance networking, education, and other activities. However, these tools also present some potential risks, such as unreliable information and violations of patients' privacy rights.
DOT National Transportation Integrated Search
2014-01-01
The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...
Alternatives to Piloting Textbooks.
ERIC Educational Resources Information Center
Muther, Connie
1985-01-01
Using short-term pilot programs to evaluate textbooks can lead to unreliable results and interfere with effective education. Alternative methods for evaluating textbook-based programs include obtaining documented analyses of competitors' products from sales agents, visiting districts using programs being considered, and examining publishers' own…
An impoverished machine: challenges to human learning and instructional technology.
Taraban, Roman
2008-08-01
Many of the limitations to human learning and processing identified by cognitive psychologists over the last 50 years still hold true, including computational constraints, low learning rates, and unreliable processing. Instructional technology can be used in classrooms and in other learning contexts to address these limitations to learning. However, creating technological innovations is not enough. As part of psychological science, the development and assessment of instructional systems should be guided by theories and practices within the discipline. The technology we develop should become an object of research like other phenomena that are studied. In the present article, I present an informal account of my own work in assessing instructional technology for engineering thermodynamics to show not only the benefits, but also the limitations, in studying the technology we create. I conclude by considering several ways of advancing the development of instructional technology within the SCiP community, including interdisciplinary research and envisioning learning contexts that differ radically from traditional learning focused on lectures and testing.
Improved cancer diagnostics by different image processing techniques on OCT images
NASA Astrophysics Data System (ADS)
Kanawade, Rajesh; Lengenfelder, Benjamin; Marini Menezes, Tassiana; Hohmann, Martin; Kopfinger, Stefan; Hohmann, Tim; Grabiec, Urszula; Klämpfl, Florian; Gonzales Menezes, Jean; Waldner, Maximilian; Schmidt, Michael
2015-07-01
Optical-coherence tomography (OCT) is a promising non-invasive, high-resolution imaging modality which can be used for cancer diagnosis and its therapeutic assessment. However, speckle noise makes detection of cancer boundaries and image segmentation problematic and unreliable. Therefore, to improve the image analysis for a precise cancer border detection, the performance of different image processing algorithms such as mean, median, hybrid median filter and rotational kernel transformation (RKT) for this task is investigated. This is done on OCT images acquired from an ex-vivo human cancerous mucosa and in vitro by using cultivated tumour applied on organotypical hippocampal slice cultures. The preliminary results confirm that the border between the healthy and the cancer lesions can be identified precisely. The obtained results are verified with fluorescence microscopy. This research can improve cancer diagnosis and the detection of borders between healthy and cancerous tissue. Thus, it could also reduce the number of biopsies required during screening endoscopy by providing better guidance to the physician.
Primary path reservation using enhanced slot assignment in TDMA for session admission.
Koneri Chandrasekaran, Suresh; Savarimuthu, Prakash; Andi Elumalai, Priya; Ayyaswamy, Kathirvel
2015-01-01
Mobile ad hoc networks (MANET) is a self-organized collection of nodes that communicates without any infrastructure. Providing quality of service (QoS) in such networks is a competitive task due to unreliable wireless link, mobility, lack of centralized coordination, and channel contention. The success of many real time applications is purely based on the QoS, which can be achieved by quality aware routing (QAR) and admission control (AC). Recently proposed QoS mechanisms do focus completely on either reservation or admission control but are not better enough. In MANET, high mobility causes frequent path break due to the fact that every time the source node must find the route. In such cases the QoS session is affected. To admit a QoS session, admission control protocols must ensure the bandwidth of the relaying path before transmission starts; reservation of such bandwidth noticeably improves the admission control performance. Many TDMA based reservation mechanisms are proposed but need some improvement over slot reservation procedures. In order to overcome this specific issue, we propose a framework-PRAC (primary path reservation admission control protocol), which achieves improved QoS by making use of backup route combined with resource reservation. A network topology has been simulated and our approach proves to be a mechanism that admits the session effectively.
Feliciano, David V
2017-11-01
Although abdominal trauma has been described since antiquity, formal laparotomies for trauma were not performed until the 1800s. Even with the introduction of general anesthesia in the United States during the years 1842 to 1846, laparotomies for abdominal trauma were not performed during the Civil War. The first laparotomy for an abdominal gunshot wound in the United States was finally performed in New York City in 1884. An aggressive operative approach to all forms of abdominal trauma till the establishment of formal trauma centers (where data were analyzed) resulted in extraordinarily high rates of nontherapeutic laparotomies from the 1880s to the 1960s. More selective operative approaches to patients with abdominal stab wounds (1960s), blunt trauma (1970s), and gunshot wounds (1990s) were then developed. Current adjuncts to the diagnosis of abdominal trauma when serial physical examinations are unreliable include the following: 1) diagnostic peritoneal tap/lavage, 2) surgeon-performed ultrasound examination; 3) contrast-enhanced CT of the abdomen and pelvis; and 4) diagnostic laparoscopy. Operative techniques for injuries to the liver, spleen, duodenum, and pancreas have been refined considerably since World War II. These need to be emphasized repeatedly in an era when fewer patients undergo laparotomy for abdominal trauma. Finally, abdominal trauma damage control is a valuable operative approach in patients with physiologic exhaustion and multiple injuries.
Recognition memory: a review of the critical findings and an integrated theory for relating them.
Malmberg, Kenneth J
2008-12-01
The development of formal models has aided theoretical progress in recognition memory research. Here, I review the findings that are critical for testing them, including behavioral and brain imaging results of single-item recognition, plurality discrimination, and associative recognition experiments under a variety of testing conditions. I also review the major approaches to measurement and process modeling of recognition. The review indicates that several extant dual-process measures of recollection are unreliable, and thus they are unsuitable as a basis for forming strong conclusions. At the process level, however, the retrieval dynamics of recognition memory and the effect of strengthening operations suggest that a recall-to-reject process plays an important role in plurality discrimination and associative recognition, but not necessarily in single-item recognition. A new theoretical framework proposes that the contribution of recollection to recognition depends on whether the retrieval of episodic details improves accuracy, and it organizes the models around the construct of efficiency. Accordingly, subjects adopt strategies that they believe will produce a desired level of accuracy in the shortest amount of time. Several models derived from this framework are shown to account the accuracy, latency, and confidence with which the various recognition tasks are performed.
Genetic screening in sporadic ALS and FTD.
Turner, Martin R; Al-Chalabi, Ammar; Chio, Adriano; Hardiman, Orla; Kiernan, Matthew C; Rohrer, Jonathan D; Rowe, James; Seeley, William; Talbot, Kevin
2017-12-01
The increasing complexity of the genetic landscape in amyotrophic lateral sclerosis (ALS) and frontotemporal dementia (FTD) presents a significant resource and physician training challenge. At least 10% of those diagnosed with ALS or FTD are known to carry an autosomal dominant genetic mutation. There is no consensus on what constitutes a positive family history, and ascertainment is unreliable for many reasons. However, symptomatic individuals often wish to understand as much as possible about the cause of their disease, and to share this knowledge with their family. While the right of an individual not to know is a key aspect of patient autonomy, and despite the absence of definitive therapy, many newly diagnosed individuals are likely to elect for genetic testing if offered. It is incumbent on the practitioner to ensure that they are adequately informed, counselled and supported in this decision. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
CASAROTO, ANA REGINA; SAMPIERI, MARCELO BONIFACIO DA SILVA; SOARES, CLEVERSON TEIXEIRA; SANTOS, PAULO SERGIO DA SILVA; YAEDU, RENATO YASSUTAKA FARIA; DAMANTE, JOSÉ HUMBERTO; LARA, VANESSA SOARES
2017-01-01
Due to the low incidence of the Ewing’s Sarcoma (ES) family tumors, the available epidemiology is likely to be unreliable, and at present, there are no standard diagnostic or clinical guidelines outlining their management. This report describes a case of peripheral primitive neuroectodermal tumor (ES/pPNET) which initially mimicked cystic lesions, and describes a comparison between ES and ES/pPNET in the jaws by the World Health Organization classification. This review addressed 63 cases published in the English literature between 1950 and 2016. The majority of cases were ES. Both ES and ES/pPNET mimicked other benign entities such as traumatic, cystic and inflammatory lesions. The patients who died of their disease had a history of metastatic tumors, and primary tumor located in the mandible and maxilla for ES and ES/pPNET, respectively. The differentiation of the ES family tumors from other small blue-cell tumors may be difficult and requires familiarity with histological and immunohistochemical features. PMID:28438883
Estimates of runoff using water-balance and atmospheric general circulation models
Wolock, D.M.; McCabe, G.J.
1999-01-01
The effects of potential climate change on mean annual runoff in the conterminous United States (U.S.) are examined using a simple water-balance model and output from two atmospheric general circulation models (GCMs). The two GCMs are from the Canadian Centre for Climate Prediction and Analysis (CCC) and the Hadley Centre for Climate Prediction and Research (HAD). In general, the CCC GCM climate results in decreases in runoff for the conterminous U.S., and the HAD GCM climate produces increases in runoff. These estimated changes in runoff primarily are the result of estimated changes in precipitation. The changes in mean annual runoff, however, mostly are smaller than the decade-to-decade variability in GCM-based mean annual runoff and errors in GCM-based runoff. The differences in simulated runoff between the two GCMs, together with decade-to-decade variability and errors in GCM-based runoff, cause the estimates of changes in runoff to be uncertain and unreliable.
Quipus and System of Coordinated Precession
NASA Astrophysics Data System (ADS)
Campos, T. C.
2004-05-01
The Incas of ancient Peru possessed no writing. Instead, they developed a unique system expressed on spatial arrays of colored knotted cords called Quipus to record and transmit information throughout their vast empire. In their thorough description of quipus, Ascher & Ascher observed that in two cases the numbers registered in their strings have a very special relationship to each other. For this to occur the numbers must have been obtained through the multiplication of whole numbers by fractions or decimals, operations apparently beyond the arithmetic knowledge of the Incas. The quipus AS120 and AS143, coming from Ica (Peru) and conserved in the Museum of Berlin has the suitable characteristics previously. In the AS143 there is a the relationship with the systems of coordinated precession (tilt of Earth's spin axis (40036); eccentricity of Earth's orbit (97357); and precession of equinoxes (between 18504 and 23098)). For the history of the Earth are necessary an chronometer natural to coordinate and to classify the observations and this chronometer comes to be the vernal point, defining the vernal point as" a sensitive axis of maximum conductivity" as itdemonstrates it the stability of the geomagnetic equator (inclination of the field is zero grades), in the year 1939 calculated with the IGRF from the year 1900 up to the 2004 and that it is confirmed with tabulated data of the Geophysical Institute of Huancayo (Peru),from that date until this year (2004) and this fluctuating between the 12-14 South.,on the other hand in the area of Brazil it has advanced very quickly toward the north, and above to 108 km. approximately it is located the equatorial electrojet that is but intense in the equinoxes in South America. And this stability from the point of view of the precession of the equinoxes this coinciding with the entrance of the apparent sun for the constellation of Aquarius, being this mechanism the base to establish a system of coordinated precession where it is also considered tilt of Earth's spin axis; eccentricity of Earth's orbit; and precession of equinoxes:Together these, yield a complex curve for the solar constant at different latitudes,as first suggested by Croll( 1875) y Milankovitch (1920, 1930), Zeuner ( e. g. 1945), and other.
Guo, Qi; Wei, Hai-Zhen; Jiang, Shao-Yong; Hohl, Simon; Lin, Yi-Bo; Wang, Yi-Jing; Li, Yin-Chuan
2017-12-19
Except for extensive studies in core formation and volatile-element depletion processes using radiogenic Ag isotopes (i.e., the Pd-Ag chronometer), recent research has revealed that the mass fractionation of silver isotopes is in principle controlled by physicochemical processes (e.g., evaporation, diffusion, chemical exchange, etc.) during magmatic emplacement and hydrothermal alteration. As these geologic processes only produce very minor variations of δ 109 Ag from -0.5 to +1.1‰, more accurate and precise measurements are required. In this work, a robust linear relationship between instrumental mass discrimination of Ag and Pd isotopes was obtained at the Ag/Pd molar ratio of 1:20. In Au-Ag ore deposits, silver minerals have complex paragenetic relationships with other minerals (e.g., chalcopyrite, sphalerite, galena, pyrite, etc.). It is difficult to remove such abundant impurities completely because the other metals are tens to thousands of times richer than silver. Both quantitative evaluation of matrix effects and modification of chemical chromatography were carried out to deal with the problems. Isobaric inferences (e.g., 65 Cu 40 Ar + to 105 Pd, 208 Pb 2+ to 104 Pd, and 67 Zn 40 Ar + to 107 Ag + ) and space charge effects dramatically shift the measured δ 109 Ag values. The selection of alternative Pd isotope pairs is effective in eliminating spectral matrix effects so as to ensure accurate analysis under the largest possible ranges for metal impurities, which are Cu/Ag ≤ 50:1, Fe/Ag ≤ 600:1, Pb/Ag ≤ 10:1, and Zn/Ag ≤ 1:1, respectively. With the modified procedure, we reported silver isotope compositions (δ 109 Ag) in geological standard materials and typical Au-Ag ore deposit samples varying from -0.029 to +0.689 ‰ with external reproducibility of ±0.009-0.084 ‰. A systemic survey of δ 109 Ag (or ε 109 Ag) variations in rocks, ore deposits, and environmental materials in nature is discussed.
Prospecting for Precious Metals in Ultra-Metal-Poor Stars
NASA Astrophysics Data System (ADS)
French, R. S.
2000-05-01
The chemical compositions of the most metal-poor halo stars are living records of the very early nucleosynthetic history of the Galaxy. Only a few prior generations, if not a single one, of element-donating supernovae could have been responsible for the heavy elements observed in ultra-metal-poor (UMP; [Fe/H] < --2.5) stars. Abundances of the heavy neutron-capture elements (Z > 30) can yield direct information about the supernova progenitors to UMP stars, and abundances of unstable thorium and uranium (Z = 90, 92) can potentially provide age estimates for the Galactic halo. Already, many studies have demonstrated that abundances of rare-earth elements (56 <= Z <= 72) in UMP stars are completely consistent with their production in rapid neutron-capture synthesis (r-process) events, usually believed to occur during supernovae explosions. Therefore, mapping the entire abundance pattern of UMP stars is of significant interest. In particular, abundances of the most massive stable elements (Os -> Pb or 76 <= Z <= 82) could provide crucial information about the so-called ``third r-process peak,'' and are critical to the radioactive-dating technique that uses unstable thorium as a chronometer. Until recently, abundance determinations for these elements have been virtually non-existent, as the strongest relevant transitions lay in the vacuum UV, inaccessible to ground-based observation. The availability of high-resolution space-based spectrometers has opened up new regions of spectral coverage, including precisely the range in wavelength needed to make these sensitive measurements. We have undertaken a study of about 10 metal-poor halo giants to determine the abundances of several of the heaviest neutron-capture elements including platinum, osmium, lead, and gold. Preliminary results indicate that the abundance pattern of heavy neutron-capture elements (56 <= Z <= 82) in UMP stars does mimic a scaled solar system r-process. Thus, the ability to estimate the initial abundances of thorium and uranium is greatly reinforced.
Age validation of quillback rockfish (Sebastes maliger) using bomb radiocarbon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, L A; Andrews, A H; Munk, K
2005-01-05
Rockfishes (Sebastes spp.) support one of the most economically important fisheries of the Pacific Northwest and it is essential for sustainable management that age estimation procedures be validated for these species. Atmospheric testing of thermonuclear devices during the 1950s and 1960s created a global radiocarbon ({sup 14}C) signal in the ocean environment that scientists have identified as a useful tracer and chronological marker in natural systems. In this study, we first demonstrated that fewer samples are necessary for age validation using the bomb-generated {sup 14}C signal by emphasizing the utility of the time-specific marker created by the initial rise ofmore » bomb-{sup 14}C. Second, the bomb-generated {sup 14}C signal retained in fish otoliths was used to validate the age and age estimation methodology of the quillback rockfish (Sebastes maliger) in the waters of southeast Alaska. Radiocarbon values from the first year's growth of quillback rockfish otoliths were plotted against estimated birth year producing a {sup 14}C time series spanning 1950 to 1985. The initial rise of bomb-{sup 14}C from pre-bomb levels ({approx} -90 {per_thousand}) occurred in 1959 {+-} 1 year and {sup 14}C levels rose relatively rapidly to peak {Delta}{sup 14}C values in 1967 (+105.4 {per_thousand}), with a subsequent declining trend through the end of the record in 1985 (+15.4 {per_thousand}). The agreement between the year of initial rise of {sup 14}C levels from the quillback rockfish record and the chronometer determined for the waters of southeast Alaska from yelloweye rockfish (S. ruberrimus) otoliths validated the ageing methodology for the quillback rockfish. The concordance of the entire quillback rockfish {sup 14}C record with the yelloweye rockfish time series demonstrated the effectiveness of this age validation technique, confirmed the longevity of the quillback rockfish up to a minimum of 43 years, and strongly supports higher age estimates of up to 90 years.« less
Three essays on environmental and natural resource economics
NASA Astrophysics Data System (ADS)
Wang, Qiong (Juliana)
The doctoral dissertation is composed of three chapters on the governance of water and electricity infrastructure in China. All three chapters focus on the nexus of economy, environment, and energy. The first chapter studies the relationship of decentralization policies and the provision of public goods in the context of urban water services in China. Different degree of externalities of the public goods may affect the efficacy of decentralization policies. Using a comprehensive 2004 dataset for all the 661 cities, I measure how the clean water supply coverage rate and the wastewater treatment rate respond to these policies respectively. Results show that cities respond positively in their piped water supply coverage but not as well in their wastewater treatment, whereas they both respond positively to the mandatory information disclosure policy. The efficacy of decentralization policy is indeed compromised when externalities exist beyond the jurisdiction as suggested by the case of wastewater. Information disclosure policy, a motivational tool tied to the promotion of local officials, is shown to provide strong incentives for water services irrespective of their externalities. Private sector participation lowers the amount of government grant in the water sector but increases the tariff charged to customers. The second chapter of the dissertation examines whether competition reduces cost in the restructuring of the Chinese power sector. Although competition may reduce cost through technological innovation and advancement and diversification of ownership, higher transaction cost and price control may hinder its effectiveness. In this chapter, I describe the various restructuring programs over the years that affect the power plants. Then, I evaluate their impacts on the cost efficiency, measured by the factor demand of the power plants - labor, energy and materials. Using an industrial dataset from 1997 to 2004 of energy consuming coal power plants from the National Statistics Bureau, I first estimate the factor demand equations following the model developed in Fabrizio et al. (2007) to compare with the results from similar studies in the United States. Further, I model the cost structure of Chinese power plants using a more flexible translog specification. The results from these two models confirm the validity of the assumptions made based on the industry characteristics. The power plants located in the South reduced their labor demand after the Southern Grid separated from the National Grid in 2002. The third chapter examines how the unreliability of inputs affects productivity. Specifically, it studies how Chinese industrial enterprises respond to the unreliability of electric power. Since 2002, electricity blackouts have been hampering the industrial customers in China. Using a survey dataset of the National Statistics Bureau on eleven industries across the nation from 1999 to 2004 and an electricity dataset compiled from Electricity Yearbooks, my co-authors and I estimate the cost of power unreliability by quantifying the factor-neutral and the factor-biased productivity effects. Incorporating unreliability proxies into a flexible translog cost function and the value share equations, we estimate the whole system using seemingly unrelated regressions (SUREG) with cross equation constraints. We also calculate the marginal effect of factor unreliability on cost and on carbon emissions based on these estimates.
18 CFR 806.23 - Standards for water withdrawals.
Code of Federal Regulations, 2010 CFR
2010-04-01
... of groundwater or stream flow levels; rendering competing supplies unreliable; affecting other water..., at its own expense, an alternate water supply or other mitigating measures. (iii) Require the project... deficiencies, identify alternative water supply options, and support existing and proposed future withdrawals. ...
ERIC Educational Resources Information Center
Coen, Frank
1969-01-01
The unreliability of first impressions and subjective judgments is the subject of both Jane Austen's "Pride and Prejudice" and Lionel Trilling's "Of This Time, Of That Place"; consequently, the works are worthwhile parallel studies for high school students. Austen, by means of irony and subtle characterization, dramatizes the…
Morphology delimits more species than molecular genetic clusters of invasive Pilosella
USDA-ARS?s Scientific Manuscript database
Premise of the study: Reliable identifications of invasive species are essential for effective management. Several species of Pilosella (syn. Hieracium, Asteraceae) hawkweeds invade North America, where unreliable identification hinders their control. Here we ask (i) do morphological traits dependab...
Procedure for Failure Mode, Effects, and Criticality Analysis (FMECA)
NASA Technical Reports Server (NTRS)
1966-01-01
This document provides guidelines for the accomplishment of Failure Mode, Effects, and Criticality Analysis (FMECA) on the Apollo program. It is a procedure for analysis of hardware items to determine those items contributing most to system unreliability and crew safety problems.
Second Thoughts at Women's Colleges.
ERIC Educational Resources Information Center
Gose, Ben
1995-01-01
Despite a rise in enrollments at women's colleges nationwide, there is concern that the applicant pool is weakening. Average college entrance test scores of freshmen have dropped considerably since 1968. Some see research comparing women's performance at single-sex and coeducational colleges as unreliable. (MSE)
FabricS: A user-friendly, complete and robust software for particle shape-fabric analysis
NASA Astrophysics Data System (ADS)
Moreno Chávez, G.; Castillo Rivera, F.; Sarocchi, D.; Borselli, L.; Rodríguez-Sedano, L. A.
2018-06-01
Shape-fabric is a textural parameter related to the spatial arrangement of elongated particles in geological samples. Its usefulness spans a range from sedimentary petrology to igneous and metamorphic petrology. Independently of the process being studied, when a material flows, the elongated particles are oriented with the major axis in the direction of flow. In sedimentary petrology this information has been used for studies of paleo-flow direction of turbidites, the origin of quartz sediments, and locating ignimbrite vents, among others. In addition to flow direction and its polarity, the method enables flow rheology to be inferred. The use of shape-fabric has been limited due to the difficulties of automatically measuring particles and analyzing them with reliable circular statistics programs. This has dampened interest in the method for a long time. Shape-fabric measurement has increased in popularity since the 1980s thanks to the development of new image analysis techniques and circular statistics software. However, the programs currently available are unreliable, old and are incompatible with newer operating systems, or require programming skills. The goal of our work is to develop a user-friendly program, in the MATLAB environment, with a graphical user interface, that can process images and includes editing functions, and thresholds (elongation and size) for selecting a particle population and analyzing it with reliable circular statistics algorithms. Moreover, the method also has to produce rose diagrams, orientation vectors, and a complete series of statistical parameters. All these requirements are met by our new software. In this paper, we briefly explain the methodology from collection of oriented samples in the field to the minimum number of particles needed to obtain reliable fabric data. We obtained the data using specific statistical tests and taking into account the degree of iso-orientation of the samples and the required degree of reliability. The program has been verified by means of several simulations performed using appropriately designed features and by analyzing real samples.
Elderly dendritic cells respond to LPS/IFN-γ and CD40L stimulation despite incomplete maturation
Musk, Arthur W.; Alvarez, John; Mamotte, Cyril D. S.; Jackaman, Connie; Nowak, Anna K.; Nelson, Delia J.
2018-01-01
There is evidence that dendritic cells (DCs) undergo age-related changes that modulate their function with their key role being priming antigen-specific effector T cells. This occurs once DCs develop into antigen-presenting cells in response to stimuli/danger signals. However, the effects of aging on DC responses to bacterial lipopolysaccharide (LPS), the pro-inflammatory cytokine interferon (IFN)-γ and CD40 ligand (CD40L) have not yet been systematically evaluated. We examined responses of blood myeloid (m)DC1s, mDC2s, plasmacytoid (p)DCs, and monocyte-derived DCs (MoDCs) from young (21–40 years) and elderly (60–84 years) healthy human volunteers to LPS/IFN-γ or CD40L stimulation. All elderly DC subsets demonstrated comparable up-regulation of co-stimulatory molecules (CD40, CD80 and/or CD86), intracellular pro-inflammatory cytokine levels (IFN-γ, tumour necrosis factor (TNF)-α, IL-6 and/or IL-12), and/or secreted cytokine levels (IFN-α, IFN-γ, TNF-α, and IL-12) to their younger counterparts. Furthermore, elderly-derived LPS/IFN-γ or CD40L-activated MoDCs induced similar or increased levels of CD8+ and CD4+ T cell proliferation, and similar T cell functional phenotypes, to their younger counterparts. However, elderly LPS/IFN-γ-activated MoDCs were unreliable in their ability to up-regulate chemokine (IL-8 and monocyte chemoattractant protein (MCP)-1) and IL-6 secretion, implying an inability to dependably induce an inflammatory response. A key age-related difference was that, unlike young-derived MoDCs that completely lost their ability to process antigen, elderly-derived MoDCs maintained their antigen processing ability after LPS/IFN-γ maturation, measured using the DQ-ovalbumin assay; this response implies incomplete maturation that may enable elderly DCs to continuously present antigen. These differences may impact on the efficacy of anti-pathogen and anti-tumour immune responses in the elderly. PMID:29652910
The early differentiation of Mars inferred from Hf–W chronometry
Kruijer, Thomas S.; Kleine, Thorsten; Borg, Lars E.; ...
2017-07-20
Mars probably accreted within the first 10 million years of Solar System formation and likely underwent magma ocean crystallization and crust formation soon thereafter. In this study, to assess the nature and timescales of these large-scale mantle differentiation processes we applied the short-lived 182Hf– 182W and 146Sm– 142Nd chronometers to a comprehensive suite of martian meteorites, including several shergottites, augite basalt NWA 8159, orthopyroxenite ALH 84001 and polymict breccia NWA 7034. Compared to previous studies the 182W data are significantly more precise and have been obtained for a more diverse suite of martian meteorites, ranging from samples from highly depletedmore » to highly enriched mantle and crustal sources. Our results show that martian meteorites exhibit widespread 182W/ 184W variations that are broadly correlated with 142Nd/ 144Nd, implying that silicate differentiation (and not core formation) is the main cause of the observed 182W/ 184W differences. The combined 182W– 142Nd systematics are best explained by magma ocean crystallization on Mars within ~20–25 million years after Solar System formation, followed by crust formation ~15 million years later. Finally, these ages are indistinguishable from the I–Pu–Xe age for the formation of Mars' atmosphere, indicating that the major differentiation of Mars into mantle, crust, and atmosphere occurred between 20 and 40 million years after Solar System formation and, hence, earlier than previously inferred based on Sm–Nd chronometry alone.« less
The early differentiation of Mars inferred from Hf–W chronometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kruijer, Thomas S.; Kleine, Thorsten; Borg, Lars E.
Mars probably accreted within the first 10 million years of Solar System formation and likely underwent magma ocean crystallization and crust formation soon thereafter. In this study, to assess the nature and timescales of these large-scale mantle differentiation processes we applied the short-lived 182Hf– 182W and 146Sm– 142Nd chronometers to a comprehensive suite of martian meteorites, including several shergottites, augite basalt NWA 8159, orthopyroxenite ALH 84001 and polymict breccia NWA 7034. Compared to previous studies the 182W data are significantly more precise and have been obtained for a more diverse suite of martian meteorites, ranging from samples from highly depletedmore » to highly enriched mantle and crustal sources. Our results show that martian meteorites exhibit widespread 182W/ 184W variations that are broadly correlated with 142Nd/ 144Nd, implying that silicate differentiation (and not core formation) is the main cause of the observed 182W/ 184W differences. The combined 182W– 142Nd systematics are best explained by magma ocean crystallization on Mars within ~20–25 million years after Solar System formation, followed by crust formation ~15 million years later. Finally, these ages are indistinguishable from the I–Pu–Xe age for the formation of Mars' atmosphere, indicating that the major differentiation of Mars into mantle, crust, and atmosphere occurred between 20 and 40 million years after Solar System formation and, hence, earlier than previously inferred based on Sm–Nd chronometry alone.« less
[Do you measure gait speed in your daily clinical practice? A review].
Inzitari, Marco; Calle, Alicia; Esteve, Anna; Casas, Álvaro; Torrents, Núria; Martínez, Nicolás
Gait speed (GS), measured at usual pace, is an easy, quick, reliable, non-expensive and informative measurement. With a standard chronometer, like those that currently found in mobile phones, and with two marks on the floor, trained health professionals obtain a more objective and quick measurement compared with many geriatric scales used in daily practice. GS is one of the pillars of the frailty phenotype, and is closely related to sarcopenia. It is a powerful marker of falls incidence, disability and death, mostly useful in the screening of older adults that live in the community. In recent years, the evidence is reinforcing the usefulness of GS in acute care and post-surgical patients. Its use in patients with cognitive impairment is suggested, due to the strong link between cognitive and physical function. Although GS meets the criteria for a good geriatric screening tool, it is not much used in clinical practice. Why? This review has different aims: (i)disentangling the relationship between GS and frailty; (ii)reviewing the protocols to measure GS and the reference values; (iii)reviewing the evidence in different clinical groups (older adults with frailty, with cognitive impairment, with cancer or other pathologies), and in different settings (community, acute care, rehabilitation), and (iv)speculating about the reasons for its poor use in clinical practice and about the gaps to be filled. Copyright © 2016 SEGG. Publicado por Elsevier España, S.L.U. All rights reserved.
Measuring trends in age at first sex and age at marriage in Manicaland, Zimbabwe.
Cremin, I; Mushati, P; Hallett, T; Mupambireyi, Z; Nyamukapa, C; Garnett, G P; Gregson, S
2009-04-01
To identify reporting biases and to determine the influence of inconsistent reporting on observed trends in the timing of age at first sex and age at marriage. Longitudinal data from three rounds of a population-based cohort in eastern Zimbabwe were analysed. Reports of age at first sex and age at marriage from 6837 individuals attending multiple rounds were classified according to consistency. Survival analysis was used to identify trends in the timing of first sex and marriage. In this population, women initiate sex and enter marriage at younger ages than men but spend much less time between first sex and marriage. Among those surveyed between 1998 and 2005, median ages at first sex and first marriage were 18.5 years and 21.4 years for men and 18.2 years and 18.5 years, respectively, for women aged 15-54 years. High levels of reports of both age at first sex and age at marriage among those attending multiple surveys were found to be unreliable. Excluding reports identified as unreliable from these analyses did not alter the observed trends in either age at first sex or age at marriage. Tracing birth cohorts as they aged revealed reporting biases, particularly among the youngest cohorts. Comparisons by birth cohorts, which span a period of >40 years, indicate that median age at first sex has remained constant over time for women but has declined gradually for men. Although many reports of age at first sex and age at marriage were found to be unreliable, inclusion of such reports did not result in artificial generation or suppression of trends.
Tattini, Lorenzo; Olmi, Simona; Torcini, Alessandro
2012-06-01
In this article, we investigate the role of connectivity in promoting coherent activity in excitatory neural networks. In particular, we would like to understand if the onset of collective oscillations can be related to a minimal average connectivity and how this critical connectivity depends on the number of neurons in the networks. For these purposes, we consider an excitatory random network of leaky integrate-and-fire pulse coupled neurons. The neurons are connected as in a directed Erdös-Renyi graph with average connectivity
Clinical Diagnosis among Diverse Populations: A Multicultural Perspective.
ERIC Educational Resources Information Center
Solomon, Alison
1992-01-01
Discusses four ways in which clinical diagnosis can be detrimental to minority clients: (1) cultural expressions of symptomatology; (2) unreliable research instruments; (3) clinician bias; and (4) institutional racism. Recommendations to avoid misdiagnosis begin with accurate assessment of a client's history and cultural background. (SLD)
Preschoolers Mistrust Ignorant and Inaccurate Speakers
ERIC Educational Resources Information Center
Koenig, Melissa A.; Harris, Paul L.
2005-01-01
Being able to evaluate the accuracy of an informant is essential to communication. Three experiments explored preschoolers' (N=119) understanding that, in cases of conflict, information from reliable informants is preferable to information from unreliable informants. In Experiment 1, children were presented with previously accurate and inaccurate…
Diesel Powered School Buses: An Update.
ERIC Educational Resources Information Center
Gresham, Robert
1984-01-01
Because diesel engines are more economical and longer-lasting than gasoline engines, school districts are rapidly increasing their use of diesel buses. Dependence on diesel power, however, entails vulnerability to cost increases due to the unreliability of crude oil supplies and contributes to air pollution. (MCG)
Simple Experiments in Psychology.
ERIC Educational Resources Information Center
Ray, Wilbert S.
This material, developed for use in secondary schools, is a programmed-type learning package consisting of an "Instructor's Manual", a "Student's Introduction", and a "Laboratory Manual". The general goal of the program is to teach students to distinguish between reliable and unreliable information. The "Laboratory Manual" contains nine simple…
Effect of Training on Reasoning in Moral Choice.
ERIC Educational Resources Information Center
Kaplan, Martin F.
Moral development is viewed as a matter of progression in the cognitive reasoning and rationale underlying choices and judgments. Traditionally, retrospective reports of rationales have been used to measure moral development levels, resulting in unreliable information. Information Integration Theory attempts to assess individual differences in…
USDA-ARS?s Scientific Manuscript database
Sparganothis sulfureana Clemens, is a severe insect pest of cranberries in the Midwest and Northeast. Timing for insecticide applications has relied primarily on calendar dates and pheromone trap-catch. However, abiotic conditions can vary greatly, rendering such methods unreliable indicators of opt...
Disordered Eating among Female Adolescents: Prevalence, Risk Factors, and Consequences
ERIC Educational Resources Information Center
Bryla, Karen Y.
2003-01-01
Disordered eating among American adolescent females represents a significant health issue in our current cultural climate. Disordered eating receives insufficient attention, however, due to the public's unfamiliarity with symptoms and consequences, absence of treatment options, and unreliable instrumentation to detect disordered eating. Disordered…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-19
... is overfished. However, the SSC rejected as unreliable the absolute values that resulted in the... establish a stock ACL of zero, would result in the largest profit reductions to both the commercial sector...
IMPROVING WILLINGNESS-TO-ACCEPT RESPONSES USING ALTERNATE FORMS OF COMPENSATION
The purpose of this project is to design a pilot survey to investigate why surveys that ask willingness-to-accept compensation questions so often yield unreliable data and whether respondents would find alternate modes of compensation (specifically, public goods) more acceptab...
Murdoch, Maureen; Pryor, John B; Griffin, Joan M; Ripley, Diane Cowper; Gackstetter, Gary D; Polusny, Melissa A; Hodges, James S
2011-01-01
The Department of Defense's "gold standard" sexual harassment measure, the Sexual Harassment Core Measure (SHCore), is based on an earlier measure that was developed primarily in college women. Furthermore, the SHCore requires a reading grade level of 9.1. This may be higher than some troops' reading abilities and could generate unreliable estimates of their sexual harassment experiences. Results from 108 male and 96 female soldiers showed that the SHCore's temporal stability and alternate-forms reliability was significantly worse (a) in soldiers without college experience compared to soldiers with college experience and (b) in men compared to women. For men without college experience, almost 80% of the temporal variance in SHCore scores was attributable to error. A plain language version of the SHCore had mixed effects on temporal stability depending on education and gender. The SHCore may be particularly ill suited for evaluating population trends of sexual harassment in military men without college experience.
A pragmatic decision model for inventory management with heterogeneous suppliers
NASA Astrophysics Data System (ADS)
Nakandala, Dilupa; Lau, Henry; Zhang, Jingjing; Gunasekaran, Angappa
2018-05-01
For enterprises, it is imperative that the trade-off between the cost of inventory and risk implications is managed in the most efficient manner. To explore this, we use the common example of a wholesaler operating in an environment where suppliers demonstrate heterogeneous reliability. The wholesaler has partial orders with dual suppliers and uses lateral transshipments. While supplier reliability is a key concern in inventory management, reliable suppliers are more expensive and investment in strategic approaches that improve supplier performance carries a high cost. Here we consider the operational strategy of dual sourcing with reliable and unreliable suppliers and model the total inventory cost where the likely scenario lead-time of the unreliable suppliers extends beyond the scheduling period. We then develop a Customized Integer Programming Optimization Model to determine the optimum size of partial orders with multiple suppliers. In addition to the objective of total cost optimization, this study takes into account the volatility of the cost associated with the uncertainty of an inventory system.
Humans treat unreliable filled-in percepts as more real than veridical ones
Ehinger, Benedikt V; Häusser, Katja; Ossandón, José P; König, Peter
2017-01-01
Humans often evaluate sensory signals according to their reliability for optimal decision-making. However, how do we evaluate percepts generated in the absence of direct input that are, therefore, completely unreliable? Here, we utilize the phenomenon of filling-in occurring at the physiological blind-spots to compare partially inferred and veridical percepts. Subjects chose between stimuli that elicit filling-in, and perceptually equivalent ones presented outside the blind-spots, looking for a Gabor stimulus without a small orthogonal inset. In ambiguous conditions, when the stimuli were physically identical and the inset was absent in both, subjects behaved opposite to optimal, preferring the blind-spot stimulus as the better example of a collinear stimulus, even though no relevant veridical information was available. Thus, a percept that is partially inferred is paradoxically considered more reliable than a percept based on external input. In other words: Humans treat filled-in inferred percepts as more real than veridical ones. DOI: http://dx.doi.org/10.7554/eLife.21761.001 PMID:28506359
Foetal Alcohol Spectrum Disorders: A consideration of sentencing and unreliable confessions.
Douglas, Heather
2015-12-01
While Foetal Alcohol Spectrum Disorders (FASDs) are now a strong focus of policy-makers throughout Australia, they have received strikingly little consideration in Australian criminal courts. Many people who have an FASD are highly suggestible, have difficulty linking their actions to consequences, controlling impulses and remembering things, and thus FASD raises particular issues for appropriate sentencing and the admissibility of evidence. This article considers the approach of Australian criminal courts to FASD. It reviews the recent case of AH v Western Australia which exemplifies the difficulties associated with appropriate sentencing in cases where the accused is likely to have an FASD. The article also considers the implications for Australian courts of the New Zealand case of Pora v The Queen, recently heard by the Privy Council. In this case, the Privy Council accepted expert evidence that people with FASD may confabulate evidence, potentially making their testimony unreliable. The article concludes with an overview of developments in criminal policy and legal response in relation to FASD in the United States, Canada and Australia.
Multi stage unreliable retrial Queueing system with Bernoulli vacation
NASA Astrophysics Data System (ADS)
Radha, J.; Indhira, K.; Chandrasekaran, V. M.
2017-11-01
In this work we considered the Bernoulli vacation in group arrival retrial queues with unreliable server. Here, a server providing service in k stages. Any arriving group of units finds the server free, one from the group entering the first stage of service and the rest are joining into the orbit. After completion of the i th, (i=1,2,…k) stage of service, the customer may go to (i+1)th stage with probability θi , or leave the system with probability qi = 1 - θi , (i = 1,2,…k - 1) and qi = 1, (i = k). The server may enjoy vacation (orbit is empty or not) with probability v after finishing the service or continuing the service with probability 1-v. After finishing the vacation, the server search for the customer in the orbit with probability θ or remains idle for new arrival with probability 1-θ. We analyzed the system using the method of supplementary variable.
Dipstick measurements of urine specific gravity are unreliable
Roessingh, A; Drukker, A; Guignard, J
2001-01-01
AIM—To evaluate the reliability of dipstick measurements of urine specific gravity (U-SG). METHODS—Fresh urine specimens were tested for urine pH and osmolality (U-pH, U-Osm) by a pH meter and an osmometer, and for U-SG by three different methods (refractometry, automatic readout of a dipstick (Clinitek-50), and (visual) change of colour of the dipstick). RESULTS—The correlations between the visual U-SG dipstick measurements and U-SG determined by a refractometer and the comparison of Clinitek®-50 dipstick U-SG measurements with U-Osm were less than optimal, showing very wide scatter of values. Only the U-SG refractometer values and U-Osm had a good linear correlation. The tested dipstick was unreliable for the bedside determination of U-SG, even after correction for U-pH, as recommended by the manufacturer. CONCLUSIONS—Among the bedside determinations, only refractometry gives reliable U-SG results. Dipstick U-SG measurements should be abandoned. PMID:11466191
The (un)reliability of item-level semantic priming effects.
Heyman, Tom; Bruninx, Anke; Hutchison, Keith A; Storms, Gert
2018-04-05
Many researchers have tried to predict semantic priming effects using a myriad of variables (e.g., prime-target associative strength or co-occurrence frequency). The idea is that relatedness varies across prime-target pairs, which should be reflected in the size of the priming effect (e.g., cat should prime dog more than animal does). However, it is only insightful to predict item-level priming effects if they can be measured reliably. Thus, in the present study we examined the split-half and test-retest reliabilities of item-level priming effects under conditions that should discourage the use of strategies. The resulting priming effects proved extremely unreliable, and reanalyses of three published priming datasets revealed similar cases of low reliability. These results imply that previous attempts to predict semantic priming were unlikely to be successful. However, one study with an unusually large sample size yielded more favorable reliability estimates, suggesting that big data, in terms of items and participants, should be the future for semantic priming research.
Multifuel industrial steam generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mesko, J.E.
An inefficient, unreliable steam generation and distribution system at the Red River Army Depot (Texarkana, Tex.), a major industrial facility of the federal government, was replaced with a modern, multifuel-burning steam plant. In the new plant, steam is generated by three high-pressure field-erected boilers burning 100 percent coal, 100 percent refuse, or any combination of the two, while maintaining particulate emissions, SO{sub 2} concentration, and NO{sub x} and chlorine levels at or better than clean air standards. The plant, which has been in operation since 1986, is now part of the Army's Energy/Environment Showcase for demonstrating innovative technology to publicmore » and private operators. When the project began, the Red River depot faced several operational problems. Existing No. 2 oil- and gas- fired boilers in three separate boiler plants were inefficient, unreliable, and difficult to maintain. Extra boilers often had to be leased to provide for needed capacity. In addition, the facility had large quantities of waste to dispose of.« less
Kothari, Darshan; Ketwaroo, Gyanprakash; Sawhney, Mandeep S; Freedman, Steven D; Sheth, Sunil G
2017-07-01
We aimed to determine the feasibility and accuracy of a combined endoscopic ultrasonography (EUS) with a shortened pancreatic function testing (sEUS) for structural and functional assessment using a single instrument in patients with suspected chronic pancreatitis (CP). We completed a prospective crossover study, enrolling patients with suspected CP. Patients who underwent both traditional 1-hour secretin pancreatic function test (sPFT) and sEUS were included in the analysis. We compared study results for test concordance and for correlation of peak bicarbonate concentrations. Eleven (64.7%) of 17 patients had concordant sPFT and sEUS findings when the cutoff for peak bicarbonate was 80 mEq/L. Six patients had discordant findings with a negative sPFT and positive sEUS. This poor concordance suggests that sEUS is an unreliable functional test. Lowering the sEUS cutoff to 70 mEq/L resulted in improved concordance (64.7% vs 70.6%). Finally, there was no significant correlation between peak bicarbonate concentrations (r = 0.47; 95% confidence interval, -0.02 to 0.79) in these 2 functional tests. We demonstrate poor concordance between sPFT and sEUS suggesting that a combined shortened functional and structural test using a single instrument may not be a feasible test for diagnosis of suspected CP when a cutoff of 80 mEq/L is used.
NASA Astrophysics Data System (ADS)
Kasamatsu, Hiroki; Jahja, Mohamad; Sakakibara, Masayuki
2017-06-01
As a solution to economic poverty, we investigated the possibility of developing pearl farming in Kabupaten Gorontalo Utara region of Indonesia. The approximate income of farmers is 15 - 28 million rupiah/year, and that of fishers is 51 million rupiah/year. As these incomes are low and unreliable, it is important that food processing and other industries are developed and that agriculture and fisheries are promoted. Pearl farming minimize environmental load. The coastal environmental conditions are good for pearl shell growth. From 2001 till 2013, pearl farming was undertaken by a Japanese company in Kabupaten Gorontalo Utara, but the operation caused because of conflict with local people and issues with the actions of manager. To establish successful pearl farming businesses it is more important to address social issues than environmental factors.
Ranking Reputation and Quality in Online Rating Systems
Liao, Hao; Zeng, An; Xiao, Rui; Ren, Zhuo-Ming; Chen, Duan-Bing; Zhang, Yi-Cheng
2014-01-01
How to design an accurate and robust ranking algorithm is a fundamental problem with wide applications in many real systems. It is especially significant in online rating systems due to the existence of some spammers. In the literature, many well-performed iterative ranking methods have been proposed. These methods can effectively recognize the unreliable users and reduce their weight in judging the quality of objects, and finally lead to a more accurate evaluation of the online products. In this paper, we design an iterative ranking method with high performance in both accuracy and robustness. More specifically, a reputation redistribution process is introduced to enhance the influence of highly reputed users and two penalty factors enable the algorithm resistance to malicious behaviors. Validation of our method is performed in both artificial and real user-object bipartite networks. PMID:24819119
Convective Radio Occultations Final Campaign Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biondi, R.
2016-03-01
Deep convective systems are destructive weather phenomena that annually cause many deaths and injuries as well as much damage, thereby accounting for major economic losses in several countries. The number and intensity of such phenomena have increased over the last decades in some areas of the globe. Damage is mostly caused by strong winds and heavy rain parameters that are strongly connected to the structure of the particular storm. Convection over land is usually stronger and deeper than over the ocean and some convective systems, known as supercells, also develop tornadoes through processes that remain mostly unclear. The intensity forecastmore » and monitoring of convective systems is one of the major challenges for meteorology because in situ measurements during extreme events are too sparse or unreliable and most ongoing satellite missions do not provide suitable time/space coverage.« less
The multi-queue model applied to random access protocol
NASA Astrophysics Data System (ADS)
Fan, Xinlong
2013-03-01
The connection of everything in a sensory and an intelligent way is a pursuit in smart environment. This paper introduces the engineered cell-sensors into the multi-agent systems to realize the smart environment. The seamless interface with the natural environment and strong information-processing ability of cell with the achievements of synthetic biology make the construction of engineered cell-sensors possible. However, the engineered cell-sensors are only simple-functional and unreliable computational entities. Therefore how to combine engineered cell-sensors with digital device is a key problem in order to realize the smart environment. We give the abstract structure and interaction modes of the engineered cell-sensors in order to introduce engineered cell-sensors into multi-agent systems. We believe that the introduction of engineered cell-sensors will push forward the development of the smart environment.
Combining engineered cell-sensors with multi-agent systems to realize smart environment
NASA Astrophysics Data System (ADS)
Chen, Mei
2013-03-01
The connection of everything in a sensory and an intelligent way is a pursuit in smart environment. This paper introduces the engineered cell-sensors into the multi-agent systems to realize the smart environment. The seamless interface with the natural environment and strong information-processing ability of cell with the achievements of synthetic biology make the construction of engineered cell-sensors possible. However, the engineered cell-sensors are only simple-functional and unreliable computational entities. Therefore how to combine engineered cell-sensors with digital device is a key problem in order to realize the smart environment. We give the abstract structure and interaction modes of the engineered cell-sensors in order to introduce engineered cell-sensors into multi-agent systems. We believe that the introduction of engineered cell-sensors will push forward the development of the smart environment.
NASA Astrophysics Data System (ADS)
Gyftakis, Konstantinos N.; Marques Cardoso, Antonio J.; Antonino-Daviu, Jose A.
2017-09-01
The Park's Vector Approach (PVA), together with its variations, has been one of the most widespread diagnostic methods for electrical machines and drives. Regarding the broken rotor bars fault diagnosis in induction motors, the common practice is to rely on the width increase of the Park's Vector (PV) ring and then apply some more sophisticated signal processing methods. It is shown in this paper that this method can be unreliable and is strongly dependent on the magnetic poles and rotor slot numbers. To overcome this constraint, the novel Filtered Park's/Extended Park's Vector Approach (FPVA/FEPVA) is introduced. The investigation is carried out with FEM simulations and experimental testing. The results prove to satisfyingly coincide, whereas the proposed advanced FPVA method is desirably reliable.
Preschoolers show less trust in physically disabled or obese informants
Ma, Lili
2015-01-01
This research examined whether preschool-aged children show less trust in physically disabled or obese informants. In Study 1, when learning about novel physical activities and facts, 4- and 5-year-olds preferred to endorse the testimony of a physically abled, non-obese informant rather than a physically disabled or obese one. In Study 2, after seeing that the physically disabled or obese informant was previously reliable whereas the physically abled, non-obese one was unreliable, 4- and 5-year-olds did not show a significant preference for either informant. We conclude that in line with the literature on children’s negative stereotypes of physically disabled or obese others, preschoolers are biased against these individuals as potential sources of new knowledge. This bias is robust in that past reliability might undermine its effect on children, but cannot reverse it. PMID:25610413
Marchesi, Matteo; Battistini, Alessio; Pellegrinelli, Moira; Gentile, Guendalina; Zoja, Riccardo
2016-01-01
Fatal air embolism related to endoscopic retrograde cholangiopancreatography is a very rare phenomenon. The authors describe the case of a 51-year-old female patient who developed this mortal complication; a computed tomography (CT) examination was performed in articulo mortis by the physicians. Autopsy was unreliable because of bizarre post-mortem changes (reabsorption of intra-cardiac gas vs. conservation of intra-cranial gas) and a lack of strong diagnostic value of histological findings. The right diagnosis was possible thanks only to the CT examination that permitted the assumption of this possible cause of death before the autopsy and to prepare the necessary procedures to recognise and probe air embolism. This case exemplifies how early post-mortem imaging can be crucial to avoid a wrong diagnosis. © The Author(s) 2015.
Mechanisms for Robust Cognition
ERIC Educational Resources Information Center
Walsh, Matthew M.; Gluck, Kevin A.
2015-01-01
To function well in an unpredictable environment using unreliable components, a system must have a high degree of robustness. Robustness is fundamental to biological systems and is an objective in the design of engineered systems such as airplane engines and buildings. Cognitive systems, like biological and engineered systems, exist within…
Excess biomass accumulation and activity loss in vapor-phase bioreactors (VPBs) can lead to unreliable long-term operation. In this study, temporal and spatial variations in biomass accumulation, distribution and activity in VPBs treating toluene-contaminated air were monitored o...
POSTERIOR PREDICTIVE MODEL CHECKS FOR DISEASE MAPPING MODELS. (R827257)
Disease incidence or disease mortality rates for small areas are often displayed on maps. Maps of raw rates, disease counts divided by the total population at risk, have been criticized as unreliable due to non-constant variance associated with heterogeneity in base population si...
Comparison of Maxilla Mandibular Transverse Ratios With Class II Anteroposterior Discrepancies
2014-03-20
the structure points has shown to be at best unreliable (Jacobson 1995). “2D landmarks may be hindered by rotational, geometric , and head positioning...deficiency in Class II and Class III malocclusions: a cephalometric and morphometric study on postero‐ anterior films. Orthodontics & Craniofacial
Divorce: An Unreliable Predictor of Children's Emotional Predispositions.
ERIC Educational Resources Information Center
Bernard, Janine M.; Nesbitt, Sally
1981-01-01
Used the Children's Emotion Projection Instrument to investigate the emotional predispositions of children from divorce or disruption and children from intact families. Results indicated that children of divorce or disruption are not more hampered emotionally than children from intact families. Discusses implications for family therapists.…
How do dentists and their teams incorporate evidence about preventive care? An empirical study.
Sbaraini, Alexandra; Carter, Stacy Marie; Evans, Robin Wendell; Blinkhorn, Anthony
2013-10-01
To identify how dentists and their teams adopt evidence-based preventive care. A qualitative study using grounded theory methodology was conducted. We interviewed 23 participants working in eight dental practices about their experience and work processes, while adopting evidence-based preventive care. During the study, Charmaz's grounded theory methodology was employed to examine the social process of adopting preventive dental care in dental practices. Charmaz's iteration of the constant comparative method was used during the data analysis. This involved coding of interview transcripts, detailed memo-writing and drawing diagrams. The transcripts were analyzed as soon as possible after each round of interviews in each dental practice. Coding was conducted primarily by AS, supported by team meetings and discussions when researchers compared their interpretations. Participants engaged in a slow process of adapting evidence-based protocols and guidelines to the existing logistics of the practices. This process was influenced by practical, philosophical, and historical aspects of dental care, and a range of barriers and facilitators. In particular, dentists spoke spontaneously about two deeply held 'rules' underpinning continued restorative treatment, which acted as barriers to provide preventive care: (i) dentists believed that some patients were too 'unreliable' to benefit from prevention; and (ii) dentists believed that patients thought that only tangible restorative treatment offered 'value for money'. During the adaptation process, some dentists and teams transitioned from their initial state - selling restorative care - through an intermediary stage - learning by doing and educating patients about the importance of preventive care - and finally to a stage where they were offering patients more than just restorative care. Resources were needed for the adaptation process to occur, including: the ability to maintain the financial viability of the practice, appropriate technology, time, and supportive dental team relationships. The findings from this study show that with considerable effort, motivation and coordination, it is possible for dental practices to work against the dental 'mainstream' and implement prevention as their clinical norm. This study has shown that dental practice is not purely scientific, but it includes cultural, social, and economic resources that interfere with the provision of preventive care. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Ka Hana `Imi Na`auao: A Science Curriculum Project
NASA Astrophysics Data System (ADS)
Napeahi, K.; Roberts, K. D.; Galloway, L. M.; Stodden, R. A.; Akuna, J.; Bruno, B.
2005-12-01
In antiquity, the first people to step foot on what are now known as the Hawaiian islands skillfully traversed the Pacific Ocean using celestial navigation and learned observations of scientific phenomena. Long before the Western world ventured beyond the horizon, Hawaiians had invented the chronometer, built aqueduct systems (awai) that continue to amaze modern engineers, and had preventive health systems as well as a comprehensive knowledge of medicinal plants (including antivirals) which only now are working their way through trials for use in modern pharmacopia. Yet, today, Native Hawaiians are severely underrepresented in science-related fields, reflecting (in part) a failure of the Western educational system to nurture the potential of these resourceful students, particularly the many "at-risk" students who are presently over-represented in special education. A curriculum which draws from and incorporates traditional Hawaiian values and knowledge is needed to reinforce links to the inquiry process which nurtured creative thinking during the renaissance of Polynesian history. The primary goal of the Ka Hana `Imi Na`auao Project (translation: `science` or `work in which you seek enlightenment, knowledge or wisdom`) is to increase the number of Native Hawaiian adults in science-related postsecondary education and employment fields. Working closely with Native Hawaiian cultural experts and our high school partners, we will develop and implement a culturally responsive 11th and 12th grade high school science curriculum, infused with math, literacy and technology readiness skills. Software and assistive technology will be used to adapt instruction to individual learners` reading levels, specific disabilities and learning styles. To ease the transition from secondary to post-secondary education, selected grade 12 students will participate in planned project activities that link high school experiences with college science-related programs of study. Ka Hana `Imi Na`auao is funded through a grant awarded to the University of Hawaii Center on Disability Studies (R.A. Stodden, PI) from the U.S. Department of Education, Native Hawaiian Education Act. Project information and curricula are available at http://www.scihi.hawaii.edu/.
Measuring the speed of light with ultra-compact radio quasars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cao, Shuo; Biesiada, Marek; Jackson, John
In this paper, based on a 2.29 GHz VLBI all-sky survey of 613 milliarcsecond ultra-compact radio sources with 0.0035< z <3.787, we describe a method of identifying the sub-sample which can serve as individual standard rulers in cosmology. If the linear size of the compact structure is assumed to depend on source luminosity and redshift as l {sub m} = l L {sup β} (1+ z ) {sup n} , only intermediate-luminosity quasars (10{sup 27} W/Hz< L < 10{sup 28} W/Hz) show negligible dependence (| n |≅ 10{sup −3}, |β|≅ 10{sup −4}), and thus represent a population of such rulersmore » with fixed characteristic length l =11.42 pc. With a sample of 120 such sources covering the redshift range 00.46< z <2.8, we confirm the existence of dark energy in the Universe with high significance under the assumption of a flat universe, and obtain stringent constraints on both the matter density Ω {sub m} =0.323{sup +0.245}{sub −0.145} and the Hubble constant H {sub 0}=66.30{sup +7.00}{sub −8.50} km sec{sup −1} Mpc{sup −1}. Finally, with the angular diameter distances D {sub A} measured for quasars extending to high redshifts (0 z ∼ 3.), we reconstruct the D {sub A} ( z ) function using the technique of Gaussian processes. This allows us to identify the redshift corresponding to the maximum of the D {sub A} ( z ) function: 0 z {sub m} =1.7 and the corresponding angular diameter distance D {sub A} ( z {sub m} )=1719.01±43.46 Mpc. Similar reconstruction of the expansion rate function H ( z ) based on the data from cosmic chronometers and BAO gives us H ( z {sub m} )=176.77±6.11 km sec{sup −1} Mpc{sup −1}. These measurements are used to estimate the speed of light: c =3.039(±0.180)× 10{sup 5} km/s. This is the first measurement of the speed of light in a cosmological setting referring to the distant past.« less
Window and Overlap Processing Effects on Power Estimates from Spectra
NASA Astrophysics Data System (ADS)
Trethewey, M. W.
2000-03-01
Fast Fourier transform (FFT) spectral processing is based on the assumption of stationary ergodic data. In engineering practice, the assumption is often violated and non-stationary data processed. Data windows are commonly used to reduce leakage by decreasing the signal amplitudes near the boundaries of the discrete samples. With certain combinations of non-stationary signals and windows, the temporal weighting may attenuate important signal characteristics to adversely affect any subsequent processing. In other words, the window artificially reduces a significant section of the time signal. Consequently, spectra and overall power estimated from the affected samples are unreliable. FFT processing can be particularly problematic when the signal consists of randomly occurring transients superimposed on a more continuous signal. Overlap processing is commonly used in this situation to improve the estimates. However, the results again depend on the temporal character of the signal in relation to the window weighting. A worst-case scenario, a short-duration half sine pulse, is used to illustrate the relationship between overlap percentage and resulting power estimates. The power estimates are shown to depend on the temporal behaviour of the square of overlapped window segments. An analysis shows that power estimates may be obtained to within 0.27 dB for the following windows and overlap combinations: rectangular (0% overlap), Hanning (62.5% overlap), Hamming (60.35% overlap) and flat-top (82.25% overlap).
Using real-time problem solving to eliminate central line infections.
Shannon, Richard P; Frndak, Diane; Grunden, Naida; Lloyd, Jon C; Herbert, Cheryl; Patel, Bhavin; Cummins, Daniel; Shannon, Alexander H; O'Neill, Paul H; Spear, Steven J
2006-09-01
An estimated 200,000 Americans suffer central line-associated bloodstream infections (CLABs) each year, with 15%-20% mortality. Two intensive care units (ICUs) redefined the processes of care through system redesign to deliver reliable outcomes free of the variations that created the breeding ground for infection. The ICUs, comprising 28 beds at Allegheny General Hospital, employed the principles of the Toyota Production System adapted to health care--Perfecting Patient Care--and applied them to central line placement and maintenance. Intensive observations, which revealed multiple variances from established practices, and root cause analyses of all CLABs empowered the workers to implement countermeasures designed to eliminate the defects in the processes of central line placement and maintenance. New processes were implemented within 90 days. Within a year CLABs decreased from 49 to 6 (10.5 to 1.2 infections/1,000 line-days), and mortalities from 19 to 1 (51% to 16%), despite an increase in the use of central lines and number of line-days. These results were sustained during a 34-month period. CLABs are not an inevitable product of complex ICU care but the result of highly variable and therefore unreliable care delivery that predisposes to infection.
2018-01-01
Direction of arrival (DOA) estimation is the basis for underwater target localization and tracking using towed line array sonar devices. A method of DOA estimation for underwater wideband weak targets based on coherent signal subspace (CSS) processing and compressed sensing (CS) theory is proposed. Under the CSS processing framework, wideband frequency focusing is accompanied by a two-sided correlation transformation, allowing the DOA of underwater wideband targets to be estimated based on the spatial sparsity of the targets and the compressed sensing reconstruction algorithm. Through analysis and processing of simulation data and marine trial data, it is shown that this method can accomplish the DOA estimation of underwater wideband weak targets. Results also show that this method can considerably improve the spatial spectrum of weak target signals, enhancing the ability to detect them. It can solve the problems of low directional resolution and unreliable weak-target detection in traditional beamforming technology. Compared with the conventional minimum variance distortionless response beamformers (MVDR), this method has many advantages, such as higher directional resolution, wider detection range, fewer required snapshots and more accurate detection for weak targets. PMID:29562642
[Osseointegration as a method of direct stabilization of amputation prostheses to the bone].
Rochmiński, Robert; Sibński, Marcin; Synder, Marek
2011-01-01
This article summarizes important advantages, disadvantages and the process of treatment of patients after lower limb amputation on the level of the femur, with osseointegrated prosthesis. In the process of treatment bone-integrated material is implanted to the femur, with allows for structural and functional connection between live tissue and the prosthesis. This solution allows the patient for easy usage and direct steerage of the prosthesis, transferring of body weight to the floor and detection of sensation in the moment of contact between prosthesis and the ground. Osseointegrated prostheses in the femur gives the opportunity not to use the traditional solutions and socket-related problems as: mobility difficulties, skin sores, rush, pain during weight bearing, temporary changes of the stump volume, difficulty donning the prosthesis, unreliability of prosthesis being securely suspended. Osseointegration is possible even in cases, when quality of skin and short stump enables to use the traditional prosthetic socket. It is used after lower and upper limbs amputations. This kind of prosthetic solutions has some disadvantages and limitations. It is expensive and demanding. It can be used in cooperative patients, who take active part in the process of implantation, rehabilitation and in future usage if the prosthesis.
Forensic Analysis of Cites-Protected Dalbergia Timber from the Americas
Edgard O. Espinoza; Michael C. Wiemann; Josefina Barajas-Morales; Gabriela D. Chavarria; Pamela J. McClure
2015-01-01
Species identification of logs, planks, and veneers is difficult because they lack the traditional descriptors such as leaves and flowers. An additional challenge is that many transnational shipments have unreliable geographic provenance. Therefore, frequently the lowest taxonomic determination is genus, which allows unscrupulous importers to evade the endangered...
Unreliable Retrial Queues in a Random Environment
2007-09-01
equivalent to the stochasticity of the matrix Ĝ. It is generally known from Perron - Frobenius theory that a given square ma- trix M is stochastic if and...only if its maximum positive eigenvalue (i.e., its Perron eigenvalue) sp(M) is equal to unity. A simple analytical condition that guarantees the
A Critique of Divorce Statistics and Their Interpretation.
ERIC Educational Resources Information Center
Crosby, John F.
1980-01-01
Increasingly, appeals to the divorce statistic are employed to substantiate claims that the family is in a state of breakdown and marriage is passe. This article contains a consideration of reasons why the divorce statistics are invalid and/or unreliable as indicators of the present state of marriage and family. (Author)
Recent Research on Children's Testimony about Experienced and Witnessed Events
ERIC Educational Resources Information Center
Pipe, M.E.; Lamb, M.E.; Orbach, Y.; Esplin, P.W.
2004-01-01
Research on memory development has increasingly moved out of the laboratory and into the real world. Whereas early researchers asked whether confusion and susceptibility to suggestion made children unreliable witnesses, furthermore, contemporary researchers are addressing a much broader range of questions about children's memory, focusing not only…
ERIC Educational Resources Information Center
Graney, Christopher M.
2010-01-01
Is the phenomenon of magnification by a converging lens inconsistent and therefore unreliable? Can a lens magnify one part of an object but not another? Physics teachers and even students familiar with basic optics would answer "no," yet many answer "yes." Numerous telescope users believe that magnification is not a reliable phenomenon in that it…
Perceived Credibility and Eyewitness Testimony of Children with Intellectual Disabilities
ERIC Educational Resources Information Center
Henry, L.; Ridley, A.; Perry, J.; Crane, L.
2011-01-01
Background: Although children with intellectual disabilities (ID) often provide accurate witness testimony, jurors tend to perceive their witness statements to be inherently unreliable. Method: The current study explored the free recall transcripts of child witnesses with ID who had watched a video clip, relative to those of typically developing…
ERIC Educational Resources Information Center
Anthony, Michael A.; Caleb, Derry; Mitchell, Stanley G.
2012-01-01
When standards are absent, people soon notice. They care when products turn out to be of poor quality, are unreliable, or dangerous because of counterfeiting. By positioning their products in relation to a common standard, firms grow the total size of the market, and can focus their innovation efforts in areas where they have a comparative…
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Design. 27.601 Section 27.601 Aeronautics... STANDARDS: NORMAL CATEGORY ROTORCRAFT Design and Construction General § 27.601 Design. (a) The rotorcraft may have no design features or details that experience has shown to be hazardous or unreliable. (b...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Design. 27.601 Section 27.601 Aeronautics... STANDARDS: NORMAL CATEGORY ROTORCRAFT Design and Construction General § 27.601 Design. (a) The rotorcraft may have no design features or details that experience has shown to be hazardous or unreliable. (b...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Design. 29.601 Section 29.601 Aeronautics... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Design and Construction General § 29.601 Design. (a) The rotorcraft may have no design features or details that experience has shown to be hazardous or unreliable. (b...
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Design. 29.601 Section 29.601 Aeronautics... STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Design and Construction General § 29.601 Design. (a) The rotorcraft may have no design features or details that experience has shown to be hazardous or unreliable. (b...
The Role of Science in Behavioral Disorders.
ERIC Educational Resources Information Center
Kauffman, James M.
1999-01-01
A scientific, rule-governed approach to solving problems suggests the following assumptions: we need different rules for different purposes; rules are grounded in values; the origins and applications of rules are often misunderstood; personal experience and idea popularity are unreliable; and all truths are tentative. Each assumption is related to…
Inferential Procedures for Correlation Coefficients Corrected for Attenuation.
ERIC Educational Resources Information Center
Hakstian, A. Ralph; And Others
1988-01-01
A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)
Library Buildings 2009: The Constant Library
ERIC Educational Resources Information Center
Fox, Bette-Lee
2009-01-01
Can it be only two years, as Alan Jay Lerner once wrote, "since the whole [economic] rigmarole began"? Yet libraries have weathered to varying degrees the unreliability of funding, especially with regard to programming, materials, and hours. Money earmarked years ago is seeing construction through to conclusion; state support has helped out in…
Future Development of Instructional Television.
ERIC Educational Resources Information Center
Barnett, H. J.; Denzau, A. T.
Instructional television (ITV) has been little used in the nation's schools because ITV hardware and software has been unreliable and expensive and teachers have yet to learn to use ITV. The perfection of inexpensive videotape recorders/players (VTR) and inexpensive tapes and cameras could remedy the problem. A package consisting of 10 mobile…
COMPARISON OF GESTATIONAL AGE AT DELIVERY BASED ON LAST MENSTRUAL PERIOD AND EARLY ULTRASOUND
Reported date of last menstrual period (LMP) is commonly used to estimate gestational age but may be unreliable if recall is inaccurate or time between menstruation and ovulation differs from the presumed 15-day interval. Early ultrasound is generally a more accurate method than ...
Enhancing the Internet of Things Architecture with Flow Semantics
ERIC Educational Resources Information Center
DeSerranno, Allen Ronald
2017-01-01
Internet of Things ("IoT") systems are complex, asynchronous solutions often comprised of various software and hardware components developed in isolation of each other. These components function with different degrees of reliability and performance over an inherently unreliable network, the Internet. Many IoT systems are developed within…
A centrifuge simulated push-pull manoeuvre with subsequent reduced +Gz tolerance.
Xu, Yan; Li, Bao-Hui; Zhang, Li-Hui; Jin, Zhao; Wei, Xiao-Yang; Wang, Hong; Wu, San-Yuan; Wang, Hai-Xia; Wang, Quan; Yan, Gui-Ding; Deng, Lue; Geng, Xi-Chen
2012-07-01
The push-pull effect (PPE) has been recognized as a deleterious contributor to fatal flight accidents. The purpose of the study was to establish a push-pull manoeuvre (PPM) simulation with a tri-axes centrifuge, studying the effect of this PPM on the +Gz tolerance, and to make this simulation suitable for pilot centrifuge training. The PPM was realized through pre-programmed acceleration profiles consisting of -1 Gz for 5 s followed by a +Gz plateau for 10 s. Relaxed +Gz tolerance recordings were obtained from 20 healthy male fighter aircraft pilots and 6 healthy male volunteers through exposure to pre-programmed profiles with and without previous -1 Gz exposure. A statistically significant decrease in +Gz tolerance was seen in all subjects after -1 Gz for 5 s exposure, 0.87 ± 0.13 G in the volunteer group and 0.95 ± 0.25 G in the pilot group. The ear opacity pulse as a +Gz tolerance endpoint criterion was sometimes found to be unreliable during the PPM experiments. The simulated PPM in this study elicited a PPE, which was obvious from the significant reduction in +Gz tolerance. The PPM profile appears useful to be included in centrifuge training.
Historical citizen science to understand and predict climate-driven trout decline
Ninyerola, Miquel; Hermoso, Virgilio; Filipe, Ana Filipa; Pla, Magda; Villero, Daniel; Brotons, Lluís; Delibes, Miguel
2017-01-01
Historical species records offer an excellent opportunity to test the predictive ability of range forecasts under climate change, but researchers often consider that historical records are scarce and unreliable, besides the datasets collected by renowned naturalists. Here, we demonstrate the relevance of biodiversity records developed through citizen-science initiatives generated outside the natural sciences academia. We used a Spanish geographical dictionary from the mid-nineteenth century to compile over 10 000 freshwater fish records, including almost 4 000 brown trout (Salmo trutta) citations, and constructed a historical presence–absence dataset covering over 2 000 10 × 10 km cells, which is comparable to present-day data. There has been a clear reduction in trout range in the past 150 years, coinciding with a generalized warming. We show that current trout distribution can be accurately predicted based on historical records and past and present values of three air temperature variables. The models indicate a consistent decline of average suitability of around 25% between 1850s and 2000s, which is expected to surpass 40% by the 2050s. We stress the largely unexplored potential of historical species records from non-academic sources to open new pathways for long-term global change science. PMID:28077766
Electrochemical enzymatic biosensors using carbon nanofiber nanoelectrode arrays
NASA Astrophysics Data System (ADS)
Li, Jun; Li, Yi-fen; Swisher, Luxi Z.; Syed, Lateef U.; Prior, Allan M.; Nguyen, Thu A.; Hua, Duy H.
2012-10-01
The reduction of electrode size down to nanometers could dramatically enhance detection sensitivity and temporal resolution. Nanoelectrode arrays (NEAs) are of particular interest for ultrasensitive biosensors. Here we report the study of two types of biosensors for measuring enzyme activities using NEAs fabricated with vertically aligned carbon nanofibers (VACNFs). VACNFs of ~100 nm in average diameter and 3-5 μm in length were grown on conductive substrates as uniform vertical arrays which were then encapsulated in SiO2 matrix leaving only the tips exposed. We demonstrate that such VACNF NEAs can be used in profiling enzyme activities through monitoring the change in electrochemical signals induced by enzymatic reactions to the peptides attached to the VACNF tip. The cleavage of the tetrapeptide with a ferrocene tag by a cancerrelated protease (legumain) was monitored with AC voltammetry. Real-time electrochemical impedance spectroscopy (REIS) was used for fast label-free detection of two reversible processes, i.e. phosphorylation by c-Src tyrosine kinase and dephosphorylation by protein tyrosine phosphatase 1B (PTP1B). The REIS data of phosphorylation were slow and unreliable, but those of dephosphorylation showed large and fast exponential decay due to much higher activity of phosphatase PTP1B. The kinetic data were analyzed with a heterogeneous Michaelis-Menten model to derive the "specificity constant" kcat/Km, which is 8.2x103 M-1s-1 for legumain and (2.1 ± 0.1) x 107 M-1s-1 for phosphatase (PTP1B), well consistent with literature. It is promising to develop VACNF NEA based electrochemical enzymatic biosensors as portable multiplex electronic techniques for rapid cancer diagnosis and treatment monitoring.
Regnerus, Mark
2017-09-01
The study of stigma's influence on health has surged in recent years. Hatzenbuehler et al.'s (2014) study of structural stigma's effect on mortality revealed an average of 12 years' shorter life expectancy for sexual minorities who resided in communities thought to exhibit high levels of anti-gay prejudice, using data from the 1988-2002 administrations of the US General Social Survey linked to mortality outcome data in the 2008 National Death Index. In the original study, the key predictor variable (structural stigma) led to results suggesting the profound negative influence of structural stigma on the mortality of sexual minorities. Attempts to replicate the study, in order to explore alternative hypotheses, repeatedly failed to generate the original study's key finding on structural stigma. Efforts to discern the source of the disparity in results revealed complications in the multiple imputation process for missing values of the components of structural stigma. This prompted efforts at replication using 10 different imputation approaches. Efforts to replicate Hatzenbuehler et al.'s (2014) key finding on structural stigma's notable influence on the premature mortality of sexual minorities, including a more refined imputation strategy than described in the original study, failed. No data imputation approach yielded parameters that supported the original study's conclusions. Alternative hypotheses, which originally motivated the present study, revealed little new information. Ten different approaches to multiple imputation of missing data yielded none in which the effect of structural stigma on the mortality of sexual minorities was statistically significant. Minimally, the original study's structural stigma variable (and hence its key result) is so sensitive to subjective measurement decisions as to be rendered unreliable. Copyright © 2016 The Author. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Gillespie, S. A.; Parikh, A.; Barton, C. J.; Faestermann, T.; José, J.; Hertenberger, R.; Wirth, H.-F.; de Séréville, N.; Riley, J. E.; Williams, M.
2017-08-01
Sulphur isotopic ratio measurements may help to establish the astrophysical sites in which certain presolar grains were formed. Nova model predictions of the 34S/32S ratio are, however, unreliable due to the lack of an experimental 34S(p ,γ )35Cl reaction rate. To this end, we have measured the 34S(3He,d )35Cl reaction at 20 MeV using a high resolution quadrupole-dipole-dipole-dipole magnetic spectrograph. Twenty-two levels over 6.2 MeV
Chen, Mengni
2018-01-01
ABSTRACT Over the past few decades, the level of divorce, measured by the crude divorce rate (CDR), has increased dramatically in both the East and the West, but has recently appeared to fall or level off in some countries. To investigate whether the recent decline or stabilisation of the CDRs reflects the real trends in divorce risk, a decomposition analysis was conducted on the changes in the CDRs over the past 20 years on two western and three East Asian countries, namely, the UK, Australia, Taiwan, South Korea, and Singapore. The following is observed: the decline in the CDRs of the UK and Australia in the 1990s, and of Taiwan and Korea in the 2000s, was mainly due to shrinkage in the proportion of the married population rather than any reduction in divorce risk; only Australia experienced a genuine reduction in divorce risk between 2001 and 2011; and the continuous increase of Singapore’s divorce level between 1990 and 2010 may be is an unintentional effect of the government’s marriage promotion policies. The shift in the population age structure, and more importantly, the drastic decline in marriage, has seriously distorted the CDRs, making them unreliable indicators for monitoring divorce trends. PMID:29930691
Seashols-Williams, Sarah; Green, Raquel; Wohlfahrt, Denise; Brand, Angela; Tan-Torres, Antonio Limjuco; Nogales, Francy; Brooks, J Paul; Singh, Baneshwar
2018-05-17
Sequencing and classification of microbial taxa within forensically relevant biological fluids has the potential for applications in the forensic science and biomedical fields. The quantity of bacterial DNA from human samples is currently estimated based on quantity of total DNA isolated. This method can miscalculate bacterial DNA quantity due to the mixed nature of the sample, and consequently library preparation is often unreliable. We developed an assay that can accurately and specifically quantify bacterial DNA within a mixed sample for reliable 16S ribosomal DNA (16S rDNA) library preparation and high throughput sequencing (HTS). A qPCR method was optimized using universal 16S rDNA primers, and a commercially available bacterial community DNA standard was used to develop a precise standard curve. Following qPCR optimization, 16S rDNA libraries from saliva, vaginal and menstrual secretions, urine, and fecal matter were amplified and evaluated at various DNA concentrations; successful HTS data were generated with as low as 20 pg of bacterial DNA. Changes in bacterial DNA quantity did not impact observed relative abundances of major bacterial taxa, but relative abundance changes of minor taxa were observed. Accurate quantification of microbial DNA resulted in consistent, successful library preparations for HTS analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The time light signals of New Zealand: yet another way of communicating time in the pre-wireless era
NASA Astrophysics Data System (ADS)
Kinns, Roger
2017-08-01
The signalling of exact time using an array of lights appears to have been unique to New Zealand. It was a simple and effective solution for calibration of marine chronometers when transmission of time signals by wireless was in its infancy. Three lights, coloured green, red and white, were arranged in a vertical array. They were switched on in a defined sequence during the evening and then extinguished together to signal exact time. Time lights were first operated at the Dominion Observatory in Wellington during February 1912 and on the Ferry Building in Auckland during October 1915. The Wellington lights were immediately adjacent to the observatory buildings, but those in Auckland were operated using telegraph signals from Wellington. The timings varied over the years, but the same physical arrangement was retained at each location. The time light service was withdrawn during 1937, when wireless signals had become almost universally available for civil and navigation purposes.
Ar-Ar Impact Heating Ages of Eucrites and Timing of the LHB
NASA Technical Reports Server (NTRS)
Bogard, Donald; Garrison, Daniel
2009-01-01
Eucrites and howardites, more than most meteorite types, show extensive impact resetting of their Ar-39-Ar-40 (K-Ar) ages approximately equal to 3.4-4.1 Ga ago, and many specimens show some disturbance of other radiometry chronometers as well. Bogard (1995) argued that this age resetting occurred on Vesta and was produced by the same general population of objects that produced many of the lunar impact basins. The exact nature of the lunar late heavy bombardment (LHB or 'cataclysm') remains controversial, but the timing is similar to the reset ages of eucrites. Neither the beginning nor ending time of the lunar LHB is well constrained. Comparison of Ar-Ar ages of brecciated eucrites with data for the lunar LHB can resolve both the origin of these impactors and the time period over which they were delivered to the inner solar system. This abstract reports some new Ar-Ar age data for eucrites, obtained since the authors' 1995 and 2003 papers.
Coupled 142Nd-143Nd evidence for a protracted magma ocean in Mars.
Debaille, V; Brandon, A D; Yin, Q Z; Jacobsen, B
2007-11-22
Resolving early silicate differentiation timescales is crucial for understanding the chemical evolution and thermal histories of terrestrial planets. Planetary-scale magma oceans are thought to have formed during early stages of differentiation, but the longevity of such magma oceans is poorly constrained. In Mars, the absence of vigorous convection and plate tectonics has limited the scale of compositional mixing within its interior, thus preserving the early stages of planetary differentiation. The SNC (Shergotty-Nakhla-Chassigny) meteorites from Mars retain 'memory' of these events. Here we apply the short-lived 146Sm-142Nd and the long-lived 147Sm-143Nd chronometers to a suite of shergottites to unravel the history of early silicate differentiation in Mars. Our data are best explained by progressive crystallization of a magma ocean with a duration of approximately 100 million years after core formation. This prolonged solidification requires the existence of a primitive thick atmosphere on Mars that reduces the cooling rate of the interior.
Electromagnetic Probes: A Chronometer of Heavy Ion Collision
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sinha, Bikash
I have known Predhiman for quite some time and I consider his friendship a great privilege. He along with some of his colleagues made the almost unique transition time to time from Quantum Electrodynamics of his (almost classical) electromagnetic plasma to Quantum Chromodynamics of quarks and gluons. Some of the papers are unique in the sense they surface up to the centre stage of the field of quarks and gluons giving us a new insight; the particular paper of Bannur and Kaw discussing the stability of quark gluon plasma is a particularly interesting one.I wish Predhiman the very best onmore » this occasion and sincerely hope for a long vital and fruitful life that lies ahead.Interestingly enough this transition from QED (electromagnetic plasma) to QCD plasma (Quark Gluon Plasma) was motivated by consuming a very special kind of Indian soft nuts on Sunday afternoons, the consumers consisted of two persons, P. K. Kaw and Jitendra Parikh - some nuts!« less
Constraining the evolution of the Hubble Parameter using cosmic chronometer
NASA Astrophysics Data System (ADS)
Scarlata, Claudia; Dickinson, Hugh
2018-01-01
The Lambda-CDM model of Big Bang cosmology relies heavily on the assumption that two components - dark energy and dark matter - encompass 95% of the energy density of the Universe. Despite the dominant influence of these components, their nature is still entirely unknown.We present the initial results from a project that aims to provide new insights regarding the Dark Energy component. We do this by deriving independent constraints on the time-evolution of the Hubble parameter (H_0) using the “cosmic chronometer” method.By analyzing the HST NIR spectra from a large archival sample of passively evolving galaxies in distinct redshift bins between 1.3 and 2 we measure the typical stellar population ages (A) for the galaxies in each bin. The differential evolution of stellar population age with redshift (dA/dz) can be used to infer the corresponding evolution of H_0 which will provide important constraints on the nature of Dark Energy and its equation of state.
Cyclic fatigue of ProFile rotary instruments after prolonged clinical use.
Gambarini, G
2001-07-01
The purpose of the present study was to evaluate resistance to cyclic fatigue of new and used ProFile Ni-Ti rotary instruments. Used instruments were operated in 10 clinical cases using passive instrumentation and a crown-down preparation technique. Cyclic fatigue testing of new and used engine-driven instruments was then performed with a specific device which allowed the instruments to rotate freely inside a stainless steel artificial canal, whilst maintaining conditions close to the clinical situation. Instruments were rotated until fracture occurred and time to fracture was visually recorded with a chronometer. A significant reduction of rotation time to breakage (life span) was noted between new and used instruments. In all sizes new instruments were significantly more resistant than used ones (two-sample t-test, P < 0.01). No instrument underwent intracanal failure during clinical use. Prolonged clinical use of Ni-Ti engine-driven instruments significantly reduced their cyclic fatigue resistance. Nevertheless, each rotary instrument was successfully operated in up to 10 clinical cases without any intracanal failure.
Combined oxygen-isotope and U-Pb zoning studies of titanite: New criteria for age preservation
Bonamici, Chloe E.; Fanning, C. Mark; Kozdon, Reinhard; ...
2015-02-11
Here, titanite is an important U-Pb chronometer for dating geologic events, but its high-temperature applicability depends upon its retention of radiogenic lead (Pb). Experimental data predict similar rates of diffusion for lead (Pb) and oxygen (O) in titanite at granulite-facies metamorphic conditions (T = 650-800°C). This study therefore investigates the utility of O-isotope zoning as an indicator for U-Pb zoning in natural titanite samples from the Carthage-Colton Mylonite Zone of the Adirondack Mountains, New York. Based on previous field, textural, and microanalytical work, there are four generations (types) of titanite in the study area, at least two of which preservemore » diffusion-related δ 18O zoning. U-Th-Pb was analyzed by SIMS along traverses across three grains of type-2 titanite, which show well-developed diffusional δ 18O zoning, and one representative grain from each of the other titanite generations.« less
Brane with variable tension as a possible solution to the problem of the late cosmic acceleration
NASA Astrophysics Data System (ADS)
García-Aspeitia, Miguel A.; Hernandez-Almada, A.; Magaña, Juan; Amante, Mario H.; Motta, V.; Martínez-Robles, C.
2018-05-01
Braneworld models have been proposed as a possible solution to the problem of the accelerated expansion of the Universe. The idea is to dispense the dark energy (DE) and drive the late-time cosmic acceleration with a five-dimensional geometry. We investigate a brane model with variable brane tension as a function of redshift called chrono-brane. We propose the polynomial λ =(1 +z )n function inspired in tracker-scalar-field potentials. To constrain the n exponent we use the latest observational Hubble data from cosmic chronometers, Type Ia Supernovae from the full joint-light-analysis sample, baryon acoustic oscillations and the posterior distance from the cosmic microwave background of Planck 2015 measurements. A joint analysis of these data estimates n ≃6.19 ±0.12 which generates a DE-like (cosmological-constantlike at late times) term, in the Friedmann equation arising from the extra dimensions. This model is consistent with these data and can drive the Universe to an accelerated phase at late times.
Half-life determination for {sup 108}Ag and {sup 110}Ag
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zahn, Guilherme S.; Genezini, Frederico A.
2014-11-11
In this work, the half-life of the short-lived silver radionuclides {sup 108}Ag and {sup 110}Ag were measured by following the activity of samples after they were irradiated in the IEA-R1 reactor. The results were then fitted using a non-paralizable dead time correction to the regular exponential decay and the individual half-life values obtained were then analyzed using both the Normalized Residuals and the Rajeval techniques, in order to reach the most exact and precise final values. To check the validity of dead-time correction, a second correction method was also employed by means of counting a long-lived {sup 60}Co radioactive sourcemore » together with the samples as a livetime chronometer. The final half-live values obtained using both dead-time correction methods were in good agreement, showing that the correction was properly assessed. The results obtained are partially compatible with the literature values, but with a lower uncertainty, and allow a discussion on the last ENSDF compilations' values.« less
Reducing Uncertainty in the American Community Survey through Data-Driven Regionalization
Spielman, Seth E.; Folch, David C.
2015-01-01
The American Community Survey (ACS) is the largest survey of US households and is the principal source for neighborhood scale information about the US population and economy. The ACS is used to allocate billions in federal spending and is a critical input to social scientific research in the US. However, estimates from the ACS can be highly unreliable. For example, in over 72% of census tracts, the estimated number of children under 5 in poverty has a margin of error greater than the estimate. Uncertainty of this magnitude complicates the use of social data in policy making, research, and governance. This article presents a heuristic spatial optimization algorithm that is capable of reducing the margins of error in survey data via the creation of new composite geographies, a process called regionalization. Regionalization is a complex combinatorial problem. Here rather than focusing on the technical aspects of regionalization we demonstrate how to use a purpose built open source regionalization algorithm to process survey data in order to reduce the margins of error to a user-specified threshold. PMID:25723176
The Magnetic Origins of Solar Activity
NASA Technical Reports Server (NTRS)
Antiochos, S. K.
2012-01-01
The defining physical property of the Sun's corona is that the magnetic field dominates the plasma. This property is the genesis for all solar activity ranging from quasi-steady coronal loops to the giant magnetic explosions observed as coronal mass ejections/eruptive flares. The coronal magnetic field is also the fundamental driver of all space weather; consequently, understanding the structure and dynamics of the field, especially its free energy, has long been a central objective in Heliophysics. The main obstacle to achieving this understanding has been the lack of accurate direct measurements of the coronal field. Most attempts to determine the magnetic free energy have relied on extrapolation of photospheric measurements, a notoriously unreliable procedure. In this presentation I will discuss what measurements of the coronal field would be most effective for understanding solar activity. Not surprisingly, the key process for driving solar activity is magnetic reconnection. I will discuss, therefore, how next-generation measurements of the coronal field will allow us to understand not only the origins of space weather, but also one of the most important fundamental processes in cosmic and laboratory plasmas.
Reducing uncertainty in the american community survey through data-driven regionalization.
Spielman, Seth E; Folch, David C
2015-01-01
The American Community Survey (ACS) is the largest survey of US households and is the principal source for neighborhood scale information about the US population and economy. The ACS is used to allocate billions in federal spending and is a critical input to social scientific research in the US. However, estimates from the ACS can be highly unreliable. For example, in over 72% of census tracts, the estimated number of children under 5 in poverty has a margin of error greater than the estimate. Uncertainty of this magnitude complicates the use of social data in policy making, research, and governance. This article presents a heuristic spatial optimization algorithm that is capable of reducing the margins of error in survey data via the creation of new composite geographies, a process called regionalization. Regionalization is a complex combinatorial problem. Here rather than focusing on the technical aspects of regionalization we demonstrate how to use a purpose built open source regionalization algorithm to process survey data in order to reduce the margins of error to a user-specified threshold.
Online anomaly detection in wireless body area networks for reliable healthcare monitoring.
Salem, Osman; Liu, Yaning; Mehaoua, Ahmed; Boutaba, Raouf
2014-09-01
In this paper, we propose a lightweight approach for online detection of faulty measurements by analyzing the data collected from medical wireless body area networks. The proposed framework performs sequential data analysis using a smart phone as a base station, and takes into account the constrained resources of the smart phone, such as processing power and storage capacity. The main objective is to raise alarms only when patients enter in an emergency situation, and to discard false alarms triggered by faulty measurements or ill-behaved sensors. The proposed approach is based on the Haar wavelet decomposition, nonseasonal Holt-Winters forecasting, and the Hampel filter for spatial analysis, and on for temporal analysis. Our objective is to reduce false alarms resulting from unreliable measurements and to reduce unnecessary healthcare intervention. We apply our proposed approach on real physiological dataset. Our experimental results prove the effectiveness of our approach in achieving good detection accuracy with a low false alarm rate. The simplicity and the processing speed of our proposed framework make it useful and efficient for real time diagnosis.
NASA Astrophysics Data System (ADS)
Chen, Yaning; Li, Weihong; Fang, Gonghuan; Li, Zhi
2017-02-01
Meltwater from glacierized catchments is one of the most important water supplies in central Asia. Therefore, the effects of climate change on glaciers and snow cover will have increasingly significant consequences for runoff. Hydrological modeling has become an indispensable research approach to water resources management in large glacierized river basins, but there is a lack of focus in the modeling of glacial discharge. This paper reviews the status of hydrological modeling in glacierized catchments of central Asia, discussing the limitations of the available models and extrapolating these to future challenges and directions. After reviewing recent efforts, we conclude that the main sources of uncertainty in assessing the regional hydrological impacts of climate change are the unreliable and incomplete data sets and the lack of understanding of the hydrological regimes of glacierized catchments of central Asia. Runoff trends indicate a complex response to changes in climate. For future variation of water resources, it is essential to quantify the responses of hydrologic processes to both climate change and shrinking glaciers in glacierized catchments, and scientific focus should be on reducing uncertainties linked to these processes.
Effective Temperatures for Young Stars in Binaries
NASA Astrophysics Data System (ADS)
Muzzio, Ryan; Avilez, Ian; Prato, Lisa A.; Biddle, Lauren I.; Allen, Thomas; Wright-Garba, Nuria Meilani Laure; Wittal, Matthew
2017-01-01
We have observed about 100 multi-star systems, within the star forming regions Taurus and Ophiuchus, to investigate the individual stellar and circumstellar properties of both components in young T Tauri binaries. Near-infrared spectra were collected using the Keck II telescope’s NIRSPEC spectrograph and imaging data were taken with Keck II’s NIRC2 camera, both behind adaptive optics. Some properties are straightforward to measure; however, determining effective temperature is challenging as the standard method of estimating spectral type and relating spectral type to effective temperature can be subjective and unreliable. We explicitly looked for a relationship between effective temperatures empirically determined in Mann et al. (2015) and equivalent width ratios of H-band Fe and OH lines for main sequence spectral type templates common to both our infrared observations and to the sample of Mann et al. We find a fit for a wide range of temperatures and are currently testing the validity of using this method as a way to determine effective temperature robustly. Support for this research was provided by an REU supplement to NSF award AST-1313399.
Wade, Kimberley A; Nash, Robert A; Lindsay, D Stephen
2018-05-01
Wixted, Mickes, and Fisher (this issue) take issue with the common trope that eyewitness memory is inherently unreliable. They draw on a large body of mock-crime research and a small number of field studies, which indicate that high-confidence eyewitness reports are usually accurate, at least when memory is uncontaminated and suitable interviewing procedures are used. We agree with the thrust of Wixted et al.'s argument and welcome their invitation to confront the mass underselling of eyewitnesses' potential reliability. Nevertheless, we argue that there is a comparable risk of overselling eyewitnesses' reliability. Wixted et al.'s reasoning implies that near-pristine conditions or uncontaminated memories are normative, but there are at least two good reasons to doubt this. First, psychological science does not yet offer a good understanding of how often and when eyewitness interviews might deviate from best practice in ways that compromise the accuracy of witnesses' reports. Second, witnesses may frequently be exposed to preinterview influences that could corrupt reports obtained in best-practice interviews.
Out of the corner of my eye: Foveal semantic load modulates parafoveal processing in reading
Payne, Brennan R.; Stites, Mallory C.; Federmeier, Kara D.
2016-01-01
In two experiments, we examined the impact of foveal semantic expectancy and congruity on parafoveal word processing during reading. Experiment 1 utilized an eye-tracking gaze contingent display change paradigm, and Experiment 2 measured event-related brain potentials (ERP) in a modified flanker RSVP paradigm. Eye-tracking and ERP data converged to reveal graded effects of foveal load on parafoveal processing. In Experiment 1, when word n was highly expected, and thus foveal load was low, there was a large parafoveal preview benefit to word n + 1. When word n was unexpected but still plausible, preview benefits to n + 1 were reduced in magnitude, and when word n was semantically incongruent, the preview benefit to n + 1 was unreliable in early-pass measures. In Experiment 2, ERPs indicated that when word n was expected, and thus foveal load on was low, readers successfully discriminated between valid and orthographically invalid previews during parafoveal perception. However, when word n was unexpected, parafoveal processing of n + 1 was reduced, and it was eliminated when word n was semantically incongruent. Taken together, these findings suggest that sentential context modulates the allocation of attention in the parafovea, such that covert allocation of attention to parafoveal processing is disrupted when foveal words are inconsistent with expectations based on various contextual constraints. PMID:27428778
Out of the corner of my eye: Foveal semantic load modulates parafoveal processing in reading.
Payne, Brennan R; Stites, Mallory C; Federmeier, Kara D
2016-11-01
In 2 experiments, we examined the impact of foveal semantic expectancy and congruity on parafoveal word processing during reading. Experiment 1 utilized an eye-tracking gaze-contingent display change paradigm, and Experiment 2 measured event-related brain potentials (ERPs) in a modified flanker rapid serial visual presentation (RSVP) paradigm. Eye-tracking and ERP data converged to reveal graded effects of foveal load on parafoveal processing. In Experiment 1, when word n was highly expected, and thus foveal load was low, there was a large parafoveal preview benefit to word n + 1. When word n was unexpected but still plausible, preview benefits to n + 1 were reduced in magnitude, and when word n was semantically incongruent, the preview benefit to n + 1 was unreliable in early pass measures. In Experiment 2, ERPs indicated that when word n was expected, and thus foveal load was low, readers successfully discriminated between valid and orthographically invalid previews during parafoveal perception. However, when word n was unexpected, parafoveal processing of n + 1 was reduced, and it was eliminated when word n was semantically incongruent. Taken together, these findings suggest that sentential context modulates the allocation of attention in the parafovea, such that covert allocation of attention to parafoveal processing is disrupted when foveal words are inconsistent with expectations based on various contextual constraints. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
USDA-ARS?s Scientific Manuscript database
catfish propagation for decades has been dependent on random mating of male and female channel catfish in ponds. It is simple and has been fairly successful in fulfilling the needs of the US farm-raised catfish industry. However, natural pond spawning is unreliable, unpredictable, and incurs 30 t...
USDA-ARS?s Scientific Manuscript database
Volatile fatty acid concentrations ([VFA], mM) have long been used to assess impact of dietary treatments on ruminal fermentation in vivo. However, discrepancies in statistical results between VFA and VFA pool size (VFAmol), possibly related to ruminal digesta liquid amount (LIQ, kg), suggest issues...
How reliable are amphibian population metrics? A response to Kroll et al.
Hartwell H. Welsh; Karen L. Pope; Clara A. Wheeler
2009-01-01
Kroll et al. [Kroll, A.J., Runge, J.P., MacCracken, J.G., 2009. Unreliable amphibian population metrics may obfuscate more than they reveal. Biological Conservation] criticized our recent advocacy for combining readily attainable metrics of population status to gain insight about relationships between terrestrial plethodontid salamanders and forest succession [Welsh,...
Why Are Experts Correlated? Decomposing Correlations between Judges
ERIC Educational Resources Information Center
Broomell, Stephen B.; Budescu, David V.
2009-01-01
We derive an analytic model of the inter-judge correlation as a function of five underlying parameters. Inter-cue correlation and the number of cues capture our assumptions about the environment, while differentiations between cues, the weights attached to the cues, and (un)reliability describe assumptions about the judges. We study the relative…
Speaker Reliability Guides Children's Inductive Inferences about Novel Properties
ERIC Educational Resources Information Center
Kim, Sunae; Kalish, Charles W.; Harris, Paul L.
2012-01-01
Prior work shows that children can make inductive inferences about objects based on their labels rather than their appearance (Gelman, 2003). A separate line of research shows that children's trust in a speaker's label is selective. Children accept labels from a reliable speaker over an unreliable speaker (e.g., Koenig & Harris, 2005). In the…
Assessment and Placement: Supporting Student Success in College Gateway Courses
ERIC Educational Resources Information Center
Vandal, Bruce
2014-01-01
Evidence is mounting that the vast majority of students who are currently placed into prerequisite remedial education could be successful in gateway college-level courses if they receive additional academic support as a corequisite. Recent research on college placement exams reveals that the exams are unreliable at predicting college success, and…
Manipulating Public Opinion about Trying Juveniles as Adults: An Experimental Study
ERIC Educational Resources Information Center
Steinberg, Laurence; Piquero, Alex R.
2010-01-01
Public attitudes about juvenile crime play a significant role in fashioning juvenile justice policy; variations in the wording of public opinion surveys can produce very different responses and can result in inaccurate and unreliable assessments of public sentiment. Surveys that ask about policy alternatives in vague terms are especially…
ERIC Educational Resources Information Center
Malcarney, Mary-Beth; Horton, Katherine; Seiler, Naomi
2016-01-01
Background: School nurses can provide direct services for children with asthma, educate, and reinforce treatment recommendations to children and their families, and coordinate the school-wide response to students' asthma emergencies. Unfortunately, school-based health services today depend on an unreliable patchwork of funding. Limited state and…
Autocheck: Addressing the Problem of Rural Transportation.
ERIC Educational Resources Information Center
Payne, Guy A.
This paper describes a project implemented by a social worker from the Glynn County School District in rural Georgia to address transportation problems experienced by students and their families. The project aims to assist families who are unable to keep appointments or attend other important events due to unreliable transportation. A county needs…
A New Framework of Happiness Survey and Evaluation of National Wellbeing
ERIC Educational Resources Information Center
Zhou, Haiou
2012-01-01
Happiness surveys based on self-reporting may generate unreliable data due to respondents' imperfect retrospection, vulnerability to context and arbitrariness in measuring happiness. To overcome these problems, this paper proposes to combine a happiness evaluation method developed by Ng (Soc Indic Res, 38:1-29, 1996) with the day reconstruction…
Taking Teacher Quality Seriously: A Collaborative Approach to Teacher Evaluation
ERIC Educational Resources Information Center
Karp, Stan
2012-01-01
If narrow, test-based evaluation of teachers is unfair, unreliable, and has negative effects on kids, classrooms, and curricula, what's a better approach? By demonizing teachers and unions, and sharply polarizing the education debate, the corporate reform movement has actually undermined serious efforts to improve teacher quality and evaluation.…
ERIC Educational Resources Information Center
Scofield, Jason; Gilpin, Ansley Tullos; Pierucci, Jillian; Morgan, Reed
2013-01-01
Studies show that children trust previously reliable sources over previously unreliable ones (e.g., Koenig, Clement, & Harris, 2004). However, it is unclear from these studies whether children rely on accuracy or conventionality to determine the reliability and, ultimately, the trustworthiness of a particular source. In the current study, 3- and…
Surge in Journal Retractions May Mask Decline in Actual Problems
ERIC Educational Resources Information Center
Basken, Paul
2012-01-01
Scientific journals have been retracting unreliable articles at rapidly escalating rates in the past few years, raising concern about whether research faces a burgeoning ethical crisis. Various causes have been suspected, with the common theme being that journals are seeing more cases of plagiarism and fudging of data as researchers and editors…
The Unreliability of References
ERIC Educational Resources Information Center
Barden, Dennis M.
2008-01-01
When search consultants, like the author, are invited to propose their services in support of a college or university seeking new leadership, they are generally asked a fairly standard set of questions. But there is one question that they find among the most difficult to answer: How do they check a candidate's references to ensure that they know…
Identifying Personality Disorders that are Security Risks: Field Test Results
2011-09-01
clinical personality disorders, namely psychopathy, malignant narcissism , and borderline personality organization, can increase the likelihood of...ratings indicated that three personality disorders, psychopathy, malignant narcissism , and borderline personality organization, were associated with...certain clinical personality disorders and unreliable and unsafe behavior in the workplace, disorders such as psychopathy and malignant narcissism
Forest Ecosystem Services As Production Inputs
Subhrendu Pattanayak; David T. Butry
2003-01-01
Are we cutting down tropical forests too rapidly and too extensively? If so, why? Answers to both questions are obscured in some ways by insufficient and unreliable data on the economic worth of forest ecosystem services. It is clear, however, that rapid, excessive cutting of forests can irreversibly and substantively impair ecosystem functions, thereby endangering the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-13
... Station Unit 7. The scrubber adds moisture to the exhaust gas, which condenses as the gas stream cools. According to Indiana Department of Environmental Management (IDEM), the condensation causes unreliable... impairment caused by particulate and light impairment caused by moisture. The scrubber also removes some PM...
The Use of Structure Coefficients to Address Multicollinearity in Sport and Exercise Science
ERIC Educational Resources Information Center
Yeatts, Paul E.; Barton, Mitch; Henson, Robin K.; Martin, Scott B.
2017-01-01
A common practice in general linear model (GLM) analyses is to interpret regression coefficients (e.g., standardized ß weights) as indicators of variable importance. However, focusing solely on standardized beta weights may provide limited or erroneous information. For example, ß weights become increasingly unreliable when predictor variables are…
A method for polycrystalline silicon delineation applicable to a double-diffused MOS transistor
NASA Technical Reports Server (NTRS)
Halsor, J. L.; Lin, H. C.
1974-01-01
Method is simple and eliminates requirement for unreliable special etchants. Structure is graded in resistivity to prevent punch-through and has very narrow channel length to increase frequency response. Contacts are on top to permit planar integrated circuit structure. Polycrystalline shield will prevent creation of inversion layer in isolated region.
An identifiable model for informative censoring
Link, W.A.; Wegman, E.J.; Gantz, D.T.; Miller, J.J.
1988-01-01
The usual model for censored survival analysis requires the assumption that censoring of observations arises only due to causes unrelated to the lifetime under consideration. It is easy to envision situations in which this assumption is unwarranted, and in which use of the Kaplan-Meier estimator and associated techniques will lead to unreliable analyses.
ERIC Educational Resources Information Center
Barden, Dennis M.
2008-01-01
There are two kinds of references in administrative hires. The most customary is the "on list" references which a candidate asks one to provide. The second kind of reference is the "off list" variety, of which there are two types. Typical is the call one receives from an acquaintance at the hiring institution asking for the "dirt" on one's…
Speaker Reliability in Preschoolers' Inferences about the Meanings of Novel Words
ERIC Educational Resources Information Center
Sobel, David M.; Sedivy, Julie; Buchanan, David W.; Hennessy, Rachel
2012-01-01
Preschoolers participated in a modified version of the disambiguation task, designed to test whether the pragmatic environment generated by a reliable or unreliable speaker affected how children interpreted novel labels. Two objects were visible to children, while a third was only visible to the speaker (a fact known by the child). Manipulating…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-09
... to produce a descriptive database of existing ferry operations. Recently enacted MAP-21 legislation... Administration (FHWA) Office of Intermodal and Statewide Planning conducted a survey of approximately 250 ferry... designed to target ridership and terminal information that typically produce unreliable and/or incomplete...
Reported last menstrual period (LMP) is commonly used to estimate gestational age (GA) but may be unreliable. Ultrasound in the first trimester is generally considered a highly accurate method of pregnancy dating. The authors compared first trimester report of LMP and first trime...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
... likely to worsen, making travel times unreliable. In addition, space constraints limit the potential to... activity is expected to generate increased travel demand. By 2040, statewide population is expected to grow... continuing transportation challenges as evidenced by the following: Constrained Travel Options--While the...
Ghosts in the Machine: Incarcerated Students and the Digital University
ERIC Educational Resources Information Center
Hopkins, Susan
2015-01-01
Providing higher education to offenders in custody has become an increasingly complex business in the age of digital learning. Most Australian prisoners still have no direct access to the internet and relatively unreliable access to information technology. As incarceration is now a business, prisons, like universities, are increasingly subject to…
Preschoolers' Understanding of Subtraction-Related Principles
ERIC Educational Resources Information Center
Baroody, Arthur J.; Lai, Meng-lung; Li, Xia; Baroody, Alison E.
2009-01-01
Little research has focused on an informal understanding of subtractive negation (e.g., 3 - 3 = 0) and subtractive identity (e.g., 3 - 0 = 3). Previous research indicates that preschoolers may have a fragile (i.e., unreliable or localized) understanding of the addition-subtraction inverse principle (e.g., 2 + 1 - 1 = 2). Recognition of a small…
A public health decision support system model using reasoning methods.
Mera, Maritza; González, Carolina; Blobel, Bernd
2015-01-01
Public health programs must be based on the real health needs of the population. However, the design of efficient and effective public health programs is subject to availability of information that can allow users to identify, at the right time, the health issues that require special attention. The objective of this paper is to propose a case-based reasoning model for the support of decision-making in public health. The model integrates a decision-making process and case-based reasoning, reusing past experiences for promptly identifying new population health priorities. A prototype implementation of the model was performed, deploying the case-based reasoning framework jColibri. The proposed model contributes to solve problems found today when designing public health programs in Colombia. Current programs are developed under uncertain environments, as the underlying analyses are carried out on the basis of outdated and unreliable data.
Low-noise encoding of active touch by layer 4 in the somatosensory cortex.
Hires, Samuel Andrew; Gutnisky, Diego A; Yu, Jianing; O'Connor, Daniel H; Svoboda, Karel
2015-08-06
Cortical spike trains often appear noisy, with the timing and number of spikes varying across repetitions of stimuli. Spiking variability can arise from internal (behavioral state, unreliable neurons, or chaotic dynamics in neural circuits) and external (uncontrolled behavior or sensory stimuli) sources. The amount of irreducible internal noise in spike trains, an important constraint on models of cortical networks, has been difficult to estimate, since behavior and brain state must be precisely controlled or tracked. We recorded from excitatory barrel cortex neurons in layer 4 during active behavior, where mice control tactile input through learned whisker movements. Touch was the dominant sensorimotor feature, with >70% spikes occurring in millisecond timescale epochs after touch onset. The variance of touch responses was smaller than expected from Poisson processes, often reaching the theoretical minimum. Layer 4 spike trains thus reflect the millisecond-timescale structure of tactile input with little noise.
Highly repeatable nanoscale phase coexistence in vanadium dioxide films
NASA Astrophysics Data System (ADS)
Huffman, T. J.; Lahneman, D. J.; Wang, S. L.; Slusar, T.; Kim, Bong-Jun; Kim, Hyun-Tak; Qazilbash, M. M.
2018-02-01
It is generally believed that in first-order phase transitions in materials with imperfections, the formation of phase domains must be affected to some extent by stochastic (probabilistic) processes. The stochasticity would lead to unreliable performance in nanoscale devices that have the potential to exploit the transformation of physical properties in a phase transition. Here we show that stochasticity at nanometer length scales is completely suppressed in the thermally driven metal-insulator transition (MIT) in sputtered vanadium dioxide (V O2 ) films. The nucleation and growth of domain patterns of metallic and insulating phases occur in a strikingly reproducible way. The completely deterministic nature of domain formation and growth in films with imperfections is a fundamental and unexpected finding about the kinetics of this material. Moreover, it opens the door for realizing reliable nanoscale devices based on the MIT in V O2 and similar phase-change materials.
Interdependent networks: the fragility of control
Morris, Richard G.; Barthelemy, Marc
2013-01-01
Recent work in the area of interdependent networks has focused on interactions between two systems of the same type. However, an important and ubiquitous class of systems are those involving monitoring and control, an example of interdependence between processes that are very different. In this Article, we introduce a framework for modelling ‘distributed supervisory control' in the guise of an electrical network supervised by a distributed system of control devices. The system is characterised by degrees of freedom salient to real-world systems— namely, the number of control devices, their inherent reliability, and the topology of the control network. Surprisingly, the behavior of the system depends crucially on the reliability of control devices. When devices are completely reliable, cascade sizes are percolation controlled; the number of devices being the relevant parameter. For unreliable devices, the topology of the control network is important and can dramatically reduce the resilience of the system. PMID:24067404
Automation of Hessian-Based Tubularity Measure Response Function in 3D Biomedical Images.
Dzyubak, Oleksandr P; Ritman, Erik L
2011-01-01
The blood vessels and nerve trees consist of tubular objects interconnected into a complex tree- or web-like structure that has a range of structural scale 5 μm diameter capillaries to 3 cm aorta. This large-scale range presents two major problems; one is just making the measurements, and the other is the exponential increase of component numbers with decreasing scale. With the remarkable increase in the volume imaged by, and resolution of, modern day 3D imagers, it is almost impossible to make manual tracking of the complex multiscale parameters from those large image data sets. In addition, the manual tracking is quite subjective and unreliable. We propose a solution for automation of an adaptive nonsupervised system for tracking tubular objects based on multiscale framework and use of Hessian-based object shape detector incorporating National Library of Medicine Insight Segmentation and Registration Toolkit (ITK) image processing libraries.
Plan-graph Based Heuristics for Conformant Probabilistic Planning
NASA Technical Reports Server (NTRS)
Ramakrishnan, Salesh; Pollack, Martha E.; Smith, David E.
2004-01-01
In this paper, we introduce plan-graph based heuristics to solve a variation of the conformant probabilistic planning (CPP) problem. In many real-world problems, it is the case that the sensors are unreliable or take too many resources to provide knowledge about the environment. These domains are better modeled as conformant planning problems. POMDP based techniques are currently the most successful approach for solving CPP but have the limitation of state- space explosion. Recent advances in deterministic and conformant planning have shown that plan-graphs can be used to enhance the performance significantly. We show that this enhancement can also be translated to CPP. We describe our process for developing the plan-graph heuristics and estimating the probability of a partial plan. We compare the performance of our planner PVHPOP when used with different heuristics. We also perform a comparison with a POMDP solver to show over a order of magnitude improvement in performance.
The emergence of asymmetric normal fault systems under symmetric boundary conditions
NASA Astrophysics Data System (ADS)
Schöpfer, Martin P. J.; Childs, Conrad; Manzocchi, Tom; Walsh, John J.; Nicol, Andrew; Grasemann, Bernhard
2017-11-01
Many normal fault systems and, on a smaller scale, fracture boudinage often exhibit asymmetry with one fault dip direction dominating. It is a common belief that the formation of domino and shear band boudinage with a monoclinic symmetry requires a component of layer parallel shearing. Moreover, domains of parallel faults are frequently used to infer the presence of a décollement. Using Distinct Element Method (DEM) modelling we show, that asymmetric fault systems can emerge under symmetric boundary conditions. A statistical analysis of DEM models suggests that the fault dip directions and system polarities can be explained using a random process if the strength contrast between the brittle layer and the surrounding material is high. The models indicate that domino and shear band boudinage are unreliable shear-sense indicators. Moreover, the presence of a décollement should not be inferred on the basis of a domain of parallel faults alone.
Learning under uncertainty in smart home environments.
Zhang, Shuai; McClean, Sally; Scotney, Bryan; Nugent, Chris
2008-01-01
Technologies and services for the home environment can provide levels of independence for elderly people to support 'ageing in place'. Learning inhabitants' patterns of carrying out daily activities is a crucial component of these technological solutions with sensor technologies being at the core of such smart environments. Nevertheless, identifying high-level activities from low-level sensor events can be a challenge, as information may be unreliable resulting in incomplete data. Our work addresses the issues of learning in the presence of incomplete data along with the identification and the prediction of inhabitants and their activities under such uncertainty. We show via the evaluation results that our approach also offers the ability to assess the impact of various sensors in the activity recognition process. The benefit of this work is that future predictions can be utilised in a proposed intervention mechanism in a real smart home environment.
Halwas, Nikolaus; Griebel, Lena; Huebner, Jutta
2017-11-01
The aim of our study was to investigate Internet and eHealth usage, with respect to eHealth literacy, by cancer patients and their relatives. Using a standardized questionnaire we asked patients who attended lectures on complementary medicine in 2016. We received 142 questionnaires. The frequency of general Internet usage was directly associated with younger age and better Internet connection. Younger participants were not only more confident in allocating health-related Internet information into reliable or unreliable facts, but also more confident and capable of gaining medical knowledge through eHealth services. A regular use of eHealth services facilitated the decision-making process. Reading ability was associated with a better understanding regarding eHealth offers. In a modern health care system, emphasis should be on skills contributing to eHealth literacy among patients to improve their ability to profit from eHealth offers and improve health care.
Lee, Seung Soo; Kim, In Ho
2013-12-01
Primary gastric lymphoma is a rare gastric malignancy. Its diagnostic process is complex. Clinician may find initial diagnosis of primary gastric lymphoma unreliable, especially when it indicates the rarest subtype of gastric lymphoma, while its initial endoscopic presentation fails to raise the slightest suspicion of primary gastric lymphoma. A 53-year-old Korean man was diagnosed, by endoscopic examination, with a round submucosal tumor of the stomach. Deep endoscopic biopsy, however, confirmed CD5 positive gastric lymphoma. Surgical treatment was performed for diagnosis and treatment. Postoperative histo-logical examination confirmed gastric schwannoma. Gastric schwannoma is a spindle cell tumor, characterized by a peripheral cuff-like lymphocytic infiltration. Deep endoscopic biopsy may have been misdirected to the peripheral lymphoid cuff, failing to acquire spindle cells. The literature has been reviewed, and options for diagnostic accuracy have been suggested.
Planas, M; Rodríguez, T; Lecha, M
2004-01-01
Decisions have to be made about what data on patient characteristics and processes and outcome need to be collected, and standard definitions of these data items need to be developed to identify data quality concerns as promptly as possible and to establish ways to improve data quality. The usefulness of any clinical database depends strongly on the quality of the collected data. If the data quality is poor, the results of studies using the database might be biased and unreliable. Furthermore, if the quality of the database has not been verified, the results might be given little credence, especially if they are unwelcome or unexpected. To assure the quality of clinical database is essential the clear definition of the uses to which the database is going to be put; the database should to be developed that is comprehensive in terms of its usefulness but limited in its size.
Bidirectional Active Learning: A Two-Way Exploration Into Unlabeled and Labeled Data Set.
Zhang, Xiao-Yu; Wang, Shupeng; Yun, Xiaochun
2015-12-01
In practical machine learning applications, human instruction is indispensable for model construction. To utilize the precious labeling effort effectively, active learning queries the user with selective sampling in an interactive way. Traditional active learning techniques merely focus on the unlabeled data set under a unidirectional exploration framework and suffer from model deterioration in the presence of noise. To address this problem, this paper proposes a novel bidirectional active learning algorithm that explores into both unlabeled and labeled data sets simultaneously in a two-way process. For the acquisition of new knowledge, forward learning queries the most informative instances from unlabeled data set. For the introspection of learned knowledge, backward learning detects the most suspiciously unreliable instances within the labeled data set. Under the two-way exploration framework, the generalization ability of the learning model can be greatly improved, which is demonstrated by the encouraging experimental results.
Strategies for the coupling of global and local crystal growth models
NASA Astrophysics Data System (ADS)
Derby, Jeffrey J.; Lun, Lisa; Yeckel, Andrew
2007-05-01
The modular coupling of existing numerical codes to model crystal growth processes will provide for maximum effectiveness, capability, and flexibility. However, significant challenges are posed to make these coupled models mathematically self-consistent and algorithmically robust. This paper presents sample results from a coupling of the CrysVUn code, used here to compute furnace-scale heat transfer, and Cats2D, used to calculate melt fluid dynamics and phase-change phenomena, to form a global model for a Bridgman crystal growth system. However, the strategy used to implement the CrysVUn-Cats2D coupling is unreliable and inefficient. The implementation of under-relaxation within a block Gauss-Seidel iteration is shown to be ineffective for improving the coupling performance in a model one-dimensional problem representative of a melt crystal growth model. Ideas to overcome current convergence limitations using approximations to a full Newton iteration method are discussed.
Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H
2013-08-01
Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.
Design of interstellar digital communication links: Some insights from communication engineering
NASA Astrophysics Data System (ADS)
Messerschmitt, David G.; Morrison, Ian S.
2012-09-01
The design of an end-to-end digital interstellar communication system at radio frequencies is discussed, drawing on the disciplines of digital communication engineering and computer network engineering in terrestrial and near-space applications. One goal is a roadmap to the design of such systems, aimed at future designers of either receivers (SETI) or transmitters (METI). In particular we emphasize the implications arising from the impossibility of coordination between transmitter and receiver prior to a receiver's search for a signal. A system architecture based on layering, as commonly used in network and software design, assists in organizing and categorizing the various design issues and identifying dependencies. Implications of impairments introduced in the interstellar medium, such as dispersion, scattering, Doppler, noise, and signal attenuation are discussed. Less fundamental (but nevertheless influential) design issues are the motivations of the transmitter designers and associated resource requirements at both transmitter and receiver. Unreliability is inevitably imposed by non-idealities in the physical communication channel, and this unreliability will have substantial implications for those seeking to convey interstellar messages.
The N-policy for an unreliable server with delaying repair and two phases of service
NASA Astrophysics Data System (ADS)
Choudhury, Gautam; Ke, Jau-Chuan; Tadj, Lotfi
2009-09-01
This paper deals with an MX/G/1 with an additional second phase of optional service and unreliable server, which consist of a breakdown period and a delay period under N-policy. While the server is working with any phase of service, it may break down at any instant and the service channel will fail for a short interval of time. Further concept of the delay time is also introduced. If no customer arrives during the breakdown period, the server becomes idle in the system until the queue size builds up to a threshold value . As soon as the queue size becomes at least N, the server immediately begins to serve the first phase of regular service to all the waiting customers. After the completion of which, only some of them receive the second phase of the optional service. We derive the queue size distribution at a random epoch and departure epoch as well as various system performance measures. Finally we derive a simple procedure to obtain optimal stationary policy under a suitable linear cost structure.
Augmented reality-based electrode guidance system for reliable electroencephalography.
Song, Chanho; Jeon, Sangseo; Lee, Seongpung; Ha, Ho-Gun; Kim, Jonghyun; Hong, Jaesung
2018-05-24
In longitudinal electroencephalography (EEG) studies, repeatable electrode positioning is essential for reliable EEG assessment. Conventional methods use anatomical landmarks as fiducial locations for the electrode placement. Since the landmarks are manually identified, the EEG assessment is inevitably unreliable because of individual variations among the subjects and the examiners. To overcome this unreliability, an augmented reality (AR) visualization-based electrode guidance system was proposed. The proposed electrode guidance system is based on AR visualization to replace the manual electrode positioning. After scanning and registration of the facial surface of a subject by an RGB-D camera, the AR of the initial electrode positions as reference positions is overlapped with the current electrode positions in real time. Thus, it can guide the position of the subsequently placed electrodes with high repeatability. The experimental results with the phantom show that the repeatability of the electrode positioning was improved compared to that of the conventional 10-20 positioning system. The proposed AR guidance system improves the electrode positioning performance with a cost-effective system, which uses only RGB-D camera. This system can be used as an alternative to the international 10-20 system.
Anomaly detection for machine learning redshifts applied to SDSS galaxies
NASA Astrophysics Data System (ADS)
Hoyle, Ben; Rau, Markus Michael; Paech, Kerstin; Bonnett, Christopher; Seitz, Stella; Weller, Jochen
2015-10-01
We present an analysis of anomaly detection for machine learning redshift estimation. Anomaly detection allows the removal of poor training examples, which can adversely influence redshift estimates. Anomalous training examples may be photometric galaxies with incorrect spectroscopic redshifts, or galaxies with one or more poorly measured photometric quantity. We select 2.5 million `clean' SDSS DR12 galaxies with reliable spectroscopic redshifts, and 6730 `anomalous' galaxies with spectroscopic redshift measurements which are flagged as unreliable. We contaminate the clean base galaxy sample with galaxies with unreliable redshifts and attempt to recover the contaminating galaxies using the Elliptical Envelope technique. We then train four machine learning architectures for redshift analysis on both the contaminated sample and on the preprocessed `anomaly-removed' sample and measure redshift statistics on a clean validation sample generated without any preprocessing. We find an improvement on all measured statistics of up to 80 per cent when training on the anomaly removed sample as compared with training on the contaminated sample for each of the machine learning routines explored. We further describe a method to estimate the contamination fraction of a base data sample.
Water system unreliability and diarrhea incidence among children in Guatemala.
Trudeau, Jennifer; Aksan, Anna-Maria; Vásquez, William F
2018-03-01
This article examines the effect of water system unreliability on diarrhea incidence among children aged 0-5 in Guatemala. We use secondary data from a nationally representative sample of 7579 children to estimate the effects of uninterrupted and interrupted water services on diarrhea incidence. The national scope of this study imposes some methodological challenges due to unobserved geographical heterogeneity. To address this issue, we estimate mixed-effects logit models that control for unobserved heterogeneity by estimating random effects of selected covariates that can vary across geographical areas (i.e. water system reliability). Compared to children without access to piped water, children with uninterrupted water services have a lower probability of diarrhea incidence by approximately 33 percentage points. Conversely, there is no differential effect between children without access and those with at least one day of service interruptions in the previous month. Results also confirm negative effects of age, female gender, spanish language, and garbage disposal on diarrhea incidence. Public health benefits of piped water are realized through uninterrupted provision of service, not merely access. Policy implications are discussed.
Metabolic incentives for dishonest signals of strength in the fiddler crab Uca vomeris.
Bywater, Candice L; White, Craig R; Wilson, Robbie S
2014-08-15
To reduce the potential costs of combat, animals may rely upon signals to resolve territorial disputes. Signals also provide a means for individuals to appear better than they actually are, deceiving opponents and gaining access to resources that would otherwise be unattainable. However, other than resource gains, incentives for dishonest signalling remain unexplored. In this study, we tested the idea that unreliable signallers pay lower metabolic costs for their signals, and that energetic savings could represent an incentive for cheating. We focused on two-toned fiddler crabs (Uca vomeris), a species that frequently uses its enlarged claws as signals of dominance to opponents. Previously, we found that regenerated U. vomeris claws are often large but weak (i.e. unreliable). Here, we found that the original claws of male U. vomeris consumed 43% more oxygen than weaker, regenerated claws, suggesting that muscle quantity drives variation in metabolic costs. Therefore, it seems that metabolic savings could provide a powerful incentive for dishonesty within fiddler crabs. © 2014. Published by The Company of Biologists Ltd.
The social and medical construction of lactation pathology.
Wolf, J H
2000-01-01
Beginning in the 1880s, many mothers reported breastfeeding difficulties. Doctors blamed the stress of urban life. The "bad" human milk invariably produced by the mammary glands of urban women, some physicians charged, harmed babies as surely as the dirty and adulterated cow's milk common to the late nineteenth-century city. Mothers and pediatricians proved unusually susceptible to believing this allegation. Mothers, just learning about the germ theory of disease and anxious about protecting their babies from unseen microbes, found themselves gratefully relying on "scientific" food rather than on their own, apparently faulty, bodies. And pediatricians no longer had to defend their new specialty. Now they could point to the need for improved artificial food-given women's growing inability to lactate-as one justification for their specialty's existence. Under the influence of these mothers and doctors, the notion that human lactation is an unreliable body function became a cultural truth that has persisted unabated to the present day.
Bayes plus Brass: Estimating Total Fertility for Many Small Areas from Sparse Census Data
Schmertmann, Carl P.; Cavenaghi, Suzana M.; Assunção, Renato M.; Potter, Joseph E.
2013-01-01
Small-area fertility estimates are valuable for analysing demographic change, and important for local planning and population projection. In countries lacking complete vital registration, however, small-area estimates are possible only from sparse survey or census data that are potentially unreliable. Such estimation requires new methods for old problems: procedures must be automated if thousands of estimates are required, they must deal with extreme sampling variability in many areas, and they should also incorporate corrections for possible data errors. We present a two-step algorithm for estimating total fertility in such circumstances, and we illustrate by applying the method to 2000 Brazilian Census data for over five thousand municipalities. Our proposed algorithm first smoothes local age-specific rates using Empirical Bayes methods, and then applies a new variant of Brass’s P/F parity correction procedure that is robust under conditions of rapid fertility decline. PMID:24143946
On Klatzky and Creswell (2014): saving social priming effects but losing science as we know it?
Schwartz, Barry
2015-05-01
Klatzky and Creswell (2014) offer an interpretation of the unreliability of social priming effects by analogizing them to what is known about the complexity of cross-modal transfer effects in perception. The complexity of these transfer effects arises because they are both multiply determined and stochastic. In this commentary, I argue that Klatzky and Creswell's thoughtful contribution raises the possibility that there might be deep and substantive limits to both the replicability and the generalizability of many of the phenomena that most interest psychologists, including social priming effects. Psychological phenomena largely governed by what Fodor (1983) called the "central system" may resist both replication and generalization by their very nature and not because of weak and underpowered experimental methods. With such phenomena, science might give us very good tools for explanation, but not for prediction (replication). © The Author(s) 2015.
The design of multi temperature and humidity monitoring system for incubator
NASA Astrophysics Data System (ADS)
Yu, Junyu; Xu, Peng; Peng, Zitao; Qiang, Haonan; Shen, Xiaoyan
2017-01-01
Currently, there is only one monitor of the temperature and humidity in an incubator, which may cause inaccurate or unreliable data, and even endanger the life safety of the baby. In order to solve this problem,we designed a multi-point temperature and humidity monitoring system for incubators. The system uses the STC12C5A60S2 microcontrollers as the sender core chip which is connected to four AM2321 temperature and humidity sensors. We select STM32F103ZET6 core development board as the receiving end,cooperating with Zigbee wireless transmitting and receiving module to realize data acquisition and transmission. This design can realize remote real-time observation data on the computer by communicating with PC via Ethernet. Prototype tests show that the system can effectively collect and display the information of temperature and humidity of multiple incubators at the same time and there are four monitors in each incubator.