Sample records for nanocolloid yields reliable

  1. Nanocolloids in Natural Water: Isolation, Characterization, and Toxicity.

    PubMed

    Ouyang, Shaohu; Hu, Xiangang; Zhou, Qixing; Li, Xiaokang; Miao, Xinyu; Zhou, Ruiren

    2018-04-17

    Nanocolloids are widespread in natural water systems, but their characterization and ecological risks are largely unknown. Herein, tangential flow ultrafiltration (TFU) was used to separate and concentrate nanocolloids from surface waters. Unexpectedly, nanocolloids were present in high concentrations ranging from 3.7 to 7.2 mg/L in the surface waters of the Harihe River in Tianjin City, China. Most of the nanocolloids were 10-40 nm in size, contained various trace metals and polycyclic aromatic hydrocarbons, and exhibited fluorescence properties. Envelopment effects and aggregation of Chlorella vulgaris in the presence of nanocolloids were observed. Nanocolloids entered cells and nanocolloid-exposed cells exhibited stronger plasmolysis, chloroplast damage and more starch grains than the control cells. Moreover, nanocolloids inhibited the cell growth, promoted reactive oxygen species (ROS), reduce the chlorophyll a content and increased the cell permeability. The genotoxicity of nanocolloids was also observed. The metabolomics analysis revealed a significant ( p < 0.05) downregulation of amino acids and upregulation of fatty acids contributing to ROS increase, chlorophyll a decrease and plasmolysis. The present work reveals that nanocolloids, which are different from specific, engineered nanoparticles (e.g., Ag nanoparticles), are present at high concentrations, exhibit an obvious toxicity in environments, and deserve more attention in the future.

  2. Synthesis of copper nanocolloids using a continuous flow based microreactor

    NASA Astrophysics Data System (ADS)

    Xu, Lei; Peng, Jinhui; Srinivasakannan, C.; Chen, Guo; Shen, Amy Q.

    2015-11-01

    The copper (Cu) nanocolloids were prepared by sodium borohydride (NaBH4) reduction of metal salt solutions in a T-shaped microreactor at room temperature. The influence of NaBH4 molar concentrations on copper particle's diameter, morphology, size distribution, and elemental compositions has been investigated by transmission electron microscopy (TEM) and X-ray diffraction (XRD). The ultraviolet-visible spectroscopy (UV-vis) was used to verify the chemical compounds of nanocolloids and estimate the average size of copper nanocolloids. The synthesized copper nanocolloids were uniform in size and non-oxidized. A decrease in the mean diameter of copper nanocolloids was observed with increasing NaBH4 molar concentrations. The maximum mean diameter (4.25 nm) occurred at the CuSO4/NaBH4 molar concentration ratio of 1:2.

  3. Effects of sulfide concentration and dissolved organic matter characteristics on the structure of nanocolloidal metacinnabar

    USGS Publications Warehouse

    Poulin, Brett; Gerbig, Chase A.; Kim, Christopher S.; Stegemeier, John P.; Ryan, Joseph N.; Aiken, George R.

    2017-01-01

    Understanding the speciation of divalent mercury (Hg(II)) in aquatic systems containing dissolved organic matter (DOM) and sulfide is necessary to predict the conversion of Hg(II) to bioavailable methylmercury. We used X-ray absorption spectroscopy to characterize the structural order of mercury in Hg(II)–DOM–sulfide systems for a range of sulfide concentration (1–100 μM), DOM aromaticity (specific ultraviolet absorbance (SUVA254)), and Hg(II)–DOM and Hg(II)–DOM–sulfide equilibration times (4–142 h). In all systems, Hg(II) was present as structurally disordered nanocolloidal metacinnabar (β-HgS). β-HgS nanocolloids were significantly smaller or less ordered at lower sulfide concentration, as indicated by under-coordination of Hg(II) in β-HgS. The size or structural order of β-HgS nanocolloids increased with increasing sulfide abundance and decreased with increasing SUVA254 of the DOM. The Hg(II)–DOM or Hg(II)–DOM–sulfide equilibration times did not significantly influence the extent of structural order in nanocolloidal β-HgS. Geochemical factors that control the structural order of nanocolloidal β-HgS, which are expected to influence nanocolloid surface reactivity and solubility, should be considered in the context of mercury bioavailability.

  4. Antifungal activity of multifunctional Fe 3O 4-Ag nanocolloids

    NASA Astrophysics Data System (ADS)

    Chudasama, Bhupendra; Vala, Anjana K.; Andhariya, Nidhi; Upadhyay, R. V.; Mehta, R. V.

    2011-05-01

    In recent years, rapid increase has been observed in the population of microbes that are resistant to conventionally used antibiotics. Antifungal drug therapy is no exception and now resistance to many of the antifungal agents in use has emerged. Therefore, there is an inevitable and urgent medical need for antibiotics with novel antimicrobial mechanisms. Aspergillus glaucus is the potential cause of fatal brain infections and hypersensitivity pneumonitis in immunocompromised patients and leads to death despite aggressive multidrug antifungal therapy. In the present article, we describe the antifungal activity of multifunctional core-shell Fe 3O 4-Ag nanocolloids against A. glaucus isolates. Controlled experiments are also carried out with Ag nanocolloids in order to understand the role of core (Fe 3O 4) in the antifungal action. The minimum inhibitory concentration (MIC) of nanocolloids is determined by the micro-dilution method. MIC of A. glaucus is 2000 μg/mL. The result is quite promising and requires further investigations in order to develop a treatment methodology against this death causing fungus in immunocompromised patients.

  5. Kinetic Monte Carlo simulation of nanoparticle film formation via nanocolloid drying

    NASA Astrophysics Data System (ADS)

    Kameya, Yuki

    2017-06-01

    A kinetic Monte Carlo simulation of nanoparticle film formation via nanocolloid drying is presented. The proposed two-dimensional model addresses the dynamics of nanoparticles in the vertical plane of a drying nanocolloid film. The gas-liquid interface movement due to solvent evaporation was controlled by a time-dependent chemical potential, and the resultant particle dynamics including Brownian diffusion and aggregate growth were calculated. Simulations were performed at various Peclet numbers defined based on the rate ratio of solvent evaporation and nanoparticle diffusion. At high Peclet numbers, nanoparticles accumulated at the top layer of the liquid film and eventually formed a skin layer, causing the formation of a particulate film with a densely packed structure. At low Peclet numbers, enhanced particle diffusion led to significant particle aggregation in the bulk colloid, and the resulting film structure became highly porous. The simulated results showed some typical characteristics of a drying nanocolloid that had been reported experimentally. Finally, the potential of the model as well as the remaining challenges are discussed.

  6. Zinc oxide nanocolloids prepared by picosecond pulsed laser ablation in water at different temperatures

    NASA Astrophysics Data System (ADS)

    D'Urso, Luisa; Spadaro, Salvatore; Bonsignore, Martina; Santangelo, Saveria; Compagnini, Giuseppe; Neri, Fortunato; Fazio, Enza

    2018-01-01

    Zinc oxide with wide direct band gap and high exciton binding energy is one of the most promising materials for ultraviolet (UV) light-emitting devices. It further exhibits good performance in the degradation of non-biodegradable pollutants under UV irradiation. In this work, zinc oxide (ZnO) and zinc oxide/gold (ZnO/Au) nanocolloids are prepared by picosecond pulsed laser ablation (ps-PLA), using a Zn and Au metallic targets in water media at room temperature (RT) and 80°C. ZnO and Au nanoparticles (NPs) with size in the 10-50 nm range are obtained at RT, while ZnO nanorods (NRs) are formed when water is maintained at 80°C during the ps-PLA process. Au NPs, added to ZnO colloids after the ablation process, decorate ZnO NRs. The crystalline phase of all ZnO nanocolloids is wurtzite. Methylene blue dye is used to investigate the photo-catalytic activity of all the synthesised nanocolloids, under UV light irradiation.

  7. Sentinel Lymphadenectomy in Vulvar Cancer Using Near-Infrared Fluorescence From Indocyanine Green Compared With Technetium 99m Nanocolloid.

    PubMed

    Soergel, Philipp; Hertel, Hermann; Nacke, Anna Kaarina; Klapdor, Rüdiger; Derlin, Thorsten; Hillemanns, Peter

    2017-05-01

    Nowadays, sentinel diagnostic is performed using technetium 99m (Tc) nanocolloid as a radioactive marker and sometimes patent blue. In the last years, indocyanine green has been evaluated for sentinel diagnostic in different tumor entities. Indocyanine green is a fluorescent molecule that emits a light signal in the near-infrared band after excitation. Our study aimed to evaluate indocyanine green compared with the criterion-standard Tc-nanocolloid. We included patients with primary, unifocal vulvar cancer of less than 4 cm with clinically node-negative groins in this prospective trial. Sentinel diagnostic was carried out using Tc-nanocolloid, indocyanine green, and patent blue. We examined each groin for light signals from the near-infrared band, for radioactivity, and for blue staining. A sentinel lymph node was defined as a Tc-nanocolloid-positive lymph node. All sentinel lymph nodes and all additional blue or fluorescent lymph nodes were excised and tested and then sent for histologic examination. In all, 27 patients were included in whom we found 91 sentinel lymph nodes in 52 groins. All these lymph nodes were positive for indocyanine green, also giving a sensitivity of 100% (95% confidence interval [CI], 96.0%-100%) compared with Tc-nanocolloid. Eight additional lymph nodes showed indocyanine green fluorescence but no Tc positivity, so that the positive predictive value was 91.9% (95% confidence interval, 84.6%-96.5%). In 1 patient, a false-negative sentinel missed by all 3 modalities was found. Our results show that indocyanine green is a promising approach for inguinal sentinel identification in vulvar cancer with a similar sensitivity as radioactive Tc-nanocolloid and worth to be evaluated in further studies.

  8. Abundance, size distributions and trace-element binding of organic and iron-rich nanocolloids in Alaskan rivers, as revealed by field-flow fractionation and ICP-MS

    NASA Astrophysics Data System (ADS)

    Stolpe, Björn; Guo, Laodong; Shiller, Alan M.; Aiken, George R.

    2013-03-01

    Water samples were collected from six small rivers in the Yukon River basin in central Alaska to examine the role of colloids and organic matter in the transport of trace elements in Northern high latitude watersheds influenced by permafrost. Concentrations of dissolved organic carbon (DOC), selected elements (Al, Si, Ca, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Rb, Sr, Ba, Pb, U), and UV-absorbance spectra were measured in 0.45 μm filtered samples. 'Nanocolloidal size distributions' (0.5-40 nm, hydrodynamic diameter) of humic-type and chromophoric dissolved organic matter (CDOM), Cr, Mn, Fe, Co, Ni, Cu, Zn, and Pb were determined by on-line coupling of flow field-flow fractionation (FFF) to detectors including UV-absorbance, fluorescence, and ICP-MS. Total dissolved and nanocolloidal concentrations of the elements varied considerably between the rivers and between spring flood and late summer base flow. Data on specific UV-absorbance (SUVA), spectral slopes, and the nanocolloidal fraction of the UV-absorbance indicated a decrease in aromaticity and size of CDOM from spring flood to late summer. The nanocolloidal size distributions indicated the presence of different 'components' of nanocolloids. 'Fulvic-rich nanocolloids' had a hydrodynamic diameter of 0.5-3 nm throughout the sampling season; 'organic/iron-rich nanocolloids' occurred in the <8 nm size range during the spring flood; whereas 'iron-rich nanocolloids' formed a discrete 4-40 nm components during summer base flow. Mn, Co, Ni, Cu and Zn were distributed between the nanocolloid components depending on the stability constant of the metal (+II)-organic complexes, while stronger association of Cr to the iron-rich nanocolloids was attributed to the higher oxidation states of Cr (+III or +IV). Changes in total dissolved element concentrations, size and composition of CDOM, and occurrence and size of organic/iron and iron-rich nanocolloids were related to variations in their sources from either the upper organic

  9. Fabricating TiO2 nanocolloids by electric spark discharge method at normal temperature and pressure.

    PubMed

    Tseng, Kuo-Hsiung; Chang, Chaur-Yang; Chung, Meng-Yun; Cheng, Ting-Shou

    2017-11-17

    In this study, TiO 2 nanocolloids were successfully fabricated in deionized water without using suspending agents through using the electric spark discharge method at room temperature and under normal atmospheric pressure. This method was exceptional because it did not create nanoparticle dispersion and the produced colloids contained no derivatives. The proposed method requires only traditional electrical discharge machines (EDMs), self-made magnetic stirrers, and Ti wires (purity, 99.99%). The EDM pulse on time (T on ) and pulse off time (T off ) were respectively set at 50 and 100 μs, 100 and 100 μs, 150 and 100 μs, and 200 and 100 μs to produce four types of TiO 2 nanocolloids. Zetasizer analysis of the nanocolloids showed that a decrease in T on increased the suspension stability, but there were no significant correlations between T on and particle size. Colloids produced from the four production configurations showed a minimum particle size between 29.39 and 52.85 nm and a zeta-potential between -51.2 and -46.8 mV, confirming that the method introduced in this study can be used to produce TiO 2 nanocolloids with excellent suspension stability. Scanning electron microscopy with energy dispersive spectroscopy also indicated that the TiO 2 colloids did not contain elements other than Ti and oxygen.

  10. Strategy for continuous improvement in IC manufacturability, yield, and reliability

    NASA Astrophysics Data System (ADS)

    Dreier, Dean J.; Berry, Mark; Schani, Phil; Phillips, Michael; Steinberg, Joe; DePinto, Gary

    1993-01-01

    Continual improvements in yield, reliability and manufacturability measure a fab and ultimately result in Total Customer Satisfaction. A new organizational and technical methodology for continuous defect reduction has been established in a formal feedback loop, which relies on yield and reliability, failed bit map analysis, analytical tools, inline monitoring, cross functional teams and a defect engineering group. The strategy requires the fastest detection, identification and implementation of possible corrective actions. Feedback cycle time is minimized at all points to improve yield and reliability and reduce costs, essential for competitiveness in the memory business. Payoff was a 9.4X reduction in defectivity and a 6.2X improvement in reliability of 256 K fast SRAMs over 20 months.

  11. Magnetization of Paraffin-Based Magnetic Nanocolloids

    NASA Astrophysics Data System (ADS)

    Dikanskii, Yu. I.; Ispiryan, A. G.; Kunikin, S. A.; Radionov, A. V.

    2018-01-01

    Using paraffin-based magnetic nanocolloids as an example, the reasons for maxima in the temperature dependence of the magnetic susceptibility of magnetic colloids have been discussed. The behavior of these dependences in a wide temperature interval has been analyzed for colloids in solid and liquid states. It has been concluded that the maximum observed at the melting point of paraffin can be attributed to freezing Brownian degrees of freedom in magnetite coarse particles, the magnetic moment of which is intimately related to the solid matrix. The second main maximum, which arises in the solid state, is explained by the superparamagnetic-magnetically hard transition of most fine particles at lower temperatures. It has been noted that the flatness of this maximum results from the polydispersity of the magnetic nanoparticle ensemble.

  12. Ultralow-intensity magneto-optical and mechanical effects in metal nanocolloids.

    PubMed

    Moocarme, M; Domínguez-Juárez, J L; Vuong, L T

    2014-03-12

    Magneto-plasmonics is a designation generally associated with ferromagnetic-plasmonic materials because such optical responses from nonmagnetic materials alone are considered weak. Here, we show that there exists a switching transition between linear and nonlinear magneto-optical behaviors in noble-metal nanocolloids that is observable at ultralow illumination intensities and direct current magnetic fields. The response is attributed to polarization-dependent nonzero-time-averaged plasmonic loops, vortex power flows, and nanoparticle magnetization. This work identifies significant mechanical effects that subsequently exist via magnetic-dipole interactions.

  13. Structural and dielectric behaviors of Bi4Ti3O12 - lyotropic liquid crystalline nanocolloids

    NASA Astrophysics Data System (ADS)

    Shukla, Ravi K.; Raina, K. K.

    2018-03-01

    We investigated the structural and dielectric dynamics of nanocolloids comprising lyotropic liquid crystals and bismuth titanate (Bi4Ti3O12) spherical nanoparticles (≈16-18 nm) of varying concentration 0.05 and 0.1 wt%. The lyotropic liquid crystalline mixture was prepared by a binary mixture of cetylpyridinuium chloride and ethylene glycol mixed in 5:95 wt% ratio. Binary lyotropic mixture exhibited hexagonal lyotropic phase. Structural and textural characterizations of nanocolloids infer that the nanoparticles were homogeneously dispersed in the liquid crystalline matrix and did not perturb the hexagonal ordering of the lyotropic phase. The dielectric constant and dielectric strength were found to be increased with the rise in the Bi4Ti3O12 nanoparticles concertation in the lyotropic matrix. A significant increase of one order was observed in the ac conductivity of colloidal systems as compared to the non-doped lyotropic liquid crystal. Relaxation parameters of the non-doped lyotropic liquid crystal and colloidal systems were computed and correlated with other parameters.

  14. Exploring how organic matter controls structural transformations in natural aquatic nanocolloidal dispersions.

    PubMed

    King, Stephen M; Jarvie, Helen P

    2012-07-03

    The response of the dispersion nanostructure of surface river bed sediment to the controlled removal and readdition of natural organic matter (NOM), in the absence and presence of background electrolyte, was examined using the technique of small-angle neutron scattering (SANS). Partial NOM removal induced aggregation of the mineral particles, but more extensive NOM removal restored colloidal stability. When peat humic acid (PHA) was added to a NOM-deficient sediment concentration-related structural transformations were observed: at 255 mg/L PHA aggregation of the nanocolloid was actually enhanced, but at 380 mg/L PHA disaggregation and colloidal stability were promoted. The addition of 2 mM CaCl(2) induced mild aggregation in the native sediment but not in sediments with added PHA, suggesting that the native NOM and the PHA respond differently to changes in ionic strength. A first attempt at using SANS to directly characterize the thickness and coverage of an adsorbed PHA layer in a natural nanocolloid is also presented. The results are discussed in the context of a hierarchical aquatic colloidal nanostructure, and the implications for contemporary studies of the role of dissolved organic carbon (DOC) in sustaining the transport of colloidal iron in upland catchments.

  15. Reliability of reservoir firm yield determined from the historical drought of record

    USGS Publications Warehouse

    Archfield, S.A.; Vogel, R.M.

    2005-01-01

    The firm yield of a reservoir is typically defined as the maximum yield that could have been delivered without failure during the historical drought of record. In the future, reservoirs will experience droughts that are either more or less severe than the historical drought of record. The question addressed here is what the reliability of such systems will be when operated at the firm yield. To address this question, we examine the reliability of 25 hypothetical reservoirs sited across five locations in the central and western United States. These locations provided a continuous 756-month streamflow record spanning the same time interval. The firm yield of each reservoir was estimated from the historical drought of record at each location. To determine the steady-state monthly reliability of each firm-yield estimate, 12,000-month synthetic records were generated using the moving-blocks bootstrap method. Bootstrapping was repeated 100 times for each reservoir to obtain an average steady-state monthly reliability R, the number of months the reservoir did not fail divided by the total months. Values of R were greater than 0.99 for 60 percent of the study reservoirs; the other 40 percent ranged from 0.95 to 0.98. Estimates of R were highly correlated with both the level of development (ratio of firm yield to average streamflow) and average lag-1 monthly autocorrelation. Together these two predictors explained 92 percent of the variability in R, with the level of development alone explaining 85 percent of the variability. Copyright ASCE 2005.

  16. Rational Self-Assembly of Nano-Colloids using DNA Interaction

    NASA Astrophysics Data System (ADS)

    Ung, Marie T.; Scarlett, Raynaldo; Sinno, Talid R.; Crocker, John C.

    2010-03-01

    DNA is an attractive tool to direct the rational self-assembly of nano-colloids since its interaction is specific and reversible. This tunable attractive interaction should lead to a diverse and rich phase diagram of higher ordered structures which would not otherwise be entropically favored.footnotetextTkachenko AV, Morphological Diversity of DNA-Colloidal Self-Assembly, Phys. Rev. Lett 89 (2002) We compare our latest experimental observations to a simulation framework that precisely replicates the experimental phase behavior and the crystal growth kinetics.footnotetextKim AJ, Scarlett R., Biancaniello PL, Sinno T, Crocker JC, Probing interfacial equilibration in microsphere crystals formed by DNA-directed assembly, Nature Materials 8, 52-55 (2009) We will discuss the crystallography of novel structures and address how particle size and heterogeneity affect nucleation and growth rates.

  17. Nanocolloidal gold-based immuno-dip strip assay for rapid detection of Sudan red I in food samples.

    PubMed

    Wang, Jia; Wang, Zhanhui; Liu, Jing; Li, Hao; Li, Qing X; Li, Ji; Xu, Ting

    2013-02-15

    A semiquantitative dip strip assay was developed using nanocolloidal gold-labelled monoclonal antibody (Mab) 8A10 for the rapid detection of Sudan red I in food samples. A protein-Sudan red I conjugate was coated on a nitro cellulose membrane strip in a defined test line. In flow of the complex of nanocolloidal gold labelled-Mab and Sudan red I along the strip, intensive red colour that was formed in the test line reflected the Sudan red I concentration. The test required 10 min and had a visual limit of detection of 10 ng/g Sudan red I in tomato sauce and chilli powder samples. The results of the strip assay agreed well with those of a high performance liquid chromatography method for both spiked and real commercial samples. The strip was stable for at least 2 months at 4°C. The strip assay offers the potential as a useful rapid and simple method for screening of Sudan red I in food samples. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Electroacoustic theory for concentrated colloids with overlapped DLs at arbitrary kappa alpha. I. Application to nanocolloids and nonaqueous colloids.

    PubMed

    Shilov, V N; Borkovskaja, Y B; Dukhin, A S

    2004-09-15

    Existing theories of electroacoustic phenomena in concentrated colloids neglect the possibility of double layer overlap and are valid mostly for the "thin double layer," when the double layer thickness is much less than the particle size. In this paper we present a new electroacoustic theory which removes this restriction. This would make this new theory applicable to characterizing a variety of aqueous nanocolloids and of nonaqueous dispersions. There are two versions of the theory leading to the analytical solutions. The first version corresponds to strongly overlapped diffuse layers (so-called quasi-homogeneous model). It yields a simple analytical formula for colloid vibration current (CVI), which is valid for arbitrary ultrasound frequency, but for restricted kappa alpha range. This version of the theory, as well the Smoluchowski theory for microelectrophoresis, is independent of particle shape and polydispersity. This makes it very attractive for practical use, with the hope that it might be as useful as classical Smoluchowski theory. In order to determine the kappa alpha range of the quasi-homogeneous model validity we develop the second version that limits ultrasound frequency, but applies no restriction on kappa alpha. The ultrasound frequency should substantially exceed the Maxwell-Wagner relaxation frequency. This limitation makes active conductivity related current negligible compared to the passive dielectric displacement current. It is possible to derive an expression for CVI in the concentrated dispersion as formulae inhering definite integrals with integrands depending on equilibrium potential distribution. This second version allowed us to estimate the ranges of the applicability of the first, quasi-homogeneous version. It turns out that the quasi-homogeneous model works for kappa alpha values up to almost 1. For instance, at volume fraction 30%, the highest kappa alpha limit of the quasi-homogeneous model is 0.65. Therefore, this version of the

  19. A first-in-man study of 68Ga-nanocolloid PET-CT sentinel lymph node imaging in prostate cancer demonstrates aberrant lymphatic drainage pathways.

    PubMed

    Doughton, Jacki A; Hofman, Michael S; Eu, Peter; Hicks, Rodney J; Williams, Scott G

    2018-05-04

    Purpose: To assess feasibility, safety and utility of a novel 68 Ga-nanocolloid radio-tracer with PET-CT lymphoscintigraphy for identification of sentinel lymph nodes (SLN). Methods: Pilot study of patients from a tertiary cancer hospital who required insertion of gold fiducials for prostate cancer radiation therapy. Participation did not affect cancer management. Ultrasound-guided transperineal intra-prostatic injection of PET tracer (iron oxide nanocolloid labelled with gallium-68) after placement of fiducials. PET-CT lymphoscintigraphy imaging at approximately 45 and 100 minutes after in-jection of tracer. The study was monitored using Bayesian trial design with the as-sumption that at least one sentinel lymph node (SLN) could be identified in at least two-thirds of cases with >80% confidence. Results: SLN identification was successful in all 5 participants, allowing completion of the pilot study as per protocol. No adverse effects were observed. Unexpected po-tential pathways for transit of malignant cells as well as expected regional drainage pathways were discovered. Rapid tracer drainage to pelvic bone, perivesical, mesorec-tal, inguinal and Virchow's nodes was identified. Conclusion: SLN identification using 68 Ga-nanocolloid PET-CT can be successfully performed. Non-traditional pathways of disease spread were identified including drainage to pelvic bone as well as perivesical, mesorectal, inguinal and Virchow's nodes. Prevalence of both aberrant and non-lymphatic pathways of spread should be further investigated with this technique. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  20. Adsorption of ions onto nanosolids dispersed in liquid crystals: Towards understanding the ion trapping effect in nanocolloids

    NASA Astrophysics Data System (ADS)

    Garbovskiy, Yuriy

    2016-05-01

    The ion capturing effect in liquid crystal nanocolloids was quantified by means of the ion trapping coefficient. The dependence of the ion trapping coefficient on the concentration of nano-dopants and their ionic purity was calculated for a variety of nanosolids dispersed in liquid crystals: carbon nanotubes, graphene nano-flakes, diamond nanoparticles, anatase nanoparticles, and ferroelectric nanoparticles. The proposed method perfectly fits existing experimental data and can be useful in the design of highly efficient ion capturing nanomaterials.

  1. Phase separations in mixtures of a liquid crystal and a nanocolloidal particle.

    PubMed

    Matsuyama, Akihiko

    2009-11-28

    We present a mean field theory to describe phase separations in mixtures of a liquid crystal and a nanocolloidal particle. By taking into account a nematic, a smectic A ordering of the liquid crystal, and a crystalline ordering of the nanoparticle, we calculate the phase diagrams on the temperature-concentration plane. We predict various phase separations, such as a smectic A-crystal phase separation and a smectic A-isotropic-crystal triple point, etc., depending on the interactions between the liquid crystal and the colloidal surface. Inside binodal curves, we find new unstable and metastable regions, which are important in the phase ordering dynamics. We also find a crystalline ordering of the nanoparticles dispersed in a smectic A phase and a nematic phase. The cooperative phenomena between liquid-crystalline ordering and crystalline ordering induce a variety of phase diagrams.

  2. Preparation of Nickel Cobalt Sulfide Hollow Nanocolloids with Enhanced Electrochemical Property for Supercapacitors Application

    PubMed Central

    Chen, Zhenhua; Wan, Zhanghui; Yang, Tiezhu; Zhao, Mengen; Lv, Xinyan; Wang, Hao; Ren, Xiuli; Mei, Xifan

    2016-01-01

    Nanostructured functional materials with hollow interiors are considered to be good candidates for a variety of advanced applications. However, synthesis of uniform hollow nanocolloids with porous texture via wet chemistry method is still challenging. In this work, nickel cobalt precursors (NCP) in sub-micron sized spheres have been synthesized by a facile solvothermal method. The subsequent sulfurization process in hydrothermal system has changed the NCP to nickel cobalt sulfide (NCS) with porous texture. Importantly, the hollow interiors can be tuned through the sulfurization process by employing different dosage of sulfur source. The derived NCS products have been fabricated into supercapacitor electrodes and their electrochemical performances are measured and compared, where promising results were found for the next-generation high-performance electrochemical capacitors. PMID:27114165

  3. Particle and surfactant interactions effected polar and dispersive components of interfacial energy in nanocolloids

    NASA Astrophysics Data System (ADS)

    Harikrishnan, A. R.; Das, Sarit K.; Agnihotri, Prabhat K.; Dhar, Purbarun

    2017-08-01

    We segregate and report experimentally for the first time the polar and dispersive interfacial energy components of complex nanocolloidal dispersions. In the present study, we introduce a novel inverse protocol for the classical Owens Wendt method to determine the constitutive polar and dispersive elements of surface tension in such multicomponent fluidic systems. The effect of nanoparticles alone and aqueous surfactants alone are studied independently to understand the role of the concentration of the dispersed phase in modulating the constitutive elements of surface energy in fluids. Surfactants are capable of altering the polar component, and the combined particle and surfactant nanodispersions are shown to be effective in modulating the polar and dispersive components of surface tension depending on the relative particle and surfactant concentrations as well as the morphological and electrostatic nature of the dispersed phases. We observe that the combined surfactant and particle colloid exhibits a similar behavior to that of the particle only case; however, the amount of modulation of the polar and dispersive constituents is found to be different from the particle alone case which brings to the forefront the mechanisms through which surfactants modulate interfacial energies in complex fluids. Accordingly, we are able to show that the observations can be merged into a form of quasi-universal trend in the trends of polar and dispersive components in spite of the non-universal character in the wetting behavior of the fluids. We analyze the different factors affecting the polar and dispersive interactions in such complex colloids, and the physics behind such complex interactions has been explained by appealing to the classical dispersion theories by London, Debye, and Keesom as well as by Derjaguin-Landau-Verwey-Overbeek theory. The findings shed light on the nature of wetting behavior of such complex fluids and help in predicting the wettability and the degree of

  4. Physics behind the oscillation of pressure tensor autocorrelation function for nanocolloidal dispersions.

    PubMed

    Wang, Tao; Wang, Xinwei; Luo, Zhongyang; Cen, Kefa

    2008-08-01

    In this work, extensive equilibrium molecular dynamics simulations are conducted to explore the physics behind the oscillation of pressure tensor autocorrelation function (PTACF) for nanocolloidal dispersions, which leads to strong instability in viscosity calculation. By reducing the particle size and density, we find the intensity of the oscillation decreases while the frequency of the oscillation becomes higher. Careful analysis of the relationship between the oscillation and nanoparticle characteristics reveals that the stress wave scattering/reflection at the particle-liquid interface plays a critical role in PTACF oscillation while the Brownian motion/vibration of solid particles has little effect. Our modeling proves that it is practical to eliminate the PTACF oscillation through suppressing the acoustic mismatch at the solid-liquid interface by designing special nanoparticle materials. It is also found when the particle size is comparable with the wavelength of the stress wave, diffraction of stress wave happens at the interface. Such effect substantially reduces the PTACF oscillation and improves the stability of viscosity calculation.

  5. Molybdenum oxide nanocolloids prepared by an external field-assisted laser ablation in water

    NASA Astrophysics Data System (ADS)

    Spadaro, Salvatore; Bonsignore, Martina; Fazio, Enza; Cimino, Francesco; Speciale, Antonio; Trombetta, Domenico; Barreca, Francesco; Saija, Antonina; Neri, Fortunato

    2018-01-01

    he synthesis of extremely stable molybdenum oxide nanocolloids by pulsed laser ablation was studied. This green technique ensures the formation of contaminant-free nanostructures and the absence of by-products. A focused picosecond pulsed laser beam was used to ablate a solid molybdenum target immersed in deionized water. Molybdenum oxide nearly spherical nanoparticles with dimensions within few nanometers (20-100 nm) are synthesized when the ablation processes were carried out, in water, at room temperature and 80°C. The application of an external electric field during the ablation process induces a nanostructures reorganization, as indicated by Scanning-Transmission Electron Microscopy images analysis. The ablation products were also characterized by some spectroscopic techniques: conventional UV-vis optical absorption, atomic absorption, dynamic light scattering, micro-Raman and X-ray photoelectron spectroscopies. Finally, NIH/3T3 mouse fibroblasts were used to evaluate cell viability by the sulforhodamine B assay

  6. Fluorescent nanocolloids for differential labeling of the endocytic pathway and drug delivery applications

    NASA Astrophysics Data System (ADS)

    Delehanty, James B.; Spillmann, Christopher M.; Naciri, Jawad; Algar, W. Russ; Ratna, Banahalli R.; Medintz, Igor L.

    2013-02-01

    The demonstration of fine control over nanomaterials within biological systems, particularly in live cells, is integral for the successful implementation of nanoparticles (NPs) in biomedical applications. Here, we show the ability to differentially label the endocytic pathway of mammalian cells in a spatiotemporal manner utilizing fluorescent nanocolloids (NCs) doped with a perylene-based dye. EDC-based conjugation of green- and red-emitting NCs to the iron transport protein transferrin resulted in stable bioconjugates that were efficiently endocytosed by HEK 293T/17 cells. The staggered delivery of the bioconjugates allowed for the time-resolved, differential labeling of distinct vesicular compartments along the endocytic pathway in a nontoxic manner. We further demonstrated the ability of the NCs to be impregnated with the anticancer therapeutic, doxorubicin. Delivery of the drug-doped nanoconjugates resulted in the intracellular release and nuclear accumulation of doxorubicin in a time- and dose-dependent manner. We discuss our results in the context of the utility of such materials for NP-mediated drug delivery applications.

  7. Lippia javanica: a cheap natural source for the synthesis of antibacterial silver nanocolloid

    NASA Astrophysics Data System (ADS)

    Kumar, Santosh; Singh, Mukesh; Halder, Dipankar; Mitra, Atanu

    2016-10-01

    Aqueous silver nanocolloid was synthesized in a single step by a biogenic approach using aqueous leaf extract of Lippia javanica plant which acts as both reducing as well as capping agent. The as-synthesized silver nanoparticles were characterized by UV-visible absorption spectroscopy, high-resolution transmission electron microscopy and Fourier transform infrared spectroscopy (FTIR). The UV-Vis absorption spectra of colloidal silver nanoparticles showed characteristic surface plasmon resonance peak centered at a wavelength of 415 nm. The kinetic study showed that the reduction process was complete within 2 h of time. The TEM analysis showed that most of the particles were spherical in shape and their average diameter was about 17.5 nm. FTIR study confirmed the presence of some organic functional groups in leaf extract and their participation during the reduction as well as stabilization process. In addition, the as-synthesized silver nanoparticles showed antimicrobial activity against clinically isolated pathogenic strain of E. coli and B. subtilis.

  8. Improving yield and reliability of FIB modifications using electrical testing

    NASA Astrophysics Data System (ADS)

    Desplats, Romain; Benbrik, Jamel; Benteo, Bruno; Perdu, Philippe

    1998-08-01

    Focused Ion Beam technology has two main areas of application for ICs: modification and preparation for technological analysis. The most solicited area is modification. This involves physically modifying a circuit by cutting lines and creating new ones in order to change the electrical function of the circuit. IC planar technologies have an increasing number of metal interconnections making FIB modifications more complex and decreasing their changes of success. The yield of FIB operations on ICs reflects a downward trend that imposes a greater number of circuits to be modified in order to successfully correct a small number of them. This requires extended duration, which is not compatible with production line turn around times. To respond to this problem, two solutions can be defined: either, reducing the duration of each FIB operation or increasing the success rate of FIB modifications. Since reducing the time depends mainly on FIB operator experience, insuring a higher success rate represents a more crucial aspect as both experienced and novice operators could benefit from this improvement. In order to insure successful modifications, it is necessary to control each step of a FIB operation. To do this, we have developed a new method using in situ electrical testing which has a direct impact on the yield of FIB modifications. We will present this innovative development through a real case study of a CMOS ASIC for high-speed communications. Monitoring the electrical behavior at each step in a FIB operation makes it possible to reduce the number of circuits to be modified and consequently reduces system costs thanks to better yield control. Knowing the internal electrical behavior also gives us indications about the impact on reliability of FIB modified circuits. Finally, this approach can be applied to failure analysis and FIB operations on flip chip circuits.

  9. Dissipative particle dynamics: Effects of thermostating schemes on nano-colloid electrophoresis

    NASA Astrophysics Data System (ADS)

    Hassanzadeh Afrouzi, Hamid; Moshfegh, Abouzar; Farhadi, Mousa; Sedighi, Kurosh

    2018-05-01

    A novel fully explicit approach using dissipative particle dynamics (DPD) method is introduced in the present study to model the electrophoretic transport of nano-colloids in an electrolyte solution. Slater type charge smearing function included in 3D Ewald summation method is employed to treat electrostatic interaction. Performance of various thermostats are challenged to control the system temperature and study the dynamic response of colloidal electrophoretic mobility under practical ranges of external electric field (0 . 072 < E < 0 . 361 v/nm) covering linear to non-linear response regime, and ionic salt concentration (0.049 < SC < 0 . 69 [M]) covering weak to strong Debye screening of the colloid. System temperature and electrophoretic mobility both show a direct and inverse relationships respectively with electric field and colloidal repulsion; although they each respectively behave direct and inverse trends with salt concentration under various thermostats. Nosé-Hoover-Lowe-Andersen and Lowe-Andersen thermostats are found to function more effectively under high electric fields (E > 0 . 145[v/nm ]) while thermal equilibrium is maintained. Reasonable agreements are achieved by benchmarking the system radial distribution function with available EW3D modellings, as well as comparing reduced mobility against conventional Smoluchowski and Hückel theories, and numerical solution of Poisson-Boltzmann equation.

  10. Enhanced gel formation in binary mixtures of nanocolloids with short-range attraction

    NASA Astrophysics Data System (ADS)

    Harden, James L.; Guo, Hongyu; Bertrand, Martine; Shendruk, Tyler N.; Ramakrishnan, Subramanian; Leheny, Robert L.

    2018-01-01

    Colloidal suspensions transform between fluid and disordered solid states as parameters such as the colloid volume fraction and the strength and nature of the colloidal interactions are varied. Seemingly subtle changes in the characteristics of the colloids can markedly alter the mechanical rigidity and flow behavior of these soft composite materials. This sensitivity creates both a scientific challenge and an opportunity for designing suspensions for specific applications. In this paper, we report a novel mechanism of gel formation in mixtures of weakly attractive nanocolloids with modest size ratio. Employing a combination of x-ray photon correlation spectroscopy, rheometry, and molecular dynamics simulations, we find that gels are stable at remarkably weaker attraction in mixtures with size ratio near two than in the corresponding monodisperse suspensions. In contrast with depletion-driven gelation at larger size ratio, gel formation in the mixtures is triggered by microphase demixing of the species into dense regions of immobile smaller colloids surrounded by clusters of mobile larger colloids that is not predicted by mean-field thermodynamic considerations. These results point to a new route for tailoring nanostructured colloidal solids through judicious combination of interparticle interaction and size distribution.

  11. InfoDROUGHT: Technical reliability assessment using crop yield data at the Spanish-national level

    NASA Astrophysics Data System (ADS)

    Contreras, Sergio; Garcia-León, David; Hunink, Johannes E.

    2017-04-01

    Drought monitoring (DM) is a key component of risk-centered drought preparedness plans and drought policies. InfoDROUGHT (www.infosequia.es) is a a site- and user-tailored and fully-integrated DM system which combines functionalities for: a) the operational satellite-based weekly-1km tracking of severity and spatial extent of drought impacts, b) the interactive and faster query and delivery of drought information through a web-mapping service. InfoDROUGHT has a flexible and modular structure. The calibration (threshold definitions) and validation of the system is performed by combining expert knowledge and auxiliary impact assessments and datasets. Different technical solutions (basic or advanced versions) or deployment options (open-standard or restricted-authenticated) can be purchased by end-users and customers according to their needs. In this analysis, the technical reliability of InfoDROUGHT and its performance for detecting drought impacts on agriculture has been evaluated in the 2003-2014 period by exploring and quantifying the relationships among the drought severity indices reported by InfoDROUGHT and the annual yield anomalies observed for different rainfed crops (maize, wheat, barley) at Spain. We hypothesize a positive relationship between the crop anomalies and the drought severity level detected by InfoDROUGHT. Annual yield anomalies were computed at the province administrative level as the difference between the annual yield reported by the Spanish Annual Survey of Crop Acreages and Yields (ESYRCE database) and the mean annual yield estimated during the study period. Yield anomalies were finally compared against drought greenness-based and thermal-based drought indices (VCI and TCI, respectively) to check the coherence of the outputs and the hypothesis stated. InfoDROUGHT has been partly funded by the Spanish Ministry of Economy and Competiveness through a Torres-Quevedo grant, and by the H2020-EU project "Bridging the Gap for Innovations in

  12. Reliable yields of public water-supply wells in the fractured-rock aquifers of central Maryland, USA

    NASA Astrophysics Data System (ADS)

    Hammond, Patrick A.

    2018-02-01

    Most studies of fractured-rock aquifers are about analytical models used for evaluating aquifer tests or numerical methods for describing groundwater flow, but there have been few investigations on how to estimate the reliable long-term drought yields of individual hard-rock wells. During the drought period of 1998 to 2002, many municipal water suppliers in the Piedmont/Blue Ridge areas of central Maryland (USA) had to institute water restrictions due to declining well yields. Previous estimates of the yields of those wells were commonly based on extrapolating drawdowns, measured during short-term single-well hydraulic pumping tests, to the first primary water-bearing fracture in a well. The extrapolations were often made from pseudo-equilibrium phases, frequently resulting in substantially over-estimated well yields. The methods developed in the present study to predict yields consist of extrapolating drawdown data from infinite acting radial flow periods or by fitting type curves of other conceptual models to the data, using diagnostic plots, inverse analysis and derivative analysis. Available drawdowns were determined by the positions of transition zones in crystalline rocks or thin-bedded consolidated sandstone/limestone layers (reservoir rocks). Aquifer dewatering effects were detected by type-curve matching of step-test data or by breaks in the drawdown curves constructed from hydraulic tests. Operational data were then used to confirm the predicted yields and compared to regional groundwater levels to determine seasonal variations in well yields. Such well yield estimates are needed by hydrogeologists and water engineers for the engineering design of water systems, but should be verified by the collection of long-term monitoring data.

  13. X-ray Study of the Electric Double Layer at the n-Hexane/Nanocolloidal Silica Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tikhonov,A.

    The spatial structure of the transition region between an insulator and an electrolyte solution was studied with x-ray scattering. The electron-density profile across the n-hexane/silica sol interface (solutions with 5, 7, and 12 nm colloidal particles) agrees with the theory of the electrical double layer and shows separation of positive and negative charges. The interface consists of three layers, i.e., a compact layer of Na{sup +}, a loose monolayer of nanocolloidal particles as part of a thick diffuse layer, and a low-density layer sandwiched between them. Its structure is described by a model in which the potential gradient at themore » interface reflects the difference in the potentials of 'image forces' between the cationic Na{sup +} and anionic nanoparticles and the specific adsorption of surface charge. The density of water in the large electric field ({approx}10{sup 9}-10{sup 10} V/m) of the transition region and the layering of silica in the diffuse layer is discussed.« less

  14. Investigating Non-Equilibrium Fluctuations of Nanocolloids in a Magnetic Field Using Direct Imaging Methods

    NASA Astrophysics Data System (ADS)

    Rice, Ashley; Oprisan, Ana; Oprisan, Sorinel; Rice-Oprisan College of Charleston Team

    Nanoparticles of iron oxide have a high surface area and can be controlled by an external magnetic field. Since they have a fast response to the applied magnetic field, these systems have been used for numerous in vivo applications, such as MRI contrast enhancement, tissue repair, immunoassay, detoxification of biological fluids, hyperthermia, drug delivery, and cell separation. We performed three direct imaging experiments in order to investigate the concentration-driven fluctuations using magnetic nanoparticles in the absence and in the presence of magnetic field. Our direct imaging experimental setup involved a glass cell filled with magnetic nanocolloidal suspension and water with the concentration gradient oriented against the gravitational field and a superluminescent diode (SLD) as the light source. Nonequilibrium concentration-driven fluctuations were recorded using a direct imaging technique. We used a dynamic structure factor algorithm for image processing in order to compute the structure factor and to find the power law exponents. We saw evidence of large concentration fluctuations and permanent magnetism. Further research will use the correlation time to approximate the diffusion coefficient for the free diffusion experiment. Funded by College of Charleston Department of Undergraduate Research and Creative Activities SURF grant.

  15. Design of high-reliability low-cost amorphous silicon modules for high energy yield

    NASA Astrophysics Data System (ADS)

    Jansen, Kai W.; Varvar, Anthony; Twesme, Edward; Berens, Troy; Dhere, Neelkanth G.

    2008-08-01

    For PV modules to fulfill their intended purpose, they must generate sufficient economic return over their lifetime to justify their initial cost. Not only must modules be manufactured at a low cost/Wp with a high energy yield (kWh/kWp), they must also be designed to withstand the significant environmental stresses experienced throughout their 25+ year lifetime. Based on field experience, the most common factors affecting the lifetime energy yield of glass-based amorphous silicon (a-Si) modules have been identified; these include: 1) light-induced degradation; 2) moisture ingress and thin film corrosion; 3) transparent conductive oxide (TCO) delamination; and 4) glass breakage. The current approaches to mitigating the effect of these degradation mechanisms are discussed and the accelerated tests designed to simulate some of the field failures are described. In some cases, novel accelerated tests have been created to facilitate the development of improved manufacturing processes, including a unique test to screen for TCO delamination. Modules using the most reliable designs are tested in high voltage arrays at customer and internal test sites, as well as at independent laboratories. Data from tests at the Florida Solar Energy Center has shown that a-Si tandem modules can demonstrate an energy yield exceeding 1200 kWh/kWp/yr in a subtropical climate. In the same study, the test arrays demonstrated low long-term power loss over two years of data collection, after initial stabilization. The absolute power produced by the test arrays varied seasonally by approximately +/-7%, as expected.

  16. Nano-colloid electrophoretic transport: Fully explicit modelling via dissipative particle dynamics

    NASA Astrophysics Data System (ADS)

    Hassanzadeh Afrouzi, Hamid; Farhadi, Mousa; Sedighi, Kurosh; Moshfegh, Abouzar

    2018-02-01

    In present study, a novel fully explicit approach using dissipative particle dynamics (DPD) method is introduced for modelling electrophoretic transport of nano-colloids in an electrolyte solution. Slater type charge smearing function included in 3D Ewald summation method is employed to treat electrostatic interaction. Moreover, capability of different thermostats are challenged to control the system temperature and study the dynamic response of colloidal electrophoretic mobility under practical ranges of external electric field in nano scale application (0.072 < E < 0.361 v / nm) covering non-linear response regime, and ionic salt concentration (0.049 < SC < 0.69 [M]) covering weak to strong Debye screening of the colloid. The effect of different colloidal repulsions are then studied on temperature, reduced mobility and zeta potential which is computed based on charge distribution within the spherical colloidal EDL. System temperature and electrophoretic mobility both show a direct and inverse relationship respectively with electric field and colloidal repulsion. Mobility declining with colloidal repulsion reaches a plateau which is a relatively constant value at each electrolyte salinity for Aii > 600 in DPD units regardless of electric field intensity. Nosé-Hoover-Lowe-Andersen and Lowe-Andersen thermostats are found to function more effectively under high electric fields (E > 0.145 [ v / nm ]) while thermal equilibrium is maintained. Reasonable agreements are achieved by benchmarking the radial distribution function with available electrolyte structure modellings, as well as comparing reduced mobility against conventional Smoluchowski and Hückel theories, and numerical solution of Poisson-Boltzmann equation.

  17. Top-down and Bottom-up Approaches in Production of Aqueous Nanocolloids of Low Soluble Drug Paclitaxel

    PubMed Central

    Pattekari, P.; Zheng, Z.; Zhang, X.; Levchenko, T.; Torchilin, V.

    2015-01-01

    Nano-encapsulation of poorly soluble anticancer drug was developed with sonication assisted layer-by-layer polyelectrolyte coating (SLbL). We changed the strategy of LbL-encapsulation from making microcapsules with many layers in the walls for encasing highly soluble materials to using very thin polycation / polyanion coating on low soluble nanoparticles to provide their good colloidal stability. SLbL encapsulation of paclitaxel resulted in stable 100-200 nm diameter colloids with high electrical surface ξ-potential (of -45 mV) and drug content in the nanoparticles of 90 wt %. In the top-down approach, nanocolloids were prepared by rupturing powder of paclitaxel using ultrasonication and simultaneous sequential adsorption of oppositely charged biocompatible polyelectrolytes. In the bottom-up approach paclitaxel was dissolved in organic solvent (ethanol or acetone), and drug nucleation was initiated by gradual worsening the solution with the addition of aqueous polyelectrolyte assisted by ultrasonication. Paclitaxel release rates from such nanocapsules were controlled by assembling multilayer shells with variable thicknesses and are in the range of 10-20 hours. PMID:21442095

  18. Using operational data to estimate the reliable yields of water-supply wells

    NASA Astrophysics Data System (ADS)

    Misstear, Bruce D. R.; Beeson, Sarah

    The reliable yield of a water-supply well depends on many different factors, including the properties of the well and the aquifer; the capacities of the pumps, raw-water mains, and treatment works; the interference effects from other wells; and the constraints imposed by ion licences, water quality, and environmental issues. A relatively simple methodology for estimating reliable yields has been developed that takes into account all of these factors. The methodology is based mainly on an analysis of water-level and source-output data, where such data are available. Good operational data are especially important when dealing with wells in shallow, unconfined, fissure-flow aquifers, where actual well performance may vary considerably from that predicted using a more analytical approach. Key issues in the yield-assessment process are the identification of a deepest advisable pumping water level, and the collection of the appropriate well, aquifer, and operational data. Although developed for water-supply operators in the United Kingdom, this approach to estimating the reliable yields of water-supply wells using operational data should be applicable to a wide range of hydrogeological conditions elsewhere. Résumé La productivité d'un puits capté pour l'adduction d'eau potable dépend de différents facteurs, parmi lesquels les propriétés du puits et de l'aquifère, la puissance des pompes, le traitement des eaux brutes, les effets d'interférences avec d'autres puits et les contraintes imposées par les autorisations d'exploitation, par la qualité des eaux et par les conditions environnementales. Une méthodologie relativement simple d'estimation de la productivité qui prenne en compte tous ces facteurs a été mise au point. Cette méthodologie est basée surtout sur une analyse des données concernant le niveau piézométrique et le débit de prélèvement, quand ces données sont disponibles. De bonnes données opérationnelles sont particuli

  19. Formation of nanocolloidal metacinnabar in mercury-DOM-sulfide systems

    USGS Publications Warehouse

    Gerbig, Chase A.; Kim, Christopher S.; Stegemeier, John P.; Ryan, Joseph N.; Aiken, George R.

    2011-01-01

    Direct determination of mercury (Hg) speciation in sulfide-containing environments is confounded by low mercury concentrations and poor analytical sensitivity. Here we report the results of experiments designed to assess mercury speciation at environmentally relevant ratios of mercury to dissolved organic matter (DOM) (i.e., <4 nmol Hg (mg DOM)−1) by combining solid phase extraction using C18 resin with extended X-ray absorption fine structure (EXAFS) spectroscopy. Aqueous Hg(II) and a DOM isolate were equilibrated in the presence and absence of 100 μM total sulfide. In the absence of sulfide, mercury adsorption to the resin increased as the Hg:DOM ratio decreased and as the strength of Hg-DOM binding increased. EXAFS analysis indicated that in the absence of sulfide, mercury bonds with an average of 2.4 ± 0.2 sulfur atoms with a bond length typical of mercury-organic thiol ligands (2.35 Å). In the presence of sulfide, mercury showed greater affinity for the C18 resin, and its chromatographic behavior was independent of Hg:DOM ratio. EXAFS analysis showed mercury–sulfur bonds with a longer interatomic distance (2.51–2.53 Å) similar to the mercury–sulfur bond distance in metacinnabar (2.53 Å) regardless of the Hg:DOM ratio. For all samples containing sulfide, the sulfur coordination number was below the ideal four-coordinate structure of metacinnabar. At a low Hg:DOM ratio where strong binding DOM sites may control mercury speciation (1.9 nmol mg–1) mercury was coordinated by 2.3 ± 0.2 sulfur atoms, and the coordination number rose with increasing Hg:DOM ratio. The less-than-ideal coordination numbers indicate metacinnabar-like species on the nanometer scale, and the positive correlation between Hg:DOM ratio and sulfur coordination number suggests progressively increasing particle size or crystalline order with increasing abundance of mercury with respect to DOM. In DOM-containing sulfidic systems nanocolloidal metacinnabar-like species may form

  20. The effect of the labile organic fraction in food waste and the substrate/inoculum ratio on anaerobic digestion for a reliable methane yield.

    PubMed

    Kawai, Minako; Nagao, Norio; Tajima, Nobuaki; Niwa, Chiaki; Matsuyama, Tatsushi; Toda, Tatsuki

    2014-04-01

    Influence of the labile organic fraction (LOF) on anaerobic digestion of food waste was investigated in different S/I ratio of 0.33, 0.5, 1.0, 2.0 and 4.0g-VSsubstrate/g-VSinoculum. Two types of substrate, standard food waste (Substrate 1) and standard food waste with the supernatant (containing LOF) removed (Substrate 2) were used. Highest methane yield of 435ml-CH4g-VS(-1) in Substrate 1 was observed in the lowest S/I ratio, while the methane yield of the other S/I ratios were 38-73% lower than the highest yield due to acidification. The methane yields in Substrate 2 were relatively stable in all S/I conditions, although the maximum methane yield was low compared with Substrate 1. These results showed that LOF in food waste causes acidification, but also contributes to high methane yields, suggesting that low S/I ratio (<0.33) is required to obtain a reliable methane yield from food waste compared to other organic substrates. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Interrelation Between Safety Factors and Reliability

    NASA Technical Reports Server (NTRS)

    Elishakoff, Isaac; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    An evaluation was performed to establish relationships between safety factors and reliability relationships. Results obtained show that the use of the safety factor is not contradictory to the employment of the probabilistic methods. In many cases the safety factors can be directly expressed by the required reliability levels. However, there is a major difference that must be emphasized: whereas the safety factors are allocated in an ad hoc manner, the probabilistic approach offers a unified mathematical framework. The establishment of the interrelation between the concepts opens an avenue to specify safety factors based on reliability. In cases where there are several forms of failure, then the allocation of safety factors should he based on having the same reliability associated with each failure mode. This immediately suggests that by the probabilistic methods the existing over-design or under-design can be eliminated. The report includes three parts: Part 1-Random Actual Stress and Deterministic Yield Stress; Part 2-Deterministic Actual Stress and Random Yield Stress; Part 3-Both Actual Stress and Yield Stress Are Random.

  2. Normal yield tables for red alder.

    Treesearch

    Norman P. Worthington; Floyd A. Johnson; George R. Staebler; William J. Lloyd

    1960-01-01

    Increasing interest in the management of red alder (Alnus rubra) has created a need for reliable yield information. Existing yield tables for red alder have been very useful as interim sources of information, but they are generally inadequate for current and prospective management needs. The advisory committee for the Station's Olympia...

  3. Effects of Platinum Nanocolloid in Combination with Gamma Irradiation on Normal Human Esophageal Epithelial Cells.

    PubMed

    Li, Qiang; Tanaka, Yoshiharu; Saitoh, Yasukazu; Miwa, Nobuhiko

    2016-05-01

    Our previous study demonstrated that platinum nanocolloid (Pt-nc), combined with lower-dose gamma irradiation at 3, 5, and 7 Gy significantly decreased proliferation and accelerated apoptosis of the human esophageal squamous cell carcinoma-derived cell line KYSE-70. The aim of the present study was to determine, under the same conditions as our previous study where gamma rays combined with Pt-nc were carcinostatic to KYSE-70 cells, if we could induce a radioprotective or the radiation-sensitizing effect on the human normal esophageal epithelial cells (HEEpiC). HEEpiC were treated with various Pt-nc concentrations and then irradiated with various gamma-ray doses. The proliferative status of HEEpiC was evaluated using trypan blue dye-exclusion and WST-8 assays. The cellular and nucleic morphological features were determined using crystal violet and Hoechst 33342 stainings, respectively. The intracellular level of reactive oxygen species (ROS) in HEEpiC was evaluated with a nitro blue tetrazolium (NBT) assay. The apoptotic status was detected with caspase-3, Bax, and Bcl-2 by Western blotting. Either Pt-nc or gamma irradiation could inhibit the growth of HEEpiC; however, their combined use exerted a significant proliferation-inhibitory effect in a Pt-nc dose-dependent manner than gamma irradiation alone. Pt-nc resulted in radiation sensitization rather than radiation protection on HEEpiC in vitro similar to KYSE-70 cells, when Pt-nc was administrated alone or combined with gamma irradiation. Thus, Pt-nc has an inhibitory effect on cell proliferation, a facilitative effect on apoptosis, and a certain degree of toxicity against HEEpiC.

  4. Grapevine canopy reflectance and yield

    NASA Technical Reports Server (NTRS)

    Minden, K. A.; Philipson, W. R.

    1982-01-01

    Field spectroradiometric and airborne multispectral scanner data were applied in a study of Concord grapevines. Spectroradiometric measurements of 18 experimental vines were collected on three dates during one growing season. Spectral reflectance, determined at 30 intervals from 0.4 to 1.1 microns, was correlated with vine yield, pruning weight, clusters/vine, and nitrogen input. One date of airborne multispectral scanner data (11 channels) was collected over commercial vineyards, and the average radiance values for eight vineyard sections were correlated with the corresponding average yields. Although some correlations were significant, they were inadequate for developing a reliable yield prediction model.

  5. Carcinostatic effects of platinum nanocolloid combined with gamma irradiation on human esophageal squamous cell carcinoma.

    PubMed

    Li, Qiang; Tanaka, Yoshiharu; Saitoh, Yasukazu; Tanaka, Hiroshi; Miwa, Nobuhiko

    2015-04-15

    To explore the carcinostatic effects of platinum nanocolloid (Pt-nc) combined with gamma rays on human esophageal squamous cell carcinoma (ESCC). ESCC-derived KYSE-70 cells were treated with various concentrations of Pt-nc and/or gamma irradiation, and subsequently cultured in phenol red free DMEM with 10% FBS for 48 h. The proliferative status of the KYSE-70 cells was evaluated using trypan blue dye exclusion and WST-8 assays. Cellular and nucleic morphological aspects were evaluated using crystal violet and Hoechst 33342 stainings, respectively. Radiosensitivity was quantified by a cell viability assay, and the activated form of caspase-3, a characteristic apoptosis-related protein, was detected by Western blotting. Although single treatment with either Pt-nc or gamma irradiation could slightly inhibit the growth of the KYSE-70 cells, their combination exerted remarkable carcinostatic effects in a manner dependent on either Pt-nc concentrations or gamma ray doses, compared with the effect of each treatment alone (p<0.05). By fluorescence micrographic observation, the KYSE-70 cells that were treated with Pt-nc and subsequently irradiated with gamma rays, were shown to undergo distinct apoptotic morphological changes. The carcinostatic effect of gamma rays at 7 Gy without Pt-nc was approximately equal to that when 3-Gy irradiation was combined with 100 ppm Pt-nc or that 5-Gy irradiation was combined with 50 ppm Pt-nc. Pt-nc in combination with gamma rays may exert a cooperative effect through platinum- or gamma ray-induced apoptosis resulting in the inhibition of growth of cancer cells, while concurrently enabling the lowering of the radiative dose. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Synthesis of stable ZnO nanocolloids with enhanced optical limiting properties via simple solution method

    NASA Astrophysics Data System (ADS)

    Ramya, M.; Nideep, T. K.; Vijesh, K. R.; Nampoori, V. P. N.; Kailasnath, M.

    2018-07-01

    In present work, we report the synthesis of stable ZnO nanocolloids through a simple solution method which exhibit enhanced optical limiting threshold. The influences of reaction temperature on the crystal structure as well as linear and nonlinear optical properties of prepared ZnO nanoparticles were carried out. The XRD and Raman analysis reveal that the prepared ZnO nanoparticles retain the hexagonal wurtzite crystal structure. HRTEM analysis confirms the effect of reaction temperature, solvent effect on crystallinity as well as nanostructure of ZnO nanoparticles. It has been found that crystallinity and average diameter increase with reaction temperature where ethylene glycol act as both solvent and growth inhibiter. EDS spectra shows formation of pure ZnO nanoparticles. The direct energy band gap of the nanoparticles increases with decrease in particle size due to quantum confinement effect. The third order nonlinear optical properties of ZnO nanoparticles were investigated by z scan technique using a frequency doubled Nd-YAG nanosecond laser at 532 nm wavelength. The z-scan result reveals that the prepared ZnO nanoparticles exhibit self - defocusing nonlinearity. The two photon absorption coefficient and third - order nonlinear optical susceptibility increases with increasing particle size. The third-order susceptibility of the ZnO nanoparticles is found to be in the order of 10-10 esu, which is at least three order magnitude greater than the bulk ZnO. The optical limiting threshold of the nanoparticles varies in the range of 54 to 17 MW/cm2. The results suggest that ZnO nanoparticles considered as a promising candidates for the future photonic devices.

  7. Evaluation of the CEAS model for barley yields in North Dakota and Minnesota

    NASA Technical Reports Server (NTRS)

    Barnett, T. L. (Principal Investigator)

    1981-01-01

    The CEAS yield model is based upon multiple regression analysis at the CRD and state levels. For the historical time series, yield is regressed on a set of variables derived from monthly mean temperature and monthly precipitation. Technological trend is represented by piecewise linear and/or quadriatic functions of year. Indicators of yield reliability obtained from a ten-year bootstrap test (1970-79) demonstrated that biases are small and performance as indicated by the root mean square errors are acceptable for intended application, however, model response for individual years particularly unusual years, is not very reliable and shows some large errors. The model is objective, adequate, timely, simple and not costly. It considers scientific knowledge on a broad scale but not in detail, and does not provide a good current measure of modeled yield reliability.

  8. LACIE: Wheat yield models for the USSR

    NASA Technical Reports Server (NTRS)

    Sakamoto, C. M.; Leduc, S. K.

    1977-01-01

    A quantitative model determining the relationship between weather conditions and wheat yield in the U.S.S.R. was studied to provide early reliable forecasts on the size of the U.S.S.R. wheat harvest. Separate models are developed for spring wheat and for winter. Differences in yield potential and responses to stress conditions and cultural improvements necessitate models for each class.

  9. Electron beam induced water-soluble silk fibroin nanoparticles as a natural antioxidant and reducing agent for a green synthesis of gold nanocolloid

    NASA Astrophysics Data System (ADS)

    Wongkrongsak, Soraya; Tangthong, Theeranan; Pasanphan, Wanvimol

    2016-01-01

    The research proposes a novel water-soluble silk fibroin nanoparticles (WSSF-NPs) created by electron beam irradiation. In this report, we demonstrate the effects of electron beam irradiation doses ranging from 1 to 30 kGy on the molecular weight (MW), nanostructure formation, antioxidant activity and reducing power of the WSSF-NPs. Electron beam-induced degradation of SF causing MW reduction from 250 to 37 kDa. Chemical characteristic functions of SF still remained after exposing to electron beam. The WSSF-NPs with the MW of 37 kDa exhibited spherical morphology with a nanoscaled size of 40 nm. Antioxidant activities and reducing powers were investigated using 2,2-diphenyl-1-picrylhryl free radical (DPPH•) scavenging activity and ferric reducing antioxidant power (FRAP) assays, respectively. The WSSF-NPs showed greater antioxidant activity and reducing power than non-irradiated SF. By increasing their antioxidant and reducing power efficiencies, WSSF-NPs potentially created gold nanocolloid. WSSF-NPs produced by electron beam irradiation would be a great merit for the uses as a natural antioxidant additive and a green reducing agent in biomedical, cosmetic and food applications.

  10. Improving precision of forage yield trials: A case study

    USDA-ARS?s Scientific Manuscript database

    Field-based agronomic and genetic research relies heavily on the data generated from field evaluations. Therefore, it is imperative to optimize the precision of yield estimates in cultivar evaluation trials to make reliable selections. Experimental error in yield trials is sensitive to several facto...

  11. What's holding us back? Raising the alfalfa yield bar

    USDA-ARS?s Scientific Manuscript database

    Measuring yield of commodity crops is easy – weight and moisture content are determined on delivery. Consequently, reports of production or yield for grain crops can be made reliably to the agencies that track crop production, such as the USDA-National Agricultural Statistics Service (NASS). The s...

  12. Reliability and validity of the McDonald Play Inventory.

    PubMed

    McDonald, Ann E; Vigen, Cheryl

    2012-01-01

    This study examined the ability of a two-part self-report instrument, the McDonald Play Inventory, to reliably and validly measure the play activities and play styles of 7- to 11-yr-old children and to discriminate between the play of neurotypical children and children with known learning and developmental disabilities. A total of 124 children ages 7-11 recruited from a sample of convenience and a subsample of 17 parents participated in this study. Reliability estimates yielded moderate correlations for internal consistency, total test intercorrelations, and test-retest reliability. Validity estimates were established for content and construct validity. The results suggest that a self-report instrument yields reliable and valid measures of a child's perceived play performance and discriminates between the play of children with and without disabilities. Copyright © 2012 by the American Occupational Therapy Association, Inc.

  13. Brownian motion of a nano-colloidal particle: the role of the solvent.

    PubMed

    Torres-Carbajal, Alexis; Herrera-Velarde, Salvador; Castañeda-Priego, Ramón

    2015-07-15

    Brownian motion is a feature of colloidal particles immersed in a liquid-like environment. Usually, it can be described by means of the generalised Langevin equation (GLE) within the framework of the Mori theory. In principle, all quantities that appear in the GLE can be calculated from the molecular information of the whole system, i.e., colloids and solvent molecules. In this work, by means of extensive Molecular Dynamics simulations, we study the effects of the microscopic details and the thermodynamic state of the solvent on the movement of a single nano-colloid. In particular, we consider a two-dimensional model system in which the mass and size of the colloid are two and one orders of magnitude, respectively, larger than the ones associated with the solvent molecules. The latter ones interact via a Lennard-Jones-type potential to tune the nature of the solvent, i.e., it can be either repulsive or attractive. We choose the linear momentum of the Brownian particle as the observable of interest in order to fully describe the Brownian motion within the Mori framework. We particularly focus on the colloid diffusion at different solvent densities and two temperature regimes: high and low (near the critical point) temperatures. To reach our goal, we have rewritten the GLE as a second kind Volterra integral in order to compute the memory kernel in real space. With this kernel, we evaluate the momentum-fluctuating force correlation function, which is of particular relevance since it allows us to establish when the stationarity condition has been reached. Our findings show that even at high temperatures, the details of the attractive interaction potential among solvent molecules induce important changes in the colloid dynamics. Additionally, near the critical point, the dynamical scenario becomes more complex; all the correlation functions decay slowly in an extended time window, however, the memory kernel seems to be only a function of the solvent density. Thus, the

  14. Automatic yield-line analysis of slabs using discontinuity layout optimization

    PubMed Central

    Gilbert, Matthew; He, Linwei; Smith, Colin C.; Le, Canh V.

    2014-01-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented. PMID:25104905

  15. Assessment of the Maximal Split-Half Coefficient to Estimate Reliability

    ERIC Educational Resources Information Center

    Thompson, Barry L.; Green, Samuel B.; Yang, Yanyun

    2010-01-01

    The maximal split-half coefficient is computed by calculating all possible split-half reliability estimates for a scale and then choosing the maximal value as the reliability estimate. Osburn compared the maximal split-half coefficient with 10 other internal consistency estimates of reliability and concluded that it yielded the most consistently…

  16. Weather-based forecasts of California crop yields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobell, D B; Cahill, K N; Field, C B

    2005-09-26

    Crop yield forecasts provide useful information to a range of users. Yields for several crops in California are currently forecast based on field surveys and farmer interviews, while for many crops official forecasts do not exist. As broad-scale crop yields are largely dependent on weather, measurements from existing meteorological stations have the potential to provide a reliable, timely, and cost-effective means to anticipate crop yields. We developed weather-based models of state-wide yields for 12 major California crops (wine grapes, lettuce, almonds, strawberries, table grapes, hay, oranges, cotton, tomatoes, walnuts, avocados, and pistachios), and tested their accuracy using cross-validation over themore » 1980-2003 period. Many crops were forecast with high accuracy, as judged by the percent of yield variation explained by the forecast, the number of yields with correctly predicted direction of yield change, or the number of yields with correctly predicted extreme yields. The most successfully modeled crop was almonds, with 81% of yield variance captured by the forecast. Predictions for most crops relied on weather measurements well before harvest time, allowing for lead times that were longer than existing procedures in many cases.« less

  17. Lactation persistency as a component trait of the selection index and increase in reliability by using single nucleotide polymorphism in net merit defined as the first five lactation milk yields and herd life.

    PubMed

    Togashi, K; Hagiya, K; Osawa, T; Nakanishi, T; Yamazaki, T; Nagamine, Y; Lin, C Y; Matsumoto, S; Aihara, M; Hayasaka, K

    2012-08-01

    We first sought to clarify the effects of discounted rate, survival rate, and lactation persistency as a component trait of the selection index on net merit, defined as the first five lactation milks and herd life (HL) weighted by 1 and 0.389 (currently used in Japan), respectively, in units of genetic standard deviation. Survival rate increased the relative economic importance of later lactation traits and the first five lactation milk yields during the first 120 months from the start of the breeding scheme. In contrast, reliabilities of the estimated breeding value (EBV) in later lactation traits are lower than those of earlier lactation traits. We then sought to clarify the effects of applying single nucleotide polymorphism (SNP) on net merit to improve the reliability of EBV of later lactation traits to maximize their increased economic importance due to increase in survival rate. Net merit, selection accuracy, and HL increased by adding lactation persistency to the selection index whose component traits were only milk yields. Lactation persistency of the second and (especially) third parities contributed to increasing HL while maintaining the first five lactation milk yields compared with the selection index whose only component traits were milk yields. A selection index comprising the first three lactation milk yields and persistency accounted for 99.4% of net merit derived from a selection index whose components were identical to those for net merit. We consider that the selection index comprising the first three lactation milk yields and persistency is a practical method for increasing lifetime milk yield in the absence of data regarding HL. Applying SNP to the second- and third-lactation traits and HL increased net merit and HL by maximizing the increased economic importance of later lactation traits, reducing the effect of first-lactation milk yield on HL (genetic correlation (rG) = -0.006), and by augmenting the effects of the second- and third

  18. Electronics reliability and measurement technology

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Editor)

    1987-01-01

    A summary is presented of the Electronics Reliability and Measurement Technology Workshop. The meeting examined the U.S. electronics industry with particular focus on reliability and state-of-the-art technology. A general consensus of the approximately 75 attendees was that "the U.S. electronics industries are facing a crisis that may threaten their existence". The workshop had specific objectives to discuss mechanisms to improve areas such as reliability, yield, and performance while reducing failure rates, delivery times, and cost. The findings of the workshop addressed various aspects of the industry from wafers to parts to assemblies. Key problem areas that were singled out for attention are identified, and action items necessary to accomplish their resolution are recommended.

  19. A new electronic meter for measuring herbage yield

    Treesearch

    Donald L. Neal; Lee R. Neal

    1965-01-01

    A new electronic instrument, called the Heterodyne Vegetation Meter to measure herbage yield and utilization was built and tested. The instrument proved to be reliable and rapid. Further testing will be conducted.

  20. Reliability analysis of a sensitive and independent stabilometry parameter set

    PubMed Central

    Nagymáté, Gergely; Orlovits, Zsanett

    2018-01-01

    Recent studies have suggested reduced independent and sensitive parameter sets for stabilometry measurements based on correlation and variance analyses. However, the reliability of these recommended parameter sets has not been studied in the literature or not in every stance type used in stabilometry assessments, for example, single leg stances. The goal of this study is to evaluate the test-retest reliability of different time-based and frequency-based parameters that are calculated from the center of pressure (CoP) during bipedal and single leg stance for 30- and 60-second measurement intervals. Thirty healthy subjects performed repeated standing trials in a bipedal stance with eyes open and eyes closed conditions and in a single leg stance with eyes open for 60 seconds. A force distribution measuring plate was used to record the CoP. The reliability of the CoP parameters was characterized by using the intraclass correlation coefficient (ICC), standard error of measurement (SEM), minimal detectable change (MDC), coefficient of variation (CV) and CV compliance rate (CVCR). Based on the ICC, SEM and MDC results, many parameters yielded fair to good reliability values, while the CoP path length yielded the highest reliability (smallest ICC > 0.67 (0.54–0.79), largest SEM% = 19.2%). Usually, frequency type parameters and extreme value parameters yielded poor reliability values. There were differences in the reliability of the maximum CoP velocity (better with 30 seconds) and mean power frequency (better with 60 seconds) parameters between the different sampling intervals. PMID:29664938

  1. Reliability analysis of a sensitive and independent stabilometry parameter set.

    PubMed

    Nagymáté, Gergely; Orlovits, Zsanett; Kiss, Rita M

    2018-01-01

    Recent studies have suggested reduced independent and sensitive parameter sets for stabilometry measurements based on correlation and variance analyses. However, the reliability of these recommended parameter sets has not been studied in the literature or not in every stance type used in stabilometry assessments, for example, single leg stances. The goal of this study is to evaluate the test-retest reliability of different time-based and frequency-based parameters that are calculated from the center of pressure (CoP) during bipedal and single leg stance for 30- and 60-second measurement intervals. Thirty healthy subjects performed repeated standing trials in a bipedal stance with eyes open and eyes closed conditions and in a single leg stance with eyes open for 60 seconds. A force distribution measuring plate was used to record the CoP. The reliability of the CoP parameters was characterized by using the intraclass correlation coefficient (ICC), standard error of measurement (SEM), minimal detectable change (MDC), coefficient of variation (CV) and CV compliance rate (CVCR). Based on the ICC, SEM and MDC results, many parameters yielded fair to good reliability values, while the CoP path length yielded the highest reliability (smallest ICC > 0.67 (0.54-0.79), largest SEM% = 19.2%). Usually, frequency type parameters and extreme value parameters yielded poor reliability values. There were differences in the reliability of the maximum CoP velocity (better with 30 seconds) and mean power frequency (better with 60 seconds) parameters between the different sampling intervals.

  2. Field design factors affecting the precision of ryegrass forage yield estimation

    USDA-ARS?s Scientific Manuscript database

    Field-based agronomic and genetic research relies heavily on the data generated from field evaluations. Therefore, it is imperative to optimize the precision and accuracy of yield estimates in cultivar evaluation trials to make reliable selections. Experimental error in yield trials is sensitive to ...

  3. [Sentinel node detection in early stage of cervical carcinoma using 99mTc-nanocolloid and blue dye].

    PubMed

    Sevcík, L; Klát, J; Gráf, P; Koliba, P; Curík, R; Kraft, O

    2007-04-01

    The aim of the study was to analyse the feasibility of intraoperative sentinel lymph nodes (SLN) detection using gamma detection probe and blue dye in patients undergoing radical hysterectomy for treatment of early stage of cervical cancer. Prospective case observational study. In the period from May 2004 to February 2006 77 patients with early stage of cervical cancer who underwent a radical surgery were included into the study. Patients were divided into three groups according to the tumour volume. First group consists of patients FIGO IA2 and FIGO IB1 with tumour diameter less than 2 cm, second group tumours FIGO IB1 with tumour diameter more than 2 cm and third group stadium IB2. SLN was detected by blue dye and Tc99. Preoperative lymphoscintigraphy was done after Tc99 colloid injection, intraoperative detection was performed by visual observation and by hand-held gamma-detection probe. SLN were histologically and immunohistochemically analysed. A total number of 2764 lymph nodes with an average 36 and 202 SLN with an average 2.6 were identified. The SLN detection rate was 94.8% per patient and 85.1% for the side of dissection and depends on the tumor volume. SLN were identified in obturator area in 48%, in external iliac area in 15%, in common iliac and internal iliac both in 9%, in interiliac region in 8%, in praesacral region in 6% and in parametrial area in 5%. Metastatic disease was detected in 31 patients (40.2%), metastatic involvement of SLN only in 12 patients (15.6%). False negative rate was 2.6%, sensitivity and negative predictive value calculated by patient were 923% and 95.7%. Intraoperative lymphatic mapping using combination of technecium-99-labeled nanocolloid and blue dye are feasible, safe and accurate techniques to identified SLN in early stage of cervical cancer.

  4. Improving the reliability of female fertility breeding values using type and milk yield traits that predict energy status in Australian Holstein cattle.

    PubMed

    González-Recio, O; Haile-Mariam, M; Pryce, J E

    2016-01-01

    The objectives of this study were (1) to propose changing the selection criteria trait for evaluating fertility in Australia from calving interval to conception rate at d 42 after the beginning of the mating season and (2) to use type traits as early fertility predictors, to increase the reliability of estimated breeding values for fertility. The breeding goal in Australia is conception within 6 wk of the start of the mating season. Currently, the Australian model to predict fertility breeding values (expressed as a linear transformation of calving interval) is a multitrait model that includes calving interval (CVI), lactation length (LL), calving to first service (CFS), first nonreturn rate (FNRR), and conception rate. However, CVI has a lower genetic correlation with the breeding goal (conception within 6 wk of the start of the mating season) than conception rate. Milk yield, type, and fertility data from 164,318 cow sired by 4,766 bulls were used. Principal component analysis and genetic correlation estimates between type and fertility traits were used to select type traits that could subsequently be used in a multitrait analysis. Angularity, foot angle, and pin set were chosen as type traits to include in an index with the traits that are included in the multitrait fertility model: CVI, LL, CFS, FNRR, and conception rate at d 42 (CR42). An index with these 8 traits is expected to achieve an average bull first proof reliability of 0.60 on the breeding objective (conception within 6 wk of the start of the mating season) compared with reliabilities of 0.39 and 0.45 for CR42 only or the current 5-trait Australian model. Subsequently, we used the first eigenvector of a principal component analysis with udder texture, bone quality, angularity, and body condition score to calculate an energy status indicator trait. The inclusion of the energy status indicator trait composite in a multitrait index with CVI, LL, CFS, FNRR, and CR42 achieved a 12-point increase in

  5. Highly reliable oxide VCSELs for datacom applications

    NASA Astrophysics Data System (ADS)

    Aeby, Ian; Collins, Doug; Gibson, Brian; Helms, Christopher J.; Hou, Hong Q.; Lou, Wenlin; Bossert, David J.; Wang, Charlie X.

    2003-06-01

    In this paper we describe the processes and procedures that have been developed to ensure high reliability for Emcore"s 850 nm oxide confined GaAs VCSELs. Evidence from on-going accelerated life testing and other reliability studies that confirm that this process yields reliable products will be discussed. We will present data and analysis techniques used to determine the activation energy and acceleration factors for the dominant wear-out failure mechanisms for our devices as well as our estimated MTTF of greater than 2 million use hours. We conclude with a summary of internal verification and field return rate validation data.

  6. Yield: it's now an entitlement

    NASA Astrophysics Data System (ADS)

    George, Bill

    1994-09-01

    Only a few years ago, the primary method of cost reduction and productivity improvement in the semiconductor industry was increasing manufacturing yields throughout the process. Many of the remarkable reliability improvements realized over the past decade have come about as a result of actions that were originally taken primarily to improve device yields. Obviously, the practice of productivity improvement through yield enhancement is limited to the attainment of 100% yield, at which point some other mechanism must be employed. Traditionally, new products have been introduced to manufacturing at a point of relative immaturity, and semiconductor producers have relied on the traditional `learning curve' method of yield improvement to attain profitable levels of manufacturing yield. Recently, results of a survey of several fabs by a group of University of California at Berkeley researchers in the Competitive Semiconductor Manufacturing Program indicate that most factories learn at about the same rate after startup, in terms of both line yield and defectivity. If this is indeed generally true, then the most competitive factor is the one that starts with the highest yield, and it is difficult to displace a leader once his lead has been established. The two observations made above carry enormous implications for the semiconductor development or manufacturing professional. First, one must achieve very high yields in order to even play the game. Second, the achievement of competitive yields over time in the life of a factory is determined even before the factory is opened, in the planning and development phase. Third, and perhaps most uncomfortable for those of us who have relied on yield improvement as a cost driver, the winners of the nineties will find new levers to drive costs down, having already gotten the benefit of very high yield. This paper looks at the question of how the winners will achieve the critical measures of success, high initial yield and utilization

  7. Evaluation of the Williams-type model for barley yields in North Dakota and Minnesota

    NASA Technical Reports Server (NTRS)

    Barnett, T. L. (Principal Investigator)

    1981-01-01

    The Williams-type yield model is based on multiple regression analysis of historial time series data at CRD level pooled to regional level (groups of similar CRDs). Basic variables considered in the analysis include USDA yield, monthly mean temperature, monthly precipitation, soil texture and topographic information, and variables derived from these. Technologic trend is represented by piecewise linear and/or quadratic functions of year. Indicators of yield reliability obtained from a ten-year bootstrap test (1970-1979) demonstrate that biases are small and performance based on root mean square appears to be acceptable for the intended AgRISTARS large area applications. The model is objective, adequate, timely, simple, and not costly. It consideres scientific knowledge on a broad scale but not in detail, and does not provide a good current measure of modeled yield reliability.

  8. Covariance Matrix Evaluations for Independent Mass Fission Yields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terranova, N., E-mail: nicholas.terranova@unibo.it; Serot, O.; Archier, P.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yieldsmore » variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.« less

  9. Reliable and valid tools for measuring surgeons' teaching performance: residents' vs. self evaluation.

    PubMed

    Boerebach, Benjamin C M; Arah, Onyebuchi A; Busch, Olivier R C; Lombarts, Kiki M J M H

    2012-01-01

    In surgical education, there is a need for educational performance evaluation tools that yield reliable and valid data. This paper describes the development and validation of robust evaluation tools that provide surgeons with insight into their clinical teaching performance. We investigated (1) the reliability and validity of 2 tools for evaluating the teaching performance of attending surgeons in residency training programs, and (2) whether surgeons' self evaluation correlated with the residents' evaluation of those surgeons. We surveyed 343 surgeons and 320 residents as part of a multicenter prospective cohort study of faculty teaching performance in residency training programs. The reliability and validity of the SETQ (System for Evaluation Teaching Qualities) tools were studied using standard psychometric techniques. We then estimated the correlations between residents' and surgeons' evaluations. The response rate was 87% among surgeons and 84% among residents, yielding 2625 residents' evaluations and 302 self evaluations. The SETQ tools yielded reliable and valid data on 5 domains of surgical teaching performance, namely, learning climate, professional attitude towards residents, communication of goals, evaluation of residents, and feedback. The correlations between surgeons' self and residents' evaluations were low, with coefficients ranging from 0.03 for evaluation of residents to 0.18 for communication of goals. The SETQ tools for the evaluation of surgeons' teaching performance appear to yield reliable and valid data. The lack of strong correlations between surgeons' self and residents' evaluations suggest the need for using external feedback sources in informed self evaluation of surgeons. Copyright © 2012 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. Advanced reliability methods for structural evaluation

    NASA Technical Reports Server (NTRS)

    Wirsching, P. H.; Wu, Y.-T.

    1985-01-01

    Fast probability integration (FPI) methods, which can yield approximate solutions to such general structural reliability problems as the computation of the probabilities of complicated functions of random variables, are known to require one-tenth the computer time of Monte Carlo methods for a probability level of 0.001; lower probabilities yield even more dramatic differences. A strategy is presented in which a computer routine is run k times with selected perturbed values of the variables to obtain k solutions for a response variable Y. An approximating polynomial is fit to the k 'data' sets, and FPI methods are employed for this explicit form.

  11. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  12. Diverse Data Sets Can Yield Reliable Information through Mechanistic Modeling: Salicylic Acid Clearance.

    PubMed

    Raymond, G M; Bassingthwaighte, J B

    This is a practical example of a powerful research strategy: putting together data from studies covering a diversity of conditions can yield a scientifically sound grasp of the phenomenon when the individual observations failed to provide definitive understanding. The rationale is that defining a realistic, quantitative, explanatory hypothesis for the whole set of studies, brings about a "consilience" of the often competing hypotheses considered for individual data sets. An internally consistent conjecture linking multiple data sets simultaneously provides stronger evidence on the characteristics of a system than does analysis of individual data sets limited to narrow ranges of conditions. Our example examines three very different data sets on the clearance of salicylic acid from humans: a high concentration set from aspirin overdoses; a set with medium concentrations from a research study on the influences of the route of administration and of sex on the clearance kinetics, and a set on low dose aspirin for cardiovascular health. Three models were tested: (1) a first order reaction, (2) a Michaelis-Menten (M-M) approach, and (3) an enzyme kinetic model with forward and backward reactions. The reaction rates found from model 1 were distinctly different for the three data sets, having no commonality. The M-M model 2 fitted each of the three data sets but gave a reliable estimates of the Michaelis constant only for the medium level data (K m = 24±5.4 mg/L); analyzing the three data sets together with model 2 gave K m = 18±2.6 mg/L. (Estimating parameters using larger numbers of data points in an optimization increases the degrees of freedom, constraining the range of the estimates). Using the enzyme kinetic model (3) increased the number of free parameters but nevertheless improved the goodness of fit to the combined data sets, giving tighter constraints, and a lower estimated K m = 14.6±2.9 mg/L, demonstrating that fitting diverse data sets with a single model

  13. Quantifying yield gaps in wheat production in Russia

    NASA Astrophysics Data System (ADS)

    Schierhorn, Florian; Faramarzi, Monireh; Prishchepov, Alexander V.; Koch, Friedrich J.; Müller, Daniel

    2014-08-01

    Crop yields must increase substantially to meet the increasing demands for agricultural products. Crop yield increases are particularly important for Russia because low crop yields prevail across Russia’s widespread and fertile land resources. However, reliable data are lacking regarding the spatial distribution of potential yields in Russia, which can be used to determine yield gaps. We used a crop growth model to determine the yield potentials and yield gaps of winter and spring wheat at the provincial level across European Russia. We modeled the annual yield potentials from 1995 to 2006 with optimal nitrogen supplies for both rainfed and irrigated conditions. Overall, the results suggest yield gaps of 1.51-2.10 t ha-1, or 44-52% of the yield potential under rainfed conditions. Under irrigated conditions, yield gaps of 3.14-3.30 t ha-1, or 62-63% of the yield potential, were observed. However, recurring droughts cause large fluctuations in yield potentials under rainfed conditions, even when the nitrogen supply is optimal, particularly in the highly fertile black soil areas of southern European Russia. The highest yield gaps (up to 4 t ha-1) under irrigated conditions were detected in the steppe areas in southeastern European Russia along the border of Kazakhstan. Improving the nutrient and water supply and using crop breeds that are adapted to the frequent drought conditions are important for reducing yield gaps in European Russia. Our regional assessment helps inform policy and agricultural investors and prioritize research that aims to increase crop production in this important region for global agricultural markets.

  14. Evaluation of the CEAS trend and monthly weather data models for soybean yields in Iowa, Illinois, and Indiana

    NASA Technical Reports Server (NTRS)

    French, V. (Principal Investigator)

    1982-01-01

    The CEAS models evaluated use historic trend and meteorological and agroclimatic variables to forecast soybean yields in Iowa, Illinois, and Indiana. Indicators of yield reliability and current measures of modeled yield reliability were obtained from bootstrap tests on the end of season models. Indicators of yield reliability show that the state models are consistently better than the crop reporting district (CRD) models. One CRD model is especially poor. At the state level, the bias of each model is less than one half quintal/hectare. The standard deviation is between one and two quintals/hectare. The models are adequate in terms of coverage and are to a certain extent consistent with scientific knowledge. Timely yield estimates can be made during the growing season using truncated models. The models are easy to understand and use and are not costly to operate. Other than the specification of values used to determine evapotranspiration, the models are objective. Because the method of variable selection used in the model development is adequately documented, no evaluation can be made of the objectivity and cost of redevelopment of the model.

  15. Further Examination of the Reliability of the Modified Rathus Assertiveness Schedule.

    ERIC Educational Resources Information Center

    Del Greco, Linda; And Others

    1986-01-01

    Examined the reliability of the 30-item Modified Rathus Assertiveness Schedule (MRAS) using the test-retest method over a three-week period. The MRAS yielded correlations of .74 using the Pearson product and Spearman Brown correlation coefficient. Correlations for males yielded .77 and .72. For females correlations for both tests were .72.…

  16. Claims about the Reliability of Student Evaluations of Instruction: The Ecological Fallacy Rides Again

    ERIC Educational Resources Information Center

    Morley, Donald D.

    2012-01-01

    The vast majority of the research on student evaluation of instruction has assessed the reliability of groups of courses and yielded either a single reliability coefficient for the entire group, or grouped reliability coefficients for each student evaluation of teaching (SET) item. This manuscript argues that these practices constitute a form of…

  17. Yield gap mapping as a support tool for risk management in agriculture

    NASA Astrophysics Data System (ADS)

    Lahlou, Ouiam; Imani, Yasmina; Slimani, Imane; Van Wart, Justin; Yang, Haishun

    2016-04-01

    The increasing frequency and magnitude of droughts in Morocco and the mounting losses from extended droughts in the agricultural sector emphasized the need to develop reliable and timely tools to manage drought and to mitigate resulting catastrophic damage. In 2011, Morocco launched a cereals multi-risk insurance with drought as the most threatening and the most frequent hazard in the country. However, and in order to assess the gap and to implement the more suitable compensation, it is essential to quantify the potential yield in each area. In collaboration with the University of Nebraska-Lincoln, a study is carried out in Morocco and aims to determine the yield potentials and the yield gaps in the different agro-climatic zones of the country. It fits into the large project: Global Yield Gap and Water Productivity Atlas: http://www.yieldgap.org/. The yield gap (Yg) is the magnitude and difference between crop yield potential (Yp) or water limited yield potential (Yw) and actual yields, reached by farmers. World Food Studies (WOFOST), which is a Crop simulation mechanistic model, has been used for this purpose. Prior to simulations, reliable information about actual yields, weather data, crop management data and soil data have been collected in 7 Moroccan buffer zones considered, each, within a circle of 100 km around a weather station point, homogenously spread across the country and where cereals are widely grown. The model calibration was also carried out using WOFOST default varieties data. The map-based results represent a robust tool, not only for drought insurance organization, but for agricultural and agricultural risk management. Moreover, accurate and geospatially granular estimates of Yg and Yw will allow to focus on regions with largest unexploited yield gaps and greatest potential to close them, and consequently to improve food security in the country.

  18. Evaluation of Thompson-type trend and monthly weather data models for corn yields in Iowa, Illinois, and Indiana

    NASA Technical Reports Server (NTRS)

    French, V. (Principal Investigator)

    1982-01-01

    An evaluation was made of Thompson-Type models which use trend terms (as a surrogate for technology), meteorological variables based on monthly average temperature, and total precipitation to forecast and estimate corn yields in Iowa, Illinois, and Indiana. Pooled and unpooled Thompson-type models were compared. Neither was found to be consistently superior to the other. Yield reliability indicators show that the models are of limited use for large area yield estimation. The models are objective and consistent with scientific knowledge. Timely yield forecasts and estimates can be made during the growing season by using normals or long range weather forecasts. The models are not costly to operate and are easy to use and understand. The model standard errors of prediction do not provide a useful current measure of modeled yield reliability.

  19. Interval Estimation of Revision Effect on Scale Reliability via Covariance Structure Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko

    2009-01-01

    A didactic discussion of a procedure for interval estimation of change in scale reliability due to revision is provided, which is developed within the framework of covariance structure modeling. The method yields ranges of plausible values for the population gain or loss in reliability of unidimensional composites, which results from deletion or…

  20. Compression of freestanding gold nanostructures: from stochastic yield to predictable flow

    NASA Astrophysics Data System (ADS)

    Mook, W. M.; Niederberger, C.; Bechelany, M.; Philippe, L.; Michler, J.

    2010-02-01

    Characterizing the mechanical response of isolated nanostructures is vitally important to fields such as microelectromechanical systems (MEMS) where the behaviour of nanoscale contacts can in large part determine system reliability and lifetime. To address this challenge directly, single crystal gold nanodots are compressed inside a high resolution scanning electron microscope (SEM) using a nanoindenter equipped with a flat punch tip. These structures load elastically, and then yield in a stochastic manner, at loads ranging from 16 to 110 µN, which is up to five times higher than the load necessary for flow after yield. Yielding is immediately followed by displacement bursts equivalent to 1-50% of the initial height, depending on the yield point. During the largest displacement bursts, strain energy within the structure is released while new surface area is created in the form of localized slip bands, which are evident in both the SEM movies and still-images. A first order estimate of the apparent energy release rate, in terms of fracture mechanics concepts, for bursts representing 5-50% of the structure's initial height is on the order of 10-100 J m-2, which is approximately two orders of magnitude lower than bulk values. Once this initial strain burst during yielding has occurred, the structures flow in a ductile way. The implications of this behaviour, which is analogous to a brittle to ductile transition, are discussed with respect to mechanical reliability at the micro- and nanoscales.

  1. Increased Reliability for Single-Case Research Results: Is the Bootstrap the Answer?

    ERIC Educational Resources Information Center

    Parker, Richard I.

    2006-01-01

    There is need for objective and reliable single-case research (SCR) results in the movement toward evidence-based interventions (EBI), for inclusion in meta-analyses, and for funding accountability in clinical contexts. Yet SCR deals with data that often do not conform to parametric data assumptions and that yield results of low reliability. A…

  2. Satellite-based assessment of grassland yields

    NASA Astrophysics Data System (ADS)

    Grant, K.; Siegmund, R.; Wagner, M.; Hartmann, S.

    2015-04-01

    Cutting date and frequency are important parameters determining grassland yields in addition to the effects of weather, soil conditions, plant composition and fertilisation. Because accurate and area-wide data of grassland yields are currently not available, cutting frequency can be used to estimate yields. In this project, a method to detect cutting dates via surface changes in radar images is developed. The combination of this method with a grassland yield model will result in more reliable and regional-wide numbers of grassland yields. For the test-phase of the monitoring project, a study area situated southeast of Munich, Germany, was chosen due to its high density of managed grassland. For determining grassland cutting robust amplitude change detection techniques are used evaluating radar amplitude or backscatter statistics before and after the cutting event. CosmoSkyMed and Sentinel-1A data were analysed. All detected cuts were verified according to in-situ measurements recorded in a GIS database. Although the SAR systems had various acquisition geometries, the amount of detected grassland cut was quite similar. Of 154 tested grassland plots, covering in total 436 ha, 116 and 111 cuts were detected using CosmoSkyMed and Sentinel-1A radar data, respectively. Further improvement of radar data processes as well as additional analyses with higher sample number and wider land surface coverage will follow for optimisation of the method and for validation and generalisation of the results of this feasibility study. The automation of this method will than allow for an area-wide and cost efficient cutting date detection service improving grassland yield models.

  3. Global Crop Yields, Climatic Trends and Technology Enhancement

    NASA Astrophysics Data System (ADS)

    Najafi, E.; Devineni, N.; Khanbilvardi, R.; Kogan, F.

    2016-12-01

    During the last decades the global agricultural production has soared up and technology enhancement is still making positive contribution to yield growth. However, continuing population, water crisis, deforestation and climate change threaten the global food security. Attempts to predict food availability in the future around the world can be partly understood from the impact of changes to date. A new multilevel model for yield prediction at the country scale using climate covariates and technology trend is presented in this paper. The structural relationships between average yield and climate attributes as well as trends are estimated simultaneously. All countries are modeled in a single multilevel model with partial pooling and/or clustering to automatically group and reduce estimation uncertainties. El Niño Southern Oscillation (ENSO), Palmer Drought Severity Index (PDSI), Geopotential height (GPH), historical CO2 level and time-trend as a relatively reliable approximation of technology measurement are used as predictors to estimate annual agricultural crop yields for each country from 1961 to 2007. Results show that these indicators can explain the variability in historical crop yields for most of the countries and the model performs well under out-of-sample verifications.

  4. Movement-related beta oscillations show high intra-individual reliability.

    PubMed

    Espenhahn, Svenja; de Berker, Archy O; van Wijk, Bernadette C M; Rossiter, Holly E; Ward, Nick S

    2017-02-15

    Oscillatory activity in the beta frequency range (15-30Hz) recorded from human sensorimotor cortex is of increasing interest as a putative biomarker of motor system function and dysfunction. Despite its increasing use in basic and clinical research, surprisingly little is known about the test-retest reliability of spectral power and peak frequency measures of beta oscillatory signals from sensorimotor cortex. Establishing that these beta measures are stable over time in healthy populations is a necessary precursor to their use in the clinic. Here, we used scalp electroencephalography (EEG) to evaluate intra-individual reliability of beta-band oscillations over six sessions, focusing on changes in beta activity during movement (Movement-Related Beta Desynchronization, MRBD) and after movement termination (Post-Movement Beta Rebound, PMBR). Subjects performed visually-cued unimanual wrist flexion and extension. We assessed Intraclass Correlation Coefficients (ICC) and between-session correlations for spectral power and peak frequency measures of movement-related and resting beta activity. Movement-related and resting beta power from both sensorimotor cortices was highly reliable across sessions. Resting beta power yielded highest reliability (average ICC=0.903), followed by MRBD (average ICC=0.886) and PMBR (average ICC=0.663). Notably, peak frequency measures yielded lower ICC values compared to the assessment of spectral power, particularly for movement-related beta activity (ICC=0.386-0.402). Our data highlight that power measures of movement-related beta oscillations are highly reliable, while corresponding peak frequency measures show greater intra-individual variability across sessions. Importantly, our finding that beta power estimates show high intra-individual reliability over time serves to validate the notion that these measures reflect meaningful individual differences that can be utilised in basic research and clinical studies. Copyright © 2016 The

  5. Measurement and Reliability of Response Inhibition

    PubMed Central

    Congdon, Eliza; Mumford, Jeanette A.; Cohen, Jessica R.; Galvan, Adriana; Canli, Turhan; Poldrack, Russell A.

    2012-01-01

    Response inhibition plays a critical role in adaptive functioning and can be assessed with the Stop-signal task, which requires participants to suppress prepotent motor responses. Evidence suggests that this ability to inhibit a prepotent motor response (reflected as Stop-signal reaction time (SSRT)) is a quantitative and heritable measure of interindividual variation in brain function. Although attention has been given to the optimal method of SSRT estimation, and initial evidence exists in support of its reliability, there is still variability in how Stop-signal task data are treated across samples. In order to examine this issue, we pooled data across three separate studies and examined the influence of multiple SSRT calculation methods and outlier calling on reliability (using Intra-class correlation). Our results suggest that an approach which uses the average of all available sessions, all trials of each session, and excludes outliers based on predetermined lenient criteria yields reliable SSRT estimates, while not excluding too many participants. Our findings further support the reliability of SSRT, which is commonly used as an index of inhibitory control, and provide support for its continued use as a neurocognitive phenotype. PMID:22363308

  6. An Acoustic Charge Transport Imager for High Definition Television Applications: Reliability Modeling and Parametric Yield Prediction of GaAs Multiple Quantum Well Avalanche Photodiodes. Degree awarded Oct. 1997

    NASA Technical Reports Server (NTRS)

    Hunt, W. D.; Brennan, K. F.; Summers, C. J.; Yun, Ilgu

    1994-01-01

    Reliability modeling and parametric yield prediction of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiodes (APDs), which are of interest as an ultra-low noise image capture mechanism for high definition systems, have been investigated. First, the effect of various doping methods on the reliability of GaAs/AlGaAs multiple quantum well (MQW) avalanche photodiode (APD) structures fabricated by molecular beam epitaxy is investigated. Reliability is examined by accelerated life tests by monitoring dark current and breakdown voltage. Median device lifetime and the activation energy of the degradation mechanism are computed for undoped, doped-barrier, and doped-well APD structures. Lifetimes for each device structure are examined via a statistically designed experiment. Analysis of variance shows that dark-current is affected primarily by device diameter, temperature and stressing time, and breakdown voltage depends on the diameter, stressing time and APD type. It is concluded that the undoped APD has the highest reliability, followed by the doped well and doped barrier devices, respectively. To determine the source of the degradation mechanism for each device structure, failure analysis using the electron-beam induced current method is performed. This analysis reveals some degree of device degradation caused by ionic impurities in the passivation layer, and energy-dispersive spectrometry subsequently verified the presence of ionic sodium as the primary contaminant. However, since all device structures are similarly passivated, sodium contamination alone does not account for the observed variation between the differently doped APDs. This effect is explained by the dopant migration during stressing, which is verified by free carrier concentration measurements using the capacitance-voltage technique.

  7. The Z {yields} cc-bar {yields} {gamma}{gamma}*, Z {yields} bb-bar {yields} {gamma}{gamma}* triangle diagrams and the Z {yields} {gamma}{psi}, Z {yields} {gamma}Y decays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achasov, N. N., E-mail: achasov@math.nsc.ru

    2011-03-15

    The approach to the Z {yields} {gamma}{psi} and Z {yields} {gamma}Y decay study is presented in detail, based on the sum rules for the Z {yields} cc-bar {yields} {gamma}{gamma}* and Z {yields} bb-bar {yields} {gamma}{gamma}* amplitudes and their derivatives. The branching ratios of the Z {yields} {gamma}{psi} and Z {yields} {gamma}Y decays are calculated for different hypotheses on saturation of the sum rules. The lower bounds of {Sigma}{sub {psi}} BR(Z {yields} {gamma}{psi}) = 1.95 Multiplication-Sign 10{sup -7} and {Sigma}{sub {upsilon}} BR(Z {yields} {gamma}Y) = 7.23 Multiplication-Sign 10{sup -7} are found. Deviations from the lower bounds are discussed, including the possibilitymore » of BR(Z {yields} {gamma}J/{psi}(1S)) {approx} BR(Z {yields} {gamma}Y(1S)) {approx} 10{sup -6}, that could be probably measured in LHC. The angular distributions in the Z {yields} {gamma}{psi} and Z {yields} {gamma}Y decays are also calculated.« less

  8. Yield of illicit indoor cannabis cultivation in the Netherlands.

    PubMed

    Toonen, Marcel; Ribot, Simon; Thissen, Jac

    2006-09-01

    To obtain a reliable estimation on the yield of illicit indoor cannabis cultivation in The Netherlands, cannabis plants confiscated by the police were used to determine the yield of dried female flower buds. The developmental stage of flower buds of the seized plants was described on a scale from 1 to 10 where the value of 10 indicates a fully developed flower bud ready for harvesting. Using eight additional characteristics describing the grow room and cultivation parameters, regression analysis with subset selection was carried out to develop two models for the yield of indoor cannabis cultivation. The median Dutch illicit grow room consists of 259 cannabis plants, has a plant density of 15 plants/m(2), and 510 W of growth lamps per m(2). For the median Dutch grow room, the predicted yield of female flower buds at the harvestable developmental stage (stage 10) was 33.7 g/plant or 505 g/m(2).

  9. Governing Influence of Thermodynamic and Chemical Equilibria on the Interfacial Properties in Complex Fluids.

    PubMed

    Harikrishnan, A R; Dhar, Purbarun; Gedupudi, Sateesh; Das, Sarit K

    2018-04-12

    We propose a comprehensive analysis and a quasi-analytical mathematical formalism to predict the surface tension and contact angles of complex surfactant-infused nanocolloids. The model rests on the foundations of the interaction potentials for the interfacial adsorption-desorption dynamics in complex multicomponent colloids. Surfactant-infused nanoparticle-laden interface problems are difficult to deal with because of the many-body interactions and interfaces involved at the meso-nanoscales. The model is based on the governing role of thermodynamic and chemical equilibrium parameters in modulating the interfacial energies. The influence of parameters such as the presence of surfactants, nanoparticles, and surfactant-capped nanoparticles on interfacial dynamics is revealed by the analysis. Solely based on the knowledge of interfacial properties of independent surfactant solutions and nanocolloids, the same can be deduced for complex surfactant-based nanocolloids through the proposed approach. The model accurately predicts the equilibrium surface tension and contact angle of complex nanocolloids available in the existing literature and present experimental findings.

  10. Comparison of CEAS and Williams-type models for spring wheat yields in North Dakota and Minnesota

    NASA Technical Reports Server (NTRS)

    Barnett, T. L. (Principal Investigator)

    1982-01-01

    The CEAS and Williams-type yield models are both based on multiple regression analysis of historical time series data at CRD level. The CEAS model develops a separate relation for each CRD; the Williams-type model pools CRD data to regional level (groups of similar CRDs). Basic variables considered in the analyses are USDA yield, monthly mean temperature, monthly precipitation, and variables derived from these. The Williams-type model also used soil texture and topographic information. Technological trend is represented in both by piecewise linear functions of year. Indicators of yield reliability obtained from a ten-year bootstrap test of each model (1970-1979) demonstrate that the models are very similar in performance in all respects. Both models are about equally objective, adequate, timely, simple, and inexpensive. Both consider scientific knowledge on a broad scale but not in detail. Neither provides a good current measure of modeled yield reliability. The CEAS model is considered very slightly preferable for AgRISTARS applications.

  11. Quantifying the potential for reservoirs to secure future surface water yields in the world’s largest river basins

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Parkinson, Simon; Gidden, Matthew; Byers, Edward; Satoh, Yusuke; Riahi, Keywan; Forman, Barton

    2018-04-01

    Surface water reservoirs provide us with reliable water supply, hydropower generation, flood control and recreation services. Yet reservoirs also cause flow fragmentation in rivers and lead to flooding of upstream areas, thereby displacing existing land-use activities and ecosystems. Anticipated population growth and development coupled with climate change in many regions of the globe suggests a critical need to assess the potential for future reservoir capacity to help balance rising water demands with long-term water availability. Here, we assess the potential of large-scale reservoirs to provide reliable surface water yields while also considering environmental flows within 235 of the world’s largest river basins. Maps of existing cropland and habitat conservation zones are integrated with spatially-explicit population and urbanization projections from the Shared Socioeconomic Pathways to identify regions unsuitable for increasing water supply by exploiting new reservoir storage. Results show that even when maximizing the global reservoir storage to its potential limit (∼4.3–4.8 times the current capacity), firm yields would only increase by about 50% over current levels. However, there exist large disparities across different basins. The majority of river basins in North America are found to gain relatively little firm yield by increasing storage capacity, whereas basins in Southeast Asia display greater potential for expansion as well as proportional gains in firm yield under multiple uncertainties. Parts of Europe, the United States and South America show relatively low reliability of maintaining current firm yields under future climate change, whereas most of Asia and higher latitude regions display comparatively high reliability. Findings from this study highlight the importance of incorporating different factors, including human development, land-use activities, and climate change, over a time span of multiple decades and across a range of different

  12. Assessing disease stress and modeling yield losses in alfalfa

    NASA Astrophysics Data System (ADS)

    Guan, Jie

    Alfalfa is the most important forage crop in the U.S. and worldwide. Fungal foliar diseases are believed to cause significant yield losses in alfalfa, yet, little quantitative information exists regarding the amount of crop loss. Different fungicides and application frequencies were used as tools to generate a range of foliar disease intensities in Ames and Nashua, IA. Visual disease assessments (disease incidence, disease severity, and percentage defoliation) were obtained weekly for each alfalfa growth cycle (two to three growing cycles per season). Remote sensing assessments were performed using a hand-held, multispectral radiometer to measure the amount and quality of sunlight reflected from alfalfa canopies. Factors such as incident radiation, sun angle, sensor height, and leaf wetness were all found to significantly affect the percentage reflectance of sunlight reflected from alfalfa canopies. The precision of visual and remote sensing assessment methods was quantified. Precision was defined as the intra-rater repeatability and inter-rater reliability of assessment methods. F-tests, slopes, intercepts, and coefficients of determination (R2) were used to compare assessment methods for precision. Results showed that among the three visual disease assessment methods (disease incidence, disease severity, and percentage defoliation), percentage defoliation had the highest intra-rater repeatability and inter-rater reliability. Remote sensing assessment method had better precision than the percentage defoliation assessment method based upon higher intra-rater repeatability and inter-rater reliability. Significant linear relationships between canopy reflectance (810 nm), percentage defoliation and yield were detected using linear regression and percentage reflectance (810 nm) assessments were found to have a stronger relationship with yield than percentage defoliation assessments. There were also significant linear relationships between percentage defoliation, dry

  13. The optimal duration of frequency-volume charts related to compliance and reliability.

    PubMed

    van Haarst, Ernst P; Bosch, J L H Ruud

    2014-03-01

    To assess Frequency-volume charts (FVCs) for the yield of additional recorded days and the ideal duration of recording related to compliance and reliability. Of 500 consecutive urologic outpatients willing to complete a 7-day FVC, 378 FVCs were evaluable. During seven consecutive days every voiding time and volume were recorded. Missed entries were indicated with a coded letter, thereby assessing the true frequency and compliance. Reliability is the agreement of the day-to-day FVC parameters with the 7-day FVC pattern. Single-day reliability was assessed and used in the Spearman-Brown formula. FVCs of 228 male and 150 females were evaluated. Mean age was 55.2 years (standard deviation [SD]: 16.2 years), and mean 24-hr urine production was 1,856 ml (SD: 828 ml). The percentage of patients with complete FVCs decreased from 78% on day 2 to 58% on day 7, and dropped below 70% after 4 days. Single-day reliability was r = 0.63 for nocturnal urine production, r = 0.72 for 24-hr urine production, and r = 0.80 for mean voided volume. At 5 days, reliability of 90% was achieved for all parameters. With each additional day, FVCs showed a decrease in compliance and an increase in reliability. At day 3, reliability of 80% was achieved for all FVC parameters, but compliance dropped to 73%. Beyond 5 days, the yield of additional recorded days was limited. We advocate an FVC duration of 3 days, but the duration may be shortened or extended depending on the goal of the FVC. © 2013 Wiley Periodicals, Inc.

  14. Predicting red meat yields in carcasses from beef-type and calf-fed Holstein steers using the United States Department of Agriculture calculated yield grade.

    PubMed

    Lawrence, T E; Elam, N A; Miller, M F; Brooks, J C; Hilton, G G; VanOverbeke, D L; McKeith, F K; Killefer, J; Montgomery, T H; Allen, D M; Griffin, D B; Delmore, R J; Nichols, W T; Streeter, M N; Yates, D A; Hutcheson, J P

    2010-06-01

    Analyses were conducted to evaluate the ability of the USDA yield grade equation to detect differences in subprimal yield of beef-type steers and calf-fed Holstein steers that had been fed zilpaterol hydrochloride (ZH; Intervet Inc., Millsboro, DE) as well as those that had not been fed ZH. Beef-type steer (n = 801) and calf-fed Holstein steer (n = 235) carcasses were fabricated into subprimal cuts and trim. Simple correlations between calculated yield grades and total red meat yields ranged from -0.56 to -0.62 for beef-type steers. Reliable correlations from calf-fed Holstein steers were unobtainable; the probability of a type I error met or exceeded 0.39. Linear models were developed for the beef-type steers to predict total red meat yield based on calculated USDA yield grade within each ZH duration. At an average calculated USDA yield grade of 2.9, beef-type steer carcasses that had not been fed ZH had an estimated 69.4% red meat yield, whereas those fed ZH had an estimated 70.7% red meat yield. These results indicate that feeding ZH increased red meat yield by 1.3% at a constant calculated yield grade. However, these data also suggest that the calculated USDA yield grade score is a poor and variable estimator (adjusted R(2) of 0.31 to 0.38) of total red meat yield of beef-type steer carcasses, regardless of ZH feeding. Moreover, no relationship existed (adjusted R(2) of 0.00 to 0.01) for calf-fed Holstein steer carcasses, suggesting the USDA yield grade is not a valid estimate of calf-fed Holstein red meat yield.

  15. Scale for positive aspects of caregiving experience: development, reliability, and factor structure.

    PubMed

    Kate, N; Grover, S; Kulhara, P; Nehra, R

    2012-06-01

    OBJECTIVE. To develop an instrument (Scale for Positive Aspects of Caregiving Experience [SPACE]) that evaluates positive caregiving experience and assess its psychometric properties. METHODS. Available scales which assess some aspects of positive caregiving experience were reviewed and a 50-item questionnaire with a 5-point rating was constructed. In all, 203 primary caregivers of patients with severe mental disorders were asked to complete the questionnaire. Internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity were evaluated. Principal component factor analysis was run to assess the factorial validity of the scale. RESULTS. The scale developed as part of the study was found to have good internal consistency, test-retest reliability, cross-language reliability, split-half reliability, and face validity. Principal component factor analysis yielded a 4-factor structure, which also had good test-retest reliability and cross-language reliability. There was a strong correlation between the 4 factors obtained. CONCLUSION. The SPACE developed as part of this study has good psychometric properties.

  16. Reliability reporting across studies using the Buss Durkee Hostility Inventory.

    PubMed

    Vassar, Matt; Hale, William

    2009-01-01

    Empirical research on anger and hostility has pervaded the academic literature for more than 50 years. Accurate measurement of anger/hostility and subsequent interpretation of results requires that the instruments yield strong psychometric properties. For consistent measurement, reliability estimates must be calculated with each administration, because changes in sample characteristics may alter the scale's ability to generate reliable scores. Therefore, the present study was designed to address reliability reporting practices for a widely used anger assessment, the Buss Durkee Hostility Inventory (BDHI). Of the 250 published articles reviewed, 11.2% calculated and presented reliability estimates for the data at hand, 6.8% cited estimates from a previous study, and 77.1% made no mention of score reliability. Mean alpha estimates of scores for BDHI subscales generally fell below acceptable standards. Additionally, no detectable pattern was found between reporting practices and publication year or journal prestige. Areas for future research are also discussed.

  17. Hybrid radioguided occult lesion localization (hybrid ROLL) of (18)F-FDG-avid lesions using the hybrid tracer indocyanine green-(99m)Tc-nanocolloid.

    PubMed

    KleinJan, G H; Brouwer, O R; Mathéron, H M; Rietbergen, D D D; Valdés Olmos, R A; Wouters, M W; van den Berg, N S; van Leeuwen, F W B

    2016-01-01

    To assess if combined fluorescence- and radio-guided occult lesion localization (hybrid ROLL) is feasible in patients scheduled for surgical resection of non-palpable (18)F-FDG-avid lesions on PET/CT. Four patients with (18)F-FDG-avid lesions on follow-up PET/CT that were not palpable during physical examination but were suspected to harbor metastasis were enrolled. Guided by ultrasound, the hybrid tracer indocyanine green (ICG)-(99m)Tc-nanocolloid was injected centrally in the target lesion. SPECT/CT imaging was used to confirm tracer deposition. Intraoperatively, lesions were localized using a hand-held gamma ray detection probe, a portable gamma camera, and a fluorescence camera. After excision, the gamma camera was used to check the wound bed for residual activity. A total of six (18)F-FDG-avid lymph nodes were identified and scheduled for hybrid ROLL. Comparison of the PET/CT images with the acquired SPECT/CT after hybrid tracer injection confirmed accurate tracer deposition. No side effects were observed. Combined radio- and fluorescence-guidance enabled localization and excision of the target lesion in all patients. Five of the six excised lesions proved tumor-positive at histopathology. The hybrid ROLL approach appears to be feasible and can facilitate the intraoperative localization and excision of non-palpable lesions suspected to harbor tumor metastases. In addition to the initial radioguided detection, the fluorescence component of the hybrid tracer enables high-resolution intraoperative visualization of the target lesion. The procedure needs further evaluation in a larger cohort and wider range of malignancies to substantiate these preliminary findings. Copyright © 2016 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  18. The (un)reliability of item-level semantic priming effects.

    PubMed

    Heyman, Tom; Bruninx, Anke; Hutchison, Keith A; Storms, Gert

    2018-04-05

    Many researchers have tried to predict semantic priming effects using a myriad of variables (e.g., prime-target associative strength or co-occurrence frequency). The idea is that relatedness varies across prime-target pairs, which should be reflected in the size of the priming effect (e.g., cat should prime dog more than animal does). However, it is only insightful to predict item-level priming effects if they can be measured reliably. Thus, in the present study we examined the split-half and test-retest reliabilities of item-level priming effects under conditions that should discourage the use of strategies. The resulting priming effects proved extremely unreliable, and reanalyses of three published priming datasets revealed similar cases of low reliability. These results imply that previous attempts to predict semantic priming were unlikely to be successful. However, one study with an unusually large sample size yielded more favorable reliability estimates, suggesting that big data, in terms of items and participants, should be the future for semantic priming research.

  19. Reliability-based structural optimization: A proposed analytical-experimental study

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Nikolaidis, Efstratios

    1993-01-01

    An analytical and experimental study for assessing the potential of reliability-based structural optimization is proposed and described. In the study, competing designs obtained by deterministic and reliability-based optimization are compared. The experimental portion of the study is practical because the structure selected is a modular, actively and passively controlled truss that consists of many identical members, and because the competing designs are compared in terms of their dynamic performance and are not destroyed if failure occurs. The analytical portion of this study is illustrated on a 10-bar truss example. In the illustrative example, it is shown that reliability-based optimization can yield a design that is superior to an alternative design obtained by deterministic optimization. These analytical results provide motivation for the proposed study, which is underway.

  20. Statistical Evaluations of Variations in Dairy Cows’ Milk Yields as a Precursor of Earthquakes

    PubMed Central

    Yamauchi, Hiroyuki; Hayakawa, Masashi; Asano, Tomokazu; Ohtani, Nobuyo; Ohta, Mitsuaki

    2017-01-01

    Simple Summary There are many reports of abnormal changes occurring in various natural systems prior to earthquakes. Unusual animal behavior is one of these abnormalities; however, there are few objective indicators and to date, reliability has remained uncertain. We found that milk yields of dairy cows decreased prior to an earthquake in our previous case study. In this study, we examined the reliability of decreases in milk yields as a precursor for earthquakes using long-term observation data. In the results, milk yields decreased approximately three weeks before earthquakes. We have come to the conclusion that dairy cow milk yields have applicability as an objectively observable unusual animal behavior prior to earthquakes, and dairy cows respond to some physical or chemical precursors of earthquakes. Abstract Previous studies have provided quantitative data regarding unusual animal behavior prior to earthquakes; however, few studies include long-term, observational data. Our previous study revealed that the milk yields of dairy cows decreased prior to an extremely large earthquake. To clarify whether the milk yields decrease prior to earthquakes, we examined the relationship between earthquakes of various magnitudes and daily milk yields. The observation period was one year. In the results, cross-correlation analyses revealed a significant negative correlation between earthquake occurrence and milk yields approximately three weeks beforehand. Approximately a week and a half beforehand, a positive correlation was revealed, and the correlation gradually receded to zero as the day of the earthquake approached. Future studies that use data from a longer observation period are needed because this study only considered ten earthquakes and therefore does not have strong statistical power. Additionally, we compared the milk yields with the subionospheric very low frequency/low frequency (VLF/LF) propagation data indicating ionospheric perturbations. The results showed

  1. Plausible rice yield losses under future climate warming.

    PubMed

    Zhao, Chuang; Piao, Shilong; Wang, Xuhui; Huang, Yao; Ciais, Philippe; Elliott, Joshua; Huang, Mengtian; Janssens, Ivan A; Li, Tao; Lian, Xu; Liu, Yongwen; Müller, Christoph; Peng, Shushi; Wang, Tao; Zeng, Zhenzhong; Peñuelas, Josep

    2016-12-19

    Rice is the staple food for more than 50% of the world's population 1-3 . Reliable prediction of changes in rice yield is thus central for maintaining global food security. This is an extraordinary challenge. Here, we compare the sensitivity of rice yield to temperature increase derived from field warming experiments and three modelling approaches: statistical models, local crop models and global gridded crop models. Field warming experiments produce a substantial rice yield loss under warming, with an average temperature sensitivity of -5.2 ± 1.4% K -1 . Local crop models give a similar sensitivity (-6.3 ± 0.4% K -1 ), but statistical and global gridded crop models both suggest less negative impacts of warming on yields (-0.8 ± 0.3% and -2.4 ± 3.7% K -1 , respectively). Using data from field warming experiments, we further propose a conditional probability approach to constrain the large range of global gridded crop model results for the future yield changes in response to warming by the end of the century (from -1.3% to -9.3% K -1 ). The constraint implies a more negative response to warming (-8.3 ± 1.4% K -1 ) and reduces the spread of the model ensemble by 33%. This yield reduction exceeds that estimated by the International Food Policy Research Institute assessment (-4.2 to -6.4% K -1 ) (ref. 4). Our study suggests that without CO 2 fertilization, effective adaptation and genetic improvement, severe rice yield losses are plausible under intensive climate warming scenarios.

  2. Splitting parameter yield (SPY): A program for semiautomatic analysis of shear-wave splitting

    NASA Astrophysics Data System (ADS)

    Zaccarelli, Lucia; Bianco, Francesca; Zaccarelli, Riccardo

    2012-03-01

    SPY is a Matlab algorithm that analyzes seismic waveforms in a semiautomatic way, providing estimates of the two observables of the anisotropy: the shear-wave splitting parameters. We chose to exploit those computational processes that require less intervention by the user, gaining objectivity and reliability as a result. The algorithm joins the covariance matrix and the cross-correlation techniques, and all the computation steps are interspersed by several automatic checks intended to verify the reliability of the yields. The resulting semiautomation generates two new advantages in the field of anisotropy studies: handling a huge amount of data at the same time, and comparing different yields. From this perspective, SPY has been developed in the Matlab environment, which is widespread, versatile, and user-friendly. Our intention is to provide the scientific community with a new monitoring tool for tracking the temporal variations of the crustal stress field.

  3. Incorporating uncertainty into the ranking of SPARROW model nutrient yields from Mississippi/Atchafalaya River basin watersheds

    USGS Publications Warehouse

    Robertson, Dale M.; Schwarz, Gregory E.; Saad, David A.; Alexander, Richard B.

    2009-01-01

    Excessive loads of nutrients transported by tributary rivers have been linked to hypoxia in the Gulf of Mexico. Management efforts to reduce the hypoxic zone in the Gulf of Mexico and improve the water quality of rivers and streams could benefit from targeting nutrient reductions toward watersheds with the highest nutrient yields delivered to sensitive downstream waters. One challenge is that most conventional watershed modeling approaches (e.g., mechanistic models) used in these management decisions do not consider uncertainties in the predictions of nutrient yields and their downstream delivery. The increasing use of parameter estimation procedures to statistically estimate model coefficients, however, allows uncertainties in these predictions to be reliably estimated. Here, we use a robust bootstrapping procedure applied to the results of a previous application of the hybrid statistical/mechanistic watershed model SPARROW (Spatially Referenced Regression On Watershed attributes) to develop a statistically reliable method for identifying “high priority” areas for management, based on a probabilistic ranking of delivered nutrient yields from watersheds throughout a basin. The method is designed to be used by managers to prioritize watersheds where additional stream monitoring and evaluations of nutrient-reduction strategies could be undertaken. Our ranking procedure incorporates information on the confidence intervals of model predictions and the corresponding watershed rankings of the delivered nutrient yields. From this quantified uncertainty, we estimate the probability that individual watersheds are among a collection of watersheds that have the highest delivered nutrient yields. We illustrate the application of the procedure to 818 eight-digit Hydrologic Unit Code watersheds in the Mississippi/Atchafalaya River basin by identifying 150 watersheds having the highest delivered nutrient yields to the Gulf of Mexico. Highest delivered yields were from

  4. Reliable Digit Span: A Systematic Review and Cross-Validation Study

    ERIC Educational Resources Information Center

    Schroeder, Ryan W.; Twumasi-Ankrah, Philip; Baade, Lyle E.; Marshall, Paul S.

    2012-01-01

    Reliable Digit Span (RDS) is a heavily researched symptom validity test with a recent literature review yielding more than 20 studies ranging in dates from 1994 to 2011. Unfortunately, limitations within some of the research minimize clinical generalizability. This systematic review and cross-validation study was conducted to address these…

  5. Clinical trial of combined radio- and fluorescence-guided sentinel lymph node biopsy in breast cancer

    PubMed Central

    Schaafsma, Boudewijn E.; Verbeek, Floris P.R.; Rietbergen, Daphne D.D.; van der Hiel, Bernies; van der Vorst, Joost R.; Liefers, Gerrit-Jan; Frangioni, John V.; van de Velde, Cornelis J.H.; van Leeuwen, Fijs W.B.; Vahrmeijer, Alexander L.

    2013-01-01

    Background Combining radioactive colloids and a near-infrared (NIR) fluorophore permit preoperative planning and intraoperative localization of deeply located sentinel lymph nodes (SLNs) with direct optical guidance by a single lymphatic tracer. The aim of this clinical trial was to evaluate and optimize a hybrid NIR fluorescence and radioactive tracer for SLN detection in breast cancer patients. Method Patients with breast cancer undergoing SLN biopsy were enrolled. The day before surgery, indocyanine green (ICG)-99mTc-Nanocolloid was injected periareolarly and a lymphoscintigram was acquired. Directly before surgery, blue dye was injected. Intraoperative SLN localization was performed by a gamma probe and the Mini-FLARETM NIR fluorescence imaging system. Patients were divided into two dose groups, with one group receiving twice the particle density of ICG and nanocolloid, but the same dose of radioactive 99mTechnetium. Results Thirty-two patients were enrolled in the trial. At least one SLN was identified pre- and intraoperatively. All 48 axillary SLNs could be detected by gamma tracing and NIR fluorescence imaging, but only 42 of them stained blue. NIR fluorescence permitted detection of lymphatic vessels draining to the SLN up to 29 hours after injection. Increasing the particle density by two-fold did not yield a difference in fluorescence intensity, median 255 (range 98 – 542) vs. median 284 (90 – 921; P = 0.590), or signal- to- background ratio, median 5.4 (range 3.0 – 15.4) vs. median 4.9 (3.5 – 16.3; P = 1.000), of the SLN. Conclusion The hybrid NIR fluorescence and radioactive tracer ICG-99mTc-Nanocolloid permitted accurate pre- and intraoperative detection of the SLNs in patients with breast cancer. PMID:23696463

  6. Reliable bonding using indium-based solders

    NASA Astrophysics Data System (ADS)

    Cheong, Jongpil; Goyal, Abhijat; Tadigadapa, Srinivas; Rahn, Christopher

    2004-01-01

    Low temperature bonding techniques with high bond strengths and reliability are required for the fabrication and packaging of MEMS devices. Indium and indium-tin based bonding processes are explored for the fabrication of a flextensional MEMS actuator, which requires the integration of lead zirconate titanate (PZT) substrate with a silicon micromachined structure at low temperatures. The developed technique can be used either for wafer or chip level bonding. The lithographic steps used for the patterning and delineation of the seed layer limit the resolution of this technique. Using this technique, reliable bonds were achieved at a temperature of 200°C. The bonds yielded an average tensile strength of 5.41 MPa and 7.38 MPa for samples using indium and indium-tin alloy solders as the intermediate bonding layers respectively. The bonds (with line width of 100 microns) showed hermetic sealing capability of better than 10-11 mbar-l/s when tested using a commercial helium leak tester.

  7. Reliable bonding using indium-based solders

    NASA Astrophysics Data System (ADS)

    Cheong, Jongpil; Goyal, Abhijat; Tadigadapa, Srinivas; Rahn, Christopher

    2003-12-01

    Low temperature bonding techniques with high bond strengths and reliability are required for the fabrication and packaging of MEMS devices. Indium and indium-tin based bonding processes are explored for the fabrication of a flextensional MEMS actuator, which requires the integration of lead zirconate titanate (PZT) substrate with a silicon micromachined structure at low temperatures. The developed technique can be used either for wafer or chip level bonding. The lithographic steps used for the patterning and delineation of the seed layer limit the resolution of this technique. Using this technique, reliable bonds were achieved at a temperature of 200°C. The bonds yielded an average tensile strength of 5.41 MPa and 7.38 MPa for samples using indium and indium-tin alloy solders as the intermediate bonding layers respectively. The bonds (with line width of 100 microns) showed hermetic sealing capability of better than 10-11 mbar-l/s when tested using a commercial helium leak tester.

  8. Magnetoviscoelastic characteristics of superparamagnetic oxides (Fe, Ni) based ferrofluids

    NASA Astrophysics Data System (ADS)

    Katiyar, Ajay; Dhar, Purbarun; Nandi, Tandra; Das, Sarit K.

    2017-08-01

    Ferrofluids have been popular among the academic and scientific communities owing to their intelligent physical characteristics under external stimuli and are in fact among the first nanotechnology products to be employed in real world applications. However, studies on the magnetoviscoelastic behavior of concentrated ferrofluids, especially of superparamagnetic oxides of iron and nickel are rare. The present article comprises the formulation of magneto-colloids utilizing the three various metal oxides nanoparticles viz. Iron (II, III) oxide (Fe3O4), Iron (III) oxide (Fe2O3) and Nickel oxide (NiO) in oil. Iron (II, III) oxide based colloids demonstrate high magnetoviscous characteristics over the other oxides based colloids under external magnetic fields. The maximum magnitude of yield stress and viscosity is found to be 3.0 kPa and 2.9 kPa.s, respectively for iron (II, III) oxide based colloids at 2.6 vol% particle concentration and 1.2 T magnetic field. Experimental investigations reveal that the formulated magneto-nanocolloids are stable, even in high magnetic fields and almost reversible when exposed to rising and drop of magnetic fields of the same magnitude. Observations also reveal that the elastic behavior dominates over the viscous behavior with enhanced relaxation and creep characteristics under the magnetic field. The effect of temperature on viscosity and yield stress of magneto-nanocolloids under magnetic fields has also been discussed. Thus, the present findings have potential applications in various fields such as electromagnetic clutch and brakes of automotive, damping, sealing, optics, nanofinishing etc.

  9. Modelling crop yield in Iberia under drought conditions

    NASA Astrophysics Data System (ADS)

    Ribeiro, Andreia; Páscoa, Patrícia; Russo, Ana; Gouveia, Célia

    2017-04-01

    The improved assessment of the cereal yield and crop loss under drought conditions are essential to meet the increasing economy demands. The growing frequency and severity of the extreme drought conditions in the Iberian Peninsula (IP) has been likely responsible for negative impacts on agriculture, namely on crop yield losses. Therefore, a continuous monitoring of vegetation activity and a reliable estimation of drought impacts is crucial to contribute for the agricultural drought management and development of suitable information tools. This works aims to assess the influence of drought conditions in agricultural yields over the IP, considering cereal yields from mainly rainfed agriculture for the provinces with higher productivity. The main target is to develop a strategy to model drought risk on agriculture for wheat yield at a province level. In order to achieve this goal a combined assessment was made using a drought indicator (Standardized Precipitation Evapotranspiration Index, SPEI) to evaluate drought conditions together with a widely used vegetation index (Normalized Difference Vegetation Index, NDVI) to monitor vegetation activity. A correlation analysis between detrended wheat yield and SPEI was performed in order to assess the vegetation response to each time scale of drought occurrence and also identify the moment of the vegetative cycle when the crop yields are more vulnerable to drought conditions. The time scales and months of SPEI, together with the months of NDVI, better related with wheat yield were chosen to perform a multivariate regression analysis to simulate crop yield. Model results are satisfactory and highlighted the usefulness of such analysis in the framework of developing a drought risk model for crop yields. In terms of an operational point of view, the results aim to contribute to an improved understanding of crop yield management under dry conditions, particularly adding substantial information on the advantages of combining

  10. Hyperspectral sensing to detect the impact of herbicide drift on cotton growth and yield

    NASA Astrophysics Data System (ADS)

    Suarez, L. A.; Apan, A.; Werth, J.

    2016-10-01

    Yield loss in crops is often associated with plant disease or external factors such as environment, water supply and nutrient availability. Improper agricultural practices can also introduce risks into the equation. Herbicide drift can be a combination of improper practices and environmental conditions which can create a potential yield loss. As traditional assessment of plant damage is often imprecise and time consuming, the ability of remote and proximal sensing techniques to monitor various bio-chemical alterations in the plant may offer a faster, non-destructive and reliable approach to predict yield loss caused by herbicide drift. This paper examines the prediction capabilities of partial least squares regression (PLS-R) models for estimating yield. Models were constructed with hyperspectral data of a cotton crop sprayed with three simulated doses of the phenoxy herbicide 2,4-D at three different growth stages. Fibre quality, photosynthesis, conductance, and two main hormones, indole acetic acid (IAA) and abscisic acid (ABA) were also analysed. Except for fibre quality and ABA, Spearman correlations have shown that these variables were highly affected by the chemical. Four PLS-R models for predicting yield were developed according to four timings of data collection: 2, 7, 14 and 28 days after the exposure (DAE). As indicated by the model performance, the analysis revealed that 7 DAE was the best time for data collection purposes (RMSEP = 2.6 and R2 = 0.88), followed by 28 DAE (RMSEP = 3.2 and R2 = 0.84). In summary, the results of this study show that it is possible to accurately predict yield after a simulated herbicide drift of 2,4-D on a cotton crop, through the analysis of hyperspectral data, thereby providing a reliable, effective and non-destructive alternative based on the internal response of the cotton leaves.

  11. Prediction of kharif rice yield at Kharagpur using disaggregated extended range rainfall forecasts

    NASA Astrophysics Data System (ADS)

    Dhekale, B. S.; Nageswararao, M. M.; Nair, Archana; Mohanty, U. C.; Swain, D. K.; Singh, K. K.; Arunbabu, T.

    2017-08-01

    The Extended Range Forecasts System (ERFS) has been generating monthly and seasonal forecasts on real-time basis throughout the year over India since 2009. India is one of the major rice producer and consumer in South Asia; more than 50% of the Indian population depends on rice as staple food. Rice is mainly grown in kharif season, which contributed 84% of the total annual rice production of the country. Rice cultivation in India is rainfed, which depends largely on rains, so reliability of the rainfall forecast plays a crucial role for planning the kharif rice crop. In the present study, an attempt has been made to test the reliability of seasonal and sub-seasonal ERFS summer monsoon rainfall forecasts for kharif rice yield predictions at Kharagpur, West Bengal by using CERES-Rice (DSSATv4.5) model. These ERFS forecasts are produced as monthly and seasonal mean values and are converted into daily sequences with stochastic weather generators for use with crop growth models. The daily sequences are generated from ERFS seasonal (June-September) and sub-seasonal (July-September, August-September, and September) summer monsoon (June to September) rainfall forecasts which are considered as input in CERES-rice crop simulation model for the crop yield prediction for hindcast (1985-2008) and real-time mode (2009-2015). The yield simulated using India Meteorological Department (IMD) observed daily rainfall data is considered as baseline yield for evaluating the performance of predicted yields using the ERFS forecasts. The findings revealed that the stochastic disaggregation can be used to disaggregate the monthly/seasonal ERFS forecasts into daily sequences. The year to year variability in rice yield at Kharagpur is efficiently predicted by using the ERFS forecast products in hindcast as well as real time, and significant enhancement in the prediction skill is noticed with advancement in the season due to incorporation of observed weather data which reduces uncertainty of

  12. Spatial cue reliability drives frequency tuning in the barn Owl's midbrain

    PubMed Central

    Cazettes, Fanny; Fischer, Brian J; Pena, Jose L

    2014-01-01

    The robust representation of the environment from unreliable sensory cues is vital for the efficient function of the brain. However, how the neural processing captures the most reliable cues is unknown. The interaural time difference (ITD) is the primary cue to localize sound in horizontal space. ITD is encoded in the firing rate of neurons that detect interaural phase difference (IPD). Due to the filtering effect of the head, IPD for a given location varies depending on the environmental context. We found that, in barn owls, at each location there is a frequency range where the head filtering yields the most reliable IPDs across contexts. Remarkably, the frequency tuning of space-specific neurons in the owl's midbrain varies with their preferred sound location, matching the range that carries the most reliable IPD. Thus, frequency tuning in the owl's space-specific neurons reflects a higher-order feature of the code that captures cue reliability. DOI: http://dx.doi.org/10.7554/eLife.04854.001 PMID:25531067

  13. Statistical methodology: II. Reliability and validity assessment in study design, Part B.

    PubMed

    Karras, D J

    1997-02-01

    Validity measures the correspondence between a test and other purported measures of the same or similar qualities. When a reference standard exists, a criterion-based validity coefficient can be calculated. If no such standard is available, the concepts of content and construct validity may be used, but quantitative analysis may not be possible. The Pearson and Spearman tests of correlation are often used to assess the correspondence between tests, but do not account for measurement biases and may yield misleading results. Techniques that measure interest differences may be more meaningful in validity assessment, and the kappa statistic is useful for analyzing categorical variables. Questionnaires often can be designed to allow quantitative assessment of reliability and validity, although this may be difficult. Inclusion of homogeneous questions is necessary to assess reliability. Analysis is enhanced by using Likert scales or similar techniques that yield ordinal data. Validity assessment of questionnaires requires careful definition of the scope of the test and comparison with previously validated tools.

  14. Brazilian Soybean Yields and Yield Gaps Vary with Farm Size

    NASA Astrophysics Data System (ADS)

    Jeffries, G. R.; Cohn, A.; Griffin, T. S.; Bragança, A.

    2017-12-01

    Understanding the farm size-specific characteristics of crop yields and yield gaps may help to improve yields by enabling better targeting of technical assistance and agricultural development programs. Linking remote sensing-based yield estimates with property boundaries provides a novel view of the relationship between farm size and yield structure (yield magnitude, gaps, and stability over time). A growing literature documents variations in yield gaps, but largely ignores the role of farm size as a factor shaping yield structure. Research on the inverse farm size-productivity relationship (IR) theory - that small farms are more productive than large ones all else equal - has documented that yield magnitude may vary by farm size, but has not considered other yield structure characteristics. We examined farm size - yield structure relationships for soybeans in Brazil for years 2001-2015. Using out-of-sample soybean yield predictions from a statistical model, we documented 1) gaps between the 95th percentile of attained yields and mean yields within counties and individual fields, and 2) yield stability defined as the standard deviation of time-detrended yields at given locations. We found a direct relationship between soy yields and farm size at the national level, while the strength and the sign of the relationship varied by region. Soybean yield gaps were found to be inversely related to farm size metrics, even when yields were only compared to farms of similar size. The relationship between farm size and yield stability was nonlinear, with mid-sized farms having the most stable yields. The work suggests that farm size is an important factor in understanding yield structure and that opportunities for improving soy yields in Brazil are greatest among smaller farms.

  15. A study on the value of computer-assisted assessment for SPECT/CT-scans in sentinel lymph node diagnostics of penile cancer as well as clinical reliability and morbidity of this procedure.

    PubMed

    Lützen, Ulf; Naumann, Carsten Maik; Marx, Marlies; Zhao, Yi; Jüptner, Michael; Baumann, René; Papp, László; Zsótér, Norbert; Aksenov, Alexey; Jünemann, Klaus-Peter; Zuhayra, Maaz

    2016-09-07

    Because of the increasing importance of computer-assisted post processing of image data in modern medical diagnostic we studied the value of an algorithm for assessment of single photon emission computed tomography/computed tomography (SPECT/CT)-data, which has been used for the first time for lymph node staging in penile cancer with non-palpable inguinal lymph nodes. In the guidelines of the relevant international expert societies, sentinel lymph node-biopsy (SLNB) is recommended as a diagnostic method of choice. The aim of this study is to evaluate the value of the afore-mentioned algorithm and in the clinical context the reliability and the associated morbidity of this procedure. Between 2008 and 2015, 25 patients with invasive penile cancer and inconspicuous inguinal lymph node status underwent SLNB after application of the radiotracer Tc-99m labelled nanocolloid. We recorded in a prospective approach the reliability and the complication rate of the procedure. In addition, we evaluated the results of an algorithm for SPECT/CT-data assessment of these patients. SLNB was carried out in 44 groins of 25 patients. In three patients, inguinal lymph node metastases were detected via SLNB. In one patient, bilateral lymph node recurrence of the groins occurred after negative SLNB. There was a false-negative rate of 4 % in relation to the number of patients (1/25), resp. 4.5 % in relation to the number of groins (2/44). Morbidity was 4 % in relation to the number of patients (1/25), resp. 2.3 % in relation to the number of groins (1/44). The results of computer-assisted assessment of SPECT/CT data for sentinel lymph node (SLN)-diagnostics demonstrated high sensitivity of 88.8 % and specificity of 86.7 %. SLNB is a very reliable method, associated with low morbidity. Computer-assisted assessment of SPECT/CT data of the SLN-diagnostics shows high sensitivity and specificity. While it cannot replace the assessment by medical experts, it can still provide substantial

  16. Genetic correlations between the cumulative pseudo-survival rate, milk yield, and somatic cell score during lactation in Holstein cattle in Japan using a random regression model.

    PubMed

    Sasaki, O; Aihara, M; Nishiura, A; Takeda, H

    2017-09-01

    Trends in genetic correlations between longevity, milk yield, and somatic cell score (SCS) during lactation in cows are difficult to trace. In this study, changes in the genetic correlations between milk yield, SCS, and cumulative pseudo-survival rate (PSR) during lactation were examined, and the effect of milk yield and SCS information on the reliability of estimated breeding value (EBV) of PSR were determined. Test day milk yield, SCS, and PSR records were obtained for Holstein cows in Japan from 2004 to 2013. A random subset of the data was used for the analysis (825 herds, 205,383 cows). This data set was randomly divided into 5 subsets (162-168 herds, 83,389-95,854 cows), and genetic parameters were estimated in each subset independently. Data were analyzed using multiple-trait random regression animal models including either the residual effect for the whole lactation period (H0), the residual effects for 5 lactation stages (H5), or both of these residual effects (HD). Milk yield heritability increased until 310 to 351 d in milk (DIM) and SCS heritability increased until 330 to 344 DIM. Heritability estimates for PSR increased with DIM from 0.00 to 0.05. The genetic correlation between milk yield and SCS increased negatively to under -0.60 at 455 DIM. The genetic correlation between milk yield and PSR increased until 342 to 355 DIM (0.53-0.57). The genetic correlation between the SCS and PSR was -0.82 to -0.83 at around 180 DIM, and decreased to -0.65 to -0.71 at 455 DIM. The reliability of EBV of PSR for sires with 30 or more recorded daughters was 0.17 to 0.45 when the effects of correlated traits were ignored. The maximum reliability of EBV was observed at 257 (H0) or 322 (HD) DIM. When the correlations of PSR with milk yield and SCS were considered, the reliabilities of PSR estimates increased to 0.31-0.76. The genetic parameter estimates of H5 were the same as those for HD. The rank correlation coefficients of the EBV of PSR between H0 and H5 or HD were

  17. Development of a European Ensemble System for Seasonal Prediction: Application to crop yield

    NASA Astrophysics Data System (ADS)

    Terres, J. M.; Cantelaube, P.

    2003-04-01

    Western European agriculture is highly intensive and the weather is the main source of uncertainty for crop yield assessment and for crop management. In the current system, at the time when a crop yield forecast is issued, the weather conditions leading up to harvest time are unknown and are therefore a major source of uncertainty. The use of seasonal weather forecast would bring additional information for the remaining crop season and has valuable benefit for improving the management of agricultural markets and environmentally sustainable farm practices. An innovative method for supplying seasonal forecast information to crop simulation models has been developed in the frame of the EU funded research project DEMETER. It consists in running a crop model on each individual member of the seasonal hindcasts to derive a probability distribution of crop yield. Preliminary results of cumulative probability function of wheat yield provides information on both the yield anomaly and the reliability of the forecast. Based on the spread of the probability distribution, the end-user can directly quantify the benefits and risks of taking weather-sensitive decisions.

  18. fMRI reliability: influences of task and experimental design.

    PubMed

    Bennett, Craig M; Miller, Michael B

    2013-12-01

    As scientists, it is imperative that we understand not only the power of our research tools to yield results, but also their ability to obtain similar results over time. This study is an investigation into how common decisions made during the design and analysis of a functional magnetic resonance imaging (fMRI) study can influence the reliability of the statistical results. To that end, we gathered back-to-back test-retest fMRI data during an experiment involving multiple cognitive tasks (episodic recognition and two-back working memory) and multiple fMRI experimental designs (block, event-related genetic sequence, and event-related m-sequence). Using these data, we were able to investigate the relative influences of task, design, statistical contrast (task vs. rest, target vs. nontarget), and statistical thresholding (unthresholded, thresholded) on fMRI reliability, as measured by the intraclass correlation (ICC) coefficient. We also utilized data from a second study to investigate test-retest reliability after an extended, six-month interval. We found that all of the factors above were statistically significant, but that they had varying levels of influence on the observed ICC values. We also found that these factors could interact, increasing or decreasing the relative reliability of certain Task × Design combinations. The results suggest that fMRI reliability is a complex construct whose value may be increased or decreased by specific combinations of factors.

  19. The Reliability of a Novel Mobile 3-dimensional Wound Measurement Device.

    PubMed

    Anghel, Ersilia L; Kumar, Anagha; Bigham, Thomas E; Maselli, Kathryn M; Steinberg, John S; Evans, Karen K; Kim, Paul J; Attinger, Christopher E

    2016-11-01

    Objective assessment of wound dimensions is essential for tracking progression and determining treatment effectiveness. A reliability study was designed to establish intrarater and interrater reliability of a novel mobile 3-dimensional wound measurement (3DWM) device. Forty-five wounds were assessed by 2 raters using a 3DWM device to obtain length, width, area, depth, and volume measurements. Wounds were also measured manually, using a disposable ruler and digital planimetry. The intraclass correlation coefficient (ICC) was used to establish intrarater and interrater reliability. High levels of intrarater and interrater agreement were observed for area, length, and width; ICC = 0.998, 0.977, 0.955 and 0.999, 0.997, 0.995, respectively. Moderate levels of intrarater (ICC = 0.888) and interrater (ICC = 0.696) agreement were observed for volume. Lastly, depth yielded an intrarater ICC of 0.360 and an interrater ICC of 0.649. Measures from the 3DWM device were highly correlated with those obtained from scaled photography for length, width, and area (ρ = 0.997, 0.988, 0.997, P < 0.001). The 3DWM device yielded correlations of ρ = 0.990, 0.987, 0.996 with P < 0.001 for length, width, and area when compared to manual measurements. The 3DWM device was found to be highly reliable for measuring wound areas for a range of wound sizes and types as compared to manual measurement and digital planimetry. The depth and therefore volume measurement using the 3DWM device was found to have a lower ICC, but volume ICC alone was moderate. Overall, this device offers a mobile option for objective wound measurement in the clinical setting.

  20. Accuracy and reliability of the Pfeffer Questionnaire for the Brazilian elderly population

    PubMed Central

    Dutra, Marina Carneiro; Ribeiro, Raynan dos Santos; Pinheiro, Sarah Brandão; de Melo, Gislane Ferreira; Carvalho, Gustavo de Azevedo

    2015-01-01

    The aging population calls for instruments to assess functional and cognitive impairment in the elderly, aiming to prevent conditions that affect functional abilities. Objective To verify the accuracy and reliability of the Pfeffer (FAQ) scale for the Brazilian elderly population and to evaluate the reliability and reproducibility of the translated version of the Pfeffer Questionnaire. Methods The Brazilian version of the FAQ was applied to 110 elderly divided into two groups. Both groups were assessed by two blinded investigators at baseline and again after 15 days. In order to verify the accuracy and reliability of the instrument, sensitivity and specificity measurements for the presence or absence of functional and cognitive decline were calculated for various cut-off points and the ROC curve. Intra and inter-examiner reliability were assessed using the Interclass Correlation Coefficient (ICC) and Bland-Altman plots. Results For the occurrence of cognitive decline, the ROC curve yielded an area under the curve of 0.909 (95%CI of 0.845 to 0.972), sensitivity of 75.68% (95%CI of 93.52% to 100%) and specificity of 97.26%. For the occurrence of functional decline, the ROC curve yielded an area under the curve of 0.851 (95%CI of 64.52% to 87.33%) and specificity of 80.36% (95%CI of 69.95% to 90.76%). The ICC was excellent, with all values exceeding 0.75. On the Bland-Altman plot, intra-examiner agreement was good, with p>0.05consistently close to 0. A systematic difference was found for inter-examiner agreement. Conclusion The Pfeffer Questionnaire is applicable in the Brazilian elderly population and showed reliability and reproducibility compared to the original test. PMID:29213959

  1. Identification of saline soils with multi-year remote sensing of crop yields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lobell, D; Ortiz-Monasterio, I; Gurrola, F C

    2006-10-17

    Soil salinity is an important constraint to agricultural sustainability, but accurate information on its variation across agricultural regions or its impact on regional crop productivity remains sparse. We evaluated the relationships between remotely sensed wheat yields and salinity in an irrigation district in the Colorado River Delta Region. The goals of this study were to (1) document the relative importance of salinity as a constraint to regional wheat production and (2) develop techniques to accurately identify saline fields. Estimates of wheat yield from six years of Landsat data agreed well with ground-based records on individual fields (R{sup 2} = 0.65).more » Salinity measurements on 122 randomly selected fields revealed that average 0-60 cm salinity levels > 4 dS m{sup -1} reduced wheat yields, but the relative scarcity of such fields resulted in less than 1% regional yield loss attributable to salinity. Moreover, low yield was not a reliable indicator of high salinity, because many other factors contributed to yield variability in individual years. However, temporal analysis of yield images showed a significant fraction of fields exhibited consistently low yields over the six year period. A subsequent survey of 60 additional fields, half of which were consistently low yielding, revealed that this targeted subset had significantly higher salinity at 30-60 cm depth than the control group (p = 0.02). These results suggest that high subsurface salinity is associated with consistently low yields in this region, and that multi-year yield maps derived from remote sensing therefore provide an opportunity to map salinity across agricultural regions.« less

  2. Reliability-Based Design Optimization of a Composite Airframe Component

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.

    2009-01-01

    A stochastic design optimization methodology (SDO) has been developed to design components of an airframe structure that can be made of metallic and composite materials. The design is obtained as a function of the risk level, or reliability, p. The design method treats uncertainties in load, strength, and material properties as distribution functions, which are defined with mean values and standard deviations. A design constraint or a failure mode is specified as a function of reliability p. Solution to stochastic optimization yields the weight of a structure as a function of reliability p. Optimum weight versus reliability p traced out an inverted-S-shaped graph. The center of the inverted-S graph corresponded to 50 percent (p = 0.5) probability of success. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponds to unity for reliability p (or p = 1). Weight can be reduced to a small value for the most failure-prone design with a reliability that approaches zero (p = 0). Reliability can be changed for different components of an airframe structure. For example, the landing gear can be designed for a very high reliability, whereas it can be reduced to a small extent for a raked wingtip. The SDO capability is obtained by combining three codes: (1) The MSC/Nastran code was the deterministic analysis tool, (2) The fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and (3) NASA Glenn Research Center s optimization testbed CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life raked wingtip structure of the Boeing 767-400 extended range airliner made of metallic and composite materials.

  3. The revised Generalized Expectancy for Success Scale: a validity and reliability study.

    PubMed

    Hale, W D; Fiedler, L R; Cochran, C D

    1992-07-01

    The Generalized Expectancy for Success Scale (GESS; Fibel & Hale, 1978) was revised and assessed for reliability and validity. The revised version was administered to 199 college students along with other conceptually related measures, including the Rosenberg Self-Esteem Scale, the Life Orientation Test, and Rotter's Internal-External Locus of Control Scale. One subsample of students also completed the Eysenck Personality Inventory, while another subsample performed a criterion-related task that involved risk taking. Item analysis yielded 25 items with correlations of .45 or higher with the total score. Results indicated high internal consistency and test-retest reliability.

  4. Spectrally-Based Assessment of Crop Seasonal Performance and Yield

    NASA Astrophysics Data System (ADS)

    Kancheva, Rumiana; Borisova, Denitsa; Georgiev, Georgy

    The rapid advances of space technologies concern almost all scientific areas from aeronautics to medicine, and a wide range of application fields from communications to crop yield predictions. Agricultural monitoring is among the priorities of remote sensing observations for getting timely information on crop development. Monitoring agricultural fields during the growing season plays an important role in crop health assessment and stress detection provided that reliable data is obtained. Successfully spreading is the implementation of hyperspectral data to precision farming associated with plant growth and phenology monitoring, physiological state assessment, and yield prediction. In this paper, we investigated various spectral-biophysical relationships derived from in-situ reflectance measurements. The performance of spectral data for the assessment of agricultural crops condition and yield prediction was examined. The approach comprisesd development of regression models between plant spectral and state-indicative variables such as biomass, vegetation cover fraction, leaf area index, etc., and development of yield forecasting models from single-date (growth stage) and multitemporal (seasonal) reflectance data. Verification of spectral predictions was performed through comparison with estimations from biophysical relationships between crop growth variables. The study was carried out for spring barley and winter wheat. Visible and near-infrared reflectance data was acquired through the whole growing season accompanied by detailed datasets on plant phenology and canopy structural and biochemical attributes. Empirical relationships were derived relating crop agronomic variables and yield to various spectral predictors. The study findings were tested using airborne remote sensing inputs. A good correspondence was found between predicted and actual (ground-truth) estimates

  5. Multispectral Fluorescence Imaging During Robot-assisted Laparoscopic Sentinel Node Biopsy: A First Step Towards a Fluorescence-based Anatomic Roadmap.

    PubMed

    van den Berg, Nynke S; Buckle, Tessa; KleinJan, Gijs H; van der Poel, Henk G; van Leeuwen, Fijs W B

    2017-07-01

    During (robot-assisted) sentinel node (SN) biopsy procedures, intraoperative fluorescence imaging can be used to enhance radioguided SN excision. For this combined pre- and intraoperative SN identification was realized using the hybrid SN tracer, indocyanine green- 99m Tc-nanocolloid. Combining this dedicated SN tracer with a lymphangiographic tracer such as fluorescein may further enhance the accuracy of SN biopsy. Clinical evaluation of a multispectral fluorescence guided surgery approach using the dedicated SN tracer ICG- 99m Tc-nanocolloid, the lymphangiographic tracer fluorescein, and a commercially available fluorescence laparoscope. Pilot study in ten patients with prostate cancer. Following ICG- 99m Tc-nanocolloid administration and preoperative lymphoscintigraphy and single-photon emission computed tomograpy imaging, the number and location of SNs were determined. Fluorescein was injected intraprostatically immediately after the patient was anesthetized. A multispectral fluorescence laparoscope was used intraoperatively to identify both fluorescent signatures. Multispectral fluorescence imaging during robot-assisted radical prostatectomy with extended pelvic lymph node dissection and SN biopsy. (1) Number and location of preoperatively identified SNs. (2) Number and location of SNs intraoperatively identified via ICG- 99m Tc-nanocolloid imaging. (3) Rate of intraoperative lymphatic duct identification via fluorescein imaging. (4) Tumor status of excised (sentinel) lymph node(s). (5) Postoperative complications and follow-up. Near-infrared fluorescence imaging of ICG- 99m Tc-nanocolloid visualized 85.3% of the SNs. In 8/10 patients, fluorescein imaging allowed bright and accurate identification of lymphatic ducts, although higher background staining and tracer washout were observed. The main limitation is the small patient population. Our findings indicate that a lymphangiographic tracer can provide additional information during SN biopsy based on ICG- 99m

  6. 99mTc-Nanocolloid SPECT/MRI Fusion for the Selective Assessment of Nonenlarged Sentinel Lymph Nodes in Patients with Early-Stage Cervical Cancer.

    PubMed

    Hoogendam, Jacob P; Zweemer, Ronald P; Hobbelink, Monique G G; van den Bosch, Maurice A A J; Verheijen, René H M; Veldhuis, Wouter B

    2016-04-01

    We aimed to explore the accuracy of (99m)Tc SPECT/MRI fusion for the selective assessment of nonenlarged sentinel lymph nodes (SLNs) for diagnosing metastases in early-stage cervical cancer patients. We consecutively included stage IA1-IIB1 cervical cancer patients who presented to our tertiary referral center between March 2011 and February 2015. Patients with enlarged lymph nodes (short axis ≥ 10 mm) on MRI were excluded. Patients underwent an SLN procedure with preoperative (99m)Tc-nanocolloid SPECT/CT-based SLN mapping. When fused datasets of the SPECT and MR images were created, SLNs could be identified on the MR image with accurate correlation to the histologic result of each individual SLN. An experienced radiologist, masked to histology, retrospectively reviewed all fused SPECT/MR images and scored morphologic SLN parameters on a standardized case report form. Logistic regression and receiver-operating curves were used to model the parameters against the SLN status. In 75 cases, 136 SLNs were eligible for analysis, of which 13 (9.6%) contained metastases (8 cases). Three parameters-short-axis diameter, long-axis diameter, and absence of sharp demarcation-significantly predicted metastatic invasion of nonenlarged SLNs, with quality-adjusted odds ratios of 1.42 (95% confidence interval [CI], 1.01-1.99), 1.28 (95% CI, 1.03-1.57), and 7.55 (95% CI, 1.09-52.28), respectively. The area under the curve of the receiver-operating curves combining these parameters was 0.749 (95% CI, 0.569-0.930). Heterogeneous gadolinium enhancement, cortical thickness, round shape, or SLN size, compared with the nearest non-SLN, showed no association with metastases (P= 0.055-0.795). In cervical cancer patients without enlarged lymph nodes, selective evaluation of only the SLNs-for size and absence of sharp demarcation-can be used to noninvasively assess the presence of metastases. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  7. How Reliable Are Students' Evaluations of Teaching Quality? A Variance Components Approach

    ERIC Educational Resources Information Center

    Feistauer, Daniela; Richter, Tobias

    2017-01-01

    The inter-rater reliability of university students' evaluations of teaching quality was examined with cross-classified multilevel models. Students (N = 480) evaluated lectures and seminars over three years with a standardised evaluation questionnaire, yielding 4224 data points. The total variance of these student evaluations was separated into the…

  8. Yield gaps and yield relationships in US soybean production systems

    USDA-ARS?s Scientific Manuscript database

    The magnitude of yield gaps (YG) (potential yield – farmer yield) provides some indication of the prospects for increasing crop yield to meet the food demands of future populations. Quantile regression analysis was applied to county soybean [Glycine max (L.) Merrill] yields (1971 – 2011) from Kentuc...

  9. Ultraflexible nanoelectronic probes form reliable, glial scar–free neural integration

    PubMed Central

    Luan, Lan; Wei, Xiaoling; Zhao, Zhengtuo; Siegel, Jennifer J.; Potnis, Ojas; Tuppen, Catherine A; Lin, Shengqing; Kazmi, Shams; Fowler, Robert A.; Holloway, Stewart; Dunn, Andrew K.; Chitwood, Raymond A.; Xie, Chong

    2017-01-01

    Implanted brain electrodes construct the only means to electrically interface with individual neurons in vivo, but their recording efficacy and biocompatibility pose limitations on scientific and clinical applications. We showed that nanoelectronic thread (NET) electrodes with subcellular dimensions, ultraflexibility, and cellular surgical footprints form reliable, glial scar–free neural integration. We demonstrated that NET electrodes reliably detected and tracked individual units for months; their impedance, noise level, single-unit recording yield, and the signal amplitude remained stable during long-term implantation. In vivo two-photon imaging and postmortem histological analysis revealed seamless, subcellular integration of NET probes with the local cellular and vasculature networks, featuring fully recovered capillaries with an intact blood-brain barrier and complete absence of chronic neuronal degradation and glial scar. PMID:28246640

  10. Reliability and Validity of Wisconsin Upper Respiratory Symptom Survey, Korean Version

    PubMed Central

    Yang, Su-Young; Kang, Weechang; Yeo, Yoon; Park, Yang-Chun

    2011-01-01

    Background The Wisconsin Upper Respiratory Symptom Survey (WURSS) is a self-administered questionnaire developed in the United States to evaluate the severity of the common cold and its reliability has been validated. We developed a Korean language version of this questionnaire by using a sequential forward and backward translation approach. The purpose of this study was to validate the Korean version of the Wisconsin Upper Respiratory Symptom Survey (WURSS-K) in Korean patients with common cold. Methods This multicenter prospective study enrolled 107 participants who were diagnosed with common cold and consented to participate in the study. The WURSS-K includes 1 global illness severity item, 32 symptom-based items, 10 functional quality-of-life (QOL) items, and 1 item assessing global change. The SF-8 was used as an external comparator. Results The participants were 54 women and 53 men aged 18 to 42 years. The WURSS-K showed good reliability in 10 domains, with Cronbach’s alphas ranging from 0.67 to 0.96 (mean: 0.84). Comparison of the reliability coefficients of the WURSS-K and WURSS yielded a Pearson correlation coefficient of 0.71 (P = 0.02). Validity of the WURSS-K was evaluated by comparing it with the SF-8, which yielded a Pearson correlation coefficient of −0.267 (P < 0.001). The Guyatt’s responsiveness index of the WURSS-K ranged from 0.13 to 0.46, and the correlation coefficient with the WURSS was 0.534 (P < 0.001), indicating that there was close correlation between the WURSS-K and WURSS. Conclusions The WURSS-K is a reliable, valid, and responsive disease-specific questionnaire for assessing symptoms and QOL in Korean patients with common cold. PMID:21691034

  11. Transient-evoked and distortion product otoacoustic emissions: A short-term test-retest reliability study.

    PubMed

    Keppler, Hannah; Dhooge, Ingeborg; Maes, Leen; D'haenens, Wendy; Bockstael, Annelies; Philips, Birgit; Swinnen, Freya; Vinck, Bart

    2010-02-01

    Knowledge regarding the variability of transient-evoked otoacoustic emissions (TEOAEs) and distortion product otoacoustic emissions (DPOAEs) is essential in clinical settings and improves their utility in monitoring hearing status over time. In the current study, TEOAEs and DPOAEs were measured with commercially available OAE-equipment in 56 normally-hearing ears during three sessions. Reliability was analysed for the retest measurement without probe-refitting, the immediate retest measurement with probe-refitting, and retest measurements after one hour and one week. The highest reliability was obtained in the retest measurement without probe-refitting, and decreased with increasing time-interval between measurements. For TEOAEs, the lowest reliability was seen at half-octave frequency bands 1.0 and 1.4 kHz; whereas for DPOAEs half-octave frequency band 8.0 kHz had also poor reliability. Higher primary tone level combination for DPOAEs yielded to a better reliability of DPOAE amplitudes. External environmental noise seemed to be the dominating noise source in normal-hearing subjects, decreasing the reliability of emission amplitudes especially in the low-frequency region.

  12. Examination of Anomalous World Experience: A Report on Reliability.

    PubMed

    Conerty, Joseph; Skodlar, Borut; Pienkos, Elizabeth; Zadravek, Tina; Byrom, Greg; Sass, Louis

    2017-01-01

    The EAWE (Examination of Anomalous World Experience) is a newly developed, semi-structured interview that aims to capture anomalies of subjectivity, common in schizophrenia spectrum disorders, that pertain to experiences of the lived world, including space, time, people, language, atmosphere, and certain existential attitudes. By contrast, previous empirical studies of subjective experience in schizophrenia have focused largely on disturbances in self-experience. To assess the reliability of the EAWE, including internal consistency and interrater reliability. In the course of developing the EAWE, two distinct studies were conducted, one in the United States and the other in Slovenia. Thirteen patients diagnosed with schizophrenia spectrum or mood disorders were recruited for the US study. Fifteen such patients were recruited for the Slovenian study. Two live interviewers conducted the EAWE in the US. The Slovenian interviews were completed by one live interviewer with a second rater reviewing audiorecordings of the interview. Internal consistency and interrater reliability were calculated independently for each study, utilizing Cronbach's α, Spearman's ρ, and Cohen's κ. Each study yielded high internal consistency (Cronbach's α >0.82) and high interrater reliability for total EAWE scores (ρ > 0.83; average κ values were at least 0.78 for each study, with EAWE domain-specific κ not lower than 0.73). The EAWE, containing world-oriented inquiries into anomalies in subjective experience, has adequate reliability for use in a clinical or research setting. © 2017 S. Karger AG, Basel.

  13. Optimizing rice yields while minimizing yield-scaled global warming potential.

    PubMed

    Pittelkow, Cameron M; Adviento-Borbe, Maria A; van Kessel, Chris; Hill, James E; Linquist, Bruce A

    2014-05-01

    To meet growing global food demand with limited land and reduced environmental impact, agricultural greenhouse gas (GHG) emissions are increasingly evaluated with respect to crop productivity, i.e., on a yield-scaled as opposed to area basis. Here, we compiled available field data on CH4 and N2 O emissions from rice production systems to test the hypothesis that in response to fertilizer nitrogen (N) addition, yield-scaled global warming potential (GWP) will be minimized at N rates that maximize yields. Within each study, yield N surplus was calculated to estimate deficit or excess N application rates with respect to the optimal N rate (defined as the N rate at which maximum yield was achieved). Relationships between yield N surplus and GHG emissions were assessed using linear and nonlinear mixed-effects models. Results indicate that yields increased in response to increasing N surplus when moving from deficit to optimal N rates. At N rates contributing to a yield N surplus, N2 O and yield-scaled N2 O emissions increased exponentially. In contrast, CH4 emissions were not impacted by N inputs. Accordingly, yield-scaled CH4 emissions decreased with N addition. Overall, yield-scaled GWP was minimized at optimal N rates, decreasing by 21% compared to treatments without N addition. These results are unique compared to aerobic cropping systems in which N2 O emissions are the primary contributor to GWP, meaning yield-scaled GWP may not necessarily decrease for aerobic crops when yields are optimized by N fertilizer addition. Balancing gains in agricultural productivity with climate change concerns, this work supports the concept that high rice yields can be achieved with minimal yield-scaled GWP through optimal N application rates. Moreover, additional improvements in N use efficiency may further reduce yield-scaled GWP, thereby strengthening the economic and environmental sustainability of rice systems. © 2013 John Wiley & Sons Ltd.

  14. Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models

    DOE PAGES

    Blanc, Élodie

    2017-01-26

    This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less

  15. Statistical emulators of maize, rice, soybean and wheat yields from global gridded crop models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanc, Élodie

    This study provides statistical emulators of crop yields based on global gridded crop model simulations from the Inter-Sectoral Impact Model Intercomparison Project Fast Track project. The ensemble of simulations is used to build a panel of annual crop yields from five crop models and corresponding monthly summer weather variables for over a century at the grid cell level globally. This dataset is then used to estimate, for each crop and gridded crop model, the statistical relationship between yields, temperature, precipitation and carbon dioxide. This study considers a new functional form to better capture the non-linear response of yields to weather,more » especially for extreme temperature and precipitation events, and now accounts for the effect of soil type. In- and out-of-sample validations show that the statistical emulators are able to replicate spatial patterns of yields crop levels and changes overtime projected by crop models reasonably well, although the accuracy of the emulators varies by model and by region. This study therefore provides a reliable and accessible alternative to global gridded crop yield models. By emulating crop yields for several models using parsimonious equations, the tools provide a computationally efficient method to account for uncertainty in climate change impact assessments.« less

  16. Sample-averaged biexciton quantum yield measured by solution-phase photon correlation.

    PubMed

    Beyler, Andrew P; Bischof, Thomas S; Cui, Jian; Coropceanu, Igor; Harris, Daniel K; Bawendi, Moungi G

    2014-12-10

    The brightness of nanoscale optical materials such as semiconductor nanocrystals is currently limited in high excitation flux applications by inefficient multiexciton fluorescence. We have devised a solution-phase photon correlation measurement that can conveniently and reliably measure the average biexciton-to-exciton quantum yield ratio of an entire sample without user selection bias. This technique can be used to investigate the multiexciton recombination dynamics of a broad scope of synthetically underdeveloped materials, including those with low exciton quantum yields and poor fluorescence stability. Here, we have applied this method to measure weak biexciton fluorescence in samples of visible-emitting InP/ZnS and InAs/ZnS core/shell nanocrystals, and to demonstrate that a rapid CdS shell growth procedure can markedly increase the biexciton fluorescence of CdSe nanocrystals.

  17. Sample-Averaged Biexciton Quantum Yield Measured by Solution-Phase Photon Correlation

    PubMed Central

    Beyler, Andrew P.; Bischof, Thomas S.; Cui, Jian; Coropceanu, Igor; Harris, Daniel K.; Bawendi, Moungi G.

    2015-01-01

    The brightness of nanoscale optical materials such as semiconductor nanocrystals is currently limited in high excitation flux applications by inefficient multiexciton fluorescence. We have devised a solution-phase photon correlation measurement that can conveniently and reliably measure the average biexciton-to-exciton quantum yield ratio of an entire sample without user selection bias. This technique can be used to investigate the multiexciton recombination dynamics of a broad scope of synthetically underdeveloped materials, including those with low exciton quantum yields and poor fluorescence stability. Here, we have applied this method to measure weak biexciton fluorescence in samples of visible-emitting InP/ZnS and InAs/ZnS core/shell nanocrystals, and to demonstrate that a rapid CdS shell growth procedure can markedly increase the biexciton fluorescence of CdSe nanocrystals. PMID:25409496

  18. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1991-01-01

    The difficulty of developing reliable distributed software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems which are substantially easier to develop, fault-tolerance, and self-managing. Six years of research on ISIS are reviewed, describing the model, the types of applications to which ISIS was applied, and some of the reasoning that underlies a recent effort to redesign and reimplement ISIS as a much smaller, lightweight system.

  19. Comparison of Hyperthermal Ground Laboratory Atomic Oxygen Erosion Yields With Those in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Dill, Grace C.; Loftus, Ryan J.; deGroh, Kim K.; Miller, Sharon K.

    2013-01-01

    The atomic oxygen erosion yields of 26 materials (all polymers except for pyrolytic graphite) were measured in two directed hyperthermal radio frequency (RF) plasma ashers operating at 30 or 35 kHz with air. The hyperthermal asher results were compared with thermal energy asher results and low Earth orbital (LEO) results from the Materials International Space Station Experiment 2 and 7 (MISSE 2 and 7) flight experiments. The hyperthermal testing was conducted to a significant portion of the atomic oxygen fluence similar polymers were exposed to during the MISSE 2 and 7 missions. Comparison of the hyperthermal asher prediction of LEO erosion yields with thermal energy asher erosion yields indicates that except for the fluorocarbon polymers of PTFE and FEP, the hyperthermal energy ashers are a much more reliable predictor of LEO erosion yield than thermal energy asher testing, by a factor of four.

  20. The impact of Global Warming on global crop yields due to changes in pest pressure

    NASA Astrophysics Data System (ADS)

    Battisti, D. S.; Tewksbury, J. J.; Deutsch, C. A.

    2011-12-01

    A billion people currently lack reliable access to sufficient food and almost half of the calories feeding these people come from just three crops: rice, maize, wheat. Insect pests are among the largest factors affecting the yield of these three crops, but models assessing the effects of global warming on crops rarely consider changes in insect pest pressure on crop yields. We use well-established relationships between temperature and insect physiology to project climate-driven changes in pest pressure, defined as integrated population metabolism, for the three major crops. By the middle of this century, under most scenarios, insect pest pressure is projected to increase by more than 50% in temperate areas, while increases in tropical regions will be more modest. Yield relationships indicate that the largest increases in insect pest pressure are likely to occur in areas where yield is greatest, suggesting increased strain on global food markets.

  1. Analysis of the Reliability and Validity of a Mentor's Assessment for Principal Internships

    ERIC Educational Resources Information Center

    Koonce, Glenn L.; Kelly, Michael D.

    2014-01-01

    In this study, researchers analyzed the reliability and validity of the mentor's assessment for principal internships at a university in the Southeast region of the United States. The results of the study yielded how trustworthy and dependable the instrument is and the effectiveness of the instrument in the current principal preparation program.…

  2. Sediment yields of streams in the Umpqua River Basin, Oregon

    USGS Publications Warehouse

    Curtiss, D.A.

    1975-01-01

    This report summarizes sediment data collected at 11 sites in the Umpqua River basin from 1956 to 1973 and updates a report by C. A. Onions (1969) of estimated sediment yields in the basin from 1956-67.  Onions' report points out that the suspended-sediment data, collected during the 1956-67 period, were insufficient to compute reliable sediment yields.  Therefore, the U.S, Geological Survey, in cooperation with Douglas County, collected additional data from 1969 to 1973 to improve the water discharge-sediment discharge relationships at these sites.  These data are published in "Water resources data for Oregon, Part 2, Water quality records," 1970 through 1973 water years.  In addition to the 10 original sites, data were collected during this period from the Umpqua River near Elkton station, and a summary of the data for that station is included in table 1.

  3. Between-Person and Within-Person Subscore Reliability: Comparison of Unidimensional and Multidimensional IRT Models

    ERIC Educational Resources Information Center

    Bulut, Okan

    2013-01-01

    The importance of subscores in educational and psychological assessments is undeniable. Subscores yield diagnostic information that can be used for determining how each examinee's abilities/skills vary over different content domains. One of the most common criticisms about reporting and using subscores is insufficient reliability of subscores.…

  4. Reliability Generalization (RG) Analysis: The Test Is Not Reliable

    ERIC Educational Resources Information Center

    Warne, Russell

    2008-01-01

    Literature shows that most researchers are unaware of some of the characteristics of reliability. This paper clarifies some misconceptions by describing the procedures, benefits, and limitations of reliability generalization while using it to illustrate the nature of score reliability. Reliability generalization (RG) is a meta-analytic method…

  5. Reliability and diagnostic accuracy of history and physical examination for diagnosing glenoid labral tears.

    PubMed

    Walsworth, Matthew K; Doukas, William C; Murphy, Kevin P; Mielcarek, Billie J; Michener, Lori A

    2008-01-01

    Glenoid labral tears provide a diagnostic challenge. Combinations of items in the patient history and physical examination will provide stronger diagnostic accuracy to suggest the presence or absence of glenoid labral tear than will individual items. Cohort study (diagnosis); Level of evidence, 1. History and examination findings in patients with shoulder pain (N = 55) were compared with arthroscopic findings to determine diagnostic accuracy and intertester reliability. The intertester reliability of the crank, anterior slide, and active compression tests was 0.20 to 0.24. A combined history of popping or catching and positive crank or anterior slide results yielded specificities of 0.91 and 1.00 and positive likelihood ratios of 3.0 and infinity, respectively. A positive anterior slide result combined with either a positive active compression or crank result yielded specificities of 0.91 and positive likelihood ratio of 2.75 and 3.75, respectively. Requiring only a single positive finding in the combination of popping or catching and the anterior slide or crank yielded sensitivities of 0.82 and 0.89 and negative likelihood ratios of 0.31 and 0.33, respectively. The diagnostic accuracy of individual tests in previous studies is quite variable, which may be explained in part by the modest reliability of these tests. The combination of popping or catching with a positive crank or anterior slide result or a positive anterior slide result with a positive active compression or crank test result suggests the presence of a labral tear. The combined absence of popping or catching and a negative anterior slide or crank result suggests the absence of a labral tear.

  6. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    PubMed

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  7. Declining water yield from forested mountain watersheds in response to climate change and forest mesophication

    Treesearch

    Peter V. Caldwell; Chelcy F. Miniat; Katherine J. Elliott; Wayne. T. Swank; Steven T. Brantley; Stephanie H. Laseter

    2016-01-01

    Climate change and forest disturbances are threatening the ability of forested mountain watersheds to provide the clean, reliable, and abundant fresh water necessary to support aquatic ecosystems and a growing human population. Here we used 76 years of water yield, climate, and field plot vegetation measurements in six unmanaged, reference watersheds in the southern...

  8. Specific yield: compilation of specific yields for various materials

    USGS Publications Warehouse

    Johnson, A.I.

    1967-01-01

    Specific yield is defined as the ratio of (1) the volume of water that a saturated rock or soil will yield by gravity to (2) the total volume of the rock or soft. Specific yield is usually expressed as a percentage. The value is not definitive, because the quantity of water that will drain by gravity depends on variables such as duration of drainage, temperature, mineral composition of the water, and various physical characteristics of the rock or soil under consideration. Values of specific yields nevertheless offer a convenient means by which hydrologists can estimate the water-yielding capacities of earth materials and, as such, are very useful in hydrologic studies. The present report consists mostly of direct or modified quotations from many selected reports that present and evaluate methods for determining specific yield, limitations of those methods, and results of the determinations made on a wide variety of rock and soil materials. Although no particular values are recommended in this report, a table summarizes values of specific yield, and their averages, determined for 10 rock textures. The following is an abstract of the table. [Table

  9. Yield and yield gaps in central U.S. corn production systems

    USDA-ARS?s Scientific Manuscript database

    The magnitude of yield gaps (YG) (potential yield – farmer yield) provides some indication of the prospects for increasing crop yield. Quantile regression analysis was applied to county maize (Zea mays L.) yields (1972 – 2011) from Kentucky, Iowa and Nebraska (irrigated) (total of 115 counties) to e...

  10. Transient generation of hydrogen peroxide is responsible for carcinostatic effects of hydrogen combined with platinum nanocolloid, together with increases intracellular ROS, DNA cleavages, and proportion of G2/M-phase.

    PubMed

    Saitoh, Yasukazu; Ikeshima, Minoru; Kawasaki, Naho; Masumoto, Aoi; Miwa, Nobuhiko

    2016-01-01

    In our previous study, we demonstrated that combined treatment with hydrogen (H2) and platinum nanocolloid (Pt-nc) exerted markedly antiproliferative effects on cancer cells compared with each treatment alone. However, because the related mechanisms remain unclear, we investigated carcinostatic mechanisms of the combined treatment with H2 + Pt-nc. Significant suppression of cell proliferation was confirmed at 52 h following combined treatment, and the similar effect was also observed by the 30- or 40-min transient treatment with H2 + Pt-nc. The transient treatments led to changes in cell size and morphology, loss of microvilli, and apoptosis-like cell death at 120 h after treatment. Moreover, transient combined treatment with H2 + Pt-nc induced cell-cycle arrest, as reflected by decreased proportions of G1-phase cells and accumulation of G2/M-phase cells. In contrast, intracellular peroxide levels were temporarily and significantly increased immediately after H2 + Pt-nc treatment but not after treatment with H2 or Pt-nc alone. Additionally, combined treatment-induced carcinostatic effects were significantly diminished in the presence of catalase, and marked hydrogen peroxide (H2O2) generation was confirmed after mixing Pt-nc into cell culture media containing a high concentration of H2. These changes are in agreement with the results that carcinostatic effects were induced after only 40 min of treatment with H2 + Pt-nc. Thus, transient and marked generation of H2O2 is responsible for the carcinostatic effects of combined treatment with H2 + Pt-nc.

  11. Sample-Averaged Biexciton Quantum Yield Measured by Solution-Phase Photon Correlation

    DOE PAGES

    Beyler, Andrew P.; Bischof, Thomas S.; Cui, Jian; ...

    2014-11-19

    The brightness of nanoscale optical materials such as semiconductor nanocrystals is currently limited in high excitation flux applications by inefficient multiexciton fluorescence. We have devised a solution-phase photon correlation measurement that can conveniently and reliably measure the average biexciton-to-exciton quantum yield ratio of an entire sample without user selection bias. This technique can be used to investigate the multiexciton recombination dynamics of a broad scope of synthetically underdeveloped materials, including those with low exciton quantum yields and poor fluorescence stability. Here in this study, we have applied this method to measure weak biexciton fluorescence in samples of visible-emitting InP/ZnS andmore » InAs/ZnS core/shell nanocrystals, and to demonstrate that a rapid CdS shell growth procedure can markedly increase the biexciton fluorescence of CdSe nanocrystals.« less

  12. Reliability of reflectance measures in passive filters

    NASA Astrophysics Data System (ADS)

    Saldiva de André, Carmen Diva; Afonso de André, Paulo; Rocha, Francisco Marcelo; Saldiva, Paulo Hilário Nascimento; Carvalho de Oliveira, Regiani; Singer, Julio M.

    2014-08-01

    Measurements of optical reflectance in passive filters impregnated with a reactive chemical solution may be transformed to ozone concentrations via a calibration curve and constitute a low cost alternative for environmental monitoring, mainly to estimate human exposure. Given the possibility of errors caused by exposure bias, it is common to consider sets of m filters exposed during a certain period to estimate the latent reflectance on n different sample occasions at a certain location. Mixed models with sample occasions as random effects are useful to analyze data obtained under such setups. The intra-class correlation coefficient of the mean of the m measurements is an indicator of the reliability of the latent reflectance estimates. Our objective is to determine m in order to obtain a pre-specified reliability of the estimates, taking possible outliers into account. To illustrate the procedure, we consider an experiment conducted at the Laboratory of Experimental Air Pollution, University of São Paulo, Brazil (LPAE/FMUSP), where sets of m = 3 filters were exposed during 7 days on n = 9 different occasions at a certain location. The results show that the reliability of the latent reflectance estimates for each occasion obtained under homoskedasticity is km = 0.74. A residual analysis suggests that the within-occasion variance for two of the occasions should be different from the others. A refined model with two within-occasion variance components was considered, yielding km = 0.56 for these occasions and km = 0.87 for the remaining ones. To guarantee that all estimates have a reliability of at least 80% we require measurements on m = 10 filters on each occasion.

  13. Satellite techniques yield insight into devastating rainfall from Hurricane Mitch

    NASA Astrophysics Data System (ADS)

    Ferraro, R.; Vicente, G.; Ba, M.; Gruber, A.; Scofield, R.; Li, Q.; Weldon, R.

    Hurricane Mitch may prove to be one of the most devastating tropical cyclones to affect the western hemisphere. Heavy rains over Central America from October 28, 1998, to November 1, 1998, caused widespread flooding and mud slides in Nicaragua and Honduras resulting in thousands of deaths and missing persons. News reports indicated entire towns being swept away, destruction of national economies and infrastructure, and widespread disease in the aftermath of the storm, which some estimates suggested dropped as much as 1300 mm of rain.However, in view of the widespread damage it is difficult to determine the actual amounts and distribution of rainfall. More accurate means of determining the rainfall associated with Mitch are vital for diagnosing and understanding the evolution of this disaster and for developing new mitigation strategies for future tropical cyclones. Satellite data may prove to be a reliable resource for accurate rainfall analysis and have yielded apparently reliable figures for Hurricane Mitch.

  14. Nanowire growth process modeling and reliability models for nanodevices

    NASA Astrophysics Data System (ADS)

    Fathi Aghdam, Faranak

    Nowadays, nanotechnology is becoming an inescapable part of everyday life. The big barrier in front of its rapid growth is our incapability of producing nanoscale materials in a reliable and cost-effective way. In fact, the current yield of nano-devices is very low (around 10 %), which makes fabrications of nano-devices very expensive and uncertain. To overcome this challenge, the first and most important step is to investigate how to control nano-structure synthesis variations. The main directions of reliability research in nanotechnology can be classified either from a material perspective or from a device perspective. The first direction focuses on restructuring materials and/or optimizing process conditions at the nano-level (nanomaterials). The other direction is linked to nano-devices and includes the creation of nano-electronic and electro-mechanical systems at nano-level architectures by taking into account the reliability of future products. In this dissertation, we have investigated two topics on both nano-materials and nano-devices. In the first research work, we have studied the optimization of one of the most important nanowire growth processes using statistical methods. Research on nanowire growth with patterned arrays of catalyst has shown that the wire-to-wire spacing is an important factor affecting the quality of resulting nanowires. To improve the process yield and the length uniformity of fabricated nanowires, it is important to reduce the resource competition between nanowires during the growth process. We have proposed a physical-statistical nanowire-interaction model considering the shadowing effect and shared substrate diffusion area to determine the optimal pitch that would ensure the minimum competition between nanowires. A sigmoid function is used in the model, and the least squares estimation method is used to estimate the model parameters. The estimated model is then used to determine the optimal spatial arrangement of catalyst arrays

  15. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  16. WaferOptics® mass volume production and reliability

    NASA Astrophysics Data System (ADS)

    Wolterink, E.; Demeyer, K.

    2010-05-01

    The Anteryon WaferOptics® Technology platform contains imaging optics designs, materials, metrologies and combined with wafer level based Semicon & MEMS production methods. WaferOptics® first required complete new system engineering. This system closes the loop between application requirement specifications, Anteryon product specification, Monte Carlo Analysis, process windows, process controls and supply reject criteria. Regarding the Anteryon product Integrated Lens Stack (ILS), new design rules, test methods and control systems were assessed, implemented, validated and customer released for mass production. This includes novel reflowable materials, mastering process, replication, bonding, dicing, assembly, metrology, reliability programs and quality assurance systems. Many of Design of Experiments were performed to assess correlations between optical performance parameters and machine settings of all process steps. Lens metrologies such as FFL, BFL, and MTF were adapted for wafer level production and wafer mapping was introduced for yield management. Test methods for screening and validating suitable optical materials were designed. Critical failure modes such as delamination and popcorning were assessed and modeled with FEM. Anteryon successfully managed to integrate the different technologies starting from single prototypes to high yield mass volume production These parallel efforts resulted in a steep yield increase from 30% to over 90% in a 8 months period.

  17. Reliability of Autism-Tics, AD/HD, and other Comorbidities (A-TAC) inventory in a test-retest design.

    PubMed

    Larson, Tomas; Kerekes, Nóra; Selinus, Eva Norén; Lichtenstein, Paul; Gumpert, Clara Hellner; Anckarsäter, Henrik; Nilsson, Thomas; Lundström, Sebastian

    2014-02-01

    The Autism-Tics, AD/HD, and other Comorbidities (A-TAC) inventory is used in epidemiological research to assess neurodevelopmental problems and coexisting conditions. Although the A-TAC has been applied in various populations, data on retest reliability are limited. The objective of the present study was to present additional reliability data. The A-TAC was administered by lay assessors and was completed on two occasions by parents of 400 individual twins, with an average interval of 70 days between test sessions. Intra- and inter-rater reliability were analysed with intraclass correlations and Cohen's kappa. A-TAC showed excellent test-retest intraclass correlations for both autism spectrum disorder and attention deficit hyperactivity disorder (each at .84). Most modules in the A-TAC had intra- and inter-rater reliability intraclass correlation coefficients of > or = .60. Cohen's kappa indi- cated acceptable reliability. The current study provides statistical evidence that the A-TAC yields good test-retest reliability in a population-based cohort of children.

  18. Why is it so difficult to determine the yield of indoor cannabis plantations? A case study from the Netherlands.

    PubMed

    Vanhove, Wouter; Maalsté, Nicole; Van Damme, Patrick

    2017-07-01

    Together, the Netherlands and Belgium are the largest indoor cannabis producing countries in Europe. In both countries, legal prosecution procedure of convicted illicit cannabis growers usually includes recovery of the profits gained. However, it is not easy to make a reliable estimation of the latter profits, due to the wide range of factors that determine indoor cannabis yields and eventual selling prices. In the Netherlands, since 2005, a reference model is used that assumes a constant yield (g) per plant for a given indoor cannabis plant density. Later, in 2011, a new model was developed in Belgium for yield estimation of Belgian indoor cannabis plantations that assumes a constant yield per m 2 of growth surface, provided that a number of growth conditions are met. Indoor cannabis plantations in the Netherlands and Belgium share similar technical characteristics. As a result, for indoor cannabis plantations in both countries, both aforementioned yield estimation models should yield similar yield estimations. By means of a real-case study from the Netherlands, we show that the reliability of both models is hampered by a number of flaws and unmet preconditions. The Dutch model is based on a regression equation that makes use of ill-defined plant development stages, assumes a linear plant growth, does not discriminate between different plantation size categories and does not include other important yield determining factors (such as fertilization). The Belgian model addresses some of the latter shortcomings, but its applicability is constrained by a number of pre-conditions including plantation size between 50 and 1000 plants; cultivation in individual pots with peat soil; 600W (electrical power) assimilation lamps; constant temperature between 20°C and 30°C; adequate fertilizer application and plants unaffected by pests and diseases. Judiciary in both the Netherlands and Belgium require robust indoor cannabis yield models for adequate legal prosecution of

  19. Toward an Economic Definition of Sustainable Yield for Coastal Aquifers

    NASA Astrophysics Data System (ADS)

    Jenson, J. W.; Habana, N. C.; Lander, M.

    2016-12-01

    The concept of aquifer sustainable yield has long been criticized, debated, and even disparaged among groundwater hydrologists, but policy-makers and professional water resource managers inevitably ask them for unequivocal answers to such questions as "What is the absolute maximum volume of water that could be sustainably withdrawn from this aquifer?" We submit that it is therefore incumbent upon hydrologists to develop and offer valid practical definitions of sustainable yield that can be usefully applied to given conditions and types of aquifers. In coastal aquifers, water quality—in terms of salinity—is affected by changes in the natural water budget and the volume rate of artificial extraction. In principle, one can identify a family of assay curves for a given aquifer, showing the specific relationships between the quantity and quality of the water extracted under given conditions of recharge. The concept of the assay curve, borrowed from the literature of natural-resource extraction economics, has to our knowledge not yet found its way into the literature of applied hydrology. The relationships between recharge, extraction, and water quality that define the assay curve can be determined empirically from sufficient observations of groundwater response to recharge and extraction and can be estimated from models that have been reliably history-matched ("calibrated") to such data. We thus propose a working definition of sustainable yield for coastal aquifers in terms of the capacity that ultimately could be achieved by an ideal production system, given what is known or can be assumed about the natural limiting conditions. Accordingly, we also offer an approach for defining an ideal production system for a given aquifer, and demonstrate how observational data and/or modeling results can be used to develop assay curves of quality vs. quantity extracted, which can serve as reliable predictive tools for engineers, managers, regulators, and policy

  20. The shared and unique values of optical, fluorescence, thermal and microwave satellite data for estimating large-scale crop yields

    USDA-ARS?s Scientific Manuscript database

    Large-scale crop monitoring and yield estimation are important for both scientific research and practical applications. Satellite remote sensing provides an effective means for regional and global cropland monitoring, particularly in data-sparse regions that lack reliable ground observations and rep...

  1. The reliability and validity of the Caregiver Work Limitations Questionnaire.

    PubMed

    Lerner, Debra; Parsons, Susan K; Chang, Hong; Visco, Zachary L; Pawlecki, J Brent

    2015-01-01

    To test a new Caregiver Work Limitations Questionnaire (WLQ). On the basis of the original WLQ, this new survey instrument assesses the effect of caregiving for ill and/or disabled persons on the caregiver's work performance. A questionnaire was administered anonymously to employees of a large business services company. Scale reliability and validity were tested with psychometric methods. Of 4128 survey participants, 18.3% currently were caregivers, 10.2% were past caregivers, and 71.5% were not caregivers. Current caregivers were limited in their ability to perform basic job tasks between mean 10.3% and 16.8% of the time. Confirmatory factor analysis yielded a scale structure similar to the WLQ's. Scales reliabilities (the Cronbach's α) ranged from 0.91 to 0.95. The Caregiver WLQ is a new tool for understanding the workplace effect of caregiving.

  2. Measurement of fission yields and isomeric yield ratios at IGISOL

    NASA Astrophysics Data System (ADS)

    Pomp, Stephan; Mattera, Andrea; Rakopoulos, Vasileios; Al-Adili, Ali; Lantz, Mattias; Solders, Andreas; Jansson, Kaj; Prokofiev, Alexander V.; Eronen, Tommi; Gorelov, Dimitri; Jokinen, Ari; Kankainen, Anu; Moore, Iain D.; Penttilä, Heikki; Rinta-Antila, Sami

    2018-03-01

    Data on fission yields and isomeric yield ratios (IYR) are tools to study the fission process, in particular the generation of angular momentum. We use the IGISOL facility with the Penning trap JYFLTRAP in Jyväskylä, Finland, for such measurements on 232Th and natU targets. Previously published fission yield data from IGISOL concern the 232Th(p,f) and 238U(p,f) reactions at 25 and 50 MeV. Recently, a neutron source, using the Be(p,n) reaction, has been developed, installed and tested. We summarize the results for (p,f) focusing on the first measurement of IYR by direct ion counting. We also present first results for IYR and relative yields for Sn and Sb isotopes in the 128-133 mass range from natU(n,f) based on γ-spectrometry. We find a staggering behaviour in the cumulative yields for Sn and a shift in the independent fission yields for Sb as compared to current evaluations. Plans for the future experimental program on fission yields and IYR measurements are discussed.

  3. Slope Controls Grain Yield and Climatic Yield in Mountainous Yunnan province, China

    NASA Astrophysics Data System (ADS)

    Duan, X.; Rong, L.; Gu, Z.; Feng, D.

    2017-12-01

    Mountainous regions are increasingly vulnerable to food insecurity because of limited arable land, growing population pressure, and climate change. Development of sustainable mountain agriculture will require an increased understanding of the effects of environmental factors on grain and climatic yields. The objective of this study was to explore the relationships between actual grain yield, climatic yield, and environmental factors in a mountainous region in China. We collected data on the average grain yield per unit area in 119 counties in Yunnan province from 1985 to 2012, and chose 17 environmental factors for the same period. Our results showed that actual grain yield ranged from 1.43 to 6.92 t·ha-1, and the climatic yield ranged from -0.15 to -0.01 t·ha-1. Lower climatic yield but higher grain yield was generally found in central areas and at lower slopes and elevations in the western and southwestern counties of Yunnan province. Higher climatic yield but lower grain yield were found in northwestern parts of Yunnan province on steep slopes. Annual precipation and temperature had a weak influence on the climatic yield. Slope explained 44.62 and 26.29% of the variation in grain yield and climatic yield. The effects of topography on grain and climatic yields were greater than climatic factors. Slope was the most important environmental variable for the variability in climatic and grain yields in the mountainous Yunnan province due to the highly heterogeneous topographic conditions. Conversion of slopes to terraces in areas with higher climatic yields is an effective way to maintain grain production in response to climate variability. Additionally, soil amendments and soil and water conservation measures should be considered to maintain soil fertility and aid in sustainable development in central areas, and in counties at lower slopes and elevations in western and southwestern Yunnan province.

  4. Development of a nanosatellite de-orbiting system by reliability based design optimization

    NASA Astrophysics Data System (ADS)

    Nikbay, Melike; Acar, Pınar; Aslan, Alim Rüstem

    2015-12-01

    This paper presents design approaches to develop a reliable and efficient de-orbiting system for the 3USAT nanosatellite to provide a beneficial orbital decay process at the end of a mission. A de-orbiting system is initially designed by employing the aerodynamic drag augmentation principle where the structural constraints of the overall satellite system and the aerodynamic forces are taken into account. Next, an alternative de-orbiting system is designed with new considerations and further optimized using deterministic and reliability based design techniques. For the multi-objective design, the objectives are chosen to maximize the aerodynamic drag force through the maximization of the Kapton surface area while minimizing the de-orbiting system mass. The constraints are related in a deterministic manner to the required deployment force, the height of the solar panel hole and the deployment angle. The length and the number of layers of the deployable Kapton structure are used as optimization variables. In the second stage of this study, uncertainties related to both manufacturing and operating conditions of the deployable structure in space environment are considered. These uncertainties are then incorporated into the design process by using different probabilistic approaches such as Monte Carlo Simulation, the First-Order Reliability Method and the Second-Order Reliability Method. The reliability based design optimization seeks optimal solutions using the former design objectives and constraints with the inclusion of a reliability index. Finally, the de-orbiting system design alternatives generated by different approaches are investigated and the reliability based optimum design is found to yield the best solution since it significantly improves both system reliability and performance requirements.

  5. Benchmarks and Reliable DFT Results for Spin Gaps of Small Ligand Fe(II) Complexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Suhwan; Kim, Min-Cheol; Sim, Eunji

    2017-05-01

    All-electron fixed-node diffusion Monte Carlo provides benchmark spin gaps for four Fe(II) octahedral complexes. Standard quantum chemical methods (semilocal DFT and CCSD(T)) fail badly for the energy difference between their high- and low-spin states. Density-corrected DFT is both significantly more accurate and reliable and yields a consistent prediction for the Fe-Porphyrin complex

  6. Simulated Impacts of Climate Change on Water Use and Yield of Irrigated Sugarcane in South Africa

    NASA Technical Reports Server (NTRS)

    Jones, M.R; Singels, A.; Ruane, A. C.

    2015-01-01

    Reliable predictions of climate change impacts on water use, irrigation requirements and yields of irrigated sugarcane in South Africa (a water-scarce country) are necessary to plan adaptation strategies. Although previous work has been done in this regard, methodologies and results vary considerably. The objectives were (1) to estimate likely impacts of climate change on sugarcane yields, water use and irrigation demand at three irrigated sugarcane production sites in South Africa (Malelane, Pongola and La Mercy) for current (1980-2010) and future (2070-2100) climate scenarios, using an approach based on the Agricultural Model Inter-comparison and Improvement Project (AgMIP) protocols; and (2) to assess the suitability of this methodology for investigating climate change impacts on sugarcane production. Future climate datasets were generated using the Delta downscaling method and three Global Circulation Models (GCMs) assuming atmospheric CO2 concentration [CO2] of 734 ppm(A2 emissions scenario). Yield and water use were simulated using the DSSAT-Canegro v4.5 model. Irrigated cane yields are expected to increase at all three sites (between 11 and 14%), primarily due to increased interception of radiation as a result of accelerated canopy development. Evapotranspiration and irrigation requirements increased by 11% due to increased canopy cover and evaporative demand. Sucrose yields are expected to decline because of increased consumption of photo-assimilate for structural growth and maintenance respiration. Crop responses in canopy development and yield formation differed markedly between the crop cycles investigated. Possible agronomic implications of these results include reduced weed control costs due to shortened periods of partial canopy, a need for improved efficiency of irrigation to counter increased demands, and adjustments to ripening and harvest practices to counter decreased cane quality and optimize productivity. Although the Delta climate data

  7. Increasing Crop Yields in Water Stressed Countries by Combining Operations of Freshwater Reservoir and Wastewater Reclamation Plant

    NASA Astrophysics Data System (ADS)

    Bhushan, R.; Ng, T. L.

    2015-12-01

    Freshwater resources around the world are increasing in scarcity due to population growth, industrialization and climate change. This is a serious concern for water stressed countries, including those in Asia and North Africa where future food production is expected to be negatively affected by this. To address this problem, we investigate the potential of combining freshwater reservoir and wastewater reclamation operations. Reservoir water is the cheaper source of irrigation, but is often limited and climate sensitive. Treated wastewater is a more reliable alternative for irrigation, but often requires extensive further treatment which can be expensive. We propose combining the operations of a reservoir and a wastewater reclamation plant (WWRP) to augment the supply from the reservoir with reclaimed water for increasing crop yields in water stressed regions. The joint system of reservoir and WWRP is modeled as a multi-objective optimization problem with the double objective of maximizing the crop yield and minimizing total cost, subject to constraints on reservoir storage, spill and release, and capacity of the WWRP. We use the crop growth model Aquacrop, supported by The Food and Agriculture Organization of the United Nations (FAO), to model crop growth in response to water use. Aquacrop considers the effects of water deficit on crop growth stages, and from there estimates crop yield. We generate results comparing total crop yield under irrigation with water from just the reservoir (which is limited and often interrupted), and yield with water from the joint system (which has the potential of higher supply and greater reliability). We will present results for locations in India and Africa to evaluate the potential of the joint operations for improving food security in those areas for different budgets.

  8. 75 FR 71613 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... Reliability Standards. The proposed Reliability Standards were designed to prevent instability, uncontrolled... Reliability Standards.\\2\\ The proposed Reliability Standards were designed to prevent instability... the SOLs, which if exceeded, could expose a widespread area of the bulk electric system to instability...

  9. Calculations of reliability predictions for the Apollo spacecraft

    NASA Technical Reports Server (NTRS)

    Amstadter, B. L.

    1966-01-01

    A new method of reliability prediction for complex systems is defined. Calculation of both upper and lower bounds are involved, and a procedure for combining the two to yield an approximately true prediction value is presented. Both mission success and crew safety predictions can be calculated, and success probabilities can be obtained for individual mission phases or subsystems. Primary consideration is given to evaluating cases involving zero or one failure per subsystem, and the results of these evaluations are then used for analyzing multiple failure cases. Extensive development is provided for the overall mission success and crew safety equations for both the upper and lower bounds.

  10. Reliability Growth and Its Applications to Dormant Reliability

    DTIC Science & Technology

    1981-12-01

    ability to make projection about future reli- ability (Rof 9:41-42). Barlow and Scheuer Model. Richard E. Barlow and Ernest M. Sch~uvr, of the University...Reliability Growth Prediction Models," Operations Research, 18(l):S2-6S (January/February 1970). 7. Bauer, John, William Hadley, and Robert Dietz... Texarkana , Texas, May 1973. (AD 768 119). 10. Bonis, Austin J. "Reliability Growth Curves for One Shot Devices," Proceedings 1977 Annual Reliability and

  11. Specific energy yield comparison between crystalline silicon and amorphous silicon based PV modules

    NASA Astrophysics Data System (ADS)

    Ferenczi, Toby; Stern, Omar; Hartung, Marianne; Mueggenburg, Eike; Lynass, Mark; Bernal, Eva; Mayer, Oliver; Zettl, Marcus

    2009-08-01

    As emerging thin-film PV technologies continue to penetrate the market and the number of utility scale installations substantially increase, detailed understanding of the performance of the various PV technologies becomes more important. An accurate database for each technology is essential for precise project planning, energy yield prediction and project financing. However recent publications showed that it is very difficult to get accurate and reliable performance data of theses technologies. This paper evaluates previously reported claims the amorphous silicon based PV modules have a higher annual energy yield compared to crystalline silicon modules relative to their rated performance. In order to acquire a detailed understanding of this effect, outdoor module tests were performed at GE Global Research Center in Munich. In this study we examine closely two of the five reported factors that contribute to enhanced energy yield of amorphous silicon modules. We find evidence to support each of these factors and evaluate their relative significance. We discuss aspects for improvement in how PV modules are sold and identify areas for further study further study.

  12. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  13. Cross-cultural Adaptation, Reliability, and Validity of the Yoruba Version of the Roland-Morris Disability Questionnaire.

    PubMed

    Mbada, Chidozie Emmanuel; Idowu, Opeyemi Ayodiipo; Ogunjimi, Olawale Richard; Ayanniyi, Olusola; Orimolade, Elkanah Ayodele; Oladiran, Ajibola Babatunde; Johnson, Olubusola Esther; Akinsulore, Adesanmi; Oni, Temitope Olawale

    2017-04-01

    A translation, cross-cultural adaptation, and psychometric analysis. The aim of this study was to translate, cross-culturally adapt, and validate the Yoruba version of the RMDQ. The Roland-Morris Disability Questionnaire (RMDQ) is a valid outcome tool for low back pain (LBP) in clinical and research settings. There seems to be no valid and reliable version of the RMDQ in the Nigerian languages. Following the Guillemin criteria, the English version of the RMDQ was forward and back translated. Two Yoruba translated versions of the RMDQ were assessed for clarity, common language usage, and conceptual equivalence. Consequently, a harmonized Yoruba version was produced and was pilot-tested among 20 patients with nonspecific long-term LBP (NSLBP) for cognitive debriefing. The final version of the Yoruba RMDQ was tested for its construct validity and re-retest reliability among 120 and 87 patients with NSLBP, respectively. Pearson product moment correlation coefficient (r) of 0.82 was obtained for reliability of the Yoruba version of the RMDQ. The test-retest reliability of the Yoruba RMDQ yielded Cronbach alpha 0.932, while the intraclass correlation (ICC) ranged between 0.896 and 0.956. The analysis of the global scores of both the English and Yoruba versions of the RMDQ yielded ICC value of between 0.995 (95% confidence interval 0.996-0.997), with the item-by-item Kappa agreement ranging between 0.824 and 1.000. The external validity of RMDQ using Quadruple Visual Analogue Scale was r = -0.596 (P = 0.001). The Yoruba version of the RMDQ had no floor/ceiling effects, as no patient achieved either of the maximum or the minimum possible scores. The Yoruba version of the RMDQ has excellent reliability and validity and may be an appropriate outcome tool for clinical and research purposes among Yoruba-speaking patients with LBP. 3.

  14. Software Reliability 2002

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores R.

    2003-01-01

    In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.

  15. Evaluating the capabilities of watershed-scale models in estimating sediment yield at field-scale.

    PubMed

    Sommerlot, Andrew R; Nejadhashemi, A Pouyan; Woznicki, Sean A; Giri, Subhasis; Prohaska, Michael D

    2013-09-30

    Many watershed model interfaces have been developed in recent years for predicting field-scale sediment loads. They share the goal of providing data for decisions aimed at improving watershed health and the effectiveness of water quality conservation efforts. The objectives of this study were to: 1) compare three watershed-scale models (Soil and Water Assessment Tool (SWAT), Field_SWAT, and the High Impact Targeting (HIT) model) against calibrated field-scale model (RUSLE2) in estimating sediment yield from 41 randomly selected agricultural fields within the River Raisin watershed; 2) evaluate the statistical significance among models; 3) assess the watershed models' capabilities in identifying areas of concern at the field level; 4) evaluate the reliability of the watershed-scale models for field-scale analysis. The SWAT model produced the most similar estimates to RUSLE2 by providing the closest median and the lowest absolute error in sediment yield predictions, while the HIT model estimates were the worst. Concerning statistically significant differences between models, SWAT was the only model found to be not significantly different from the calibrated RUSLE2 at α = 0.05. Meanwhile, all models were incapable of identifying priorities areas similar to the RUSLE2 model. Overall, SWAT provided the most correct estimates (51%) within the uncertainty bounds of RUSLE2 and is the most reliable among the studied models, while HIT is the least reliable. The results of this study suggest caution should be exercised when using watershed-scale models for field level decision-making, while field specific data is of paramount importance. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Atomic Oxygen Erosion Yield Prediction for Spacecraft Polymers in Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Backus, Jane A.; Manno, Michael V.; Waters, Deborah L.; Cameron, Kevin C.; deGroh, Kim K.

    2009-01-01

    The ability to predict the atomic oxygen erosion yield of polymers based on their chemistry and physical properties has been only partially successful because of a lack of reliable low Earth orbit (LEO) erosion yield data. Unfortunately, many of the early experiments did not utilize dehydrated mass loss measurements for erosion yield determination, and the resulting mass loss due to atomic oxygen exposure may have been compromised because samples were often not in consistent states of dehydration during the pre-flight and post-flight mass measurements. This is a particular problem for short duration mission exposures or low erosion yield materials. However, as a result of the retrieval of the Polymer Erosion and Contamination Experiment (PEACE) flown as part of the Materials International Space Station Experiment 2 (MISSE 2), the erosion yields of 38 polymers and pyrolytic graphite were accurately measured. The experiment was exposed to the LEO environment for 3.95 years from August 16, 2001 to July 30, 2005 and was successfully retrieved during a space walk on July 30, 2005 during Discovery s STS-114 Return to Flight mission. The 40 different materials tested (including Kapton H fluence witness samples) were selected specifically to represent a variety of polymers used in space as well as a wide variety of polymer chemical structures. The MISSE 2 PEACE Polymers experiment used carefully dehydrated mass measurements, as well as accurate density measurements to obtain accurate erosion yield data for high-fluence (8.43 1021 atoms/sq cm). The resulting data was used to develop an erosion yield predictive tool with a correlation coefficient of 0.895 and uncertainty of +/-6.3 10(exp -25)cu cm/atom. The predictive tool utilizes the chemical structures and physical properties of polymers to predict in-space atomic oxygen erosion yields. A predictive tool concept (September 2009 version) is presented which represents an improvement over an earlier (December 2008) version.

  17. Uncertainty quantification and reliability assessment in operational oil spill forecast modeling system.

    PubMed

    Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao

    2017-03-15

    As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. The yield and post-yield behavior of high-density polyethylene

    NASA Technical Reports Server (NTRS)

    Semeliss, M. A.; Wong, R.; Tuttle, M. E.

    1990-01-01

    An experimental and analytical evaluation was made of the yield and post-yield behavior of high-density polyethylene, a semi-crystalline thermoplastic. Polyethylene was selected for study because it is very inexpensive and readily available in the form of thin-walled tubes. Thin-walled tubular specimens were subjected to axial loads and internal pressures, such that the specimens were subjected to a known biaxial loading. A constant octahederal shear stress rate was imposed during all tests. The measured yield and post-yield behavior was compared with predictions based on both isotropic and anisotropic models. Of particular interest was whether inelastic behavior was sensitive to the hydrostatic stress level. The major achievements and conclusions reached are discussed.

  19. High-yield exfoliation of tungsten disulphide nanosheets by rational mixing of low-boiling-point solvents

    NASA Astrophysics Data System (ADS)

    Sajedi-Moghaddam, Ali; Saievar-Iranizad, Esmaiel

    2018-01-01

    Developing high-throughput, reliable, and facile approaches for producing atomically thin sheets of transition metal dichalcogenides is of great importance to pave the way for their use in real applications. Here, we report a highly promising route for exfoliating two-dimensional tungsten disulphide sheets by using binary combination of low-boiling-point solvents. Experimental results show significant dependence of exfoliation yield on the type of solvents as well as relative volume fraction of each solvent. The highest yield was found for appropriate combination of isopropanol/water (20 vol% isopropanol and 80 vol% water) which is approximately 7 times higher than that in pure isopropanol and 4 times higher than that in pure water. The dramatic increase in exfoliation yield can be attributed to perfect match between the surface tension of tungsten disulphide and binary solvent system. Furthermore, solvent molecular size also has a profound impact on the exfoliation efficiency, due to the steric repulsion.

  20. An adapted yield criterion for the evolution of subsequent yield surfaces

    NASA Astrophysics Data System (ADS)

    Küsters, N.; Brosius, A.

    2017-09-01

    In numerical analysis of sheet metal forming processes, the anisotropic material behaviour is often modelled with isotropic work hardening and an average Lankford coefficient. In contrast, experimental observations show an evolution of the Lankford coefficients, which can be associated with a yield surface change due to kinematic and distortional hardening. Commonly, extensive efforts are carried out to describe these phenomena. In this paper an isotropic material model based on the Yld2000-2d criterion is adapted with an evolving yield exponent in order to change the yield surface shape. The yield exponent is linked to the accumulative plastic strain. This change has the effect of a rotating yield surface normal. As the normal is directly related to the Lankford coefficient, the change can be used to model the evolution of the Lankford coefficient during yielding. The paper will focus on the numerical implementation of the adapted material model for the FE-code LS-Dyna, mpi-version R7.1.2-d. A recently introduced identification scheme [1] is used to obtain the parameters for the evolving yield surface and will be briefly described for the proposed model. The suitability for numerical analysis will be discussed for deep drawing processes in general. Efforts for material characterization and modelling will be compared to other common yield surface descriptions. Besides experimental efforts and achieved accuracy, the potential of flexibility in material models and the risk of ambiguity during identification are of major interest in this paper.

  1. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ...] Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards AGENCY: Federal... Reliability Standards identified by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. FOR FURTHER INFORMATION CONTACT: Kevin Ryan (Legal Information...

  2. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation

    USDA-ARS?s Scientific Manuscript database

    The scale mismatch between remotely sensed observations and crop growth models simulated state variables decreases the reliability of crop yield estimates. To overcome this problem, we used a two-step data assimilation phases: first we generated a complete leaf area index (LAI) time series by combin...

  3. [Prediction of the side-cut product yield of atmospheric/vacuum distillation unit by NIR crude oil rapid assay].

    PubMed

    Wang, Yan-Bin; Hu, Yu-Zhong; Li, Wen-Le; Zhang, Wei-Song; Zhou, Feng; Luo, Zhi

    2014-10-01

    In the present paper, based on the fast evaluation technique of near infrared, a method to predict the yield of atmos- pheric and vacuum line was developed, combined with H/CAMS software. Firstly, the near-infrared (NIR) spectroscopy method for rapidly determining the true boiling point of crude oil was developed. With commercially available crude oil spectroscopy da- tabase and experiments test from Guangxi Petrochemical Company, calibration model was established and a topological method was used as the calibration. The model can be employed to predict the true boiling point of crude oil. Secondly, the true boiling point based on NIR rapid assay was converted to the side-cut product yield of atmospheric/vacuum distillation unit by H/CAMS software. The predicted yield and the actual yield of distillation product for naphtha, diesel, wax and residual oil were compared in a 7-month period. The result showed that the NIR rapid crude assay can predict the side-cut product yield accurately. The near infrared analytic method for predicting yield has the advantages of fast analysis, reliable results, and being easy to online operate, and it can provide elementary data for refinery planning optimization and crude oil blending.

  4. 76 FR 42534 - Mandatory Reliability Standards for Interconnection Reliability Operating Limits; System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-19

    ... Reliability Operating Limits; System Restoration Reliability Standards AGENCY: Federal Energy Regulatory... data necessary to analyze and monitor Interconnection Reliability Operating Limits (IROL) within its... Interconnection Reliability Operating Limits, Order No. 748, 134 FERC ] 61,213 (2011). \\2\\ The term ``Wide-Area...

  5. Using multivariate generalizability theory to assess the effect of content stratification on the reliability of a performance assessment.

    PubMed

    Keller, Lisa A; Clauser, Brian E; Swanson, David B

    2010-12-01

    In recent years, demand for performance assessments has continued to grow. However, performance assessments are notorious for lower reliability, and in particular, low reliability resulting from task specificity. Since reliability analyses typically treat the performance tasks as randomly sampled from an infinite universe of tasks, these estimates of reliability may not be accurate. For tests built according to a table of specifications, tasks are randomly sampled from different strata (content domains, skill areas, etc.). If these strata remain fixed in the test construction process, ignoring this stratification in the reliability analysis results in an underestimate of "parallel forms" reliability, and an overestimate of the person-by-task component. This research explores the effect of representing and misrepresenting the stratification appropriately in estimation of reliability and the standard error of measurement. Both multivariate and univariate generalizability studies are reported. Results indicate that the proper specification of the analytic design is essential in yielding the proper information both about the generalizability of the assessment and the standard error of measurement. Further, illustrative D studies present the effect under a variety of situations and test designs. Additional benefits of multivariate generalizability theory in test design and evaluation are also discussed.

  6. Acid soil infertility effects on peanut yields and yield components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blamey, F.P.C.

    1983-01-01

    The interpretation of soil amelioration experiments with peanuts is made difficult by the unpredictibility of the crop and by the many factors altered when ameliorating acid soils. The present study was conducted to investigate the effects of lime and gypsum applications on peanut kernel yield via the three first order yield components, pods per ha, kernels per pod, and kernel mass. On an acid medium sandy loam soil (typic Plinthustult), liming resulted in a highly significant kernel yield increase of 117% whereas gypsum applications were of no significant benefit. As indicated by path coefficient analysis, an increase in the numbermore » of pods per ha was markedly more important in increasing yield than an increase in either the number of kernels per pod or kernel mass. Furthermore, exch. Al was found to be particularly detrimental to pod number. It was postulated that poor peanut yields resulting from acid soil infertility were mainly due to the depressive effect of exch. Al on pod number. Exch. Ca appeared to play a secondary role by ameliorating the adverse effects of exch. Al.« less

  7. Assessing the Measurement Properties of the Principal Instructional Management Rating Scale: A Meta-Analysis of Reliability Studies

    ERIC Educational Resources Information Center

    Hallinger, Phillip; Wang, Wen-Chung; Chen, Chia-Wen

    2013-01-01

    Background: In a recent article, Hallinger (2011b) reviewed 135 empirical studies that had employed the Principal Instructional Management Rating Scale (PIMRS) over the prior three decades. The author concluded that the PIMRS appeared to have attained a consistent record of yielding reliable and valid data on principal instructional leadership.…

  8. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    PubMed

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  9. Test-retest reliability and cross validation of the functioning everyday with a wheelchair instrument.

    PubMed

    Mills, Tamara L; Holm, Margo B; Schmeler, Mark

    2007-01-01

    The purpose of this study was to establish the test-retest reliability and content validity of an outcomes tool designed to measure the effectiveness of seating-mobility interventions on the functional performance of individuals who use wheelchairs or scooters as their primary seating-mobility device. The instrument, Functioning Everyday With a Wheelchair (FEW), is a questionnaire designed to measure perceived user function related to wheelchair/scooter use. Using consumer-generated items, FEW Beta Version 1.0 was developed and test-retest reliability was established. Cross-validation of FEW Beta Version 1.0 was then carried out with five samples of seating-mobility users to establish content validity. Based on the content validity study, FEW Version 2.0 was developed and administered to seating-mobility consumers to examine its test-retest reliability. FEW Beta Version 1.0 yielded an intraclass correlation coefficient (ICC) Model (3,k) of .92, p < .001, and the content validity results revealed that FEW Beta Version 1.0 captured 55% of seating-mobility goals reported by consumers across five samples. FEW Version 2.0 yielded ICC(3,k) = .86, p < .001, and captured 98.5% of consumers' seating-mobility goals. The cross-validation study identified new categories of seating-mobility goals for inclusion in FEW Version 2.0, and the content validity of FEW Version 2.0 was confirmed. FEW Beta Version 1.0 and FEW Version 2.0 were highly stable in their measurement of participants' seating-mobility goals over a 1-week interval.

  10. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  11. Yield Advances in Peanut

    USDA-ARS?s Scientific Manuscript database

    Average yields of peanut in the U.S. set an all time record of 4,695 kg ha-1 in 2012. This far exceeded the previous record yield of 3,837 kg ha-1 in 2008. Favorable weather conditions undoubtedly contributed to the record yields in 2012; however, these record yields would not have been achievable...

  12. Si-nanocrystal-based nanofluids for nanothermometry

    NASA Astrophysics Data System (ADS)

    Cardona-Castro, M. A.; Morales-Sánchez, A.; Licea-Jiménez, L.; Alvarez-Quintana, J.

    2016-06-01

    The measurement of local temperature in nanoscale volumes is becoming a technological frontier. Photoluminescent nanoparticles and nanocolloids are the natural choice for nanoscale temperature probes. However, the influence of a surrounding liquid on the cryogenic behavior of oxidized Si-nanocrystals (Si-NCs) has never been investigated. In this work, the photoluminescence (PL) of oxidized Si-NCs/alcohol based nanocolloids is measured as a function of the temperature and the molecule length of monohydric alcohols above their melting-freezing point. The results unveil a progressive blue shift on the emission peak which is dependent on the temperature as well as the dielectric properties of the surrounding liquid. Such an effect is analyzed in terms of thermal changes of the Si-NCs bandgap, quantum confinement and the polarization effects of the embedding medium; revealing an important role of the dielectric constant of the surrounding liquid. These results are relevant because they offer a general insight to the fundamental behavior of photoluminescent nanocolloids under a cooling process and moreover, enabling PL tuning based on the dielectric properties of the surrounding liquid. Hence, the variables required to engineer PL of nanofluids are properly identified for use as temperature sensors at the nanoscale.

  13. CuInGaSe{sub 2} nanoparticles by pulsed laser ablation in liquid medium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendivil, M.I.; García, L.V.; Krishnan, B.

    2015-12-15

    Highlights: • CIGS nanocolloids were synthesized using PLAL technique. • Characterized their morphology, structure, composition and optical properties. • Morphologies were dependent on ablation wavelength and liquid medium. • Optical absorption and bandgap of these nanocolloids were tunable. - Abstract: Pulsed laser ablation in liquid medium (PLALM) is a nanofabrication technique to produce complex nanostructures. CuInGaSe{sub 2} (CIGS) is an alloy with applications in photovoltaic industry. In this work, we studied the effects of laser ablation wavelength, energy fluence and liquid medium on the properties of the CIGS nanoparticles synthesized by PLALM. The nanoparticles obtained were analyzed by transmission electronmore » microscopy (TEM), energy dispersive X-ray spectroscopy (EDX), selected area electron diffraction (SAED), X-ray photoelectron spectroscopy (XPS) and UV–vis absorption spectroscopy. XPS results confirmed the chemical states and composition of the ablated products. TEM analysis showed different morphologies for the nanomaterials obtained in different liquid media and ablation wavelengths. The optical properties for these CIGS nanocolloids were analyzed using UV–vis absorption spectroscopy. The results demonstrated the use of PLALM as a useful synthesis technique for nanoparticles of quaternary photovoltaic materials.« less

  14. Increasing influence of heat stress on French maize yields from the 1960s to the 2030s

    PubMed Central

    Hawkins, Ed; Fricker, Thomas E; Challinor, Andrew J; Ferro, Christopher A T; Kit Ho, Chun; Osborne, Tom M

    2013-01-01

    Improved crop yield forecasts could enable more effective adaptation to climate variability and change. Here, we explore how to combine historical observations of crop yields and weather with climate model simulations to produce crop yield projections for decision relevant timescales. Firstly, the effects on historical crop yields of improved technology, precipitation and daily maximum temperatures are modelled empirically, accounting for a nonlinear technology trend and interactions between temperature and precipitation, and applied specifically for a case study of maize in France. The relative importance of precipitation variability for maize yields in France has decreased significantly since the 1960s, likely due to increased irrigation. In addition, heat stress is found to be as important for yield as precipitation since around 2000. A significant reduction in maize yield is found for each day with a maximum temperature above 32 °C, in broad agreement with previous estimates. The recent increase in such hot days has likely contributed to the observed yield stagnation. Furthermore, a general method for producing near-term crop yield projections, based on climate model simulations, is developed and utilized. We use projections of future daily maximum temperatures to assess the likely change in yields due to variations in climate. Importantly, we calibrate the climate model projections using observed data to ensure both reliable temperature mean and daily variability characteristics, and demonstrate that these methods work using retrospective predictions. We conclude that, to offset the projected increased daily maximum temperatures over France, improved technology will need to increase base level yields by 12% to be confident about maintaining current levels of yield for the period 2016–2035; the current rate of yield technology increase is not sufficient to meet this target. PMID:23504849

  15. Training less-experienced faculty improves reliability of skills assessment in cardiac surgery.

    PubMed

    Lou, Xiaoying; Lee, Richard; Feins, Richard H; Enter, Daniel; Hicks, George L; Verrier, Edward D; Fann, James I

    2014-12-01

    Previous work has demonstrated high inter-rater reliability in the objective assessment of simulated anastomoses among experienced educators. We evaluated the inter-rater reliability of less-experienced educators and the impact of focused training with a video-embedded coronary anastomosis assessment tool. Nine less-experienced cardiothoracic surgery faculty members from different institutions evaluated 2 videos of simulated coronary anastomoses (1 by a medical student and 1 by a resident) at the Thoracic Surgery Directors Association Boot Camp. They then underwent a 30-minute training session using an assessment tool with embedded videos to anchor rating scores for 10 components of coronary artery anastomosis. Afterward, they evaluated 2 videos of a different student and resident performing the task. Components were scored on a 1 to 5 Likert scale, yielding an average composite score. Inter-rater reliabilities of component and composite scores were assessed using intraclass correlation coefficients (ICCs) and overall pass/fail ratings with kappa. All components of the assessment tool exhibited improvement in reliability, with 4 (bite, needle holder use, needle angles, and hand mechanics) improving the most from poor (ICC range, 0.09-0.48) to strong (ICC range, 0.80-0.90) agreement. After training, inter-rater reliabilities for composite scores improved from moderate (ICC, 0.76) to strong (ICC, 0.90) agreement, and for overall pass/fail ratings, from poor (kappa = 0.20) to moderate (kappa = 0.78) agreement. Focused, video-based anchor training facilitates greater inter-rater reliability in the objective assessment of simulated coronary anastomoses. Among raters with less teaching experience, such training may be needed before objective evaluation of technical skills. Published by Elsevier Inc.

  16. A tonic heat test stimulus yields a larger and more reliable conditioned pain modulation effect compared to a phasic heat test stimulus

    PubMed Central

    Lie, Marie Udnesseter; Matre, Dagfinn; Hansson, Per; Stubhaug, Audun; Zwart, John-Anker; Nilsen, Kristian Bernhard

    2017-01-01

    Abstract Introduction: The interest in conditioned pain modulation (CPM) as a clinical tool for measuring endogenously induced analgesia is increasing. There is, however, large variation in the CPM methodology, hindering comparison of results across studies. Research comparing different CPM protocols is needed in order to obtain a standardized test paradigm. Objectives: The aim of the study was to assess whether a protocol with phasic heat stimuli as test-stimulus is preferable to a protocol with tonic heat stimulus as test-stimulus. Methods: In this experimental crossover study, we compared 2 CPM protocols with different test-stimulus; one with tonic test-stimulus (constant heat stimulus of 120-second duration) and one with phasic test-stimuli (3 heat stimulations of 5 seconds duration separated by 10 seconds). Conditioning stimulus was a 7°C water bath in parallel with the test-stimulus. Twenty-four healthy volunteers were assessed on 2 occasions with minimum 1 week apart. Differences in the magnitude and test–retest reliability of the CPM effect in the 2 protocols were investigated with repeated-measures analysis of variance and by relative and absolute reliability indices. Results: The protocol with tonic test-stimulus induced a significantly larger CPM effect compared to the protocol with phasic test-stimuli (P < 0.001). Fair and good relative reliability was found with the phasic and tonic test-stimuli, respectively. Absolute reliability indices showed large intraindividual variability from session to session in both protocols. Conclusion: The present study shows that a CPM protocol with a tonic test-stimulus is preferable to a protocol with phasic test-stimuli. However, we emphasize that one should be cautious to use the CPM effect as biomarker or in clinical decision making on an individual level due to large intraindividual variability. PMID:29392240

  17. Reliability training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  18. Reliability as Argument

    ERIC Educational Resources Information Center

    Parkes, Jay

    2007-01-01

    Reliability consists of both important social and scientific values and methods for evidencing those values, though in practice methods are often conflated with the values. With the two distinctly understood, a reliability argument can be made that articulates the particular reliability values most relevant to the particular measurement situation…

  19. Assimilation of Remotely Sensed Soil Moisture Profiles into a Crop Modeling Framework for Reliable Yield Estimations

    NASA Astrophysics Data System (ADS)

    Mishra, V.; Cruise, J.; Mecikalski, J. R.

    2017-12-01

    growing seasons from 2015-2017. Soil moisture profiles compared favorably to in situ data and simulated crop yields compared well with observed yields.

  20. Design for reliability: NASA reliability preferred practices for design and test

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.

    1994-01-01

    This tutorial summarizes reliability experience from both NASA and industry and reflects engineering practices that support current and future civil space programs. These practices were collected from various NASA field centers and were reviewed by a committee of senior technical representatives from the participating centers (members are listed at the end). The material for this tutorial was taken from the publication issued by the NASA Reliability and Maintainability Steering Committee (NASA Reliability Preferred Practices for Design and Test. NASA TM-4322, 1991). Reliability must be an integral part of the systems engineering process. Although both disciplines must be weighed equally with other technical and programmatic demands, the application of sound reliability principles will be the key to the effectiveness and affordability of America's space program. Our space programs have shown that reliability efforts must focus on the design characteristics that affect the frequency of failure. Herein, we emphasize that these identified design characteristics must be controlled by applying conservative engineering principles.

  1. Reliability of COPVs Accounting for Margin of Safety on Design Burst

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L.N.

    2012-01-01

    In this paper, the stress rupture reliability of Carbon/Epoxy Composite Overwrapped Pressure Vessels (COPVs) is examined utilizing the classic Phoenix model and accounting for the differences between the design and the actual burst pressure, and the liner contribution effects. Stress rupture life primarily depends upon the fiber stress ratio which is defined as the ratio of stress in fibers at the maximum expected operating pressure to actual delivered fiber strength. The actual delivered fiber strength is calculated using the actual burst pressures of vessels established through burst tests. However, during the design phase the actual burst pressure is generally not known and to estimate the reliability of the vessels calculations are usually performed based upon the design burst pressure only. Since the design burst is lower than the actual burst, this process yields a much higher value for the stress ratio and consequently a conservative estimate for the reliability. Other complications arise due to the fact that the actual burst pressure and the liner contributions have inherent variability and therefore must be treated as random variables in order to compute the stress rupture reliability. Furthermore, the model parameters, which have to be established based on stress rupture tests of subscale vessels or coupons, have significant variability as well due to limited available data and hence must be properly accounted for. In this work an assessment of reliability of COPVs including both parameter uncertainties and physical variability inherent in liner and overwrap material behavior is made and estimates are provided in terms of degree of uncertainty in the actual burst pressure and the liner load sharing.

  2. A hierarchical spatial model for well yield in complex aquifers

    NASA Astrophysics Data System (ADS)

    Montgomery, J.; O'sullivan, F.

    2017-12-01

    Efficiently siting and managing groundwater wells requires reliable estimates of the amount of water that can be produced, or the well yield. This can be challenging to predict in highly complex, heterogeneous fractured aquifers due to the uncertainty around local hydraulic properties. Promising statistical approaches have been advanced in recent years. For instance, kriging and multivariate regression analysis have been applied to well test data with limited but encouraging levels of prediction accuracy. Additionally, some analytical solutions to diffusion in homogeneous porous media have been used to infer "effective" properties consistent with observed flow rates or drawdown. However, this is an under-specified inverse problem with substantial and irreducible uncertainty. We describe a flexible machine learning approach capable of combining diverse datasets with constraining physical and geostatistical models for improved well yield prediction accuracy and uncertainty quantification. Our approach can be implemented within a hierarchical Bayesian framework using Markov Chain Monte Carlo, which allows for additional sources of information to be incorporated in priors to further constrain and improve predictions and reduce the model order. We demonstrate the usefulness of this approach using data from over 7,000 wells in a fractured bedrock aquifer.

  3. The reliability of a quality appraisal tool for studies of diagnostic reliability (QAREL).

    PubMed

    Lucas, Nicholas; Macaskill, Petra; Irwig, Les; Moran, Robert; Rickards, Luke; Turner, Robin; Bogduk, Nikolai

    2013-09-09

    The aim of this project was to investigate the reliability of a new 11-item quality appraisal tool for studies of diagnostic reliability (QAREL). The tool was tested on studies reporting the reliability of any physical examination procedure. The reliability of physical examination is a challenging area to study given the complex testing procedures, the range of tests, and lack of procedural standardisation. Three reviewers used QAREL to independently rate 29 articles, comprising 30 studies, published during 2007. The articles were identified from a search of relevant databases using the following string: "Reproducibility of results (MeSH) OR reliability (t.w.) AND Physical examination (MeSH) OR physical examination (t.w.)." A total of 415 articles were retrieved and screened for inclusion. The reviewers undertook an independent trial assessment prior to data collection, followed by a general discussion about how to score each item. At no time did the reviewers discuss individual papers. Reliability was assessed for each item using multi-rater kappa (κ). Multi-rater reliability estimates ranged from κ = 0.27 to 0.92 across all items. Six items were recorded with good reliability (κ > 0.60), three with moderate reliability (κ = 0.41 - 0.60), and two with fair reliability (κ = 0.21 - 0.40). Raters found it difficult to agree about the spectrum of patients included in a study (Item 1) and the correct application and interpretation of the test (Item 10). In this study, we found that QAREL was a reliable assessment tool for studies of diagnostic reliability when raters agreed upon criteria for the interpretation of each item. Nine out of 11 items had good or moderate reliability, and two items achieved fair reliability. The heterogeneity in the tests included in this study may have resulted in an underestimation of the reliability of these two items. We discuss these and other factors that could affect our results and make recommendations for the use of QAREL.

  4. Validity and Reliability of Accelerometers in Patients With COPD: A SYSTEMATIC REVIEW.

    PubMed

    Gore, Shweta; Blackwood, Jennifer; Guyette, Mary; Alsalaheen, Bara

    2018-05-01

    Reduced physical activity is associated with poor prognosis in chronic obstructive pulmonary disease (COPD). Accelerometers have greatly improved quantification of physical activity by providing information on step counts, body positions, energy expenditure, and magnitude of force. The purpose of this systematic review was to compare the validity and reliability of accelerometers used in patients with COPD. An electronic database search of MEDLINE and CINAHL was performed. Study quality was assessed with the Strengthening the Reporting of Observational Studies in Epidemiology checklist while methodological quality was assessed using the modified Quality Appraisal Tool for Reliability Studies. The search yielded 5392 studies; 25 met inclusion criteria. The SenseWear Pro armband reported high criterion validity under controlled conditions (r = 0.75-0.93) and high reliability (ICC = 0.84-0.86) for step counts. The DynaPort MiniMod demonstrated highest concurrent validity for step count using both video and manual methods. Validity of the SenseWear Pro armband varied between studies especially in free-living conditions, slower walking speeds, and with addition of weights during gait. A high degree of variability was found in the outcomes used and statistical analyses performed between studies, indicating a need for further studies to measure reliability and validity of accelerometers in COPD. The SenseWear Pro armband is the most commonly used accelerometer in COPD, but measurement properties are limited by gait speed variability and assistive device use. DynaPort MiniMod and Stepwatch accelerometers demonstrated high validity in patients with COPD but lack reliability data.

  5. Towards cost-effective reliability through visualization of the reliability option space

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.

    2004-01-01

    In planning a complex system's development there can be many options to improve its reliability. Typically their sum total cost exceeds the budget available, so it is necessary to select judiciously from among them. Reliability models can be employed to calculate the cost and reliability implications of a candidate selection.

  6. 76 FR 73608 - Reliability Technical Conference, North American Electric Reliability Corporation, Public Service...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-29

    ... or municipal authority play in forming your bulk power system reliability plans? b. Do you support..., North American Electric Reliability Corporation (NERC) Nick Akins, CEO of American Electric Power (AEP..., EL11-62-000] Reliability Technical Conference, North American Electric Reliability Corporation, Public...

  7. Soviet test yields

    NASA Astrophysics Data System (ADS)

    Vergino, Eileen S.

    Soviet seismologists have published descriptions of 96 nuclear explosions conducted from 1961 through 1972 at the Semipalatinsk test site, in Kazakhstan, central Asia [Bocharov et al., 1989]. With the exception of releasing news about some of their peaceful nuclear explosions (PNEs) the Soviets have never before published such a body of information.To estimate the seismic yield of a nuclear explosion it is necessary to obtain a calibrated magnitude-yield relationship based on events with known yields and with a consistent set of seismic magnitudes. U.S. estimation of Soviet test yields has been done through application of relationships to the Soviet sites based on the U.S. experience at the Nevada Test Site (NTS), making some correction for differences due to attenuation and near-source coupling of seismic waves.

  8. Assuring reliability program effectiveness.

    NASA Technical Reports Server (NTRS)

    Ball, L. W.

    1973-01-01

    An attempt is made to provide simple identification and description of techniques that have proved to be most useful either in developing a new product or in improving reliability of an established product. The first reliability task is obtaining and organizing parts failure rate data. Other tasks are parts screening, tabulation of general failure rates, preventive maintenance, prediction of new product reliability, and statistical demonstration of achieved reliability. Five principal tasks for improving reliability involve the physics of failure research, derating of internal stresses, control of external stresses, functional redundancy, and failure effects control. A final task is the training and motivation of reliability specialist engineers.

  9. Impact of heterozygosity and heterogeneity on cotton lint yield stability: II. Lint yield components

    USDA-ARS?s Scientific Manuscript database

    In order to determine which yield components may contribute to yield stability, an 18-environment field study was undertaken to observe the mean, standard deviation (SD), and coefficient of variation (CV) for cotton lint yield components in population types that differed for lint yield stability. Th...

  10. Reliability Analysis and Reliability-Based Design Optimization of Circular Composite Cylinders Under Axial Compression

    NASA Technical Reports Server (NTRS)

    Rais-Rohani, Masoud

    2001-01-01

    This report describes the preliminary results of an investigation on component reliability analysis and reliability-based design optimization of thin-walled circular composite cylinders with average diameter and average length of 15 inches. Structural reliability is based on axial buckling strength of the cylinder. Both Monte Carlo simulation and First Order Reliability Method are considered for reliability analysis with the latter incorporated into the reliability-based structural optimization problem. To improve the efficiency of reliability sensitivity analysis and design optimization solution, the buckling strength of the cylinder is estimated using a second-order response surface model. The sensitivity of the reliability index with respect to the mean and standard deviation of each random variable is calculated and compared. The reliability index is found to be extremely sensitive to the applied load and elastic modulus of the material in the fiber direction. The cylinder diameter was found to have the third highest impact on the reliability index. Also the uncertainty in the applied load, captured by examining different values for its coefficient of variation, is found to have a large influence on cylinder reliability. The optimization problem for minimum weight is solved subject to a design constraint on element reliability index. The methodology, solution procedure and optimization results are included in this report.

  11. Effects of fission yield data in the calculation of antineutrino spectra for U 235 ( n , fission ) at thermal and fast neutron energies

    DOE PAGES

    Sonzogni, A. A.; McCutchan, E. A.; Johnson, T. D.; ...

    2016-04-01

    Fission yields form an integral part of the prediction of antineutrino spectra generated by nuclear reactors, but little attention has been paid to the quality and reliability of the data used in current calculations. Following a critical review of the thermal and fast ENDF/B-VII.1 235U fission yields, deficiencies are identified and improved yields are obtained, based on corrections of erroneous yields, consistency between decay and fission yield data, and updated isomeric ratios. These corrected yields are used to calculate antineutrino spectra using the summation method. An anomalous value for the thermal fission yield of 86Ge generates an excess of antineutrinosmore » at 5–7 MeV, a feature which is no longer present when the corrected yields are used. Thermal spectra calculated with two distinct fission yield libraries (corrected ENDF/B and JEFF) differ by up to 6% in the 0–7 MeV energy window, allowing for a basic estimate of the uncertainty involved in the fission yield component of summation calculations. Lastly, the fast neutron antineutrino spectrum is calculated, which at the moment can only be obtained with the summation method and may be relevant for short baseline reactor experiments using highly enriched uranium fuel.« less

  12. Getting to Zero Yield: The Evolution of the U.S. Position on the CTBT

    NASA Astrophysics Data System (ADS)

    Zimmerman, Peter D.

    1998-03-01

    In 1994 the United States favored a Comprehensive Test Ban Treaty (CTBT) which permitted tiny "hydronuclear" experiments with a nuclear energy release of four pounds or less. Other nuclear powers supported yield limits as high as large fractions of a kiloton, while most non-nuclear nations participating in the discussions at the United Nations Conference on Disarmament wanted to prohibit all nuclear explosions -- some even favoring an end to computer simulations. On the other hand, China wished an exception to permit high yield "peaceful" nuclear explosions. For the United States to adopt a new position favoring a "true zero" several pieces had to fall into place: 1) The President had to be assured that the U.S. could preserve the safety and reliability of the enduring stockpile without yield testing; 2) the U.S. needed to be sure that the marginal utility of zero-yield experiments was at least as great for this country as for any other; 3) that tests with any nuclear yield might have more marginal utility for nuclear proliferators than for the United States, thus marginally eroding this country's position; 4) the United States required a treaty which would permit maintenance of the capacity to return to testing should a national emergency requiring a nuclear test arise; and 5) all of the five nuclear weapons states had to realize that only a true-zero CTBT would have the desired political effects. This paper will outline the physics near zero yield and show why President Clinton was persuaded by arguments from many viewpoints to endorse a true test ban in August, 1996 and to sign the CTBT in September, 1997.

  13. Software reliability studies

    NASA Technical Reports Server (NTRS)

    Wilson, Larry W.

    1989-01-01

    The longterm goal of this research is to identify or create a model for use in analyzing the reliability of flight control software. The immediate tasks addressed are the creation of data useful to the study of software reliability and production of results pertinent to software reliability through the analysis of existing reliability models and data. The completed data creation portion of this research consists of a Generic Checkout System (GCS) design document created in cooperation with NASA and Research Triangle Institute (RTI) experimenters. This will lead to design and code reviews with the resulting product being one of the versions used in the Terminal Descent Experiment being conducted by the Systems Validations Methods Branch (SVMB) of NASA/Langley. An appended paper details an investigation of the Jelinski-Moranda and Geometric models for software reliability. The models were given data from a process that they have correctly simulated and asked to make predictions about the reliability of that process. It was found that either model will usually fail to make good predictions. These problems were attributed to randomness in the data and replication of data was recommended.

  14. Comparative Reliability of Structured Versus Unstructured Interviews in the Admission Process of a Residency Program

    PubMed Central

    Blouin, Danielle; Day, Andrew G.; Pavlov, Andrey

    2011-01-01

    Background Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. Methods In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Results Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. Conclusions A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when

  15. Comparative reliability of structured versus unstructured interviews in the admission process of a residency program.

    PubMed

    Blouin, Danielle; Day, Andrew G; Pavlov, Andrey

    2011-12-01

    Although never directly compared, structured interviews are reported as being more reliable than unstructured interviews. This study compared the reliability of both types of interview when applied to a common pool of applicants for positions in an emergency medicine residency program. In 2008, one structured interview was added to the two unstructured interviews traditionally used in our resident selection process. A formal job analysis using the critical incident technique guided the development of the structured interview tool. This tool consisted of 7 scenarios assessing 4 of the domains deemed essential for success as a resident in this program. The traditional interview tool assessed 5 general criteria. In addition to these criteria, the unstructured panel members were asked to rate each candidate on the same 4 essential domains rated by the structured panel members. All 3 panels interviewed all candidates. Main outcomes were the overall, interitem, and interrater reliabilities, the correlations between interview panels, and the dimensionality of each interview tool. Thirty candidates were interviewed. The overall reliability reached 0.43 for the structured interview, and 0.81 and 0.71 for the unstructured interviews. Analyses of the variance components showed a high interrater, low interitem reliability for the structured interview, and a high interrater, high interitem reliability for the unstructured interviews. The summary measures from the 2 unstructured interviews were significantly correlated, but neither was correlated with the structured interview. Only the structured interview was multidimensional. A structured interview did not yield a higher overall reliability than both unstructured interviews. The lower reliability is explained by a lower interitem reliability, which in turn is due to the multidimensionality of the interview tool. Both unstructured panels consistently rated a single dimension, even when prompted to assess the 4 specific domains

  16. The reliability evaluation of reclaimed water reused in power plant project

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Jia, Ru-sheng; Gao, Yu-lan; Wang, Wan-fen; Cao, Peng-qiang

    2017-12-01

    The reuse of reclaimed water has become one of the important measures to solve the shortage of water resources in many cities, But there is no unified way to evaluate the engineering. Concerning this issue, it took Wanneng power plant project in Huai city as a example, analyzed the reliability of wastewater reuse from the aspects of quality in reclaimed water, water quality of sewage plant, the present sewage quantity in the city and forecast of reclaimed water yield, in particular, it was necessary to make a correction to the actual operation flow rate of the sewage plant. the results showed that on the context of the fluctuation of inlet water quality, the outlet water quality of sewage treatment plants is basically stable, and it can meet the requirement of circulating cooling water, but suspended solids(SS) and total hardness in boiler water exceed the limit, and some advanced treatment should be carried out. In addition, the total sewage discharge will reach 13.91×104m3/d and 14.21×104m3/d respectively in the two planning level years of the project. They are greater than the normal collection capacity of the sewage system which is 12.0×104 m3/d, and the reclaimed water yield can reach 10.74×104m3/d, which is greater than the actual needed quantity 8.25×104m3/d of the power plant, so the wastewater reuse of this sewage plant are feasible and reliable to the power plant in view of engineering.

  17. Breast cancer sentinel node scintigraphy: differences between imaging results 1 and 2 h after injection.

    PubMed

    Wondergem, Maurits; Hobbelink, Monique G G; Witkamp, Arjen J; van Hillegersberg, Richard; de Keizer, Bart

    2012-11-01

    Timing of image acquisition in breast cancer sentinel node scintigraphy remains a subject of debate. Therefore, the performance of our protocol in which images are acquired 1 and 2 h after injection was evaluated. The results of sentinel node scintigraphy 1 and 2 h after injection were compared with regard to the sentinel lymph nodes visualized. We studied 132 patients who were consecutively referred for sentinel lymph node biopsy. 99mTc-albumine nanocolloid (120 MBq) was injected peritumourally into patients with palpable tumours and intratumourally into patients with nonpalpable tumours. All scintigraphic images taken for the sentinel node procedure were evaluated. The number of sentinel nodes per anatomic localization and the interpretability of the images were scored. A total of 132 patients underwent sentinel node scintigraphy 1 h after injection. Of these, 117 patients also underwent sentinel node scintigraphy 2 h after injection. An axillary sentinel node was visualized in 79.5 and 95.7% of patients, respectively, 1 and 2 h after injection. In 20.5% of the patients the images acquired 1 h after injection did not show a sentinel node. Furthermore, in all procedures, the images 1 h after injection were of no added value to those acquired 2 h after injection. Scintigraphic imaging 2 h after a single peritumoural or intratumoural administration of about 120 MBq 99mTc-albumine nanocolloid yields an axillary sentinel node in over 95% of cases. Imaging 1 h after injection is of no additional value and can be omitted.

  18. Using normalized difference vegetation index (NDVI) to estimate sugarcane yield and yield components

    USDA-ARS?s Scientific Manuscript database

    Sugarcane (Saccharum spp.) yield and yield components are important traits for growers and scientists to evaluate and select cultivars. Collection of these yield data would be labor intensive and time consuming in the early selection stages of sugarcane breeding cultivar development programs with a ...

  19. Comparison of heavy metal loads in stormwater runoff from major and minor urban roads using pollutant yield rating curves.

    PubMed

    Davis, Brett; Birch, Gavin

    2010-08-01

    Trace metal export by stormwater runoff from a major road and local street in urban Sydney, Australia, is compared using pollutant yield rating curves derived from intensive sampling data. The event loads of copper, lead and zinc are well approximated by logarithmic relationships with respect to total event discharge owing to the reliable appearance of a first flush in pollutant mass loading from urban roads. Comparisons of the yield rating curves for these three metals show that copper and zinc export rates from the local street are comparable with that of the major road, while lead export from the local street is much higher, despite a 45-fold difference in traffic volume. The yield rating curve approach allows problematic environmental data to be presented in a simple yet meaningful manner with less information loss. Copyright 2010 Elsevier Ltd. All rights reserved.

  20. Hawaii Electric System Reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loose, Verne William; Silva Monroy, Cesar Augusto

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and formore » application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.« less

  1. Pocket Handbook on Reliability

    DTIC Science & Technology

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  2. Process yield improvements with process control terminal for varian serial ion implanters

    NASA Astrophysics Data System (ADS)

    Higashi, Harry; Soni, Ameeta; Martinez, Larry; Week, Ken

    Implant processes in a modern wafer production fab are extremely complex. There can be several types of misprocessing, i.e. wrong dose or species, double implants and missed implants. Process Control Terminals (PCT) for Varian 350Ds installed at Intel fabs were found to substantially reduce the number of misprocessing steps. This paper describes those misprocessing steps and their subsequent reduction with use of PCTs. Reliable and simple process control with serial process ion implanters has been in increasing demand. A well designed process control terminal greatly increases device yield by monitoring all pertinent implanter functions and enabling process engineering personnel to set up process recipes for simple and accurate system operation. By programming user-selectable interlocks, implant errors are reduced and those that occur are logged for further analysis and prevention. A process control terminal should also be compatible with office personal computers for greater flexibility in system use and data analysis. The impact from the capability of a process control terminal is increased productivity, ergo higher device yield.

  3. Model uncertainty and multimodel inference in reliability estimation within a longitudinal framework.

    PubMed

    Alonso, Ariel; Laenen, Annouschka

    2013-05-01

    Laenen, Alonso, and Molenberghs (2007) and Laenen, Alonso, Molenberghs, and Vangeneugden (2009) proposed a method to assess the reliability of rating scales in a longitudinal context. The methodology is based on hierarchical linear models, and reliability coefficients are derived from the corresponding covariance matrices. However, finding a good parsimonious model to describe complex longitudinal data is a challenging task. Frequently, several models fit the data equally well, raising the problem of model selection uncertainty. When model uncertainty is high one may resort to model averaging, where inferences are based not on one but on an entire set of models. We explored the use of different model building strategies, including model averaging, in reliability estimation. We found that the approach introduced by Laenen et al. (2007, 2009) combined with some of these strategies may yield meaningful results in the presence of high model selection uncertainty and when all models are misspecified, in so far as some of them manage to capture the most salient features of the data. Nonetheless, when all models omit prominent regularities in the data, misleading results may be obtained. The main ideas are further illustrated on a case study in which the reliability of the Hamilton Anxiety Rating Scale is estimated. Importantly, the ambit of model selection uncertainty and model averaging transcends the specific setting studied in the paper and may be of interest in other areas of psychometrics. © 2012 The British Psychological Society.

  4. Linkage design effect on the reliability of surface-micromachined microengines driving a load

    NASA Astrophysics Data System (ADS)

    Tanner, Danelle M.; Peterson, Kenneth A.; Irwin, Lloyd W.; Tangyunyong, Paiboon; Miller, William M.; Eaton, William P.; Smith, Norman F.; Rodgers, M. Steven

    1998-09-01

    The reliability of microengines is a function of the design of the mechanical linkage used to connect the electrostatic actuator to the drive. We have completed a series of reliability stress tests on surface micromachined microengines driving an inertial load. In these experiments, we used microengines that had pin mechanisms with guides connecting the drive arms to the electrostatic actuators. Comparing this data to previous results using flexure linkages revealed that the pin linkage design was less reliable. The devices were stressed to failure at eight frequencies, both above and below the measured resonance frequency of the microengine. Significant amounts of wear debris were observed both around the hub and pin joint of the drive gear. Additionally, wear tracks were observed in the area where the moving shuttle rubbed against the guides of the pin linkage. At each frequency, we analyzed the statistical data yielding a lifetime (t50) for median cycles to failure and (sigma) , the shape parameter of the distribution. A model was developed to describe the failure data based on fundamental wear mechanisms and forces exhibited in mechanical resonant systems. The comparison to the model will be discussed.

  5. Hawaii electric system reliability.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and formore » application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.« less

  6. Attachment of micro- and nano-particles on tipless cantilevers for colloidal probe microscopy.

    PubMed

    D'Sa, Dexter J; Chan, Hak-Kim; Chrzanowski, Wojciech

    2014-07-15

    Current colloidal probe preparation techniques face several challenges in the production of functional probes using particles ⩽5 μm. Challenges include: glue encapsulated particles, glue altered particle properties, improper particle or agglomerate attachment, and lengthy procedures. We present a method to rapidly and reproducibly produce functional micro and nano-colloidal probes. Using a six-step procedure, cantilevers mounted on a custom designed 45° holder were used to approach and obtain a minimal amount of epoxy resin (viscosity of ∼14,000 cP) followed by a single micron/nano particle on the apex of a tipless cantilever. The epoxy and particles were prepared on individual glass slides and subsequently affixed to a 10× or 40× optical microscope lens using another custom designed holder. Scanning electron microscopy and comparative glue-colloidal probe measurements were used to confirm colloidal probe functionality. The method presented allowed rapid and reproducible production of functional colloidal probes (80% success). Single nano-particles were prominently affixed to the apex of the cantilever, unaffected by the epoxy. Nano-colloidal probes were used to conduct topographical, instantaneous force, and adhesive force mapping measurements in dry and liquid media conveying their versatility and functionality in studying nano-colloidal systems. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Gold nanostar synthesis with a silver seed mediated growth method.

    PubMed

    Kereselidze, Zurab; Romero, Victor H; Peralta, Xomalin G; Santamaria, Fidel

    2012-01-15

    The physical, chemical and optical properties of nano-scale colloids depend on their material composition, size and shape. There is a great interest in using nano-colloids for photo-thermal ablation, drug delivery and many other biomedical applications. Gold is particularly used because of its low toxicity. A property of metal nano-colloids is that they can have a strong surface plasmon resonance. The peak of the surface plasmon resonance mode depends on the structure and composition of the metal nano-colloids. Since the surface plasmon resonance mode is stimulated with light there is a need to have the peak absorbance in the near infrared where biological tissue transmissivity is maximal. We present a method to synthesize star shaped colloidal gold, also known as star shaped nanoparticles or nanostars. This method is based on a solution containing silver seeds that are used as the nucleating agent for anisotropic growth of gold colloids. Scanning electron microscopy (SEM) analysis of the resulting gold colloid showed that 70 % of the nanostructures were nanostars. The other 30 % of the particles were amorphous clusters of decahedra and rhomboids. The absorbance peak of the nanostars was detected to be in the near infrared (840 nm). Thus, our method produces gold nanostars suitable for biomedical applications, particularly for photo-thermal ablation.

  8. Degradation spectra and ionization yields of electrons in gases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inokuti, M.; Douthat, D.A.; Rau, A.R.P.

    1975-01-01

    Progress in the microscopic theory of electron degradation in gases by Platzman, Fano, and co-workers is outlined. The theory consists of (1) the cataloging of all major inelastic-collision cross sections for electrons (including secondary-electron energy distribution in a single ionizing collision) and (2) the evaluation of cumulative consequences of individual electron collisions for the electrons themselves as well as for target molecules. For assessing the data consistency and reliability and extrapolating the data to the unexplored ranges of variables (such as electron energy), a series of plots devised by Platzman are very powerful. Electron degradation spectra were obtained through numericalmore » solution of the Spencer--Fano equation for all electron energies down to the first ionization thresholds for a few examples such as He and Ne. The systematics of the solutions resulted in the recognition of approximate scaling properties of the degradation spectra for different initial electron energies and pointed to new methods of more efficient treatment. Systematics of the ionization yields and their energy dependence on the initial electron energy were also recognized. Finally, the Spencer--Fano equation for the degradation spectra and the Fowler equation for the ionization and other yields are tightly linked with each other by a set of variational principles. (52 references, 7 figures) (DLC)« less

  9. Reliable Design Versus Trust

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  10. Validity and reliability of Persian version of Listening Styles Profile-Revised (LSP- R) in Iranian students.

    PubMed

    Fatehi, Zahra; Baradaran, Hamid Reza; Asadpour, Mohamad; Rezaeian, Mohsen

    2017-01-01

    Background: Individuals' listening styles differs based on their characters, professions and situations. This study aimed to assess the validity and reliability of Listening Styles Profile- Revised (LSP- R) in Iranian students. Methods: After translating into Persian, LSP-R was employed in a sample of 240 medical and nursing Persian speaking students in Iran. Statistical analysis was performed to test the reliability and validity of the LSP-R. Results: The study revealed high internal consistency and good test-retest reliability for the Persian version of the questionnaire. The Cronbach's alpha coefficient was 0.72 and intra-class correlation coefficient 0.87. The means for the content validity index and the content validity ratio (CVR) were 0.90 and 0.83, respectively. Exploratory factor analysis (EFA) yielded a four-factor solution accounted for 60.8% of the observed variance. Majority of medical students (73%) as well as majority of nursing students (70%) stated that their listening styles were task-oriented. Conclusion: In general, the study finding suggests that the Persian version of LSP-R is a valid and reliable instrument for assessing listening styles profile in the studied sample.

  11. Trade-off between reservoir yield and evaporation losses as a function of lake morphology in semi-arid Brazil.

    PubMed

    Campos, José N B; Lima, Iran E; Studart, Ticiana M C; Nascimento, Luiz S V

    2016-05-31

    This study investigates the relationships between yield and evaporation as a function of lake morphology in semi-arid Brazil. First, a new methodology was proposed to classify the morphology of 40 reservoirs in the Ceará State, with storage capacities ranging from approximately 5 to 4500 hm3. Then, Monte Carlo simulations were conducted to study the effect of reservoir morphology (including real and simplified conical forms) on the water storage process at different reliability levels. The reservoirs were categorized as convex (60.0%), slightly convex (27.5%) or linear (12.5%). When the conical approximation was used instead of the real lake form, a trade-off occurred between reservoir yield and evaporation losses, with different trends for the convex, slightly convex and linear reservoirs. Using the conical approximation, the water yield prediction errors reached approximately 5% of the mean annual inflow, which is negligible for large reservoirs. However, for smaller reservoirs, this error became important. Therefore, this paper presents a new procedure for correcting the yield-evaporation relationships that were obtained by assuming a conical approximation rather than the real reservoir morphology. The combination of this correction with the Regulation Triangle Diagram is useful for rapidly and objectively predicting reservoir yield and evaporation losses in semi-arid environments.

  12. Statistical modelling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1991-01-01

    During the six-month period from 1 April 1991 to 30 September 1991 the following research papers in statistical modeling of software reliability appeared: (1) A Nonparametric Software Reliability Growth Model; (2) On the Use and the Performance of Software Reliability Growth Models; (3) Research and Development Issues in Software Reliability Engineering; (4) Special Issues on Software; and (5) Software Reliability and Safety.

  13. Rate constants and mechanisms for the crystallization of Al nano-goethite under environmentally relevant conditions

    NASA Astrophysics Data System (ADS)

    Bazilevskaya, Ekaterina; Archibald, Douglas D.; Martínez, Carmen Enid

    2012-07-01

    , within error, for both 0 and 2 mol% Al nanoparticle suspensions. Thus, the presence of 2 mol% Al decreased the rate constants determined from analyses of infrared OH-stretching and OH-bending vibrations by 43-57%. We postulate that dissolution re-precipitation reactions are accelerated in aggregate microenvironments by locally increased supersaturation, yielding the dominant mechanism for transformation of ferrihydrite to goethite and goethite crystal growth when bulk ion concentrations are low. Although we did observe growth of a population of prismatic goethite single crystals by TEM, there was more substantial growth of a population of polycrystalline goethite needles that appeared to retain some defects from a preceding aggregation step that we detected with DLS. Since the presence of Al hinders the dissolution of ferrihydrite, it too reduces the rate of crystallization to goethite and its crystal growth. As exemplified in this nano-particle crystallization study, the combination of advanced spectral-curve-resolution algorithms and sensitive and quantitative infrared sampling techniques opens future opportunities for the quantification of mineral phase dynamics in nanocolloidal suspensions, which is important for many aspects of environmental studies.

  14. Sediment yield estimation in mountain catchments of the Camastra reservoir, southern Italy: a comparison among different empirical methods

    NASA Astrophysics Data System (ADS)

    Lazzari, Maurizio; Danese, Maria; Gioia, Dario; Piccarreta, Marco

    2013-04-01

    Sedimentary budget estimation is an important topic for both scientific and social community, because it is crucial to understand both dynamics of orogenic belts and many practical problems, such as soil conservation and sediment accumulation in reservoir. Estimations of sediment yield or denudation rates in southern-central Italy are generally obtained by simple empirical relationships based on statistical regression between geomorphic parameters of the drainage network and the measured suspended sediment yield at the outlet of several drainage basins or through the use of models based on sediment delivery ratio or on soil loss equations. In this work, we perform a study of catchment dynamics and an estimation of sedimentary yield for several mountain catchments of the central-western sector of the Basilicata region, southern Italy. Sediment yield estimation has been obtained through both an indirect estimation of suspended sediment yield based on the Tu index (mean annual suspension sediment yield, Ciccacci et al., 1980) and the application of the Rusle (Renard et al., 1997) and the USPED (Mitasova et al., 1996) empirical methods. The preliminary results indicate a reliable difference between the RUSLE and USPED methods and the estimation based on the Tu index; a critical data analysis of results has been carried out considering also the present-day spatial distribution of erosion, transport and depositional processes in relation to the maps obtained from the application of those different empirical methods. The studied catchments drain an artificial reservoir (i.e. the Camastra dam), where a detailed evaluation of the amount of historical sediment storage has been collected. Sediment yield estimation obtained by means of the empirical methods have been compared and checked with historical data of sediment accumulation measured in the artificial reservoir of the Camastra dam. The validation of such estimations of sediment yield at the scale of large catchments

  15. Angular distribution measurements of photo-neutron yields produced by 2.0 GeV electrons incident on thick targets.

    PubMed

    Lee, Hee-Seock; Ban, Syuichi; Sanami, Toshiya; Takahashi, Kazutoshi; Sato, Tatsuhiko; Shin, Kazuo; Chung, Chinwha

    2005-01-01

    A study of differential photo-neutron yields by irradiation with 2 GeV electrons has been carried out. In this extension of a previous study in which measurements were made at an angle of 90 degrees relative to incident electrons, the differential photo-neutron yield was obtained at two other angles, 48 degrees and 140 degrees, to study its angular characteristics. Photo-neutron spectra were measured using a pulsed beam time-of-flight method and a BC418 plastic scintillator. The reliable range of neutron energy measurement was 8-250 MeV. The neutron spectra were measured for 10 Xo-thick Cu, Sn, W and Pb targets. The angular distribution characteristics, together with the previous results for 90 degrees, are presented in the study. The experimental results are compared with Monte Carlo calculation results. The yields predicted by MCNPX 2.5 tend to underestimate the measured ones. The same trend holds for the comparison results using the EGS4 and PICA3 codes.

  16. Identifying seedling root architectural traits associated with yield and yield components in wheat.

    PubMed

    Xie, Quan; Fernando, Kurukulasuriya M C; Mayes, Sean; Sparkes, Debbie L

    2017-05-01

    Plant roots growing underground are critical for soil resource acquisition, anchorage and plant-environment interactions. In wheat ( Triticum aestivum ), however, the target root traits to improve yield potential still remain largely unknown. This study aimed to identify traits of seedling root system architecture (RSA) associated with yield and yield components in 226 recombinant inbred lines (RILs) derived from a cross between the bread wheat Triticum aestivum 'Forno' (small, wide root system) and spelt Triticum spelta 'Oberkulmer' (large, narrow root system). A 'pouch and wick' high-throughput phenotyping pipeline was used to determine the RSA traits of 13-day-old RIL seedlings. Two field experiments and one glasshouse experiment were carried out to investigate the yield, yield components and phenology, followed by identification of quantitative trait loci (QTLs). There was substantial variation in RSA traits between genotypes. Seminal root number and total root length were both positively associated with grains m -2 , grains per spike, above-ground biomass m -2 and grain yield. More seminal roots and longer total root length were also associated with delayed maturity and extended grain filling, likely to be a consequence of more grains being defined before anthesis. Additionally, the maximum width of the root system displayed positive relationships with spikes m -2 , grains m -2 and grain yield. Ten RILs selected for the longest total roots exhibited the same effects on yield and phenology as described above, compared with the ten lines with the shortest total roots. Genetic analysis revealed 38 QTLs for the RSA, and QTL coincidence between the root and yield traits was frequently observed, indicating tightly linked genes or pleiotropy, which concurs with the results of phenotypic correlation analysis. Based on the results from the Forno × Oberkulmer population, it is proposed that vigorous early root growth, particularly more seminal roots and longer total

  17. Measurements of branching fraction ratios and CP-asymmetries in suppressed B{sup -}{yields}D({yields}K{sup +}{pi}{sup -})K{sup -} and B{sup -}{yields}D({yields}K{sup +}{pi}{sup -}){pi}{sup -} decays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aaltonen, T.; Brucken, E.; Devoto, F.

    2011-11-01

    We report the first reconstruction in hadron collisions of the suppressed decays B{sup -}{yields}D({yields}K{sup +}{pi}{sup -})K{sup -} and B{sup -}{yields}D({yields}K{sup +}{pi}{sup -}){pi}{sup -}, sensitive to the Cabibbo-Kobayashi-Maskawa phase {gamma}, using data from 7 fb{sup -1} of integrated luminosity collected by the CDF II detector at the Tevatron collider. We reconstruct a signal for the B{sup -}{yields}D({yields}K{sup +}{pi}{sup -})K{sup -} suppressed mode with a significance of 3.2 standard deviations, and measure the ratios of the suppressed to favored branching fractions R(K)=[22.0{+-}8.6(stat){+-}2.6(syst)]x10{sup -3}, R{sup +}(K)=[42.6{+-}13.7(stat){+-}2.8(syst)]x10{sup -3}, R{sup -}(K)=[3.8{+-}10.3(stat){+-}2.7(syst)]x10{sup -3} as well as the direct CP-violating asymmetry A(K)=-0.82{+-}0.44(stat){+-}0.09(syst) of this mode. Corresponding quantitiesmore » for B{sup -}{yields}D({yields}K{sup +}{pi}{sup -}){pi}{sup -} decay are also reported.« less

  18. Facial disability index (FDI): Adaptation to Spanish, reliability and validity

    PubMed Central

    Gonzalez-Cardero, Eduardo; Cayuela, Aurelio; Acosta-Feria, Manuel; Gutierrez-Perez, Jose-Luis

    2012-01-01

    Objectives: To adapt to Spanish the facial disability index (FDI) described by VanSwearingen and Brach in 1995 and to assess its reliability and validity in patients with facial nerve paresis after parotidectomy. Study Design: The present study was conducted in two different stages: a) cross-cultural adaptation of the questionnaire and b) cross-sectional study of a control group of 79 Spanish-speaking patients who suffered facial paresis after superficial parotidectomy with facial nerve preservation. The cross-cultural adaptation process comprised the following stages: (I) initial translation, (II) synthesis of the translated document, (III) retro-translation, (IV) review by a board of experts, (V) pilot study of the pre-final draft and (VI) analysis of the pilot study and final draft. Results: The reliability and internal consistency of every one of the rating scales included in the FDI (Cronbach’s alpha coefficient) was 0.83 for the complete scale and 0.77 and 0.82 for the physical and the social well-being subscales. The analysis of the factorial validity of the main components of the adapted FDI yielded similar results to the original questionnaire. Bivariate correlations between FDI and House-Brackmann scale were positive. The variance percentage was calculated for all FDI components. Conclusions: The FDI questionnaire is a specific instrument for assessing facial neuromuscular dysfunction which becomes a useful tool in order to determine quality of life in patients with facial nerve paralysis. Spanish adapted FDI is equivalent to the original questionnaire and shows similar reliability and validity. The proven reproducibi-lity, reliability and validity of this questionnaire make it a useful additional tool for evaluating the impact of facial nerve paralysis in Spanish-speaking patients. Key words:Parotidectomy, facial nerve paralysis, facial disability. PMID:22926474

  19. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  20. Reliability model generator specification

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C.; Mccann, Catherine

    1990-01-01

    The Reliability Model Generator (RMG), a program which produces reliability models from block diagrams for ASSIST, the interface for the reliability evaluation tool SURE is described. An account is given of motivation for RMG and the implemented algorithms are discussed. The appendices contain the algorithms and two detailed traces of examples.

  1. Transit Reliability Information Program : PATCO-WMATA Propulsion System Reliability/Productivity Analysis

    DOT National Transportation Integrated Search

    1984-10-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national data ban...

  2. Reliability and validity of two isometric squat tests.

    PubMed

    Blazevich, Anthony J; Gill, Nicholas; Newton, Robert U

    2002-05-01

    The purpose of the present study was first to examine the reliability of isometric squat (IS) and isometric forward hack squat (IFHS) tests to determine if repeated measures on the same subjects yielded reliable results. The second purpose was to examine the relation between isometric and dynamic measures of strength to assess validity. Fourteen male subjects performed maximal IS and IFHS tests on 2 occasions and 1 repetition maximum (1-RM) free-weight squat and forward hack squat (FHS) tests on 1 occasion. The 2 tests were found to be highly reliable (intraclass correlation coefficient [ICC](IS) = 0.97 and ICC(IFHS) = 1.00). There was a strong relation between average IS and 1-RM squat performance, and between IFHS and 1-RM FHS performance (r(squat) = 0.77, r(FHS) = 0.76; p < 0.01), but a weak relation between squat and FHS test performances (r < 0.55). There was also no difference between observed 1-RM values and those predicted by our regression equations. Errors in predicting 1-RM performance were in the order of 8.5% (standard error of the estimate [SEE] = 13.8 kg) and 7.3% (SEE = 19.4 kg) for IS and IFHS respectively. Correlations between isometric and 1-RM tests were not of sufficient size to indicate high validity of the isometric tests. Together the results suggest that IS and IFHS tests could detect small differences in multijoint isometric strength between subjects, or performance changes over time, and that the scores in the isometric tests are well related to 1-RM performance. However, there was a small error when predicting 1-RM performance from isometric performance, and these tests have not been shown to discriminate between small changes in dynamic strength. The weak relation between squat and FHS test performance can be attributed to differences in the movement patterns of the tests

  3. Interval Predictor Models with a Formal Characterization of Uncertainty and Reliability

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2014-01-01

    This paper develops techniques for constructing empirical predictor models based on observations. By contrast to standard models, which yield a single predicted output at each value of the model's inputs, Interval Predictors Models (IPM) yield an interval into which the unobserved output is predicted to fall. The IPMs proposed prescribe the output as an interval valued function of the model's inputs, render a formal description of both the uncertainty in the model's parameters and of the spread in the predicted output. Uncertainty is prescribed as a hyper-rectangular set in the space of model's parameters. The propagation of this set through the empirical model yields a range of outputs of minimal spread containing all (or, depending on the formulation, most) of the observations. Optimization-based strategies for calculating IPMs and eliminating the effects of outliers are proposed. Outliers are identified by evaluating the extent by which they degrade the tightness of the prediction. This evaluation can be carried out while the IPM is calculated. When the data satisfies mild stochastic assumptions, and the optimization program used for calculating the IPM is convex (or, when its solution coincides with the solution to an auxiliary convex program), the model's reliability (that is, the probability that a future observation would be within the predicted range of outputs) can be bounded rigorously by a non-asymptotic formula.

  4. Space Shuttle Propulsion System Reliability

    NASA Technical Reports Server (NTRS)

    Welzyn, Ken; VanHooser, Katherine; Moore, Dennis; Wood, David

    2011-01-01

    This session includes the following sessions: (1) External Tank (ET) System Reliability and Lessons, (2) Space Shuttle Main Engine (SSME), Reliability Validated by a Million Seconds of Testing, (3) Reusable Solid Rocket Motor (RSRM) Reliability via Process Control, and (4) Solid Rocket Booster (SRB) Reliability via Acceptance and Testing.

  5. Gene expression information improves reliability of receptor status in breast cancer patients

    PubMed Central

    Kenn, Michael; Schlangen, Karin; Castillo-Tong, Dan Cacsire; Singer, Christian F.; Cibena, Michael; Koelbl, Heinz; Schreiner, Wolfgang

    2017-01-01

    Immunohistochemical (IHC) determination of receptor status in breast cancer patients is frequently inaccurate. Since it directs the choice of systemic therapy, it is essential to increase its reliability. We increase the validity of IHC receptor expression by additionally considering gene expression (GE) measurements. Crisp therapeutic decisions are based on IHC estimates, even if they are borderline reliable. We further improve decision quality by a responsibility function, defining a critical domain for gene expression. Refined normalization is devised to file any newly diagnosed patient into existing data bases. Our approach renders receptor estimates more reliable by identifying patients with questionable receptor status. The approach is also more efficient since the rate of conclusive samples is increased. We have curated and evaluated gene expression data, together with clinical information, from 2880 breast cancer patients. Combining IHC with gene expression information yields a method more reliable and also more efficient as compared to common practice up to now. Several types of possibly suboptimal treatment allocations, based on IHC receptor status alone, are enumerated. A ‘therapy allocation check’ identifies patients possibly miss-classified. Estrogen: false negative 8%, false positive 6%. Progesterone: false negative 14%, false positive 11%. HER2: false negative 2%, false positive 50%. Possible implications are discussed. We propose an ‘expression look-up-plot’, allowing for a significant potential to improve the quality of precision medicine. Methods are developed and exemplified here for breast cancer patients, but they may readily be transferred to diagnostic data relevant for therapeutic decisions in other fields of oncology. PMID:29100391

  6. Transferring Aviation Practices into Clinical Medicine for the Promotion of High Reliability.

    PubMed

    Powell-Dunford, Nicole; McPherson, Mark K; Pina, Joseph S; Gaydos, Steven J

    2017-05-01

    Aviation is a classic example of a high reliability organization (HRO)-an organization in which catastrophic events are expected to occur without control measures. As health care systems transition toward high reliability, aviation practices are increasingly transferred for clinical implementation. A PubMed search using the terms aviation, crew resource management, and patient safety was undertaken. Manuscripts authored by physician pilots and accident investigation regulations were analyzed. Subject matter experts involved in adoption of aviation practices into the medical field were interviewed. A PubMed search yielded 621 results with 22 relevant for inclusion. Improved clinical outcomes were noted in five research trials in which aviation practices were adopted, particularly with regard to checklist usage and crew resource-management training. Effectiveness of interventions was influenced by intensity of application, leadership involvement, and provision of staff training. The usefulness of incorporating mishap investigation techniques has not been established. Whereas aviation accident investigation is highly standardized, the investigation of medical error is characterized by variation. The adoption of aviation practices into clinical medicine facilitates an evolution toward high reliability. Evidence for the efficacy of the checklist and crew resource-management training is robust. Transference of aviation accident investigation practices is preliminary. A standardized, independent investigation process could facilitate the development of a safety culture commensurate with that achieved in the aviation industry.Powell-Dunford N, McPherson MK, Pina JS, Gaydos SJ. Transferring aviation practices into clinical medicine for the promotion of high reliability. Aerosp Med Hum Perform. 2017; 88(5):487-491.

  7. Reliability and smallest real difference of the ankle lunge test post ankle fracture.

    PubMed

    Simondson, David; Brock, Kim; Cotton, Susan

    2012-02-01

    This study aimed to determine the reliability and the smallest real difference of the Ankle Lunge test in an ankle fracture patient population. In the post immobilisation stage of ankle fracture, ankle dorsiflexion is an important measure of progress and outcome. The Ankle Lunge test measures weight bearing dorsiflexion, resulting in negative scores (knee to wall distance) and positive scores (toe to wall distance), for which the latter has proven reliability in normal subjects only. A consecutive sample of ankle fracture patients with permission to commence weight bearing, were recruited to the study. Three measurements of the Ankle Lunge Test were performed each by two raters, one senior and one junior physiotherapist. These occurred prior to therapy sessions in the second week after plaster removal. A standardised testing station was utilised and allowed for both knee to wall distance and toe to wall distance measurement. Data was collected from 10 individuals with ankle fracture, with an average age of 36 years (SD 14.8). Seventy seven percent of observations were negative. Intra and inter-rater reliability yielded intra class correlations at or above 0.97, p < .001. There was a significant systematic bias towards improved scores during repeated measurement for one rater (p = .01). The smallest real difference was calculated as 13.8mm. The Ankle Lunge test is a practical and reliable tool for measuring weightbearing dorsiflexion post ankle fracture. Copyright © 2011 Elsevier Ltd. All rights reserved.

  8. Inventing the future of reliability: FERC's recent orders and the consolidation of reliability authority

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skees, J. Daniel

    2010-06-15

    The Energy Policy Act of 2005 established mandatory reliability standard enforcement under a system in which the Federal Energy Regulatory Commission and the Electric Reliability Organization would have their own spheres of responsibility and authority. Recent orders, however, reflect the Commission's frustration with the reliability standard drafting process and suggest that the Electric Reliability Organization's discretion is likely to receive less deference in the future. (author)

  9. Multi-scale modeling to relate Be surface temperatures, concentrations and molecular sputtering yields

    NASA Astrophysics Data System (ADS)

    Lasa, Ane; Safi, Elnaz; Nordlund, Kai

    2015-11-01

    Recent experiments and Molecular Dynamics (MD) simulations show erosion rates of Be exposed to deuterium (D) plasma varying with surface temperature and the correlated D concentration. Little is understood how these three parameters relate for Be surfaces, despite being essential for reliable prediction of impurity transport and plasma facing material lifetime in current (JET) and future (ITER) devices. A multi-scale exercise is presented here to relate Be surface temperatures, concentrations and sputtering yields. Kinetic Monte Carlo (MC) code MMonCa is used to estimate equilibrium D concentrations in Be at different temperatures. Then, mixed Be-D surfaces - that correspond to the KMC profiles - are generated in MD, to calculate Be-D molecular erosion yields due to D irradiation. With this new database implemented in the 3D MC impurity transport code ERO, modeling scenarios studying wall erosion, such as RF-induced enhanced limiter erosion or main wall surface temperature scans run at JET, can be revisited with higher confidence. Work supported by U.S. DOE under Contract DE-AC05-00OR22725.

  10. Nitrate radical oxidation of γ-terpinene: hydroxy nitrate, total organic nitrate, and secondary organic aerosol yields

    NASA Astrophysics Data System (ADS)

    Slade, Jonathan H.; de Perre, Chloé; Lee, Linda; Shepson, Paul B.

    2017-07-01

    Polyolefinic monoterpenes represent a potentially important but understudied source of organic nitrates (ONs) and secondary organic aerosol (SOA) following oxidation due to their high reactivity and propensity for multi-stage chemistry. Recent modeling work suggests that the oxidation of polyolefinic γ-terpinene can be the dominant source of nighttime ON in a mixed forest environment. However, the ON yields, aerosol partitioning behavior, and SOA yields from γ-terpinene oxidation by the nitrate radical (NO3), an important nighttime oxidant, have not been determined experimentally. In this work, we present a comprehensive experimental investigation of the total (gas + particle) ON, hydroxy nitrate, and SOA yields following γ-terpinene oxidation by NO3. Under dry conditions, the hydroxy nitrate yield = 4(+1/-3) %, total ON yield = 14(+3/-2) %, and SOA yield ≤ 10 % under atmospherically relevant particle mass loadings, similar to those for α-pinene + NO3. Using a chemical box model, we show that the measured concentrations of NO2 and γ-terpinene hydroxy nitrates can be reliably simulated from α-pinene + NO3 chemistry. This suggests that NO3 addition to either of the two internal double bonds of γ-terpinene primarily decomposes forming a relatively volatile keto-aldehyde, reconciling the small SOA yield observed here and for other internal olefinic terpenes. Based on aerosol partitioning analysis and identification of speciated particle-phase ON applying high-resolution liquid chromatography-mass spectrometry, we estimate that a significant fraction of the particle-phase ON has the hydroxy nitrate moiety. This work greatly contributes to our understanding of ON and SOA formation from polyolefin monoterpene oxidation, which could be important in the northern continental US and the Midwest, where polyolefinic monoterpene emissions are greatest.

  11. Multi-approach assessment of the spatial distribution of the specific yield: application to the Crau plain aquifer, France

    NASA Astrophysics Data System (ADS)

    Seraphin, Pierre; Gonçalvès, Julio; Vallet-Coulomb, Christine; Champollion, Cédric

    2018-03-01

    Spatially distributed values of the specific yield, a fundamental parameter for transient groundwater mass balance calculations, were obtained by means of three independent methods for the Crau plain, France. In contrast to its traditional use to assess recharge based on a given specific yield, the water-table fluctuation (WTF) method, applied using major recharging events, gave a first set of reference values. Then, large infiltration processes recorded by monitored boreholes and caused by major precipitation events were interpreted in terms of specific yield by means of a one-dimensional vertical numerical model solving Richards' equations within the unsaturated zone. Finally, two gravity field campaigns, at low and high piezometric levels, were carried out to assess the groundwater mass variation and thus alternative specific yield values. The range obtained by the WTF method for this aquifer made of alluvial detrital material was 2.9- 26%, in line with the scarce data available so far. The average spatial value of specific yield by the WTF method (9.1%) is consistent with the aquifer scale value from the hydro-gravimetric approach. In this investigation, an estimate of the hitherto unknown spatial distribution of the specific yield over the Crau plain was obtained using the most reliable method (the WTF method). A groundwater mass balance calculation over the domain using this distribution yielded similar results to an independent quantification based on a stable isotope-mixing model. This agreement reinforces the relevance of such estimates, which can be used to build a more accurate transient hydrogeological model.

  12. Multi-approach assessment of the spatial distribution of the specific yield: application to the Crau plain aquifer, France

    NASA Astrophysics Data System (ADS)

    Seraphin, Pierre; Gonçalvès, Julio; Vallet-Coulomb, Christine; Champollion, Cédric

    2018-06-01

    Spatially distributed values of the specific yield, a fundamental parameter for transient groundwater mass balance calculations, were obtained by means of three independent methods for the Crau plain, France. In contrast to its traditional use to assess recharge based on a given specific yield, the water-table fluctuation (WTF) method, applied using major recharging events, gave a first set of reference values. Then, large infiltration processes recorded by monitored boreholes and caused by major precipitation events were interpreted in terms of specific yield by means of a one-dimensional vertical numerical model solving Richards' equations within the unsaturated zone. Finally, two gravity field campaigns, at low and high piezometric levels, were carried out to assess the groundwater mass variation and thus alternative specific yield values. The range obtained by the WTF method for this aquifer made of alluvial detrital material was 2.9- 26%, in line with the scarce data available so far. The average spatial value of specific yield by the WTF method (9.1%) is consistent with the aquifer scale value from the hydro-gravimetric approach. In this investigation, an estimate of the hitherto unknown spatial distribution of the specific yield over the Crau plain was obtained using the most reliable method (the WTF method). A groundwater mass balance calculation over the domain using this distribution yielded similar results to an independent quantification based on a stable isotope-mixing model. This agreement reinforces the relevance of such estimates, which can be used to build a more accurate transient hydrogeological model.

  13. Plateletpheresis efficiency and mathematical correction of software-derived platelet yield prediction: A linear regression and ROC modeling approach.

    PubMed

    Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David

    2017-10-01

    Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P < .001). Means of software machine-derived values differed significantly from actual PLT yield, 4.72 × 10 11 vs.6.12 × 10 11 , respectively, (P < .001). The following equation was developed to adjust these values: actual PLT yield= 0.221 + (1.254 × theoretical platelet yield). ROC curve model showed an optimal apheresis device software prediction cut-off of 4.65 × 10 11 to obtain a DP, with a sensitivity of 82.2%, specificity of 93.3%, and an area under the curve (AUC) of 0.909. Trima Accel v6.0 software consistently underestimated PLT yields. Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.

  14. High yield neutron generators using the DD reaction

    NASA Astrophysics Data System (ADS)

    Vainionpaa, J. H.; Harris, J. L.; Piestrup, M. A.; Gary, C. K.; Williams, D. L.; Apodaca, M. D.; Cremer, J. T.; Ji, Qing; Ludewigt, B. A.; Jones, G.

    2013-04-01

    A product line of high yield neutron generators has been developed at Adelphi technology inc. The generators use the D-D fusion reaction and are driven by an ion beam supplied by a microwave ion source. Yields of up to 5 × 109 n/s have been achieved, which are comparable to those obtained using the more efficient D-T reaction. The microwave-driven plasma uses the electron cyclotron resonance (ECR) to produce a high plasma density for high current and high atomic ion species. These generators have an actively pumped vacuum system that allows operation at reduced pressure in the target chamber, increasing the overall system reliability. Since no radioactive tritium is used, the generators can be easily serviced, and components can be easily replaced, providing essentially an unlimited lifetime. Fast neutron source size can be adjusted by selecting the aperture and target geometries according to customer specifications. Pulsed and continuous operation has been demonstrated. Minimum pulse lengths of 50 μs have been achieved. Since the generators are easily serviceable, they offer a long lifetime neutron generator for laboratories and commercial systems requiring continuous operation. Several of the generators have been enclosed in radiation shielding/moderator structures designed for customer specifications. These generators have been proven to be useful for prompt gamma neutron activation analysis (PGNAA), neutron activation analysis (NAA) and fast neutron radiography. Thus these generators make excellent fast, epithermal and thermal neutron sources for laboratories and industrial applications that require neutrons with safe operation, small footprint, low cost and small regulatory burden.

  15. General Aviation Aircraft Reliability Study

    NASA Technical Reports Server (NTRS)

    Pettit, Duane; Turnbull, Andrew; Roelant, Henk A. (Technical Monitor)

    2001-01-01

    This reliability study was performed in order to provide the aviation community with an estimate of Complex General Aviation (GA) Aircraft System reliability. To successfully improve the safety and reliability for the next generation of GA aircraft, a study of current GA aircraft attributes was prudent. This was accomplished by benchmarking the reliability of operational Complex GA Aircraft Systems. Specifically, Complex GA Aircraft System reliability was estimated using data obtained from the logbooks of a random sample of the Complex GA Aircraft population.

  16. Tactile Acuity Charts: A Reliable Measure of Spatial Acuity

    PubMed Central

    Bruns, Patrick; Camargo, Carlos J.; Campanella, Humberto; Esteve, Jaume; Dinse, Hubert R.; Röder, Brigitte

    2014-01-01

    For assessing tactile spatial resolution it has recently been recommended to use tactile acuity charts which follow the design principles of the Snellen letter charts for visual acuity and involve active touch. However, it is currently unknown whether acuity thresholds obtained with this newly developed psychophysical procedure are in accordance with established measures of tactile acuity that involve passive contact with fixed duration and control of contact force. Here we directly compared tactile acuity thresholds obtained with the acuity charts to traditional two-point and grating orientation thresholds in a group of young healthy adults. For this purpose, two types of charts, using either Braille-like dot patterns or embossed Landolt rings with different orientations, were adapted from previous studies. Measurements with the two types of charts were equivalent, but generally more reliable with the dot pattern chart. A comparison with the two-point and grating orientation task data showed that the test-retest reliability of the acuity chart measurements after one week was superior to that of the passive methods. Individual thresholds obtained with the acuity charts agreed reasonably with the grating orientation threshold, but less so with the two-point threshold that yielded relatively distinct acuity estimates compared to the other methods. This potentially considerable amount of mismatch between different measures of tactile acuity suggests that tactile spatial resolution is a complex entity that should ideally be measured with different methods in parallel. The simple test procedure and high reliability of the acuity charts makes them a promising complement and alternative to the traditional two-point and grating orientation thresholds. PMID:24504346

  17. Reliability and validity of the Brief Pain Inventory in individuals with chronic obstructive pulmonary disease.

    PubMed

    Chen, Y-W; HajGhanbari, B; Road, J D; Coxson, H O; Camp, P G; Reid, W D

    2018-06-08

    Pain is prevalent in chronic obstructive pulmonary disease (COPD) and the Brief Pain Inventory (BPI) appears to be a feasible questionnaire to assess this symptom. However, the reliability and validity of the BPI have not been determined in individuals with COPD. This study aimed to determine the internal consistency, test-retest reliability and validity (construct, convergent, divergent and discriminant) of the BPI in individuals with COPD. In order to examine the test-retest reliability, individuals with COPD were recruited from pulmonary rehabilitation programmes to complete the BPI twice 1 week apart. In order to investigate validity, de-identified data was retrieved from two previous studies, including forced expiratory volume in 1-s, age, sex and data from four questionnaires: the BPI, short-form McGill Pain Questionnaire (SF-MPQ), 36-Item Short Form Survey (SF-36) and Community Health Activities Model Program for Seniors (CHAMPS) questionnaire. In total, 123 participants were included in the analyses (eligible data were retrieved from 86 participants and additional 37 participants were recruited). The BPI demonstrated excellent internal consistency and test-retest reliability. It also showed convergent validity with the SF-MPQ and divergent validity with the SF-36. The factor analysis yielded two factors of the BPI, which demonstrated that the two domains of the BPI measure the intended constructs. The BPI can also discriminate pain levels among COPD patients with varied levels of quality of life (SF-36) and physical activity (CHAMPS). The BPI is a reliable and valid pain questionnaire that can be used to evaluate pain in COPD. This study formally established the reliability and validity of the BPI in individuals with COPD, which have not been determined in this patient group. The results of this study provide strong evidence that assessment results from this pain questionnaire are reliable and valid. © 2018 European Pain Federation - EFIC®.

  18. Key Reliability Drivers of Liquid Propulsion Engines and A Reliability Model for Sensitivity Analysis

    NASA Technical Reports Server (NTRS)

    Huang, Zhao-Feng; Fint, Jeffry A.; Kuck, Frederick M.

    2005-01-01

    This paper is to address the in-flight reliability of a liquid propulsion engine system for a launch vehicle. We first establish a comprehensive list of system and sub-system reliability drivers for any liquid propulsion engine system. We then build a reliability model to parametrically analyze the impact of some reliability parameters. We present sensitivity analysis results for a selected subset of the key reliability drivers using the model. Reliability drivers identified include: number of engines for the liquid propulsion stage, single engine total reliability, engine operation duration, engine thrust size, reusability, engine de-rating or up-rating, engine-out design (including engine-out switching reliability, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction), propellant specific hazards, engine start and cutoff transient hazards, engine combustion cycles, vehicle and engine interface and interaction hazards, engine health management system, engine modification, engine ground start hold down with launch commit criteria, engine altitude start (1 in. start), Multiple altitude restart (less than 1 restart), component, subsystem and system design, manufacturing/ground operation support/pre and post flight check outs and inspection, extensiveness of the development program. We present some sensitivity analysis results for the following subset of the drivers: number of engines for the propulsion stage, single engine total reliability, engine operation duration, engine de-rating or up-rating requirements, engine-out design, catastrophic fraction, preventable failure fraction, unnecessary shutdown fraction, and engine health management system implementation (basic redlines and more advanced health management systems).

  19. Yield surface evolution for columnar ice

    NASA Astrophysics Data System (ADS)

    Zhou, Zhiwei; Ma, Wei; Zhang, Shujuan; Mu, Yanhu; Zhao, Shunpin; Li, Guoyu

    A series of triaxial compression tests, which has capable of measuring the volumetric strain of the sample, were conducted on columnar ice. A new testing approach of probing the experimental yield surface was performed from a single sample in order to investigate yield and hardening behaviors of the columnar ice under complex stress states. Based on the characteristic of the volumetric strain, a new method of defined the multiaxial yield strengths of the columnar ice is proposed. The experimental yield surface remains elliptical shape in the stress space of effective stress versus mean stress. The effect of temperature, loading rate and loading path in the initial yield surface and deformation properties of the columnar ice were also studied. Subsequent yield surfaces of the columnar ice have been explored by using uniaxial and hydrostatic paths. The evolution of the subsequent yield surface exhibits significant path-dependent characteristics. The multiaxial hardening law of the columnar ice was established experimentally. A phenomenological yield criterion was presented for multiaxial yield and hardening behaviors of the columnar ice. The comparisons between the theoretical and measured results indicate that this current model is capable of giving a reasonable prediction for the multiaxial yield and post-yield properties of the columnar ice subjected to different temperature, loading rate and path conditions.

  20. Detection of faults and software reliability analysis

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1987-01-01

    Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.

  1. Reliability and validity of the test of incremental respiratory endurance measures of inspiratory muscle performance in COPD.

    PubMed

    Formiga, Magno F; Roach, Kathryn E; Vital, Isabel; Urdaneta, Gisel; Balestrini, Kira; Calderon-Candelario, Rafael A; Campos, Michael A; Cahalin, Lawrence P

    2018-01-01

    The Test of Incremental Respiratory Endurance (TIRE) provides a comprehensive assessment of inspiratory muscle performance by measuring maximal inspiratory pressure (MIP) over time. The integration of MIP over inspiratory duration (ID) provides the sustained maximal inspiratory pressure (SMIP). Evidence on the reliability and validity of these measurements in COPD is not currently available. Therefore, we assessed the reliability, responsiveness and construct validity of the TIRE measures of inspiratory muscle performance in subjects with COPD. Test-retest reliability, known-groups and convergent validity assessments were implemented simultaneously in 81 male subjects with mild to very severe COPD. TIRE measures were obtained using the portable PrO2 device, following standard guidelines. All TIRE measures were found to be highly reliable, with SMIP demonstrating the strongest test-retest reliability with a nearly perfect intraclass correlation coefficient (ICC) of 0.99, while MIP and ID clustered closely together behind SMIP with ICC values of about 0.97. Our findings also demonstrated known-groups validity of all TIRE measures, with SMIP and ID yielding larger effect sizes when compared to MIP in distinguishing between subjects of different COPD status. Finally, our analyses confirmed convergent validity for both SMIP and ID, but not MIP. The TIRE measures of MIP, SMIP and ID have excellent test-retest reliability and demonstrated known-groups validity in subjects with COPD. SMIP and ID also demonstrated evidence of moderate convergent validity and appear to be more stable measures in this patient population than the traditional MIP.

  2. Predicting paddlefish roe yields using an extension of the Beverton–Holt equilibrium yield-per-recruit model

    USGS Publications Warehouse

    Colvin, M.E.; Bettoli, Phillip William; Scholten, G.D.

    2013-01-01

    Equilibrium yield models predict the total biomass removed from an exploited stock; however, traditional yield models must be modified to simulate roe yields because a linear relationship between age (or length) and mature ovary weight does not typically exist. We extended the traditional Beverton-Holt equilibrium yield model to predict roe yields of Paddlefish Polyodon spathula in Kentucky Lake, Tennessee-Kentucky, as a function of varying conditional fishing mortality rates (10-70%), conditional natural mortality rates (cm; 9% and 18%), and four minimum size limits ranging from 864 to 1,016mm eye-to-fork length. These results were then compared to a biomass-based yield assessment. Analysis of roe yields indicated the potential for growth overfishing at lower exploitation rates and smaller minimum length limits than were suggested by the biomass-based assessment. Patterns of biomass and roe yields in relation to exploitation rates were similar regardless of the simulated value of cm, thus indicating that the results were insensitive to changes in cm. Our results also suggested that higher minimum length limits would increase roe yield and reduce the potential for growth overfishing and recruitment overfishing at the simulated cm values. Biomass-based equilibrium yield assessments are commonly used to assess the effects of harvest on other caviar-based fisheries; however, our analysis demonstrates that such assessments likely underestimate the probability and severity of growth overfishing when roe is targeted. Therefore, equilibrium roe yield-per-recruit models should also be considered to guide the management process for caviar-producing fish species.

  3. On-line prediction of yield grade, longissimus muscle area, preliminary yield grade, adjusted preliminary yield grade, and marbling score using the MARC beef carcass image analysis system.

    PubMed

    Shackelford, S D; Wheeler, T L; Koohmaraie, M

    2003-01-01

    The present experiment was conducted to evaluate the ability of the U.S. Meat Animal Research Center's beef carcass image analysis system to predict calculated yield grade, longissimus muscle area, preliminary yield grade, adjusted preliminary yield grade, and marbling score under commercial beef processing conditions. In two commercial beef-processing facilities, image analysis was conducted on 800 carcasses on the beef-grading chain immediately after the conventional USDA beef quality and yield grades were applied. Carcasses were blocked by plant and observed calculated yield grade. The carcasses were then separated, with 400 carcasses assigned to a calibration data set that was used to develop regression equations, and the remaining 400 carcasses assigned to a prediction data set used to validate the regression equations. Prediction equations, which included image analysis variables and hot carcass weight, accounted for 90, 88, 90, 88, and 76% of the variation in calculated yield grade, longissimus muscle area, preliminary yield grade, adjusted preliminary yield grade, and marbling score, respectively, in the prediction data set. In comparison, the official USDA yield grade as applied by online graders accounted for 73% of the variation in calculated yield grade. The technology described herein could be used by the beef industry to more accurately determine beef yield grades; however, this system does not provide an accurate enough prediction of marbling score to be used without USDA grader interaction for USDA quality grading.

  4. How do cognitively impaired elderly patients define "testament": reliability and validity of the testament definition scale.

    PubMed

    Heinik, J; Werner, P; Lin, R

    1999-01-01

    The testament definition scale (TDS) is a specifically designed six-item scale aimed at measuring the respondent's capacity to define "testament." We assessed the reliability and validity of this new short scale in 31 community-dwelling cognitively impaired elderly patients. Interrater reliability for the six items ranged from .87 to .97. The interrater reliability for the total score was .77. Significant correlations were found between the TDS score and the Mini-Mental State Examination (MMSE) and the Cambridge Cognitive Examination scores (r = .71 and .72 respectively, p = .001). Criterion validity yielded significantly different means for subjects with MMSE scores of 24-30 and 0-23: mean 3.9 and 1.6 respectively (t(20) = 4.7, p = .001). Using a cutoff point of 0-2 vs. 3+, 79% of the subjects were correctly classified as severely cognitively impaired, with only 8.3% false positives, and a positive predictive value of 94%. Thus, TDS was found both reliable and valid. This scale, however, is not synonymous with testamentary capacity. The discussion deals with the methodological limitations of this study, and highlights the practical as well as the theoretical relevance of TDS. Future studies are warranted to elucidate the relationships between TDS and existing legal requirements of testamentary capacity.

  5. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  6. Reliability and validity of a nutrition and physical activity environmental self-assessment for child care

    PubMed Central

    Benjamin, Sara E; Neelon, Brian; Ball, Sarah C; Bangdiwala, Shrikant I; Ammerman, Alice S; Ward, Dianne S

    2007-01-01

    Background Few assessment instruments have examined the nutrition and physical activity environments in child care, and none are self-administered. Given the emerging focus on child care settings as a target for intervention, a valid and reliable measure of the nutrition and physical activity environment is needed. Methods To measure inter-rater reliability, 59 child care center directors and 109 staff completed the self-assessment concurrently, but independently. Three weeks later, a repeat self-assessment was completed by a sub-sample of 38 directors to assess test-retest reliability. To assess criterion validity, a researcher-administered environmental assessment was conducted at 69 centers and was compared to a self-assessment completed by the director. A weighted kappa test statistic and percent agreement were calculated to assess agreement for each question on the self-assessment. Results For inter-rater reliability, kappa statistics ranged from 0.20 to 1.00 across all questions. Test-retest reliability of the self-assessment yielded kappa statistics that ranged from 0.07 to 1.00. The inter-quartile kappa statistic ranges for inter-rater and test-retest reliability were 0.45 to 0.63 and 0.27 to 0.45, respectively. When percent agreement was calculated, questions ranged from 52.6% to 100% for inter-rater reliability and 34.3% to 100% for test-retest reliability. Kappa statistics for validity ranged from -0.01 to 0.79, with an inter-quartile range of 0.08 to 0.34. Percent agreement for validity ranged from 12.9% to 93.7%. Conclusion This study provides estimates of criterion validity, inter-rater reliability and test-retest reliability for an environmental nutrition and physical activity self-assessment instrument for child care. Results indicate that the self-assessment is a stable and reasonably accurate instrument for use with child care interventions. We therefore recommend the Nutrition and Physical Activity Self-Assessment for Child Care (NAP SACC

  7. Monitoring interannual variation in global crop yield using long-term AVHRR and MODIS observations

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoyang; Zhang, Qingyuan

    2016-04-01

    Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS) data have been extensively applied for crop yield prediction because of their daily temporal resolution and a global coverage. This study investigated global crop yield using daily two band Enhanced Vegetation Index (EVI2) derived from AVHRR (1981-1999) and MODIS (2000-2013) observations at a spatial resolution of 0.05° (∼5 km). Specifically, EVI2 temporal trajectory of crop growth was simulated using a hybrid piecewise logistic model (HPLM) for individual pixels, which was used to detect crop phenological metrics. The derived crop phenology was then applied to calculate crop greenness defined as EVI2 amplitude and EVI2 integration during annual crop growing seasons, which was further aggregated for croplands in each country, respectively. The interannual variations in EVI2 amplitude and EVI2 integration were combined to correlate to the variation in cereal yield from 1982-2012 for individual countries using a stepwise regression model, respectively. The results show that the confidence level of the established regression models was higher than 90% (P value < 0.1) in most countries in the northern hemisphere although it was relatively poor in the southern hemisphere (mainly in Africa). The error in the yield predication was relatively smaller in America, Europe and East Asia than that in Africa. In the 10 countries with largest cereal production across the world, the prediction error was less than 9% during past three decades. This suggests that crop phenology-controlled greenness from coarse resolution satellite data has the capability of predicting national crop yield across the world, which could provide timely and reliable crop information for global agricultural trade and policymakers.

  8. Climatic and technological ceilings for Chinese rice stagnation based on yield gaps and yield trend pattern analysis.

    PubMed

    Zhang, Tianyi; Yang, Xiaoguang; Wang, Hesong; Li, Yong; Ye, Qing

    2014-04-01

    Climatic or technological ceilings could cause yield stagnation. Thus, identifying the principal reasons for yield stagnation within the context of the local climate and socio-economic conditions are essential for informing regional agricultural policies. In this study, we identified the climatic and technological ceilings for seven rice-production regions in China based on yield gaps and on a yield trend pattern analysis for the period 1980-2010. The results indicate that 54.9% of the counties sampled experienced yield stagnation since the 1980. The potential yield ceilings in northern and eastern China decreased to a greater extent than in other regions due to the accompanying climate effects of increases in temperature and decreases in radiation. This may be associated with yield stagnation and halt occurring in approximately 49.8-57.0% of the sampled counties in these areas. South-western China exhibited a promising scope for yield improvement, showing the greatest yield gap (30.6%), whereas the yields were stagnant in 58.4% of the sampled counties. This finding suggests that efforts to overcome the technological ceiling must be given priority so that the available exploitable yield gap can be achieved. North-eastern China, however, represents a noteworthy exception. In the north-central area of this region, climate change has increased the yield potential ceiling, and this increase has been accompanied by the most rapid increase in actual yield: 1.02 ton ha(-1) per decade. Therefore, north-eastern China shows a great potential for rice production, which is favoured by the current climate conditions and available technology level. Additional environmentally friendly economic incentives might be considered in this region. © 2013 John Wiley & Sons Ltd.

  9. Transit Reliability Information Program : Reliability Verification Demonstration Plan for Rapid Rail Vehicles

    DOT National Transportation Integrated Search

    1981-08-01

    The Transit Reliability Information Program (TRIP) is a government-initiated program to assist the transit industry in satisfying its need for transit reliability information. TRIP provides this assistance through the operation of a national Data Ban...

  10. Declining water yield from forested mountain watersheds in response to climate change and forest mesophication.

    PubMed

    Caldwell, Peter V; Miniat, Chelcy F; Elliott, Katherine J; Swank, Wayne T; Brantley, Steven T; Laseter, Stephanie H

    2016-09-01

    Climate change and forest disturbances are threatening the ability of forested mountain watersheds to provide the clean, reliable, and abundant fresh water necessary to support aquatic ecosystems and a growing human population. Here, we used 76 years of water yield, climate, and field plot vegetation measurements in six unmanaged, reference watersheds in the southern Appalachian Mountains of North Carolina, USA to determine whether water yield has changed over time, and to examine and attribute the causal mechanisms of change. We found that annual water yield increased in some watersheds from 1938 to the mid-1970s by as much as 55%, but this was followed by decreases up to 22% by 2013. Changes in forest evapotranspiration were consistent with, but opposite in direction to the changes in water yield, with decreases in evapotranspiration up to 31% by the mid-1970s followed by increases up to 29% until 2013. Vegetation survey data showed commensurate reductions in forest basal area until the mid-1970s and increases since that time accompanied by a shift in dominance from xerophytic oak and hickory species to several mesophytic species (i.e., mesophication) that use relatively more water. These changes in forest structure and species composition may have decreased water yield by as much as 18% in a given year since the mid-1970s after accounting for climate. Our results suggest that changes in climate and forest structure and species composition in unmanaged forests brought about by disturbance and natural community dynamics over time can result in large changes in water supply. © 2016 John Wiley & Sons Ltd.

  11. Covariate-free and Covariate-dependent Reliability.

    PubMed

    Bentler, Peter M

    2016-12-01

    Classical test theory reliability coefficients are said to be population specific. Reliability generalization, a meta-analysis method, is the main procedure for evaluating the stability of reliability coefficients across populations. A new approach is developed to evaluate the degree of invariance of reliability coefficients to population characteristics. Factor or common variance of a reliability measure is partitioned into parts that are, and are not, influenced by control variables, resulting in a partition of reliability into a covariate-dependent and a covariate-free part. The approach can be implemented in a single sample and can be applied to a variety of reliability coefficients.

  12. Reliability of surface electromyography in the assessment of paraspinal muscle fatigue: an updated systematic review.

    PubMed

    Mohseni Bandpei, Mohammad A; Rahmani, Nahid; Majdoleslam, Basir; Abdollahi, Iraj; Ali, Shabnam Shah; Ahmad, Ashfaq

    2014-09-01

    The purpose of this study was to review the literature to determine whether surface electromyography (EMG) is a reliable tool to assess paraspinal muscle fatigue in healthy subjects and in patients with low back pain (LBP). A literature search for the period of 2000 to 2012 was performed, using PubMed, ProQuest, Science Direct, EMBASE, OVID, CINAHL, and MEDLINE databases. Electromyography, reliability, median frequency, paraspinal muscle, endurance, low back pain, and muscle fatigue were used as keywords. The literature search yielded 178 studies using the above keywords. Twelve articles were selected according to the inclusion criteria of the study. In 7 of the 12 studies, the surface EMG was only applied in healthy subjects, and in 5 studies, the reliability of surface EMG was investigated in patients with LBP or a comparison with a control group. In all of these studies, median frequency was shown to be a reliable EMG parameter to assess paraspinal muscles fatigue. There was a wide variation among studies in terms of methodology, surface EMG parameters, electrode location, procedure, and homogeneity of the study population. The results suggest that there seems to be a convincing body of evidence to support the merit of surface EMG in the assessment of paraspinal muscle fatigue in healthy subject and in patients with LBP. Copyright © 2014 National University of Health Sciences. Published by Elsevier Inc. All rights reserved.

  13. GT0 Explosion Sources for IMS Infrasound Calibration: Charge Design and Yield Estimation from Near-source Observations

    NASA Astrophysics Data System (ADS)

    Gitterman, Y.; Hofstetter, R.

    2014-03-01

    Three large-scale on-surface explosions were conducted by the Geophysical Institute of Israel (GII) at the Sayarim Military Range, Negev desert, Israel: about 82 tons of strong high explosives in August 2009, and two explosions of about 10 and 100 tons of ANFO explosives in January 2011. It was a collaborative effort between Israel, CTBTO, USA and several European countries, with the main goal to provide fully controlled ground truth (GT0) infrasound sources, monitored by extensive observations, for calibration of International Monitoring System (IMS) infrasound stations in Europe, Middle East and Asia. In all shots, the explosives were assembled like a pyramid/hemisphere on dry desert alluvium, with a complicated explosion design, different from the ideal homogenous hemisphere used in similar experiments in the past. Strong boosters and an upward charge detonation scheme were applied to provide more energy radiated to the atmosphere. Under these conditions the evaluation of the actual explosion yield, an important source parameter, is crucial for the GT0 calibration experiment. Audio-visual, air-shock and acoustic records were utilized for interpretation of observed unique blast effects, and for determination of blast wave parameters suited for yield estimation and the associated relationships. High-pressure gauges were deployed at 100-600 m to record air-blast properties, evaluate the efficiency of the charge design and energy generation, and provide a reliable estimation of the charge yield. The yield estimators, based on empirical scaled relations for well-known basic air-blast parameters—the peak pressure, impulse and positive phase duration, as well as on the crater dimensions and seismic magnitudes, were analyzed. A novel empirical scaled relationship for the little-known secondary shock delay was developed, consistent for broad ranges of ANFO charges and distances, which facilitates using this stable and reliable air-blast parameter as a new potential

  14. Multidisciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  15. Developing a Measure of General Academic Ability: An Application of Maximal Reliability and Optimal Linear Combination to High School Students' Scores

    ERIC Educational Resources Information Center

    Dimitrov, Dimiter M.; Raykov, Tenko; AL-Qataee, Abdullah Ali

    2015-01-01

    This article is concerned with developing a measure of general academic ability (GAA) for high school graduates who apply to colleges, as well as with the identification of optimal weights of the GAA indicators in a linear combination that yields a composite score with maximal reliability and maximal predictive validity, employing the framework of…

  16. Reliability of Pain Measurements Using Computerized Cuff Algometry: A DoloCuff Reliability and Agreement Study.

    PubMed

    Kvistgaard Olsen, Jack; Fener, Dilay Kesgin; Waehrens, Eva Elisabet; Wulf Christensen, Anton; Jespersen, Anders; Danneskiold-Samsøe, Bente; Bartels, Else Marie

    2017-07-01

    Computerized pneumatic cuff pressure algometry (CPA) using the DoloCuff is a new method for pain assessment. Intra- and inter-rater reliabilities have not yet been established. Our aim was to examine the inter- and intrarater reliabilities of DoloCuff measures in healthy subjects. Twenty healthy subjects (ages 20 to 29 years) were assessed three times at 24-hour intervals by two trained raters. Inter-rater reliability was established based on the first and second assessments, whereas intrarater reliability was based on the second and third assessments. Subjects were randomized 1:1 to first assessment at either rater 1 or rater 2. The variables of interest were pressure pain threshold (PT), pressure pain tolerance (PTol), and temporal summation index (TSI). Reliability was estimated by a two-way mixed intraclass correlation coefficient (ICC) absolute agreement analysis. Reliability was considered excellent if ICC > 0.75, fair to good if 0.4 < ICC < 0.75, and poor if ICC < 0.4. Bias and random errors between raters and assessments were evaluated using 95% confidence interval (CI) and Bland-Altman plots. Inter-rater reliability for PT, PTol, and TSI was 0.88 (95% CI: 0.69 to 0.95), 0.86 (95% CI: 0.65 to 0.95), and 0.81 (95% CI: 0.42 to 0.94), respectively. The intrarater reliability for PT, PTol, and TSI was 0.81 (95% CI: 0.53 to 0.92), 0.89 (95% CI: 0.74 to 0.96), and 0.75 (95% CI: 0.28 to 0.91), respectively. Inter-rater reliability was excellent for PT, PTol, and TSI. Similarly, the intrarater reliability for PT and PTol was excellent, while borderline excellent/good for TSI. Therefore, the DoloCuff can be used to obtain reliable measures of pressure pain parameters in healthy subjects. © 2016 World Institute of Pain.

  17. Reed canarygrass yield improvement

    USDA-ARS?s Scientific Manuscript database

    Reed canarygrass is well adapted to the northern USA. Eight cultivars and 72 accessions collected in rural landscapes from Iowa to New Hampshire were evaluated for yield. Accessions produced on average 7% higher biomass yield compared to existing cultivars. Naturalized populations of reed canarygras...

  18. Comprehensive proficiency-based inanimate training for robotic surgery: reliability, feasibility, and educational benefit.

    PubMed

    Arain, Nabeel A; Dulan, Genevieve; Hogg, Deborah C; Rege, Robert V; Powers, Cathryn E; Tesfay, Seifu T; Hynan, Linda S; Scott, Daniel J

    2012-10-01

    We previously developed a comprehensive proficiency-based robotic training curriculum demonstrating construct, content, and face validity. This study aimed to assess reliability, feasibility, and educational benefit associated with curricular implementation. Over an 11-month period, 55 residents, fellows, and faculty (robotic novices) from general surgery, urology, and gynecology were enrolled in a 2-month curriculum: online didactics, half-day hands-on tutorial, and self-practice using nine inanimate exercises. Each trainee completed a questionnaire and performed a single proctored repetition of each task before (pretest) and after (post-test) training. Tasks were scored for time and errors using modified FLS metrics. For inter-rater reliability (IRR), three trainees were scored by two raters and analyzed using intraclass correlation coefficients (ICC). Data from eight experts were analyzed using ICC and Cronbach's α to determine test-retest reliability and internal consistency, respectively. Educational benefit was assessed by comparing baseline (pretest) and final (post-test) trainee performance; comparisons used Wilcoxon signed-rank test. Of the 55 trainees that pretested, 53 (96 %) completed all curricular components in 9-17 h and reached proficiency after completing an average of 72 ± 28 repetitions over 5 ± 1 h. Trainees indicated minimal prior robotic experience and "poor comfort" with robotic skills at baseline (1.8 ± 0.9) compared to final testing (3.1 ± 0.8, p < 0.001). IRR data for the composite score revealed an ICC of 0.96 (p < 0.001). Test-retest reliability was 0.91 (p < 0.001) and internal consistency was 0.81. Performance improved significantly after training for all nine tasks and according to composite scores (548 ± 176 vs. 914 ± 81, p < 0.001), demonstrating educational benefit. This curriculum is associated with high reliability measures, demonstrated feasibility for a large cohort of trainees, and yielded significant educational

  19. Calculating system reliability with SRFYDO

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morzinski, Jerome; Anderson - Cook, Christine M; Klamann, Richard M

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for themore » system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.« less

  20. Precision and reliability of periodically and quasiperiodically driven integrate-and-fire neurons.

    PubMed

    Tiesinga, P H E

    2002-04-01

    Neurons in the brain communicate via trains of all-or-none electric events known as spikes. How the brain encodes information using spikes-the neural code-remains elusive. Here the robustness against noise of stimulus-induced neural spike trains is studied in terms of attractors and bifurcations. The dynamics of model neurons converges after a transient onto an attractor yielding a reproducible sequence of spike times. At a bifurcation point the spike times on the attractor change discontinuously when a parameter is varied. Reliability, the stability of the attractor against noise, is reduced when the neuron operates close to a bifurcation point. We determined using analytical spike-time maps the attractor and bifurcation structure of an integrate-and-fire model neuron driven by a periodic or a quasiperiodic piecewise constant current and investigated the stability of attractors against noise. The integrate-and-fire model neuron became mode locked to the periodic current with a rational winding number p/q and produced p spikes per q cycles. There were q attractors. p:q mode-locking regions formed Arnold tongues. In the model, reliability was the highest during 1:1 mode locking when there was only one attractor, as was also observed in recent experiments. The quasiperiodically driven neuron mode locked to either one of the two drive periods, or to a linear combination of both of them. Mode-locking regions were organized in Arnold tongues and reliability was again highest when there was only one attractor. These results show that neuronal reliability in response to the rhythmic drive generated by synchronized networks of neurons is profoundly influenced by the location of the Arnold tongues in parameter space.

  1. Limits on reliable information flows through stochastic populations.

    PubMed

    Boczkowski, Lucas; Natale, Emanuele; Feinerman, Ofer; Korman, Amos

    2018-06-06

    Biological systems can share and collectively process information to yield emergent effects, despite inherent noise in communication. While man-made systems often employ intricate structural solutions to overcome noise, the structure of many biological systems is more amorphous. It is not well understood how communication noise may affect the computational repertoire of such groups. To approach this question we consider the basic collective task of rumor spreading, in which information from few knowledgeable sources must reliably flow into the rest of the population. We study the effect of communication noise on the ability of groups that lack stable structures to efficiently solve this task. We present an impossibility result which strongly restricts reliable rumor spreading in such groups. Namely, we prove that, in the presence of even moderate levels of noise that affect all facets of the communication, no scheme can significantly outperform the trivial one in which agents have to wait until directly interacting with the sources-a process which requires linear time in the population size. Our results imply that in order to achieve efficient rumor spread a system must exhibit either some degree of structural stability or, alternatively, some facet of the communication which is immune to noise. We then corroborate this claim by providing new analyses of experimental data regarding recruitment in Cataglyphis niger desert ants. Finally, in light of our theoretical results, we discuss strategies to overcome noise in other biological systems.

  2. Reliability of Measurement of Glenohumeral Internal Rotation, External Rotation, and Total Arc of Motion in 3 Test Positions

    PubMed Central

    Kevern, Mark A.; Beecher, Michael; Rao, Smita

    2014-01-01

    Context: Athletes who participate in throwing and racket sports consistently demonstrate adaptive changes in glenohumeral-joint internal and external rotation in the dominant arm. Measurements of these motions have demonstrated excellent intrarater and poor interrater reliability. Objective: To determine intrarater reliability, interrater reliability, and standard error of measurement for shoulder internal rotation, external rotation, and total arc of motion using an inclinometer in 3 testing procedures in National Collegiate Athletic Association Division I baseball and softball athletes. Design: Cross-sectional study. Setting: Athletic department. Patients or Other Participants Thirty-eight players participated in the study. Shoulder internal rotation, external rotation, and total arc of motion were measured by 2 investigators in 3 test positions. The standard supine position was compared with a side-lying test position, as well as a supine test position without examiner overpressure. Results: Excellent intrarater reliability was noted for all 3 test positions and ranges of motion, with intraclass correlation coefficient values ranging from 0.93 to 0.99. Results for interrater reliability were less favorable. Reliability for internal rotation was highest in the side-lying position (0.68) and reliability for external rotation and total arc was highest in the supine-without-overpressure position (0.774 and 0.713, respectively). The supine-with-overpressure position yielded the lowest interrater reliability results in all positions. The side-lying position had the most consistent results, with very little variation among intraclass correlation coefficient values for the various test positions. Conclusions: The results of our study clearly indicate that the side-lying test procedure is of equal or greater value than the traditional supine-with-overpressure method. PMID:25188316

  3. [Reliability and validity studies of Turkish translation of Eysenck Personality Questionnaire Revised-Abbreviated].

    PubMed

    Karanci, A Nuray; Dirik, Gülay; Yorulmaz, Orçun

    2007-01-01

    The aim of the present study was to examine the reliability and the validity of the Turkish translation of the Eysneck Personality Questionnaire Revised-abbreviated Form (EPQR-A) (Francis et al., 1992), which consists of 24 items that assess neuroticism, extraversion, psychoticism, and lying. The questionnaire was first translated into Turkish and then back translated. Subsequently, it was administered to 756 students from 4 different universities. The Fear Survey Inventory-III (FSI-III), Rosenberg Self-Esteem Scales (RSES), and Egna Minnen Betraffande Uppfostran (EMBU-C) were also administered in order to assess the questionnaire's validity. The internal consistency, test-retest reliability, and validity were subsequently evaluated. Factor analysis, similar to the original scale, yielded 4 factors; the neuroticism, extraversion, psychoticism, and lie scales. Kuder-Richardson alpha coefficients for the extraversion, neuroticism, psychoticism, and lie scales were 0.78, 0.65, 0.42, and 0.64, respectively, and the test-retest reliability of the scales was 0.84, 0.82, 0.69, and 0.69, respectively. The relationships between EPQR-A-48, FSI-III, EMBU-C, and RSES were examined in order to evaluate the construct validity of the scale. Our findings support the construct validity of the questionnaire. To investigate gender differences in scores on the subscales, MANOVA was conducted. The results indicated that there was a gender difference only in the lie scale scores. Our findings largely supported the reliability and validity of the questionnaire in a Turkish student sample. The psychometric characteristics of the Turkish version of the EPQR-A were discussed in light of the relevant literature.

  4. High-Yield Synthesis of Stoichiometric Boron Nitride Nanostructures

    DOE PAGES

    Nocua, José E.; Piazza, Fabrice; Weiner, Brad R.; ...

    2009-01-01

    Boron nimore » tride (BN) nanostructures are structural analogues of carbon nanostructures but have completely different bonding character and structural defects. They are chemically inert, electrically insulating, and potentially important in mechanical applications that include the strengthening of light structural materials. These applications require the reliable production of bulk amounts of pure BN nanostructures in order to be able to reinforce large quantities of structural materials, hence the need for the development of high-yield synthesis methods of pure BN nanostructures. Using borazine ( B 3 N 3 H 6 ) as chemical precursor and the hot-filament chemical vapor deposition (HFCVD) technique, pure BN nanostructures with cross-sectional sizes ranging between 20 and 50 nm were obtained, including nanoparticles and nanofibers. Their crystalline structure was characterized by (XRD), their morphology and nanostructure was examined by (SEM) and (TEM), while their chemical composition was studied by (EDS), (FTIR), (EELS), and (XPS). Taken altogether, the results indicate that all the material obtained is stoichiometric nanostructured BN with hexagonal and rhombohedral crystalline structure.« less

  5. Proposed reliability cost model

    NASA Technical Reports Server (NTRS)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  6. Chapter 15: Reliability of Wind Turbines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheng, Shuangwen; O'Connor, Ryan

    The global wind industry has witnessed exciting developments in recent years. The future will be even brighter with further reductions in capital and operation and maintenance costs, which can be accomplished with improved turbine reliability, especially when turbines are installed offshore. One opportunity for the industry to improve wind turbine reliability is through the exploration of reliability engineering life data analysis based on readily available data or maintenance records collected at typical wind plants. If adopted and conducted appropriately, these analyses can quickly save operation and maintenance costs in a potentially impactful manner. This chapter discusses wind turbine reliability bymore » highlighting the methodology of reliability engineering life data analysis. It first briefly discusses fundamentals for wind turbine reliability and the current industry status. Then, the reliability engineering method for life analysis, including data collection, model development, and forecasting, is presented in detail and illustrated through two case studies. The chapter concludes with some remarks on potential opportunities to improve wind turbine reliability. An owner and operator's perspective is taken and mechanical components are used to exemplify the potential benefits of reliability engineering analysis to improve wind turbine reliability and availability.« less

  7. High yield neutron generators using the DD reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vainionpaa, J. H.; Harris, J. L.; Piestrup, M. A.

    2013-04-19

    A product line of high yield neutron generators has been developed at Adelphi technology inc. The generators use the D-D fusion reaction and are driven by an ion beam supplied by a microwave ion source. Yields of up to 5 Multiplication-Sign 10{sup 9} n/s have been achieved, which are comparable to those obtained using the more efficient D-T reaction. The microwave-driven plasma uses the electron cyclotron resonance (ECR) to produce a high plasma density for high current and high atomic ion species. These generators have an actively pumped vacuum system that allows operation at reduced pressure in the target chamber,more » increasing the overall system reliability. Since no radioactive tritium is used, the generators can be easily serviced, and components can be easily replaced, providing essentially an unlimited lifetime. Fast neutron source size can be adjusted by selecting the aperture and target geometries according to customer specifications. Pulsed and continuous operation has been demonstrated. Minimum pulse lengths of 50 {mu}s have been achieved. Since the generators are easily serviceable, they offer a long lifetime neutron generator for laboratories and commercial systems requiring continuous operation. Several of the generators have been enclosed in radiation shielding/moderator structures designed for customer specifications. These generators have been proven to be useful for prompt gamma neutron activation analysis (PGNAA), neutron activation analysis (NAA) and fast neutron radiography. Thus these generators make excellent fast, epithermal and thermal neutron sources for laboratories and industrial applications that require neutrons with safe operation, small footprint, low cost and small regulatory burden.« less

  8. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the secondmore » place. 407 refs., 4 figs., 2 tabs.« less

  9. Software reliability models for critical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, H.; Pham, M.

    This report presents the results of the first phase of the ongoing EG&G Idaho, Inc. Software Reliability Research Program. The program is studying the existing software reliability models and proposes a state-of-the-art software reliability model that is relevant to the nuclear reactor control environment. This report consists of three parts: (1) summaries of the literature review of existing software reliability and fault tolerant software reliability models and their related issues, (2) proposed technique for software reliability enhancement, and (3) general discussion and future research. The development of this proposed state-of-the-art software reliability model will be performed in the second place.more » 407 refs., 4 figs., 2 tabs.« less

  10. Software reliability perspectives

    NASA Technical Reports Server (NTRS)

    Wilson, Larry; Shen, Wenhui

    1987-01-01

    Software which is used in life critical functions must be known to be highly reliable before installation. This requires a strong testing program to estimate the reliability, since neither formal methods, software engineering nor fault tolerant methods can guarantee perfection. Prior to the final testing software goes through a debugging period and many models have been developed to try to estimate reliability from the debugging data. However, the existing models are poorly validated and often give poor performance. This paper emphasizes the fact that part of their failures can be attributed to the random nature of the debugging data given to these models as input, and it poses the problem of correcting this defect as an area of future research.

  11. Developing Reliable Life Support for Mars

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and

  12. Reliability and validity of the test of incremental respiratory endurance measures of inspiratory muscle performance in COPD

    PubMed Central

    Formiga, Magno F; Roach, Kathryn E; Vital, Isabel; Urdaneta, Gisel; Balestrini, Kira; Calderon-Candelario, Rafael A

    2018-01-01

    Purpose The Test of Incremental Respiratory Endurance (TIRE) provides a comprehensive assessment of inspiratory muscle performance by measuring maximal inspiratory pressure (MIP) over time. The integration of MIP over inspiratory duration (ID) provides the sustained maximal inspiratory pressure (SMIP). Evidence on the reliability and validity of these measurements in COPD is not currently available. Therefore, we assessed the reliability, responsiveness and construct validity of the TIRE measures of inspiratory muscle performance in subjects with COPD. Patients and methods Test–retest reliability, known-groups and convergent validity assessments were implemented simultaneously in 81 male subjects with mild to very severe COPD. TIRE measures were obtained using the portable PrO2 device, following standard guidelines. Results All TIRE measures were found to be highly reliable, with SMIP demonstrating the strongest test–retest reliability with a nearly perfect intraclass correlation coefficient (ICC) of 0.99, while MIP and ID clustered closely together behind SMIP with ICC values of about 0.97. Our findings also demonstrated known-groups validity of all TIRE measures, with SMIP and ID yielding larger effect sizes when compared to MIP in distinguishing between subjects of different COPD status. Finally, our analyses confirmed convergent validity for both SMIP and ID, but not MIP. Conclusion The TIRE measures of MIP, SMIP and ID have excellent test–retest reliability and demonstrated known-groups validity in subjects with COPD. SMIP and ID also demonstrated evidence of moderate convergent validity and appear to be more stable measures in this patient population than the traditional MIP. PMID:29805255

  13. Analysis of the trade-off between high crop yield and low yield instability at the global scale

    NASA Astrophysics Data System (ADS)

    Ben-Ari, Tamara; Makowski, David

    2016-10-01

    Yield dynamics of major crops species vary remarkably among continents. Worldwide distribution of cropland influences both the expected levels and the interannual variability of global yields. An expansion of cultivated land in the most productive areas could theoretically increase global production, but also increase global yield instability if the most productive regions are characterized by high interannual yield variability. In this letter, we use portfolio analysis to quantify the tradeoff between the expected values and the interannual variance of global yield. We compute optimal frontiers for four crop species i.e., maize, rice, soybean and wheat and show how the distribution of cropland among large world regions can be optimized to either increase expected global crop production or decrease its interannual variability. We also show that a preferential allocation of cropland in the most productive regions can increase global expected yield at the expense of yield stability. Theoretically, optimizing the distribution of a small fraction of total cultivated areas can help find a good compromise between low instability and high crop yields at the global scale.

  14. Combinatorial Reliability and Repair

    DTIC Science & Technology

    1992-07-01

    Press, Oxford, 1987. [2] G. Gordon and L. Traldi, Generalized activities and the Tutte polynomial, Discrete Math . 85 (1990), 167-176. [3] A. B. Huseby, A...Chromatic polynomials and network reliability, Discrete Math . 67 (1987), 57-79. [7] A. Satayanarayana and R. K. Wood, A linear-time algorithm for comput- ing...K-terminal reliability in series-parallel networks, SIAM J. Comput. 14 (1985), 818-832. [8] L. Traldi, Generalized activities and K-terminal reliability, Discrete Math . 96 (1991), 131-149. 4

  15. Test-Retest Reliability of Memory Task fMRI in Alzheimer’s Disease Clinical Trials

    PubMed Central

    Atri, Alireza; O’Brien, Jacqueline L.; Sreenivasan, Aishwarya; Rastegar, Sarah; Salisbury, Sibyl; DeLuca, Amy N.; O’Keefe, Kelly M.; LaViolette, Peter S.; Rentz, Dorene M.; Locascio, Joseph J.; Sperling, Reisa A.

    2012-01-01

    Objective To examine feasibility and test-retest reliability of encoding-task functional MRI (fMRI) in mild Alzheimer’s disease (AD). Design Randomized, double-blind, placebo-controlled (RCT) study. Setting Memory clinical trials unit. Participants Twelve subjects with mild AD (MMSE 24.0±0.7, CDR 1), on >6 months stable donepezil, from the placebo-arm of a larger 24-week (n=24, four scans on weeks 0,6,12,24) study. Interventions Placebo and three face-name paired-associate encoding, block-design BOLD-fMRI scans in 12 weeks. Main Outcomes Whole-brain t-maps (p<0.001, 5-contiguous voxels) and hippocampal regions-of-interest (ROI) analyses of extent (EXT, %voxels active) and magnitude (MAG, %signal change) for Novel-greater-than-Repeated (N>R) face-name contrasts. Calculation of Intraclass Correlations (ICC) and power estimates for hippocampal ROIs. Results Task-tolerability and data yield were high (95 of 96 scans yield good quality data). Whole-brain maps were stable. Right and left hippocampal ROI ICCs were 0.59–0.87 and 0.67–0.74, respectively. To detect 25–50% changes in 0–12 week hippocampal activity using L/R-EXT or R-MAG with 80% power (2-sided-α=0.05) requires 14–51 subjects. Using L-MAG requires >125 subjects due to relatively small signals to variance ratios. Conclusions Encoding-task fMRI was successfully implemented in a single-site, 24-week, AD RCT. Week 0–12 whole-brain t-maps were stable and test-retest reliability of hippocampal fMRI measures ranged from moderate to substantial. Right hippocampal-MAG may be the most promising of these candidate measures in a leveraged context. These initial estimates of test-retest reliability and power justify evaluation of encoding-task fMRI as a potential biomarker for “signal-of-effect” in exploratory and proof-of-concept trials in mild AD. Validation of these results with larger sample sizes and assessment in multi-site studies is warranted. PMID:21555634

  16. MEMS reliability: coming of age

    NASA Astrophysics Data System (ADS)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  17. Reliability of Wireless Sensor Networks

    PubMed Central

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  18. Emulation applied to reliability analysis of reconfigurable, highly reliable, fault-tolerant computing systems

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques applied to the analysis of the reliability of highly reliable computer systems for future commercial aircraft are described. The lack of credible precision in reliability estimates obtained by analytical modeling techniques is first established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Next, the technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. Use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques. Finally an illustrative example is presented to demonstrate from actual use the promise of the proposed application of emulation.

  19. 2017 NREL Photovoltaic Reliability Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah

    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  20. Testing for PV Reliability (Presentation)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, S.; Bansal, S.

    The DOE SUNSHOT workshop is seeking input from the community about PV reliability and how the DOE might address gaps in understanding. This presentation describes the types of testing that are needed for PV reliability and introduces a discussion to identify gaps in our understanding of PV reliability testing.

  1. Laser System Reliability

    DTIC Science & Technology

    1977-03-01

    system acquisition cycle since they provide necessary inputs to comparative analyses, cost/benefit trade -offs, and system simulations. In addition, the...Management Program from above performs the function of analyzing the system trade -offs with respect to reliability to determine a reliability goal...one encounters the problem of comparing present dollars with future dollars. In this analysis, we are trading off costs expended initially (or at

  2. Quantifying potential yield and water-limited yield of summer maize in the North China Plain

    NASA Astrophysics Data System (ADS)

    Jiang, Mingnuo; Liu, Chaoshun; Chen, Maosi

    2017-09-01

    The North China Plain is a major food producing region in China, and climate change could pose a threat to food production in the region. Based on China Meteorological Forcing Dataset, simulating the growth of summer maize in North China Plain from 1979 to 2015 with the regional implementation of crop growth model WOFOST. The results showed that the model can reflect the potential yield and water-limited yield of Summer Maize in North China Plain through the calibration and validation of WOFOST model. After the regional implementation of model, combined with the reanalysis data, the model can better reproduce the regional history of summer maize yield in the North China Plain. The yield gap in Southeastern Beijing, southern Tianjin, southern Hebei province, Northwestern Shandong province is significant, these means the water condition is the main factor to summer maize yield in these regions.

  3. Reliability of EEG Measures of Interaction: A Paradigm Shift Is Needed to Fight the Reproducibility Crisis

    PubMed Central

    Höller, Yvonne; Uhl, Andreas; Bathke, Arne; Thomschewski, Aljoscha; Butz, Kevin; Nardone, Raffaele; Fell, Jürgen; Trinka, Eugen

    2017-01-01

    Measures of interaction (connectivity) of the EEG are at the forefront of current neuroscientific research. Unfortunately, test-retest reliability can be very low, depending on the measure and its estimation, the EEG-frequency of interest, the length of the signal, and the population under investigation. In addition, artifacts can hamper the continuity of the EEG signal, and in some clinical situations it is impractical to exclude artifacts. We aimed to examine factors that moderate test-retest reliability of measures of interaction. The study involved 40 patients with a range of neurological diseases and memory impairments (age median: 60; range 21–76; 40% female; 22 mild cognitive impairment, 5 subjective cognitive complaints, 13 temporal lobe epilepsy), and 20 healthy controls (age median: 61.5; range 23–74; 70% female). We calculated 14 measures of interaction based on the multivariate autoregressive model from two EEG-recordings separated by 2 weeks. We characterized test-retest reliability by correlating the measures between the two EEG-recordings for variations of data length, data discontinuity, artifact exclusion, model order, and frequency over all combinations of channels and all frequencies, individually for each subject, yielding a correlation coefficient for each participant. Excluding artifacts had strong effects on reliability of some measures, such as classical, real valued coherence (~0.1 before, ~0.9 after artifact exclusion). Full frequency directed transfer function was highly reliable and robust against artifacts. Variation of data length decreased reliability in relation to poor adjustment of model order and signal length. Variation of discontinuity had no effect, but reliabilities were different between model orders, frequency ranges, and patient groups depending on the measure. Pathology did not interact with variation of signal length or discontinuity. Our results emphasize the importance of documenting reliability, which may vary

  4. Assessing Variations in Areal Organization for the Intrinsic Brain: From Fingerprints to Reliability

    PubMed Central

    Xu, Ting; Opitz, Alexander; Craddock, R. Cameron; Wright, Margaret J.; Zuo, Xi-Nian; Milham, Michael P.

    2016-01-01

    Resting state fMRI (R-fMRI) is a powerful in-vivo tool for examining the functional architecture of the human brain. Recent studies have demonstrated the ability to characterize transitions between functionally distinct cortical areas through the mapping of gradients in intrinsic functional connectivity (iFC) profiles. To date, this novel approach has primarily been applied to iFC profiles averaged across groups of individuals, or in one case, a single individual scanned multiple times. Here, we used a publically available R-fMRI dataset, in which 30 healthy participants were scanned 10 times (10 min per session), to investigate differences in full-brain transition profiles (i.e., gradient maps, edge maps) across individuals, and their reliability. 10-min R-fMRI scans were sufficient to achieve high accuracies in efforts to “fingerprint” individuals based upon full-brain transition profiles. Regarding test–retest reliability, the image-wise intraclass correlation coefficient (ICC) was moderate, and vertex-level ICC varied depending on region; larger durations of data yielded higher reliability scores universally. Initial application of gradient-based methodologies to a recently published dataset obtained from twins suggested inter-individual variation in areal profiles might have genetic and familial origins. Overall, these results illustrate the utility of gradient-based iFC approaches for studying inter-individual variation in brain function. PMID:27600846

  5. How reliable are Functional Movement Screening scores? A systematic review of rater reliability.

    PubMed

    Moran, Robert W; Schneiders, Anthony G; Major, Katherine M; Sullivan, S John

    2016-05-01

    Several physical assessment protocols to identify intrinsic risk factors for injury aetiology related to movement quality have been described. The Functional Movement Screen (FMS) is a standardised, field-expedient test battery intended to assess movement quality and has been used clinically in preparticipation screening and in sports injury research. To critically appraise and summarise research investigating the reliability of scores obtained using the FMS battery. Systematic literature review. Systematic search of Google Scholar, Scopus (including ScienceDirect and PubMed), EBSCO (including Academic Search Complete, AMED, CINAHL, Health Source: Nursing/Academic Edition), MEDLINE and SPORTDiscus. Studies meeting eligibility criteria were assessed by 2 reviewers for risk of bias using the Quality Appraisal of Reliability Studies checklist. Overall quality of evidence was determined using van Tulder's levels of evidence approach. 12 studies were appraised. Overall, there was a 'moderate' level of evidence in favour of 'acceptable' (intraclass correlation coefficient ≥0.6) inter-rater and intra-rater reliability for composite scores derived from live scoring. For inter-rater reliability of composite scores derived from video recordings there was 'conflicting' evidence, and 'limited' evidence for intra-rater reliability. For inter-rater reliability based on live scoring of individual subtests there was 'moderate' evidence of 'acceptable' reliability (κ≥0.4) for 4 subtests (Deep Squat, Shoulder Mobility, Active Straight-leg Raise, Trunk Stability Push-up) and 'conflicting' evidence for the remaining 3 (Hurdle Step, In-line Lunge, Rotary Stability). This review found 'moderate' evidence that raters can achieve acceptable levels of inter-rater and intra-rater reliability of composite FMS scores when using live ratings. Overall, there were few high-quality studies, and the quality of several studies was impacted by poor study reporting particularly in relation to

  6. Effectiveness of rabbit manure biofertilizer in barley crop yield.

    PubMed

    Islas-Valdez, Samira; Lucho-Constantino, Carlos A; Beltrán-Hernández, Rosa I; Gómez-Mercado, René; Vázquez-Rodríguez, Gabriela A; Herrera, Juan M; Jiménez-González, Angélica

    2017-11-01

    The quality of biofertilizers is usually assessed only in terms of the amount of nutrients that they supply to the crops and their lack of viable pathogens and phytotoxicity. The goal of this study was to determine the effectiveness of a liquid biofertilizer obtained from rabbit manure in terms of presence of pathogens, phytotoxicity, and its effect on the grain yield and other agronomic traits of barley (Hordeum vulgare L.). Environmental effects of the biofertilizer were also evaluated by following its influence on selected soil parameters. We applied the biofertilizer at five combinations of doses and timings each and in two application modes (foliar or direct soil application) within a randomized complete block design with three replicates and using a chemical fertilizer as control. The agronomic traits evaluated were plant height, root length, dry weight, and number of leaves and stems at three growth stages: tillering, jointing, and flowering. The effectiveness of the biofertilizer was significantly modified by the mode of application, the growth stage of the crop, and the dose of biofertilizer applied. The results showed that the foliar application of the biofertilizer at the tillering stage produced the highest increase in grain yield (59.7 %, p < 0.10). The use of the biofertilizer caused significant changes in soil, particularly concerning pH, EC, Ca, Zn, Mg, and Mn. It is our view that the production and use of biofertilizers are a reliable alternative to deal with a solid waste problem while food security is increased.

  7. 78 FR 21879 - Improving 9-1-1 Reliability; Reliability and Continuity of Communications Networks, Including...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-12

    ... maps? What are the public safety and homeland security implications of public disclosure of key network... 13-33] Improving 9-1-1 Reliability; Reliability and Continuity of Communications Networks, Including... improve the reliability and resiliency of the Nation's 9-1-1 networks. The Notice of Proposed Rulemaking...

  8. Fission yield measurements at IGISOL

    NASA Astrophysics Data System (ADS)

    Lantz, M.; Al-Adili, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Mattera, A.; Moore, I.; Penttilä, H.; Pomp, S.; Prokofiev, A. V.; Rakopoulos, V.; Rinta-Antila, S.; Simutkin, V.; Solders, A.

    2016-06-01

    The fission product yields are an important characteristic of the fission process. In fundamental physics, knowledge of the yield distributions is needed to better understand the fission process. For nuclear energy applications good knowledge of neutroninduced fission-product yields is important for the safe and efficient operation of nuclear power plants. With the Ion Guide Isotope Separator On-Line (IGISOL) technique, products of nuclear reactions are stopped in a buffer gas and then extracted and separated by mass. Thanks to the high resolving power of the JYFLTRAP Penning trap, at University of Jyväskylä, fission products can be isobarically separated, making it possible to measure relative independent fission yields. In some cases it is even possible to resolve isomeric states from the ground state, permitting measurements of isomeric yield ratios. So far the reactions U(p,f) and Th(p,f) have been studied using the IGISOL-JYFLTRAP facility. Recently, a neutron converter target has been developed utilizing the Be(p,xn) reaction. We here present the IGISOL-technique for fission yield measurements and some of the results from the measurements on proton induced fission. We also present the development of the neutron converter target, the characterization of the neutron field and the first tests with neutron-induced fission.

  9. Reliability and Validity of an Internet-based Questionnaire Measuring Lifetime Physical Activity

    PubMed Central

    De Vera, Mary A.; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-01-01

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005–2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity. PMID:20876666

  10. Reliability and validity of an internet-based questionnaire measuring lifetime physical activity.

    PubMed

    De Vera, Mary A; Ratzlaff, Charles; Doerfling, Paul; Kopec, Jacek

    2010-11-15

    Lifetime exposure to physical activity is an important construct for evaluating associations between physical activity and disease outcomes, given the long induction periods in many chronic diseases. The authors' objective in this study was to evaluate the measurement properties of the Lifetime Physical Activity Questionnaire (L-PAQ), a novel Internet-based, self-administered instrument measuring lifetime physical activity, among Canadian men and women in 2005-2006. Reliability was examined using a test-retest study. Validity was examined in a 2-part study consisting of 1) comparisons with previously validated instruments measuring similar constructs, the Lifetime Total Physical Activity Questionnaire (LT-PAQ) and the Chasan-Taber Physical Activity Questionnaire (CT-PAQ), and 2) a priori hypothesis tests of constructs measured by the L-PAQ. The L-PAQ demonstrated good reliability, with intraclass correlation coefficients ranging from 0.67 (household activity) to 0.89 (sports/recreation). Comparison between the L-PAQ and the LT-PAQ resulted in Spearman correlation coefficients ranging from 0.41 (total activity) to 0.71 (household activity); comparison between the L-PAQ and the CT-PAQ yielded coefficients of 0.58 (sports/recreation), 0.56 (household activity), and 0.50 (total activity). L-PAQ validity was further supported by observed relations between the L-PAQ and sociodemographic variables, consistent with a priori hypotheses. Overall, the L-PAQ is a useful instrument for assessing multiple domains of lifetime physical activity with acceptable reliability and validity.

  11. Yield model development project implementation plan

    NASA Technical Reports Server (NTRS)

    Ambroziak, R. A.

    1982-01-01

    Tasks remaining to be completed are summarized for the following major project elements: (1) evaluation of crop yield models; (2) crop yield model research and development; (3) data acquisition processing, and storage; (4) related yield research: defining spectral and/or remote sensing data requirements; developing input for driving and testing crop growth/yield models; real time testing of wheat plant process models) and (5) project management and support.

  12. Extracting More Information from Passive Optical Tracking Observations for Reliable Orbit Element Generation

    NASA Astrophysics Data System (ADS)

    Bennett, J.; Gehly, S.

    2016-09-01

    This paper presents results from a preliminary method for extracting more orbital information from low rate passive optical tracking data. An improvement in the accuracy of the observation data yields more accurate and reliable orbital elements. A comparison between the orbit propagations from the orbital element generated using the new data processing method is compared with the one generated from the raw observation data for several objects. Optical tracking data collected by EOS Space Systems, located on Mount Stromlo, Australia, is fitted to provide a new orbital element. The element accuracy is determined from a comparison between the predicted orbit and subsequent tracking data or reference orbit if available. The new method is shown to result in a better orbit prediction which has important implications in conjunction assessments and the Space Environment Research Centre space object catalogue. The focus is on obtaining reliable orbital solutions from sparse data. This work forms part of the collaborative effort of the Space Environment Management Cooperative Research Centre which is developing new technologies and strategies to preserve the space environment (www.serc.org.au).

  13. A reliable multicast for XTP

    NASA Technical Reports Server (NTRS)

    Dempsey, Bert J.; Weaver, Alfred C.

    1990-01-01

    Multicast services needed for current distributed applications on LAN's fall generally into one of three categories: datagram, semi-reliable, and reliable. Transport layer multicast datagrams represent unreliable service in which the transmitting context 'fires and forgets'. XTP executes these semantics when the MULTI and NOERR mode bits are both set. Distributing sensor data and other applications in which application-level error recovery strategies are appropriate benefit from the efficiency in multidestination delivery offered by datagram service. Semi-reliable service refers to multicasting in which the control algorithms of the transport layer--error, flow, and rate control--are used in transferring the multicast distribution to the set of receiving contexts, the multicast group. The multicast defined in XTP provides semi-reliable service. Since, under a semi-reliable service, joining a multicast group means listening on the group address and entails no coordination with other members, a semi-reliable facility can be used for communication between a client and a server group as well as true peer-to-peer group communication. Resource location in a LAN is an important application domain. The term 'semi-reliable' refers to the fact that group membership changes go undetected. No attempt is made to assess the current membership of the group at any time--before, during, or after--the data transfer.

  14. Effect of the plate surface characteristics and gap height on yield stresses of a magnetorheological fluid

    NASA Astrophysics Data System (ADS)

    Jonkkari, I.; Kostamo, E.; Kostamo, J.; Syrjala, S.; Pietola, M.

    2012-07-01

    Effects of the plate material, surface roughness and measuring gap height on static and dynamic yield stresses of a magnetorheological (MR) fluid were investigated with a commercial plate-plate magnetorheometer. Magnetic and non-magnetic plates with smooth (Ra ˜ 0.3 μm) and rough (Ra ˜ 10 μm) surface finishes were used. It was shown by Hall probe measurements and finite element simulations that the use of magnetic plates or higher gap heights increases the level of magnetic flux density and changes the shape of the radial flux density profile. The yield stress increase caused by these factors was determined and subtracted from the measured values in order to examine only the effect of the wall characteristics or the gap height. Roughening of the surfaces offered a significant increase in the yield stresses for non-magnetic plates. With magnetic plates the yield stresses were higher to start with, but roughening did not increase them further. A significant part of the difference in measured stresses between rough non-magnetic and magnetic plates was caused by changes in magnetic flux density rather than by better contact of the particles to the plate surfaces. In a similar manner, an increase in gap height from 0.25 to 1.00 mm can lead to over 20% increase in measured stresses due to changes in the flux density profile. When these changes were compensated the dynamic yield stresses generally remained independent of the gap height, even in the cases where it was obvious that the wall slip was present. This suggests that with MR fluids the wall slip cannot be reliably detected by comparison of flow curves measured at different gap heights.

  15. Flood-tolerant rice reduces yield variability and raises expected yield, differentially benefitting socially disadvantaged groups

    PubMed Central

    Dar, Manzoor H.; de Janvry, Alain; Emerick, Kyle; Raitzer, David; Sadoulet, Elisabeth

    2013-01-01

    Approximately 30% of the cultivated rice area in India is prone to crop damage from prolonged flooding. We use a randomized field experiment in 128 villages of Orissa India to show that Swarna-Sub1, a recently released submergence-tolerant rice variety, has significant positive impacts on rice yield when fields are submerged for 7 to 14 days with no yield penalty without flooding. We estimate that Swarna-Sub1 offers an approximate 45% increase in yields over the current popular variety when fields are submerged for 10 days. We show additionally that low-lying areas prone to flooding tend to be more heavily occupied by people belonging to lower caste social groups. Thus, a policy relevant implication of our findings is that flood-tolerant rice can deliver both efficiency gains, through reduced yield variability and higher expected yield, and equity gains in disproportionately benefiting the most marginal group of farmers. PMID:24263095

  16. System and Software Reliability (C103)

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Within the last decade better reliability models (hardware. software, system) than those currently used have been theorized and developed but not implemented in practice. Previous research on software reliability has shown that while some existing software reliability models are practical, they are no accurate enough. New paradigms of development (e.g. OO) have appeared and associated reliability models have been proposed posed but not investigated. Hardware models have been extensively investigated but not integrated into a system framework. System reliability modeling is the weakest of the three. NASA engineers need better methods and tools to demonstrate that the products meet NASA requirements for reliability measurement. For the new models for the software component of the last decade, there is a great need to bring them into a form that they can be used on software intensive systems. The Statistical Modeling and Estimation of Reliability Functions for Systems (SMERFS'3) tool is an existing vehicle that may be used to incorporate these new modeling advances. Adapting some existing software reliability modeling changes to accommodate major changes in software development technology may also show substantial improvement in prediction accuracy. With some additional research, the next step is to identify and investigate system reliability. System reliability models could then be incorporated in a tool such as SMERFS'3. This tool with better models would greatly add value in assess in GSFC projects.

  17. Computer-Aided Reliability Estimation

    NASA Technical Reports Server (NTRS)

    Bavuso, S. J.; Stiffler, J. J.; Bryant, L. A.; Petersen, P. L.

    1986-01-01

    CARE III (Computer-Aided Reliability Estimation, Third Generation) helps estimate reliability of complex, redundant, fault-tolerant systems. Program specifically designed for evaluation of fault-tolerant avionics systems. However, CARE III general enough for use in evaluation of other systems as well.

  18. Preparation of DNA from cytological material: effects of fixation, staining, and mounting medium on DNA yield and quality.

    PubMed

    Dejmek, Annika; Zendehrokh, Nooreldin; Tomaszewska, Malgorzata; Edsjö, Anders

    2013-07-01

    Personalized oncology requires molecular analysis of tumor cells. Several studies have demonstrated that cytological material is suitable for DNA analysis, but to the authors' knowledge there are no systematic studies comparing how the yield and quality of extracted DNA is affected by the various techniques used for the preparation of cytological material. DNA yield and quality were compared using cultured human lung cancer cells subjected to different preparation techniques used in routine cytology, including fixation, mounting medium, and staining. The results were compared with the outcome of epidermal growth factor receptor (EGFR) genotyping of 66 clinical cytological samples using the same DNA preparation protocol. All tested protocol combinations resulted in fragment lengths of at least 388 base pairs. The mounting agent EcoMount resulted in higher yields than traditional xylene-based medium. Spray and ethanol fixation resulted in both a higher yield and better DNA quality than air drying. In liquid-based cytology (LBC) methods, CytoLyt solution resulted in a 5-fold higher yield than CytoRich Red. Papanicolaou staining provided twice the yield of hematoxylin and eosin staining in both liquid-based preparations. Genotyping outcome and quality control values from the clinical EGFR genotyping demonstrated a sufficient amount and amplifiability of DNA in both spray-fixed and air-dried cytological samples. Reliable clinical genotyping can be performed using all tested methods. However, in the cell line experiments, spray- or ethanol-fixed, Papanicolaou-stained slides provided the best results in terms of yield and fragment length. In LBC, the DNA recovery efficiency of the preserving medium may differ considerably, which should be taken into consideration when introducing LBC. Cancer (Cancer Cytopathol) 2013;121:344-353. © 2013 American Cancer Society. © 2013 American Cancer Society.

  19. Impact of Rating Scale Categories on Reliability and Fit Statistics of the Malay Spiritual Well-Being Scale using Rasch Analysis.

    PubMed

    Daher, Aqil Mohammad; Ahmad, Syed Hassan; Winn, Than; Selamat, Mohd Ikhsan

    2015-01-01

    Few studies have employed the item response theory in examining reliability. We conducted this study to examine the effect of Rating Scale Categories (RSCs) on the reliability and fit statistics of the Malay Spiritual Well-Being Scale, employing the Rasch model. The Malay Spiritual Well-Being Scale (SWBS) with the original six; three and four newly structured RSCs was distributed randomly among three different samples of 50 participants each. The mean age of respondents in the three samples ranged between 36 and 39 years old. The majority was female in all samples, and Islam was the most prevalent religion among the respondents. The predominating race was Malay, followed by Chinese and Indian. The original six RSCs indicated better targeting of 0.99 and smallest model error of 0.24. The Infit Mnsq (mean square) and Zstd (Z standard) of the six RSCs were "1.1"and "-0.1"respectively. The six RSCs achieved the highest person and item reliabilities of 0.86 and 0.85 respectively. These reliabilities yielded the highest person (2.46) and item (2.38) separation indices compared to other the RSCs. The person and item reliability and, to a lesser extent, the fit statistics, were better with the six RSCs compared to the four and three RSCs.

  20. Validity and reliability of the Persian version of mobile phone addiction scale

    PubMed Central

    Mazaheri, Maryam Amidi; Karbasi, Mojtaba

    2014-01-01

    Background: With regard to large number of mobile users especially among college students in Iran, addiction to mobile phone is attracting increasing concern. There is an urgent need for reliable and valid instrument to measure this phenomenon. This study examines validity and reliability of the Persian version of mobile phone addiction scale (MPAIS) in college students. Materials and Methods: this methodological study was down in Isfahan University of Medical Sciences. One thousand one hundred and eighty students were selected by convenience sampling. The English version of the MPAI questionnaire was translated into Persian with the approach of Jones et al. (Challenges in language, culture, and modality: Translating English measures into American Sign Language. Nurs Res 2006; 55: 75-81). Its reliability was tested by Cronbach's alpha and its dimensionality validity was evaluated using Pearson correlation coefficients with other measures of mobile phone use and IAT. Construct validity was evaluated using Exploratory subscale analysis. Results: Cronbach's alpha of 0.86 was obtained for total PMPAS, for subscale1 (eight items) was 0.84, for subscale 2 (five items) was 0.81 and for subscale 3 (two items) was 0.77. There were significantly positive correlations between the score of PMPAS and IAT (r = 0.453, P < 0.001) and other measures of mobile phone use. Principal component subscale analysis yielded a three-subscale structure including: inability to control craving; feeling anxious and lost; mood improvement accounted for 60.57% of total variance. The results of discriminate validity showed that all the item's correlations with related subscale were greater than 0.5 and correlations with unrelated subscale were less than 0.5. Conclusion: Considering lack of a valid and reliable questionnaire for measuring addiction to the mobile phone, PMPAS could be a suitable instrument for measuring mobile phone addiction in future research. PMID:24778668

  1. Validity and reliability of the Persian version of mobile phone addiction scale.

    PubMed

    Mazaheri, Maryam Amidi; Karbasi, Mojtaba

    2014-02-01

    With regard to large number of mobile users especially among college students in Iran, addiction to mobile phone is attracting increasing concern. There is an urgent need for reliable and valid instrument to measure this phenomenon. This study examines validity and reliability of the Persian version of mobile phone addiction scale (MPAIS) in college students. this methodological study was down in Isfahan University of Medical Sciences. One thousand one hundred and eighty students were selected by convenience sampling. The English version of the MPAI questionnaire was translated into Persian with the approach of Jones et al. (Challenges in language, culture, and modality: Translating English measures into American Sign Language. Nurs Res 2006; 55: 75-81). Its reliability was tested by Cronbach's alpha and its dimensionality validity was evaluated using Pearson correlation coefficients with other measures of mobile phone use and IAT. Construct validity was evaluated using Exploratory subscale analysis. Cronbach's alpha of 0.86 was obtained for total PMPAS, for subscale1 (eight items) was 0.84, for subscale 2 (five items) was 0.81 and for subscale 3 (two items) was 0.77. There were significantly positive correlations between the score of PMPAS and IAT (r = 0.453, P < 0.001) and other measures of mobile phone use. Principal component subscale analysis yielded a three-subscale structure including: inability to control craving; feeling anxious and lost; mood improvement accounted for 60.57% of total variance. The results of discriminate validity showed that all the item's correlations with related subscale were greater than 0.5 and correlations with unrelated subscale were less than 0.5. Considering lack of a valid and reliable questionnaire for measuring addiction to the mobile phone, PMPAS could be a suitable instrument for measuring mobile phone addiction in future research.

  2. Assessment of family functioning in Caucasian and Hispanic Americans: reliability, validity, and factor structure of the Family Assessment Device.

    PubMed

    Aarons, Gregory A; McDonald, Elizabeth J; Connelly, Cynthia D; Newton, Rae R

    2007-12-01

    The purpose of this study was to examine the factor structure, reliability, and validity of the Family Assessment Device (FAD) among a national sample of Caucasian and Hispanic American families receiving public sector mental health services. A confirmatory factor analysis conducted to test model fit yielded equivocal findings. With few exceptions, indices of model fit, reliability, and validity were poorer for Hispanic Americans compared with Caucasian Americans. Contrary to our expectation, an exploratory factor analysis did not result in a better fitting model of family functioning. Without stronger evidence supporting a reformulation of the FAD, we recommend against such a course of action. Findings highlight the need for additional research on the role of culture in measurement of family functioning.

  3. Software reliability report

    NASA Technical Reports Server (NTRS)

    Wilson, Larry

    1991-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Unfortunately, the models appear to be unable to account for the random nature of the data. If the same code is debugged multiple times and one of the models is used to make predictions, intolerable variance is observed in the resulting reliability predictions. It is believed that data replication can remove this variance in lab type situations and that it is less than scientific to talk about validating a software reliability model without considering replication. It is also believed that data replication may prove to be cost effective in the real world, thus the research centered on verification of the need for replication and on methodologies for generating replicated data in a cost effective manner. The context of the debugging graph was pursued by simulation and experimentation. Simulation was done for the Basic model and the Log-Poisson model. Reasonable values of the parameters were assigned and used to generate simulated data which is then processed by the models in order to determine limitations on their accuracy. These experiments exploit the existing software and program specimens which are in AIR-LAB to measure the performance of reliability models.

  4. Reliability Generalization: Exploring Variation of Reliability Coefficients of MMPI Clinical Scales Scores.

    ERIC Educational Resources Information Center

    Vacha-Haase, Tammi; Kogan, Lori R.; Tani, Crystal R.; Woodall, Renee A.

    2001-01-01

    Used reliability generalization to explore the variance of scores on 10 Minnesota Multiphasic Personality Inventory (MMPI) clinical scales drawing on 1,972 articles in the literature on the MMPI. Results highlight the premise that scores, not tests, are reliable or unreliable, and they show that study characteristics do influence scores on the…

  5. Nut crop yield records show that budbreak-based chilling requirements may not reflect yield decline chill thresholds

    NASA Astrophysics Data System (ADS)

    Pope, Katherine S.; Dose, Volker; Da Silva, David; Brown, Patrick H.; DeJong, Theodore M.

    2015-06-01

    Warming winters due to climate change may critically affect temperate tree species. Insufficiently cold winters are thought to result in fewer viable flower buds and the subsequent development of fewer fruits or nuts, decreasing the yield of an orchard or fecundity of a species. The best existing approximation for a threshold of sufficient cold accumulation, the "chilling requirement" of a species or variety, has been quantified by manipulating or modeling the conditions that result in dormant bud breaking. However, the physiological processes that affect budbreak are not the same as those that determine yield. This study sought to test whether budbreak-based chilling thresholds can reasonably approximate the thresholds that affect yield, particularly regarding the potential impacts of climate change on temperate tree crop yields. County-wide yield records for almond ( Prunus dulcis), pistachio ( Pistacia vera), and walnut ( Juglans regia) in the Central Valley of California were compared with 50 years of weather records. Bayesian nonparametric function estimation was used to model yield potentials at varying amounts of chill accumulation. In almonds, average yields occurred when chill accumulation was close to the budbreak-based chilling requirement. However, in the other two crops, pistachios and walnuts, the best previous estimate of the budbreak-based chilling requirements was 19-32 % higher than the chilling accumulations associated with average or above average yields. This research indicates that physiological processes beyond requirements for budbreak should be considered when estimating chill accumulation thresholds of yield decline and potential impacts of climate change.

  6. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    NASA Astrophysics Data System (ADS)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  7. Signal verification can promote reliable signalling.

    PubMed

    Broom, Mark; Ruxton, Graeme D; Schaefer, H Martin

    2013-11-22

    The central question in communication theory is whether communication is reliable, and if so, which mechanisms select for reliability. The primary approach in the past has been to attribute reliability to strategic costs associated with signalling as predicted by the handicap principle. Yet, reliability can arise through other mechanisms, such as signal verification; but the theoretical understanding of such mechanisms has received relatively little attention. Here, we model whether verification can lead to reliability in repeated interactions that typically characterize mutualisms. Specifically, we model whether fruit consumers that discriminate among poor- and good-quality fruits within a population can select for reliable fruit signals. In our model, plants either signal or they do not; costs associated with signalling are fixed and independent of plant quality. We find parameter combinations where discriminating fruit consumers can select for signal reliability by abandoning unprofitable plants more quickly. This self-serving behaviour imposes costs upon plants as a by-product, rendering it unprofitable for unrewarding plants to signal. Thus, strategic costs to signalling are not a prerequisite for reliable communication. We expect verification to more generally explain signal reliability in repeated consumer-resource interactions that typify mutualisms but also in antagonistic interactions such as mimicry and aposematism.

  8. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. The reliability of manual reporting of clinical events in an anesthesia information management system (AIMS).

    PubMed

    Simpao, Allan F; Pruitt, Eric Y; Cook-Sather, Scott D; Gurnaney, Harshad G; Rehman, Mohamed A

    2012-12-01

    Manual incident reports significantly under-report adverse clinical events when compared with automated recordings of intraoperative data. Our goal was to determine the reliability of AIMS and CQI reports of adverse clinical events that had been witnessed and recorded by research assistants. The AIMS and CQI records of 995 patients aged 2-12 years were analyzed to determine if anesthesia providers had properly documented the emesis events that were observed and recorded by research assistants who were present in the operating room at the time of induction. Research assistants recorded eight cases of emesis during induction that were confirmed with the attending anesthesiologist at the time of induction. AIMS yielded a sensitivity of 38 % (95 % confidence interval [CI] 8.5-75.5 %), while the sensitivity of CQI reporting was 13 % (95 % CI 0.3-52.7 %). The low sensitivities of the AIMS and CQI reports suggest that user-reported AIMS and CQI data do not reliably include significant clinical events.

  10. The reliability of photoneutron cross sections for 90,91,92,94Zr

    NASA Astrophysics Data System (ADS)

    Varlamov, V. V.; Davydov, A. I.; Ishkhanov, B. S.; Orlin, V. N.

    2018-05-01

    Data on partial photoneutron reaction cross sections (γ,1n) and (γ,2n) for 90,91,92,94Zr obtained at Livermore (USA) and for 90Zr obtained at Saclay (France) were analyzed. Experimental data were obtained using quasimonoenergetic photon beams from the annihilation in flight of relativistic positrons. The method of photoneutron multiplicity sorting based on the neutron energy measuring was used to separate partial reactions. The research carried out is based on the objective of using the physical criteria of data reliability. The large systematic uncertainties were found in partial cross sections, since they do not satisfy those criteria. To obtain the reliable cross sections of the partial (γ,1n) and (γ,2n) and total (γ,1n) + (γ,2n) reactions on 90,91,92,94Zr and (γ,3n) reaction on 94Zr, the experimental-theoretical method was used. It is based on the experimental data for neutron yield cross section rather independent from the neutron multiplicity and theoretical equations of the combined photonucleon reaction model (CPNRM). Newly evaluated data are compared with experimental ones. The reasons of noticeable disagreements between those are discussed.

  11. Inter-rater reliability for movement pattern analysis (MPA): measuring patterning of behaviors versus discrete behavior counts as indicators of decision-making style.

    PubMed

    Connors, Brenda L; Rende, Richard; Colton, Timothy J

    2014-01-01

    The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic - the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts - and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns.

  12. Photovoltaic Reliability Workshop Lodging Information | Photovoltaic

    Science.gov Websites

    Research | NREL Photovoltaic Reliability Workshop Lodging Information Photovoltaic Reliability Workshop Lodging Information The 2018 Photovoltaic Reliability Workshop (PVRW) will be held Tuesday

  13. Nutrient retention values and cooking yield factors for three South African lamb and mutton cuts.

    PubMed

    van Heerden, Salomina M; Strydom, Phillip E

    2017-11-01

    Nutrient content of raw and cooked foods is important for formulation of healthy diets. The retention of nutrients during cooking can be influenced by various factors, including animal age, carcass characteristics and cooking method, and these factors are often unique to specific countries. Here the effects of animal age (lamb and mutton) and carcass cut (shoulder, loin and leg) combined with cooking method (moist heat and dry heat) on yield and nutrient retention of selected nutrients of South African sheep carcasses were studied. Cooking yields and moisture retention were lower for lamb loin but higher for lamb leg. Energy and fat retention were higher for all cuts of mutton compared with lamb, while higher retention values for cholesterol were recorded for lamb. Mutton retained more iron (P = 0.10) and zinc and also more vitamin B 2 , B 6 and B 12 than lamb. Shoulder cooked according to moist heat cooking method retained more magnesium, potassium and sodium. Incorporating these retention and yield values into the South African Medical Research Council's Food Composition Tables provides a reliable reference to all concerned with nutrient content of food. It will also guide practitioners and primary industry to adjust animal production aimed at optimum nutrient content to specific diets. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  14. Impact of distributed power electronics on the lifetime and reliability of PV systems: Impact of distributed power electronics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olalla, Carlos; Maksimovic, Dragan; Deline, Chris

    Here, this paper quantifies the impact of distributed power electronics in photovoltaic (PV) systems in terms of end-of-life energy-capture performance and reliability. The analysis is based on simulations of PV installations over system lifetime at various degradation rates. It is shown how module-level or submodule-level power converters can mitigate variations in cell degradation over time, effectively increasing the system lifespan by 5-10 years compared with the nominal 25-year lifetime. An important aspect typically overlooked when characterizing such improvements is the reliability of distributed power electronics, as power converter failures may not only diminish energy yield improvements but also adversely affectmore » the overall system operation. Failure models are developed, and power electronics reliability is taken into account in this work, in order to provide a more comprehensive view of the opportunities and limitations offered by distributed power electronics in PV systems. Lastly, it is shown how a differential power-processing approach achieves the best mismatch mitigation performance and the least susceptibility to converter faults.« less

  15. Impact of distributed power electronics on the lifetime and reliability of PV systems: Impact of distributed power electronics

    DOE PAGES

    Olalla, Carlos; Maksimovic, Dragan; Deline, Chris; ...

    2017-04-26

    Here, this paper quantifies the impact of distributed power electronics in photovoltaic (PV) systems in terms of end-of-life energy-capture performance and reliability. The analysis is based on simulations of PV installations over system lifetime at various degradation rates. It is shown how module-level or submodule-level power converters can mitigate variations in cell degradation over time, effectively increasing the system lifespan by 5-10 years compared with the nominal 25-year lifetime. An important aspect typically overlooked when characterizing such improvements is the reliability of distributed power electronics, as power converter failures may not only diminish energy yield improvements but also adversely affectmore » the overall system operation. Failure models are developed, and power electronics reliability is taken into account in this work, in order to provide a more comprehensive view of the opportunities and limitations offered by distributed power electronics in PV systems. Lastly, it is shown how a differential power-processing approach achieves the best mismatch mitigation performance and the least susceptibility to converter faults.« less

  16. Modeling of unit operating considerations in generating-capacity reliability evaluation. Volume 1. Mathematical models, computing methods, and results. Final report. [GENESIS, OPCON and OPPLAN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Singh, C.

    1982-07-01

    Existing methods for generating capacity reliability evaluation do not explicitly recognize a number of operating considerations which may have important effects in system reliability performance. Thus, current methods may yield estimates of system reliability which differ appreciably from actual observed reliability. Further, current methods offer no means of accurately studying or evaluating alternatives which may differ in one or more operating considerations. Operating considerations which are considered to be important in generating capacity reliability evaluation include: unit duty cycles as influenced by load cycle shape, reliability performance of other units, unit commitment policy, and operating reserve policy; unit start-up failuresmore » distinct from unit running failures; unit start-up times; and unit outage postponability and the management of postponable outages. A detailed Monte Carlo simulation computer model called GENESIS and two analytical models called OPCON and OPPLAN have been developed which are capable of incorporating the effects of many operating considerations including those noted above. These computer models have been used to study a variety of actual and synthetic systems and are available from EPRI. The new models are shown to produce system reliability indices which differ appreciably from index values computed using traditional models which do not recognize operating considerations.« less

  17. Evaluation of Mucociliary Clearance by Three Dimension Micro-CT-SPECT in Guinea Pig: Role of Bitter Taste Agonists.

    PubMed

    Ortiz, Jose Luis; Ortiz, Amparo; Milara, Javier; Armengot, Miguel; Sanz, Celia; Compañ, Desamparados; Morcillo, Esteban; Cortijo, Julio

    2016-01-01

    Different image techniques have been used to analyze mucociliary clearance (MCC) in humans, but current small animal MCC analysis using in vivo imaging has not been well defined. Bitter taste receptor (T2R) agonists increase ciliary beat frequency (CBF) and cause bronchodilation but their effects in vivo are not well understood. This work analyzes in vivo nasal and bronchial MCC in guinea pig animals using three dimension (3D) micro-CT-SPECT images and evaluates the effect of T2R agonists. Intranasal macroaggreggates of albumin-Technetium 99 metastable (MAA-Tc99m) and lung nebulized Tc99m albumin nanocolloids were used to analyze the effect of T2R agonists on nasal and bronchial MCC respectively, using 3D micro-CT-SPECT in guinea pig. MAA-Tc99m showed a nasal mucociliary transport rate of 0.36 mm/min that was increased in presence of T2R agonist to 0.66 mm/min. Tc99m albumin nanocolloids were homogeneously distributed in the lung of guinea pig and cleared with time-dependence through the bronchi and trachea of guinea pig. T2R agonist increased bronchial MCC of Tc99m albumin nanocolloids. T2R agonists increased CBF in human nasal ciliated cells in vitro and induced bronchodilation in human bronchi ex vivo. In summary, T2R agonists increase MCC in vivo as assessed by 3D micro-CT-SPECT analysis.

  18. Process gg{yields}h{sub 0}{yields}{gamma}{gamma} in the Lee-Wick standard model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krauss, F.; Underwood, T. E. J.; Zwicky, R.

    2008-01-01

    The process gg{yields}h{sub 0}{yields}{gamma}{gamma} is studied in the Lee-Wick extension of the standard model (LWSM) proposed by Grinstein, O'Connell, and Wise. In this model, negative norm partners for each SM field are introduced with the aim to cancel quadratic divergences in the Higgs mass. All sectors of the model relevant to gg{yields}h{sub 0}{yields}{gamma}{gamma} are diagonalized and results are commented on from the perspective of both the Lee-Wick and higher-derivative formalisms. Deviations from the SM rate for gg{yields}h{sub 0} are found to be of the order of 15%-5% for Lee-Wick masses in the range 500-1000 GeV. Effects on the rate formore » h{sub 0}{yields}{gamma}{gamma} are smaller, of the order of 5%-1% for Lee-Wick masses in the same range. These comparatively small changes may well provide a means of distinguishing the LWSM from other models such as universal extra dimensions where same-spin partners to standard model fields also appear. Corrections to determinations of Cabibbo-Kobayashi-Maskawa (CKM) elements |V{sub t(b,s,d)}| are also considered and are shown to be positive, allowing the possibility of measuring a CKM element larger than unity, a characteristic signature of the ghostlike nature of the Lee-Wick fields.« less

  19. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  20. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  1. Estimating oak growth and yield

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Yields from upland oak stands vary widely from stand to stand due to differences in age, site quality, species composition, and stand structure. Cutting history and other past disturbances such as grazing or fire also affect yields.

  2. Fission yield and criticality excursion code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, A.

    2000-06-30

    The ANSI/ANS 8.3 standard allows a maximum yield not to exceed 2 x 10 fissions to calculate requiring the alarm system to be effective. It is common practice to use this allowance or to develop some other yield based on past criticality accident history or excursion experiments. The literature on the subject of yields discusses maximum yields larger and somewhat smaller than the ANS 8.3 permissive value. The ability to model criticality excursions and vary the various parameters to determine a credible maximum yield for operational specific cases has been available for some time but is not in common usemore » by criticality safety specialists. The topic of yields for various solution, metal, oxide powders, etc. in various geometry's and containers has been published by laboratory specialists or university staff and students for many decades but have not been available to practitioners. The need for best-estimate calculations of fission yields with a well-validated criticality excursion code has long been recognized. But no coordinated effort has been made so far to develop a generalized and well-validated excursion code for different types of systems. In this paper, the current practices to estimate fission yields are summarized along with its shortcomings for the 12-Rad zone (at SRS) and Criticality Alarm System (CAS) calculations. Finally the need for a user-friendly excursion code is reemphasized.« less

  3. Climate Variability and Sugarcane Yield in Louisiana.

    NASA Astrophysics Data System (ADS)

    Greenland, David

    2005-11-01

    This paper seeks to understand the role that climate variability has on annual yield of sugarcane in Louisiana. Unique features of sugarcane growth in Louisiana and nonclimatic, yield-influencing factors make this goal an interesting and challenging one. Several methods of seeking and establishing the relations between yield and climate variables are employed. First, yield climate relations were investigated at a single research station where crop variety and growing conditions could be held constant and yield relations could be established between a predominant older crop variety and a newer one. Interviews with crop experts and a literature survey were used to identify potential climatic factors that control yield. A statistical analysis was performed using statewide yield data from the American Sugar Cane League from 1963 to 2002 and a climate database. Yield values for later years were adjusted downward to form an adjusted yield dataset. The climate database was principally constructed from daily and monthly values of maximum and minimum temperature and daily and monthly total precipitation for six cooperative weather-reporting stations representative of the area of sugarcane production. The influence of 74 different, though not independent, climate-related variables on sugarcane yield was investigated. The fact that a climate signal exists is demonstrated by comparing mean values of the climate variables corresponding to the upper and lower third of adjusted yield values. Most of these mean-value differences show an intuitively plausible difference between the high- and low-yield years. The difference between means of the climate variables for years corresponding to the upper and lower third of annual yield values for 13 of the variables is statistically significant at or above the 90% level. A correlation matrix was used to identify the variables that had the largest influence on annual yield. Four variables [called here critical climatic variables (CCV

  4. Applicability and Limitations of Reliability Allocation Methods

    NASA Technical Reports Server (NTRS)

    Cruz, Jose A.

    2016-01-01

    Reliability allocation process may be described as the process of assigning reliability requirements to individual components within a system to attain the specified system reliability. For large systems, the allocation process is often performed at different stages of system design. The allocation process often begins at the conceptual stage. As the system design develops, more information about components and the operating environment becomes available, different allocation methods can be considered. Reliability allocation methods are usually divided into two categories: weighting factors and optimal reliability allocation. When properly applied, these methods can produce reasonable approximations. Reliability allocation techniques have limitations and implied assumptions that need to be understood by system engineers. Applying reliability allocation techniques without understanding their limitations and assumptions can produce unrealistic results. This report addresses weighting factors, optimal reliability allocation techniques, and identifies the applicability and limitations of each reliability allocation technique.

  5. Climate Effects on Corn Yield in Missouri(.

    NASA Astrophysics Data System (ADS)

    Hu, Qi; Buyanovsky, Gregory

    2003-11-01

    Understanding climate effects on crop yield has been a continuous endeavor aiming at improving farming technology and management strategy, minimizing negative climate effects, and maximizing positive climate effects on yield. Many studies have examined climate effects on corn yield in different regions of the United States. However, most of those studies used yield and climate records that were shorter than 10 years and were for different years and localities. Although results of those studies showed various influences of climate on corn yield, they could be time specific and have been difficult to use for deriving a comprehensive understanding of climate effects on corn yield. In this study, climate effects on corn yield in central Missouri are examined using unique long-term (1895 1998) datasets of both corn yield and climate. Major results show that the climate effects on corn yield can only be explained by within-season variations in rainfall and temperature and cannot be distinguished by average growing-season conditions. Moreover, the growing-season distributions of rainfall and temperature for high-yield years are characterized by less rainfall and warmer temperature in the planting period, a rapid increase in rainfall, and more rainfall and warmer temperatures during germination and emergence. More rainfall and cooler-than-average temperatures are key features in the anthesis and kernel-filling periods from June through August, followed by less rainfall and warmer temperatures during the September and early October ripening time. Opposite variations in rainfall and temperature in the growing season correspond to low yield. Potential applications of these results in understanding how climate change may affect corn yield in the region also are discussed.

  6. Probability interpretations of intraclass reliabilities.

    PubMed

    Ellis, Jules L

    2013-11-20

    Research where many organizations are rated by different samples of individuals such as clients, patients, or employees frequently uses reliabilities computed from intraclass correlations. Consumers of statistical information, such as patients and policy makers, may not have sufficient background for deciding which levels of reliability are acceptable. It is shown that the reliability is related to various probabilities that may be easier to understand, for example, the proportion of organizations that will be classed significantly above (or below) the mean and the probability that an organization is classed correctly given that it is classed significantly above (or below) the mean. One can view these probabilities as the amount of information of the classification and the correctness of the classification. These probabilities have an inverse relationship: given a reliability, one can 'buy' correctness at the cost of informativeness and conversely. This article discusses how this can be used to make judgments about the required level of reliabilities. Copyright © 2013 John Wiley & Sons, Ltd.

  7. [Predicting the impact of climate change in the next 40 years on the yield of maize in China].

    PubMed

    Ma, Yu-ping; Sun, Lin-li; E, You-hao; Wu, Wei

    2015-01-01

    provinces. Assessments of the future change of maize yield in China based on the different methods were not consistent. Further evaluation needs to consider the change of maize variety and scientific and technological progress, and to enhance the reliability of evaluation models.

  8. Improving the yield from fermentative hydrogen production.

    PubMed

    Kraemer, Jeremy T; Bagley, David M

    2007-05-01

    Efforts to increase H(2) yields from fermentative H(2) production include heat treatment of the inoculum, dissolved gas removal, and varying the organic loading rate. Although heat treatment kills methanogens and selects for spore-forming bacteria, the available evidence indicates H(2) yields are not maximized compared to bromoethanesulfonate, iodopropane, or perchloric acid pre-treatments and spore-forming acetogens are not killed. Operational controls (low pH, short solids retention time) can replace heat treatment. Gas sparging increases H(2) yields compared to un-sparged reactors, but no relationship exists between the sparging rate and H(2) yield. Lower sparging rates may improve the H(2) yield with less energy input and product dilution. The reasons why sparging improves H(2) yields are unknown, but recent measurements of dissolved H(2) concentrations during sparging suggest the assumption of decreased inhibition of the H(2)-producing enzymes is unlikely. Significant disagreement exists over the effect of organic loading rate (OLR); some studies show relatively higher OLRs improve H(2) yield while others show the opposite. Discovering the reasons for higher H(2) yields during dissolved gas removal and changes in OLR will help improve H(2) yields.

  9. Electron-induced electron yields of uncharged insulating materials

    NASA Astrophysics Data System (ADS)

    Hoffmann, Ryan Carl

    Presented here are electron-induced electron yield measurements from high-resistivity, high-yield materials to support a model for the yield of uncharged insulators. These measurements are made using a low-fluence, pulsed electron beam and charge neutralization to minimize charge accumulation. They show charging induced changes in the total yield, as much as 75%, even for incident electron fluences of <3 fC/mm2, when compared to an uncharged yield. The evolution of the yield as charge accumulates in the material is described in terms of electron recapture, based on the extended Chung and Everhart model of the electron emission spectrum and the dual dynamic layer model for internal charge distribution. This model is used to explain charge-induced total yield modification measured in high-yield ceramics, and to provide a method for determining electron yield of uncharged, highly insulating, high-yield materials. A sequence of materials with progressively greater charge susceptibility is presented. This series starts with low-yield Kapton derivative called CP1, then considers a moderate-yield material, Kapton HN, and ends with a high-yield ceramic, polycrystalline aluminum oxide. Applicability of conductivity (both radiation induced conductivity (RIC) and dark current conductivity) to the yield is addressed. Relevance of these results to spacecraft charging is also discussed.

  10. Orbiter Autoland reliability analysis

    NASA Technical Reports Server (NTRS)

    Welch, D. Phillip

    1993-01-01

    The Space Shuttle Orbiter is the only space reentry vehicle in which the crew is seated upright. This position presents some physiological effects requiring countermeasures to prevent a crewmember from becoming incapacitated. This also introduces a potential need for automated vehicle landing capability. Autoland is a primary procedure that was identified as a requirement for landing following and extended duration orbiter mission. This report documents the results of the reliability analysis performed on the hardware required for an automated landing. A reliability block diagram was used to evaluate system reliability. The analysis considers the manual and automated landing modes currently available on the Orbiter. (Autoland is presently a backup system only.) Results of this study indicate a +/- 36 percent probability of successfully extending a nominal mission to 30 days. Enough variations were evaluated to verify that the reliability could be altered with missions planning and procedures. If the crew is modeled as being fully capable after 30 days, the probability of a successful manual landing is comparable to that of Autoland because much of the hardware is used for both manual and automated landing modes. The analysis indicates that the reliability for the manual mode is limited by the hardware and depends greatly on crew capability. Crew capability for a successful landing after 30 days has not been determined yet.

  11. Ethiopian Wheat Yield and Yield Gap Estimation: A Spatial Small Area Integrated Data Approach

    NASA Astrophysics Data System (ADS)

    Mann, M.; Warner, J.

    2015-12-01

    Despite the collection of routine annual agricultural surveys and significant advances in GIS and remote sensing products, little econometric research has been undertaken in predicting developing nation's agricultural yields. In this paper, we explore the determinants of wheat output per hectare in Ethiopia during the 2011-2013 Meher crop seasons aggregated to the woreda administrative area. Using a panel data approach, combining national agricultural field surveys with relevant GIS and remote sensing products, the model explains nearly 40% of the total variation in wheat output per hectare across the country. The model also identifies specific contributors to wheat yields that include farm management techniques (eg. area planted, improved seed, fertilizer, irrigation), weather (eg. rainfall), water availability (vegetation and moisture deficit indexes) and policy intervention. Our findings suggest that woredas produce between 9.8 and 86.5% of their potential wheat output per hectare given their altitude, weather conditions, terrain, and plant health. At the median, Amhara, Oromiya, SNNP, and Tigray produce 48.6, 51.5, 49.7, and 61.3% of their local attainable yields, respectively. This research has a broad range of applications, especially from a public policy perspective: identifying causes of yield fluctuations, remotely evaluating larger agricultural intervention packages, and analyzing relative yield potential. Overall, the combination of field surveys with spatial data can be used to identify management priorities for improving production at a variety of administrative levels.

  12. Yields of Bacterial Cells from Hydrocarbons

    PubMed Central

    Wodzinski, Richard S.; Johnson, Marvin J.

    1968-01-01

    A strain of Nocardia and one of Pseudomonas, both isolated on pristane (2,6,10,14-tetramethylpentadecane), gave cell yields of approximately 100% on n-octadecane and pristane. Both organisms grew more rapidly on the n-octadecane than on the pristane. A mixed culture, isolated on 3-methylheptane, whose two components were identified as species of Pseudomonas and of Nocardia, gave approximately 100% cell yields and grew with generation times of about 5 hr on n-heptane, n-octane, and 2-methylheptane. The generation time on 3-methylheptane was 8.6 hr and the cell yield was only 79%. A strain of Pseudomonas isolated from naphthalene enrichments and one from phenanthrene enrichments both gave a cell yield of 50% on naphthalene. The phenanthrene isolate gave a cell yield of 40% on phenanthrene. A Nocardia species isolated on benzene gave a 79% cell yield on benzene. The generation times of the bacteria isolated on aromatic hydrocarbons were related to the solubility of the aromatic hydrocarbons on which they were grown; the more insoluble hydrocarbons gave slower growth. PMID:5726161

  13. Reliability, validity and minimal detectable change of the Mini-BESTest in Greek participants with chronic stroke.

    PubMed

    Lampropoulou, Sofia I; Billis, Evdokia; Gedikoglou, Ingrid A; Michailidou, Christina; Nowicky, Alexander V; Skrinou, Dimitra; Michailidi, Fotini; Chandrinou, Danae; Meligkoni, Margarita

    2018-02-23

    This study aimed to investigate the psychometric characteristics of reliability, validity and ability to detect change of a newly developed balance assessment tool, the Mini-BESTest, in Greek patients with stroke. A prospective, observational design study with test-retest measures was conducted. A convenience sample of 21 Greek patients with chronic stroke (14 male, 7 female; age of 63 ± 16 years) was recruited. Two independent examiners administered the scale, for the inter-rater reliability, twice within 10 days for the test-retest reliability. Bland Altman Analysis for repeated measures assessed the absolute reliability and the Standard Error of Measurement (SEM) and the Minimum Detectable Change at 95% confidence interval (MDC 95% ) were established. The Greek Mini-BESTest (Mini-BESTest GR ) was correlated with the Greek Berg Balance Scale (BBS GR ) for assessing the concurrent validity and with the Timed Up and Go (TUG), the Functional Reach Test (FRT) and the Greek Falls Efficacy Scale-International (FES-I GR ) for the convergent validity. The Mini-BESTestGR demonstrated excellent inter-rater reliability (ICC (95%CI) = 0.997 (0.995-0.999, SEM = 0.46) with the scores of two raters within the limits of agreement (mean dif  = -0.143 ± 0.727, p > 0.05) and test-retest reliability (ICC (95%CI) = 0.966 (0.926-0.988), SEM = 1.53). Additionally, the Mini-BESTest GR yielded very strong to moderate correlations with BBS GR (r = 0.924, p < 0.001), TUG (r = -0.823, p < 0.001), FES-I GR (r = -0.734, p < 0.001) and FRT (r = 0.689, p < 0.001). MDC 95 was 4.25 points. The exceptionally high reliability and the equally good validity of the Mini-BESTest GR , strongly support its utility in Greek people with chronic stroke. Its ability to identify clinically meaningful changes and falls risk need further investigation.

  14. Managing Reliability in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dellin, T.A.

    1998-11-23

    The rapid pace of change at Ike end of the 20th Century should continue unabated well into the 21st Century. The driver will be the marketplace imperative of "faster, better, cheaper." This imperative has already stimulated a revolution-in-engineering in design and manufacturing. In contrast, to date, reliability engineering has not undergone a similar level of change. It is critical that we implement a corresponding revolution-in-reliability-engineering as we enter the new millennium. If we are still using 20th Century reliability approaches in the 21st Century, then reliability issues will be the limiting factor in faster, better, and cheaper. At the heartmore » of this reliability revolution will be a science-based approach to reliability engineering. Science-based reliability will enable building-in reliability, application-specific products, virtual qualification, and predictive maintenance. The purpose of this paper is to stimulate a dialogue on the future of reliability engineering. We will try to gaze into the crystal ball and predict some key issues that will drive reliability programs in the new millennium. In the 21st Century, we will demand more of our reliability programs. We will need the ability to make accurate reliability predictions that will enable optimizing cost, performance and time-to-market to meet the needs of every market segment. We will require that all of these new capabilities be in place prior to the stint of a product development cycle. The management of reliability programs will be driven by quantifiable metrics of value added to the organization business objectives.« less

  15. Multi-Disciplinary System Reliability Analysis

    NASA Technical Reports Server (NTRS)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  16. Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2015-01-01

    NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.

  17. Evaluation Applied to Reliability Analysis of Reconfigurable, Highly Reliable, Fault-Tolerant, Computing Systems for Avionics

    NASA Technical Reports Server (NTRS)

    Migneault, G. E.

    1979-01-01

    Emulation techniques are proposed as a solution to a difficulty arising in the analysis of the reliability of highly reliable computer systems for future commercial aircraft. The difficulty, viz., the lack of credible precision in reliability estimates obtained by analytical modeling techniques are established. The difficulty is shown to be an unavoidable consequence of: (1) a high reliability requirement so demanding as to make system evaluation by use testing infeasible, (2) a complex system design technique, fault tolerance, (3) system reliability dominated by errors due to flaws in the system definition, and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. The technique of emulation is described, indicating how its input is a simple description of the logical structure of a system and its output is the consequent behavior. The use of emulation techniques is discussed for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques.

  18. Score Reliability: A Retrospective Look Back at 12 Years of Reliability Generalization Studies

    ERIC Educational Resources Information Center

    Vacha-Haase, Tammi; Thompson, Bruce

    2011-01-01

    The present study was conducted to characterize (a) the features of the thousands of primary reports synthesized in 47 reliability generalization (RG) measurement meta-analysis studies and (b) typical methodological practice within the RG literature to date. With respect to the treatment of score reliability in the literature, in an astounding…

  19. Crop status evaluations and yield predictions

    NASA Technical Reports Server (NTRS)

    Haun, J. R.

    1976-01-01

    One phase of the large area crop inventory project is presented. Wheat yield models based on the input of environmental variables potentially obtainable through the use of space remote sensing were developed and demonstrated. By the use of a unique method for visually qualifying daily plant development and subsequent multifactor computer analyses, it was possible to develop practical models for predicting crop development and yield. Development of wheat yield prediction models was based on the discovery that morphological changes in plants are detected and quantified on a daily basis, and that this change during a portion of the season was proportional to yield.

  20. Calculation of the total electron excitation cross section in the Born approximation using Slater wave functions for the Li (2s yields 2p), Li (2s yields 3p), Na (3s yields 4p), Mg (3p yields 4s), Ca (4s yields 4p) and K (4s yields 4p) excitations. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Simsic, P. L.

    1974-01-01

    Excitation of neutral atoms by inelastic scattering of incident electrons in gaseous nebulae were investigated using Slater Wave functions to describe the initial and final states of the atom. Total cross sections using the Born Approximation are calculated for: Li(2s yields 2p), Na(3s yields 4p), k(4s yields 4p). The intensity of emitted radiation from gaseous nebulae is also calculated, and Maxwell distribution is employed to average the kinetic energy of electrons.

  1. Parametric Mass Reliability Study

    NASA Technical Reports Server (NTRS)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  2. Spatial sensitivity of grassland yields to weather variations in Austria and its implications for the future☆

    PubMed Central

    Neuwirth, Christian; Hofer, Barbara

    2013-01-01

    . Projected climate change may increasingly constitute a risk to yield reliability in the east of the country. That in turn, requires consideration in agricultural development plans and a quantification of these impacts from a social-economic perspective. PMID:25843990

  3. The Malay Version of the Perceived Stress Scale (PSS)-10 is a Reliable and Valid Measure for Stress among Nurses in Malaysia.

    PubMed

    Sandhu, Sukhvinder Singh; Ismail, Noor Hassim; Rampal, Krishna Gopal

    2015-11-01

    The Perceived Stress Scale-10 (PSS-10) is widely used to assess stress perception. The aim of this study was to translate the original PSS-10 into Malay and assess the reliability and validity of the Malay version among nurses. The Malay version of the PSS-10 was distributed among 229 nurses from four government hospitals in Selangor State. Test-retest reliability and concurrent validity was conducted with 25 nurses with the Malay version of the Depression Anxiety Stress Scales (DASS) 21. Cronbach's alpha, confirmatory factor analysis (CFA), intraclass correlation coefficient and Pearson's r correlation coefficient were used to determine the psychometric properties of the Malay PSS-10. Two factor components were yielded through exploratory factor analysis with eigenvalues of 3.37 and 2.10, respectively. Both of the factors accounted for 54.6% of the variance. CFA yielded a two-factor structure with satisfactory goodness-of-fit indices [x 2 /df = 2.43; comparative fit index (CFI) = 0.92, goodness-of-fit Index (GFI) = 0.94; standardised root mean square residual (SRMR) = 0.07 and root mean square error of approximation (RMSEA) = 0.08 (90% CI = 0.07-0.09)]. The Cronbach's alpha coefficient for the total items was 0.63 (0.82 for factor 1 and 0.72 for factor 2). The intraclass correlation coefficient (ICC) was 0.81 (95% CI: 0.62-0.91) for test-retest reliability testing after seven days. The total score and the negative component of the PSS-10 correlated significantly with the stress component of the DASS-21: (r = 0.61, P < 0.001) and (r = 0.56, P < 0.004), respectively. The Malay version of the PSS-10 demonstrated a satisfactory level of validity and reliability to assess stress perception. Therefore, this questionnaire is valid in assessing stress perception among nurses in Malaysia.

  4. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), amore » systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.« less

  5. Breeding cassava for higher yield

    USDA-ARS?s Scientific Manuscript database

    Cassava is a root crop grown for food and for starch production. Breeding progress is slowed by asexual production and high levels of heterozygosity. Germplasm resources are rich and accessible to breeders through genebanks worldwide. Breeding objectives include high root yield, yield stability, dis...

  6. The Assumption of a Reliable Instrument and Other Pitfalls to Avoid When Considering the Reliability of Data

    PubMed Central

    Nimon, Kim; Zientek, Linda Reichwein; Henson, Robin K.

    2012-01-01

    The purpose of this article is to help researchers avoid common pitfalls associated with reliability including incorrectly assuming that (a) measurement error always attenuates observed score correlations, (b) different sources of measurement error originate from the same source, and (c) reliability is a function of instrumentation. To accomplish our purpose, we first describe what reliability is and why researchers should care about it with focus on its impact on effect sizes. Second, we review how reliability is assessed with comment on the consequences of cumulative measurement error. Third, we consider how researchers can use reliability generalization as a prescriptive method when designing their research studies to form hypotheses about whether or not reliability estimates will be acceptable given their sample and testing conditions. Finally, we discuss options that researchers may consider when faced with analyzing unreliable data. PMID:22518107

  7. Reliability and validity in a nutshell.

    PubMed

    Bannigan, Katrina; Watson, Roger

    2009-12-01

    To explore and explain the different concepts of reliability and validity as they are related to measurement instruments in social science and health care. There are different concepts contained in the terms reliability and validity and these are often explained poorly and there is often confusion between them. To develop some clarity about reliability and validity a conceptual framework was built based on the existing literature. The concepts of reliability, validity and utility are explored and explained. Reliability contains the concepts of internal consistency and stability and equivalence. Validity contains the concepts of content, face, criterion, concurrent, predictive, construct, convergent (and divergent), factorial and discriminant. In addition, for clinical practice and research, it is essential to establish the utility of a measurement instrument. To use measurement instruments appropriately in clinical practice, the extent to which they are reliable, valid and usable must be established.

  8. Electrical service reliability: the customer perspective

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samsa, M.E.; Hub, K.A.; Krohm, G.C.

    1978-09-01

    Electric-utility-system reliability criteria have traditionally been established as a matter of utility policy or through long-term engineering practice, generally with no supportive customer cost/benefit analysis as justification. This report presents results of an initial study of the customer perspective toward electric-utility-system reliability, based on critical review of over 20 previous and ongoing efforts to quantify the customer's value of reliable electric service. A possible structure of customer classifications is suggested as a reasonable level of disaggregation for further investigation of customer value, and these groups are characterized in terms of their electricity use patterns. The values that customers assign tomore » reliability are discussed in terms of internal and external cost components. A list of options for effecting changes in customer service reliability is set forth, and some of the many policy issues that could alter customer-service reliability are identified.« less

  9. Creating Highly Reliable Accountable Care Organizations.

    PubMed

    Vogus, Timothy J; Singer, Sara J

    2016-12-01

    Accountable Care Organizations' (ACOs) pursuit of the triple aim of higher quality, lower cost, and improved population health has met with mixed results. To improve the design and implementation of ACOs we look to organizations that manage similarly complex, dynamic, and tightly coupled conditions while sustaining exceptional performance known as high-reliability organizations. We describe the key processes through which organizations achieve reliability, the leadership and organizational practices that enable it, and the role that professionals can play when charged with enacting it. Specifically, we present concrete practices and processes from health care organizations pursuing high-reliability and from early ACOs to illustrate how the triple aim may be met by cultivating mindful organizing, practicing reliability-enhancing leadership, and identifying and supporting reliability professionals. We conclude by proposing a set of research questions to advance the study of ACOs and high-reliability research. © The Author(s) 2016.

  10. Impact of Device Scaling on Deep Sub-micron Transistor Reliability: A Study of Reliability Trends using SRAM

    NASA Technical Reports Server (NTRS)

    White, Mark; Huang, Bing; Qin, Jin; Gur, Zvi; Talmor, Michael; Chen, Yuan; Heidecker, Jason; Nguyen, Duc; Bernstein, Joseph

    2005-01-01

    As microelectronics are scaled in to the deep sub-micron regime, users of advanced technology CMOS, particularly in high-reliability applications, should reassess how scaling effects impact long-term reliability. An experimental based reliability study of industrial grade SRAMs, consisting of three different technology nodes, is proposed to substantiate current acceleration models for temperature and voltage life-stress relationships. This reliability study utilizes step-stress techniques to evaluate memory technologies (0.25mum, 0.15mum, and 0.13mum) embedded in many of today's high-reliability space/aerospace applications. Two acceleration modeling approaches are presented to relate experimental FIT calculations to Mfr's qualification data.

  11. Growth and yield of shortleaf pine

    Treesearch

    Paul A. Murphy

    1986-01-01

    A survey of available growth and yield information for shortleaf pine (Pinus echinata Mill.) is given. The kinds of studies and data sources that produce this information are also evaluated, and an example of how a growth and yield model can be used to answer management questions is illustrated. Guidelines are given for using growth and yield models, and needs for...

  12. A generalized approach to wheat yield forecasting using earth observations: Data considerations, application and relevance

    NASA Astrophysics Data System (ADS)

    Becker-Reshef, Inbal

    In recent years there has been a dramatic increase in the demand for timely, comprehensive global agricultural intelligence. The issue of food security has rapidly risen to the top of government agendas around the world as the recent lack of food access led to unprecedented food prices, hunger, poverty, and civil conflict. Timely information on global crop production is indispensable for combating the growing stress on the world's crop production, for stabilizing food prices, developing effective agricultural policies, and for coordinating responses to regional food shortages. Earth Observations (EO) data offer a practical means for generating such information as they provide global, timely, cost-effective, and synoptic information on crop condition and distribution. Their utility for crop production forecasting has long been recognized and demonstrated across a wide range of scales and geographic regions. Nevertheless it is widely acknowledged that EO data could be better utilized within the operational monitoring systems and thus there is a critical need for research focused on developing practical robust methods for agricultural monitoring. Within this context this dissertation focused on advancing EO-based methods for crop yield forecasting and on demonstrating the potential relevance for adopting EO-based crop forecasts for providing timely reliable agricultural intelligence. This thesis made contributions to this field by developing and testing a robust EO-based method for wheat production forecasting at state to national scales using available and easily accessible data. The model was developed in Kansas (KS) using coarse resolution normalized difference vegetation index (NDVI) time series data in conjunction with out-of-season wheat masks and was directly applied in Ukraine to assess its transferability. The model estimated yields within 7% in KS and 10% in Ukraine of final estimates 6 weeks prior to harvest. The relevance of adopting such methods to

  13. Inter-rater reliability for movement pattern analysis (MPA): measuring patterning of behaviors versus discrete behavior counts as indicators of decision-making style

    PubMed Central

    Connors, Brenda L.; Rende, Richard; Colton, Timothy J.

    2014-01-01

    The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic – the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts – and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns. PMID:24999336

  14. Validity and reliability of Chinese version of Adult Carer Quality of Life questionnaire (AC-QoL) in family caregivers of stroke survivors

    PubMed Central

    Li, Yingshuang; Ding, Chunge

    2017-01-01

    The Adult Carer Quality of Life questionnaire (AC-QoL) is a reliable and valid instrument used to assess the quality of life (QoL) of adult family caregivers. We explored the psychometric properties and tested the reliability and validity of a Chinese version of the AC-QoL with reliability and validity testing in 409 Chinese stroke caregivers. We used item-total correlation and extreme group comparison to do item analysis. To evaluate its reliability, we used a test-retest reliability approach, intraclass correlation coefficient (ICC), together with Cronbach’s alpha and model-based internal consistency index; to evaluate its validity, we used scale content validity, confirmatory factor analysis (CFA) and exploratory factor analysis (EFA) via principal component analysis with varimax rotation. We found that the CFA did not in fact confirm the original factor model and our EFA yielded a 31-item measure with a five-factor model. In conclusions, although some items performed differently in our analysis of the original English language version and our Chinese language version, our translated AC-QoL is a reliable and valid tool which can be used to assess the quality of life of stroke caregivers in mainland China. Chinese version AC-QoL is a comprehensive and good measurement to understand caregivers and has the potential to be a screening tool to assess QoL of caregiver. PMID:29131845

  15. Argentina soybean yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    A model based on multiple regression was developed to estimate soybean yields for the country of Argentina. A meteorological data set was obtained for the country by averaging data for stations within the soybean growing area. Predictor variables for the model were derived from monthly total precipitation and monthly average temperature. A trend variable was included for the years 1969 to 1978 since an increasing trend in yields due to technology was observed between these years.

  16. The effect on reliability and sensitivity to level of training of combining analytic and holistic rating scales for assessing communication skills in an internal medicine resident OSCE.

    PubMed

    Daniels, Vijay John; Harley, Dwight

    2017-07-01

    Although previous research has compared checklists to rating scales for assessing communication, the purpose of this study was to compare the effect on reliability and sensitivity to level of training of an analytic, a holistic, and a combined analytic-holistic rating scale in assessing communication skills. The University of Alberta Internal Medicine Residency runs OSCEs for postgraduate year (PGY) 1 and 2 residents and another for PGY-4 residents. Communication stations were scored with an analytic scale (empathy, non-verbal skills, verbal skills, and coherence subscales) and a holistic scale. Authors analyzed reliability of individual and combined scales using generalizability theory and evaluated each scale's sensitivity to level of training. For analytic, holistic, and combined scales, 12, 12, and 11 stations respectively yielded a Phi of 0.8 for the PGY-1,2 cohort, and 16, 16, and 14 stations yielded a Phi of 0.8 for the PGY-4 cohort. PGY-4 residents scored higher on the combined scale, the analytic rating scale, and the non-verbal and coherence subscales. A combined analytic-holistic rating scale increased score reliability and was sensitive to level of training. Given increased validity evidence, OSCE developers should consider combining analytic and holistic scales when assessing communication skills. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Reliability of abstracting performance measures: results of the cardiac rehabilitation referral and reliability (CR3) project.

    PubMed

    Thomas, Randal J; Chiu, Jensen S; Goff, David C; King, Marjorie; Lahr, Brian; Lichtman, Steven W; Lui, Karen; Pack, Quinn R; Shahriary, Melanie

    2014-01-01

    Assessment of the reliability of performance measure (PM) abstraction is an important step in PM validation. Reliability has not been previously assessed for abstracting PMs for the referral of patients to cardiac rehabilitation (CR) and secondary prevention (SP) programs. To help validate these PMs, we carried out a multicenter assessment of their reliability. Hospitals and clinical practices from around the United States were invited to participate in the Cardiac Rehabilitation Referral Reliability (CR3) Project. Twenty-nine hospitals and 23 outpatient centers expressed interest in participating. Seven hospitals and 6 outpatient centers met participation criteria and submitted completed data. Site coordinators identified 35 patients whose charts were reviewed by 2 site abstractors twice, 1 week apart. Percent agreement and the Cohen κ statistic were used to describe intra- and interabstractor reliability for patient eligibility for CR/SP, patient exceptions for CR/SP referral, and documented referral to CR/SP. Results were obtained from within-site data, as well as from pooled data of all inpatient and all outpatient sites. We found that intra-abstractor reliability reflected excellent repeatability (≥ 90% agreement; κ ≥ 0.75) for ratings of CR/SP eligibility, exceptions, and referral, both from pooled and site-specific analyses of inpatient and outpatient data. Similarly, the interabstractor agreement from pooled analysis ranged from good to excellent for the 3 items, although with slightly lower measures of reliability. Abstraction of PMs for CR/SP referral has high reliability, supporting the use of these PMs in quality improvement initiatives aimed at increasing CR/SP delivery to patients with cardiovascular disease.

  18. Dielectrophoresis and its application to biomedical diagnostics platforms

    NASA Astrophysics Data System (ADS)

    Basuray, Sagnik

    Novel pathogenic diagnostics and on field devices to attest their growth have been the current norm of scientific research and curiosity. Microfluidics and Nanofluidics have recently been on the forefront of the development of these devices for their inherent advantages of large surface to volume ratio and small diffusion times. With the advancement of soft lithographic techniques, the devices can be easily adapted for medical systems and bio-diagnostic devices to study mechanistic pathways of bio-molecules, bio-chemical reactions and as delivery modules for drug. However, the lack of better sensors, other than optics, to detect low bio-particle numbers in real samples have made the instruments bulky, expensive and not suitable for field use. Thus there is an urgent need to develop label-free, portable, inexpensive, rapid diagnostic devices. In order to achieve a viable device, researchers in these fields have been using dielectrophoresis as the mechanism of choice for a variety of tasks, from particle manipulation, to delivery, to movement of the particles through the fluid. However, the exact physical mechanism for not only the dielectrophoresis of the colloidal assembly is unclear, but the dielectrophoresis of single bio-particles/charged nano-colloids is not understood fully. In this thesis, I present a theory for charged nano-colloid dielectrophoresis taking into account the surface charge and Debye double layer effects. The exact mechanism of the origin of the Stern layer, through the surface conductance effect of a nano-colloid to form a collapsed diffuse layer that renders a nano-colloid conductive at sub-optical frequency has been formulated. This effect is utilized to optimize a nano-colloid assay to detect DNA hybridization. The collapsed diffuse layer kinetics with thick diffuse layer is solved, using spherical harmonics of the Bessel solution of the Poisson equation, to give a modified Clausius-Mosotti factor, that accounts for the size dependent

  19. Developing an automated database for monitoring ultrasound- and computed tomography-guided procedure complications and diagnostic yield.

    PubMed

    Itri, Jason N; Jones, Lisa P; Kim, Woojin; Boonn, William W; Kolansky, Ana S; Hilton, Susan; Zafar, Hanna M

    2014-04-01

    Monitoring complications and diagnostic yield for image-guided procedures is an important component of maintaining high quality patient care promoted by professional societies in radiology and accreditation organizations such as the American College of Radiology (ACR) and Joint Commission. These outcome metrics can be used as part of a comprehensive quality assurance/quality improvement program to reduce variation in clinical practice, provide opportunities to engage in practice quality improvement, and contribute to developing national benchmarks and standards. The purpose of this article is to describe the development and successful implementation of an automated web-based software application to monitor procedural outcomes for US- and CT-guided procedures in an academic radiology department. The open source tools PHP: Hypertext Preprocessor (PHP) and MySQL were used to extract relevant procedural information from the Radiology Information System (RIS), auto-populate the procedure log database, and develop a user interface that generates real-time reports of complication rates and diagnostic yield by site and by operator. Utilizing structured radiology report templates resulted in significantly improved accuracy of information auto-populated from radiology reports, as well as greater compliance with manual data entry. An automated web-based procedure log database is an effective tool to reliably track complication rates and diagnostic yield for US- and CT-guided procedures performed in a radiology department.

  20. Best Linear Unbiased Prediction (BLUP) for regional yield trials: a comparison to additive main effects and multiplicative interaction (AMMI) analysis.

    PubMed

    Piepho, H P

    1994-11-01

    Multilocation trials are often used to analyse the adaptability of genotypes in different environments and to find for each environment the genotype that is best adapted; i.e. that is highest yielding in that environment. For this purpose, it is of interest to obtain a reliable estimate of the mean yield of a cultivar in a given environment. This article compares two different statistical estimation procedures for this task: the Additive Main Effects and Multiplicative Interaction (AMMI) analysis and Best Linear Unbiased Prediction (BLUP). A modification of a cross validation procedure commonly used with AMMI is suggested for trials that are laid out as a randomized complete block design. The use of these procedure is exemplified using five faba bean datasets from German registration trails. BLUP was found to outperform AMMI in four of five faba bean datasets.

  1. 40 CFR 75.42 - Reliability criteria.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Reliability criteria. 75.42 Section 75...) CONTINUOUS EMISSION MONITORING Alternative Monitoring Systems § 75.42 Reliability criteria. To demonstrate reliability equal to or better than the continuous emission monitoring system, the owner or operator shall...

  2. How Reliable Are Informal Reading Inventories?

    ERIC Educational Resources Information Center

    Spector, Janet E.

    2005-01-01

    Informal Reading Inventories (IRI) are often recommended as instructionally relevant measures of reading. However, they have also been criticized for inattention to technical quality. Examination of reliability evidence in nine recently revised IRIs revealed that fewer than half report reliability. Several appear to have sufficient reliability for…

  3. The Reliability of Psychiatric Diagnosis Revisited

    PubMed Central

    Rankin, Eric; France, Cheryl; El-Missiry, Ahmed; John, Collin

    2006-01-01

    Background: The authors reviewed the topic of reliability of psychiatric diagnosis from the turn of the 20th century to present. The objectives of this paper are to explore the reasons of unreliability of psychiatric diagnosis and propose ways to improve the reliability of psychiatric diagnosis. Method: The authors reviewed the literature on the concept of reliability of psychiatric diagnosis with emphasis on the impact of interviewing skills, use of diagnostic criteria, and structured interviews on the reliability of psychiatric diagnosis. Results: Causes of diagnostic unreliability are attributed to the patient, the clinician and psychiatric nomenclature. The reliability of psychiatric diagnosis can be enhanced by using diagnostic criteria, defining psychiatric symptoms and structuring the interviews. Conclusions: The authors propose the acronym ‘DR.SED,' which stands for diagnostic criteria, reference definitions, structuring the interview, clinical experience, and data. The authors recommend that clinicians use the DR.SED paradigm to improve the reliability of psychiatric diagnoses. PMID:21103149

  4. Test Reliability at the Individual Level

    PubMed Central

    Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly

    2016-01-01

    Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107

  5. Measurement of impulsive choice in rats: Same and alternate form test-retest reliability and temporal tracking

    PubMed Central

    Peterson, Jennifer R.; Hill, Catherine C.; Kirkpatrick, Kimberly

    2016-01-01

    Impulsive choice is typically measured by presenting smaller-sooner (SS) versus larger-later (LL) rewards, with biases towards the SS indicating impulsivity. The current study tested rats on different impulsive choice procedures with LL delay manipulations to assess same-form and alternate-form test-retest reliability. In the systematic-GE procedure (Green & Estle, 2003), the LL delay increased after several sessions of training; in the systematic-ER procedure (Evenden & Ryan, 1996), the delay increased within each session; and in the adjusting-M procedure (Mazur, 1987), the delay changed after each block of trials within a session based on each rat’s choices in the previous block. In addition to measuring choice behavior, we also assessed temporal tracking of the LL delays using the median times of responding during LL trials. The two systematic procedures yielded similar results in both choice and temporal tracking measures following extensive training, whereas the adjusting procedure resulted in relatively more impulsive choices and poorer temporal tracking. Overall, the three procedures produced acceptable same form test-retest reliability over time, but the adjusting procedure did not show significant alternate form test-retest reliability with the other two procedures. The results suggest that systematic procedures may supply better measurements of impulsive choice in rats. PMID:25490901

  6. Metabolomic prediction of yield in hybrid rice.

    PubMed

    Xu, Shizhong; Xu, Yang; Gong, Liang; Zhang, Qifa

    2016-10-01

    Rice (Oryza sativa) provides a staple food source for more than 50% of the world's population. An increase in yield can significantly contribute to global food security. Hybrid breeding can potentially help to meet this goal because hybrid rice often shows a considerable increase in yield when compared with pure-bred cultivars. We recently developed a marker-guided prediction method for hybrid yield and showed a substantial increase in yield through genomic hybrid breeding. We now have transcriptomic and metabolomic data as potential resources for prediction. Using six prediction methods, including least absolute shrinkage and selection operator (LASSO), best linear unbiased prediction (BLUP), stochastic search variable selection, partial least squares, and support vector machines using the radial basis function and polynomial kernel function, we found that the predictability of hybrid yield can be further increased using these omic data. LASSO and BLUP are the most efficient methods for yield prediction. For high heritability traits, genomic data remain the most efficient predictors. When metabolomic data are used, the predictability of hybrid yield is almost doubled compared with genomic prediction. Of the 21 945 potential hybrids derived from 210 recombinant inbred lines, selection of the top 10 hybrids predicted from metabolites would lead to a ~30% increase in yield. We hypothesize that each metabolite represents a biologically built-in genetic network for yield; thus, using metabolites for prediction is equivalent to using information integrated from these hidden genetic networks for yield prediction. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  7. Redefining yield gaps at various spatial scales

    NASA Astrophysics Data System (ADS)

    Meng, K.; Fishman, R.; Norstrom, A. V.; Diekert, F. K.; Engstrom, G.; Gars, J.; McCarney, G. R.; Sjostedt, M.

    2013-12-01

    Recent research has highlighted the prevalence of 'yield gaps' around the world and the importance of closing them for global food security. However, the traditional concept of yield gap -defined as the difference between observed and optimal yield under biophysical conditions - omit relevant socio-economic and ecological constraints and thus offer limited guidance on potential policy interventions. This paper proposes alternative definitions of yield gaps by incorporating rich, high resolution, national and sub-national agricultural datasets. We examine feasible efforts to 'close yield gaps' at various spatial scales and across different socio-economic and ecological domains.

  8. Multisite Reliability of Cognitive BOLD Data

    PubMed Central

    Brown, Gregory G.; Mathalon, Daniel H.; Stern, Hal; Ford, Judith; Mueller, Bryon; Greve, Douglas N.; McCarthy, Gregory; Voyvodic, Jim; Glover, Gary; Diaz, Michele; Yetter, Elizabeth; Burak Ozyurt, I.; Jorgensen, Kasper W.; Wible, Cynthia G.; Turner, Jessica A.; Thompson, Wesley K.; Potkin, Steven G.

    2010-01-01

    Investigators perform multi-site functional magnetic resonance imaging studies to increase statistical power, to enhance generalizability, and to improve the likelihood of sampling relevant subgroups. Yet undesired site variation in imaging methods could off-set these potential advantages. We used variance components analysis to investigate sources of variation in the blood oxygen level dependent (BOLD) signal across four 3T magnets in voxelwise and region of interest (ROI) analyses. Eighteen participants traveled to four magnet sites to complete eight runs of a working memory task involving emotional or neutral distraction. Person variance was more than 10 times larger than site variance for five of six ROIs studied. Person-by-site interactions, however, contributed sizable unwanted variance to the total. Averaging over runs increased between-site reliability, with many voxels showing good to excellent between-site reliability when eight runs were averaged and regions of interest showing fair to good reliability. Between-site reliability depended on the specific functional contrast analyzed in addition to the number of runs averaged. Although median effect size was correlated with between-site reliability, dissociations were observed for many voxels. Brain regions where the pooled effect size was large but between-site reliability was poor were associated with reduced individual differences. Brain regions where the pooled effect size was small but between-site reliability was excellent were associated with a balance of participants who displayed consistently positive or consistently negative BOLD responses. Although between-site reliability of BOLD data can be good to excellent, acquiring highly reliable data requires robust activation paradigms, ongoing quality assurance, and careful experimental control. PMID:20932915

  9. Argentina wheat yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    Five models based on multiple regression were developed to estimate wheat yields for the five wheat growing provinces of Argentina. Meteorological data sets were obtained for each province by averaging data for stations within each province. Predictor variables for the models were derived from monthly total precipitation, average monthly mean temperature, and average monthly maximum temperature. Buenos Aires was the only province for which a trend variable was included because of increasing trend in yield due to technology from 1950 to 1963.

  10. Argentina corn yield model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    A model based on multiple regression was developed to estimate corn yields for the country of Argentina. A meteorological data set was obtained for the country by averaging data for stations within the corn-growing area. Predictor variables for the model were derived from monthly total precipitation, average monthly mean temperature, and average monthly maximum temperature. A trend variable was included for the years 1965 to 1980 since an increasing trend in yields due to technology was observed between these years.

  11. As reliable as the sun

    NASA Astrophysics Data System (ADS)

    Leijtens, J. A. P.

    2017-11-01

    Fortunately there is almost nothing as reliable as the sun which can consequently be utilized as a very reliable source of spacecraft power. In order to harvest this power, the solar panels have to be pointed towards the sun as accurately and reliably as possible. To this extend, sunsensors are available on almost every satellite to support vital sun-pointing capability throughout the mission, even in the deployment and save mode phases of the satellites life. Given the criticality of the application one would expect that after more than 50 years of sun sensor utilisation, such sensors would be fully matured and optimised. In actual fact though, the majority of sunsensors employed are still coarse sunsensors which have a proven extreme reliability but present major issues regarding albedo sensitivity and pointing accuracy.

  12. 78 FR 38311 - Reliability Technical Conference Agenda

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-26

    ... issues related to the reliability of the Bulk-Power System. The agenda for this conference is attached... Reliability Technical Docket No. AD13-6-000 Conference. North American Electric Docket No. RC11-6-004 Reliability Corporation. North American Electric Docket No. RR13-2-000 Reliability Corporation. Not...

  13. Evidence for Ni-56 yields Co-56 yields Fe-56 decay in type Ia supernovae

    NASA Technical Reports Server (NTRS)

    Kuchner, Marc J.; Kirshner, Robert P.; Pinto, Philip A.; Leibundgut, Bruno

    1994-01-01

    In the prevailing picture of Type Ia supernovae (SN Ia), their explosive burning produces Ni-56, and the radioactive decay chain Ni-56 yields Co-56 yields Fe-56 powers the subsequent emission. We test a central feature of this theory by measuring the relative strengths of a (Co III) emission feature near 5900 A and a (Fe III) emission feature near 4700 A. We measure 38 spectra from 13 SN Ia ranging from 48 to 310 days after maximum light. When we compare the observations with a simple multilevel calculation, we find that the observed Fe/Co flux ratio evolves as expected when the Fe-56/Co-56 abundance ratio follows from Ni-56 yields Co-56 yields Fe-56 decay. From this agreement, we conclude that the cobalt and iron atoms we observe through SN Ia emission lines are produced by the radioactive decay of Ni-56, just as predicted by a wide range of models for SN Ia explosions.

  14. Test-re-test reliability and inter-rater reliability of a digital pelvic inclinometer in young, healthy males and females.

    PubMed

    Beardsley, Chris; Egerton, Tim; Skinner, Brendon

    2016-01-01

    Objective. The purpose of this study was to investigate the reliability of a digital pelvic inclinometer (DPI) for measuring sagittal plane pelvic tilt in 18 young, healthy males and females. Method. The inter-rater reliability and test-re-test reliabilities of the DPI for measuring pelvic tilt in standing on both the right and left sides of the pelvis were measured by two raters carrying out two rating sessions of the same subjects, three weeks apart. Results. For measuring pelvic tilt, inter-rater reliability was designated as good on both sides (ICC = 0.81-0.88), test-re-test reliability within a single rating session was designated as good on both sides (ICC = 0.88-0.95), and test-re-test reliability between two rating sessions was designated as moderate on the left side (ICC = 0.65) and good on the right side (ICC = 0.85). Conclusion. Inter-rater reliability and test-re-test reliability within a single rating session of the DPI in measuring pelvic tilt were both good, while test-re-test reliability between rating sessions was moderate-to-good. Caution is required regarding the interpretation of the test-re-test reliability within a single rating session, as the raters were not blinded. Further research is required to establish validity.

  15. Fluorescence guided surgery and tracer-dose, fact or fiction?

    PubMed

    KleinJan, Gijs H; Bunschoten, Anton; van den Berg, Nynke S; Olmos, Renato A Valdès; Klop, W Martin C; Horenblas, Simon; van der Poel, Henk G; Wester, Hans-Jürgen; van Leeuwen, Fijs W B

    2016-09-01

    Fluorescence guidance is an upcoming methodology to improve surgical accuracy. Challenging herein is the identification of the minimum dose at which the tracer can be detected with a clinical-grade fluorescence camera. Using a hybrid tracer such as indocyanine green (ICG)-(99m)Tc-nanocolloid, it has become possible to determine the accumulation of tracer and correlate this to intraoperative fluorescence-based identification rates. In the current study, we determined the lower detection limit of tracer at which intraoperative fluorescence guidance was still feasible. Size exclusion chromatography (SEC) provided a laboratory set-up to analyze the chemical content and to simulate the migratory behavior of ICG-nanocolloid in tissue. Tracer accumulation and intraoperative fluorescence detection findings were derived from a retrospective analysis of 20 head-and-neck melanoma patients, 40 penile and 20 prostate cancer patients scheduled for sentinel node (SN) biopsy using ICG-(99m)Tc-nanocolloid. In these patients, following tracer injection, single photon emission computed tomography fused with computed tomography (SPECT/CT) was used to identify the SN(s). The percentage injected dose (% ID), the amount of ICG (in nmol), and the concentration of ICG in the SNs (in μM) was assessed for SNs detected on SPECT/CT and correlated with the intraoperative fluorescence imaging findings. SEC determined that in the hybrid tracer formulation, 41 % (standard deviation: 12 %) of ICG was present in nanocolloid-bound form. In the SNs detected using fluorescence guidance a median of 0.88 % ID was present, compared to a median of 0.25 % ID in the non-fluorescent SNs (p-value < 0.001). The % ID values could be correlated to the amount ICG in a SN (range: 0.003-10.8 nmol) and the concentration of ICG in a SN (range: 0.006-64.6 μM). The ability to provide intraoperative fluorescence guidance is dependent on the amount and concentration of the fluorescent dye accumulated in the

  16. 18 CFR 39.11 - Reliability reports.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Electric Reliability Organization shall conduct assessments of the adequacy of the Bulk-Power System in... assessments as determined by the Commission of the reliability of the Bulk-Power System in North America and... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Reliability reports. 39...

  17. PV Reliability Workshop | Photovoltaic Research | NREL

    Science.gov Websites

    Laboratory. NREL hosts an annual Photovoltaic Reliability Workshop (PVRW) so that solar technology experts Photovoltaic Reliability Workshop (PVRW) will be held Tuesday, February 27, to Thursday, March 1, at the workshop. 2017 Workshop The 2017 Photovoltaic Reliability Workshop (PVRW) was Tuesday, February 28, to

  18. 2016 NREL Photovoltaic Module Reliability Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology - both critical goals for moving PV technologies deeper into the electricity marketplace.

  19. 2015 NREL Photovoltaic Module Reliability Workshops

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurtz, Sarah

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  20. Normalized Rotational Multiple Yield Surface Framework (NRMYSF) stress-strain curve prediction method based on small strain triaxial test data on undisturbed Auckland residual clay soils

    NASA Astrophysics Data System (ADS)

    Noor, M. J. Md; Ibrahim, A.; Rahman, A. S. A.

    2018-04-01

    Small strain triaxial test measurement is considered to be significantly accurate compared to the external strain measurement using conventional method due to systematic errors normally associated with the test. Three submersible miniature linear variable differential transducer (LVDT) mounted on yokes which clamped directly onto the soil sample at equally 120° from the others. The device setup using 0.4 N resolution load cell and 16 bit AD converter was capable of consistently resolving displacement of less than 1µm and measuring axial strains ranging from less than 0.001% to 2.5%. Further analysis of small strain local measurement data was performed using new Normalized Multiple Yield Surface Framework (NRMYSF) method and compared with existing Rotational Multiple Yield Surface Framework (RMYSF) prediction method. The prediction of shear strength based on combined intrinsic curvilinear shear strength envelope using small strain triaxial test data confirmed the significant improvement and reliability of the measurement and analysis methods. Moreover, the NRMYSF method shows an excellent data prediction and significant improvement toward more reliable prediction of soil strength that can reduce the cost and time of experimental laboratory test.

  1. Global Agriculture Yields and Conflict under Future Climate

    NASA Astrophysics Data System (ADS)

    Rising, J.; Cane, M. A.

    2013-12-01

    Aspects of climate have been shown to correlate significantly with conflict. We investigate a possible pathway for these effects through changes in agriculture yields, as predicted by field crop models (FAO's AquaCrop and DSSAT). Using satellite and station weather data, and surveyed data for soil and management, we simulate major crop yields across all countries between 1961 and 2008, and compare these to FAO and USDA reported yields. Correlations vary by country and by crop, from approximately .8 to -.5. Some of this range in crop model performance is explained by crop varieties, data quality, and other natural, economic, and political features. We also quantify the ability of AquaCrop and DSSAT to simulate yields under past cycles of ENSO as a proxy for their performance under changes in climate. We then describe two statistical models which relate crop yields to conflict events from the UCDP/PRIO Armed Conflict dataset. The first relates several preceding years of predicted yields of the major grain in each country to any conflict involving that country. The second uses the GREG ethnic group maps to identify differences in predicted yields between neighboring regions. By using variation in predicted yields to explain conflict, rather than actual yields, we can identify the exogenous effects of weather on conflict. Finally, we apply precipitation and temperature time-series under IPCC's A1B scenario to the statistical models. This allows us to estimate the scale of the impact of future yields on future conflict. Centroids of the major growing regions for each country's primary crop, based on USDA FAS consumption. Correlations between simulated yields and reported yields, for AquaCrop and DSSAT, under the assumption that no irrigation, fertilization, or pest control is used. Reported yields are the average of FAO yields and USDA FAS yields, where both are available.

  2. A reliability analysis tool for SpaceWire network

    NASA Astrophysics Data System (ADS)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  3. Transit ridership, reliability, and retention.

    DOT National Transportation Integrated Search

    2008-10-01

    This project explores two major components that affect transit ridership: travel time reliability and rider : retention. It has been recognized that transit travel time reliability may have a significant impact on : attractiveness of transit to many ...

  4. WHO Study on the reliability and validity of the alcohol and drug use disorder instruments: overview of methods and results.

    PubMed

    Ustün, B; Compton, W; Mager, D; Babor, T; Baiyewu, O; Chatterji, S; Cottler, L; Göğüş, A; Mavreas, V; Peters, L; Pull, C; Saunders, J; Smeets, R; Stipec, M R; Vrasti, R; Hasin, D; Room, R; Van den Brink, W; Regier, D; Blaine, J; Grant, B F; Sartorius, N

    1997-09-25

    The WHO Study on the reliability and validity of the alcohol and drug use disorder instruments in an international study which has taken place in centres in ten countries, aiming to test the reliability and validity of three diagnostic instruments for alcohol and drug use disorders: the Composite International Diagnostic Interview (CIDI), the Schedules for Clinical Assessment in Neuropsychiatry (SCAN) and a special version of the Alcohol Use Disorder and Associated Disabilities Interview schedule-alcohol/drug-revised (AUDADIS-ADR). The purpose of the reliability and validity (R&V) study is to further develop the alcohol and drug sections of these instruments so that a range of substance-related diagnoses can be made in a systematic, consistent, and reliable way. The study focuses on new criteria proposed in the tenth revision of the International Classification of Diseases (ICD-10) and the fourth revision of the diagnostic and statistical manual of mental disorders (DSM-IV) for dependence, harmful use and abuse categories for alcohol and psychoactive substance use disorders. A systematic study including a scientifically rigorous measure of reliability (i.e. 1 week test-retest reliability) and validity (i.e. comparison between clinical and non-clinical measures) has been undertaken. Results have yielded useful information on reliability and validity of these instruments at diagnosis, criteria and question level. Overall the diagnostic concordance coefficients (kappa, kappa) were very good for dependence disorders (0.7-0.9), but were somewhat lower for the abuse and harmful use categories. The comparisons among instruments and independent clinical evaluations and debriefing interviews gave important information about possible sources of unreliability, and provided useful clues on the applicability and consistency of nosological concepts across cultures.

  5. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  6. Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield

    NASA Astrophysics Data System (ADS)

    Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.

    2014-12-01

    Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.

  7. Strip-based immunoassay for the simultaneous detection of the neonicotinoid insecticides imidacloprid and thiamethoxam in agricultural products

    USDA-ARS?s Scientific Manuscript database

    A semiquantitative strip immunoassay was developed for the rapid detection of imidacloprid and thiamethoxam in agricultural products using specific nanocolloidal gold-labeled monoclonal antibodies. The conjugates of imidacloprid-BSA and thiamethoxam-BSA and goat anti-mouse IgG were coated on the ni...

  8. Yield of Unthinned Yellow-Poplar

    Treesearch

    Donald E. Beck; Lino Della-Bianca

    1970-01-01

    Cubic-foot and board-foot yields of unthinned yellow-poplar (Liriodendron Tulipiferi L.) stands are described in relation to stand age, site index, and number of trees per acre. The yield tables are based on analysis of diameter distributions and height-diameter relationships obtained from 141 natural, unthinned yellow-poplar stands in the...

  9. Scaled CMOS Technology Reliability Users Guide

    NASA Technical Reports Server (NTRS)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  10. Genotypic Variation in Yield, Yield Components, Root Morphology and Architecture, in Soybean in Relation to Water and Phosphorus Supply

    PubMed Central

    He, Jin; Jin, Yi; Du, Yan-Lei; Wang, Tao; Turner, Neil C.; Yang, Ru-Ping; Siddique, Kadambot H. M.; Li, Feng-Min

    2017-01-01

    Water shortage and low phosphorus (P) availability limit yields in soybean. Roots play important roles in water-limited and P-deficient environment, but the underlying mechanisms are largely unknown. In this study we determined the responses of four soybean [Glycine max (L.) Merr.] genotypes [Huandsedadou (HD), Bailudou (BLD), Jindou 21 (J21), and Zhonghuang 30 (ZH)] to three P levels [applied 0 (P0), 60 (P60), and 120 (P120) mg P kg-1 dry soil to the upper 0.4 m of the soil profile] and two water treatment [well-watered (WW) and water-stressed (WS)] with special reference to root morphology and architecture, we compared yield and its components, root morphology and root architecture to find out which variety and/or what kind of root architecture had high grain yield under P and drought stress. The results showed that water stress and low P, respectively, significantly reduced grain yield by 60 and 40%, daily water use by 66 and 31%, P accumulation by 40 and 80%, and N accumulation by 39 and 65%. The cultivar ZH with the lowest daily water use had the highest grain yield at P60 and P120 under drought. Increased root length was positively associated with N and P accumulation in both the WW and WS treatments, but not with grain yield under water and P deficits. However, in the WS treatment, high adventitious and lateral root densities were associated with high N and P uptake per unit root length which in turn was significantly and positively associated with grain yield. Our results suggest that (1) genetic variation of grain yield, daily water use, P and N accumulation, and root morphology and architecture were observed among the soybean cultivars and ZH had the best yield performance under P and water limited conditions; (2) water has a major influence on nutrient uptake and grain yield, while additional P supply can modestly increase yields under drought in some soybean genotypes; (3) while conserved water use plays an important role in grain yield under drought

  11. Genotypic Variation in Yield, Yield Components, Root Morphology and Architecture, in Soybean in Relation to Water and Phosphorus Supply.

    PubMed

    He, Jin; Jin, Yi; Du, Yan-Lei; Wang, Tao; Turner, Neil C; Yang, Ru-Ping; Siddique, Kadambot H M; Li, Feng-Min

    2017-01-01

    Water shortage and low phosphorus (P) availability limit yields in soybean. Roots play important roles in water-limited and P-deficient environment, but the underlying mechanisms are largely unknown. In this study we determined the responses of four soybean [ Glycine max (L.) Merr.] genotypes [Huandsedadou (HD), Bailudou (BLD), Jindou 21 (J21), and Zhonghuang 30 (ZH)] to three P levels [applied 0 (P0), 60 (P60), and 120 (P120) mg P kg -1 dry soil to the upper 0.4 m of the soil profile] and two water treatment [well-watered (WW) and water-stressed (WS)] with special reference to root morphology and architecture, we compared yield and its components, root morphology and root architecture to find out which variety and/or what kind of root architecture had high grain yield under P and drought stress. The results showed that water stress and low P, respectively, significantly reduced grain yield by 60 and 40%, daily water use by 66 and 31%, P accumulation by 40 and 80%, and N accumulation by 39 and 65%. The cultivar ZH with the lowest daily water use had the highest grain yield at P60 and P120 under drought. Increased root length was positively associated with N and P accumulation in both the WW and WS treatments, but not with grain yield under water and P deficits. However, in the WS treatment, high adventitious and lateral root densities were associated with high N and P uptake per unit root length which in turn was significantly and positively associated with grain yield. Our results suggest that (1) genetic variation of grain yield, daily water use, P and N accumulation, and root morphology and architecture were observed among the soybean cultivars and ZH had the best yield performance under P and water limited conditions; (2) water has a major influence on nutrient uptake and grain yield, while additional P supply can modestly increase yields under drought in some soybean genotypes; (3) while conserved water use plays an important role in grain yield under drought

  12. Water limits to closing yield gaps

    NASA Astrophysics Data System (ADS)

    Davis, Kyle Frankel; Rulli, Maria Cristina; Garrassino, Francesco; Chiarelli, Davide; Seveso, Antonio; D'Odorico, Paolo

    2017-01-01

    Agricultural intensification is often seen as a suitable approach to meet the growing demand for agricultural products and improve food security. It typically entails the use of fertilizers, new cultivars, irrigation, and other modern technology. In regions of the world affected by seasonal or chronic water scarcity, yield gap closure is strongly dependent on irrigation (blue water). Global yield gap assessments have often ignored whether the water required to close the yield gap is locally available. Here we perform a gridded global analysis (10 km resolution) of the blue water consumption that is needed annually to close the yield gap worldwide and evaluate the associated pressure on renewable freshwater resources. We find that, to close the yield gap, human appropriation of freshwater resources for irrigation would have to increase at least by 146%. Most study countries would experience at least a doubling in blue water requirement, with 71% of the additional blue water being required by only four crops - maize, rice, soybeans, and wheat. Further, in some countries (e.g., Algeria, Morocco, Syria, Tunisia, and Yemen) the total volume of blue water required for yield gap closure would exceed sustainable levels of freshwater consumption (i.e., 40% of total renewable surface and groundwater resources).

  13. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed

  14. Travel reliability inventory for Chicago.

    DOT National Transportation Integrated Search

    2013-04-01

    The overarching goal of this research project is to enable state DOTs to document and monitor the reliability performance : of their highway networks. To this end, a computer tool, TRIC, was developed to produce travel reliability inventories from : ...

  15. Estimation of rice yield affected by drought and relation between rice yield and TVDI

    NASA Astrophysics Data System (ADS)

    Hongo, C.; Tamura, E.; Sigit, G.

    2016-12-01

    Impact of climate change is not only seen on food production but also on food security and sustainable development of society. Adaptation to climate change is a pressing issue throughout the world to reduce the risks along with the plans and strategies for food security and sustainable development. As a key adaptation to the climate change, agricultural insurance is expected to play an important role in stabilizing agricultural production through compensating the losses caused by the climate change. As the adaptation, the Government of Indonesia has launched agricultural insurance program for damage of rice by drought, flood and pest and disease. The Government started a pilot project in 2013 and this year the pilot project has been extended to 22 provinces. Having the above as background, we conducted research on development of new damage assessment method for rice using remote sensing data which could be used for evaluation of damage ratio caused by drought in West Java, Indonesia. For assessment of the damage ratio, estimation of rice yield is a key. As the result of our study, rice yield affected by drought in dry season could be estimated at level of 1 % significance using SPOT 7 data taken in 2015, and the validation result was 0.8t/ha. Then, the decrease ratio in rice yield about each individual paddy field was calculated using data of the estimated result and the average yield of the past 10 years. In addition, TVDI (Temperature Vegetation Dryness Index) which was calculated from Landsat8 data in heading season indicated the dryness in low yield area. The result suggests that rice yield was affected by irrigation water shortage around heading season as a result of the decreased precipitation by El Nino. Through our study, it becomes clear that the utilization of remote sensing data can be promising for assessment of the damage ratio of rice production precisely, quickly and quantitatively, and also it can be incorporated into the insurance procedures.

  16. A reliability study on brain activation during active and passive arm movements supported by an MRI-compatible robot.

    PubMed

    Estévez, Natalia; Yu, Ningbo; Brügger, Mike; Villiger, Michael; Hepp-Reymond, Marie-Claude; Riener, Robert; Kollias, Spyros

    2014-11-01

    In neurorehabilitation, longitudinal assessment of arm movement related brain function in patients with motor disability is challenging due to variability in task performance. MRI-compatible robots monitor and control task performance, yielding more reliable evaluation of brain function over time. The main goals of the present study were first to define the brain network activated while performing active and passive elbow movements with an MRI-compatible arm robot (MaRIA) in healthy subjects, and second to test the reproducibility of this activation over time. For the fMRI analysis two models were compared. In model 1 movement onset and duration were included, whereas in model 2 force and range of motion were added to the analysis. Reliability of brain activation was tested with several statistical approaches applied on individual and group activation maps and on summary statistics. The activated network included mainly the primary motor cortex, primary and secondary somatosensory cortex, superior and inferior parietal cortex, medial and lateral premotor regions, and subcortical structures. Reliability analyses revealed robust activation for active movements with both fMRI models and all the statistical methods used. Imposed passive movements also elicited mainly robust brain activation for individual and group activation maps, and reliability was improved by including additional force and range of motion using model 2. These findings demonstrate that the use of robotic devices, such as MaRIA, can be useful to reliably assess arm movement related brain activation in longitudinal studies and may contribute in studies evaluating therapies and brain plasticity following injury in the nervous system.

  17. Postpartum body condition score and results from the first test day milk as predictors of disease, fertility, yield, and culling in commercial dairy herds.

    PubMed

    Heuer, C; Schukken, Y H; Dobbelaar, P

    1999-02-01

    The study used field data from a regular herd health service to investigate the relationships between body condition scores or first test day milk data and disease incidence, milk yield, fertility, and culling. Path model analysis with adjustment for time at risk was applied to delineate the time sequence of events. Milk fever occurred more often in fat cows, and endometritis occurred between calving and 20 d of lactation more often in thin cows. Fat cows were less likely to conceive at first service than were cows in normal condition. Fat body condition postpartum, higher first test day milk yield, and a fat to protein ratio of > 1.5 increased body condition loss. Fat or thin condition or condition loss was not related to other lactation diseases, fertility parameters, milk yield, or culling. First test day milk yield was 1.3 kg higher after milk fever and was 7.1 kg lower after displaced abomasum. Higher first test day milk yield directly increased the risk of ovarian cyst and lameness, increased 100-d milk yield, and reduced the risk of culling and indirectly decreased reproductive performance. Cows with a fat to protein ratio of > 1.5 had higher risks for ketosis, displaced abomasum, ovarian cyst, lameness, and mastitis. Those cows produced more milk but showed poor reproductive performance. Given this type of herd health data, we concluded that the first test day milk yield and the fat to protein ratio were more reliable indicators of disease, fertility, and milk yield than was body condition score or loss of body condition score.

  18. Reliability assessments in qualitative health promotion research.

    PubMed

    Cook, Kay E

    2012-03-01

    This article contributes to the debate about the use of reliability assessments in qualitative research in general, and health promotion research in particular. In this article, I examine the use of reliability assessments in qualitative health promotion research in response to health promotion researchers' commonly held misconception that reliability assessments improve the rigor of qualitative research. All qualitative articles published in the journal Health Promotion International from 2003 to 2009 employing reliability assessments were examined. In total, 31.3% (20/64) articles employed some form of reliability assessment. The use of reliability assessments increased over the study period, ranging from <20% in 2003/2004 to 50% and above in 2008/2009, while at the same time the total number of qualitative articles decreased. The articles were then classified into four types of reliability assessments, including the verification of thematic codes, the use of inter-rater reliability statistics, congruence in team coding and congruence in coding across sites. The merits of each type were discussed, with the subsequent discussion focusing on the deductive nature of reliable thematic coding, the limited depth of immediately verifiable data and the usefulness of such studies to health promotion and the advancement of the qualitative paradigm.

  19. Learned helplessness: validity and reliability of depressive-like states in mice.

    PubMed

    Chourbaji, S; Zacher, C; Sanchis-Segura, C; Dormann, C; Vollmayr, B; Gass, P

    2005-12-01

    The learned helplessness paradigm is a depression model in which animals are exposed to unpredictable and uncontrollable stress, e.g. electroshocks, and subsequently develop coping deficits for aversive but escapable situations (J.B. Overmier, M.E. Seligman, Effects of inescapable shock upon subsequent escape and avoidance responding, J. Comp. Physiol. Psychol. 63 (1967) 28-33 ). It represents a model with good similarity to the symptoms of depression, construct, and predictive validity in rats. Despite an increased need to investigate emotional, in particular depression-like behaviors in transgenic mice, so far only a few studies have been published using the learned helplessness paradigm. One reason may be the fact that-in contrast to rats (B. Vollmayr, F.A. Henn, Learned helplessness in the rat: improvements in validity and reliability, Brain Res. Brain Res. Protoc. 8 (2001) 1-7)--there is no generally accepted learned helplessness protocol available for mice. This prompted us to develop a reliable helplessness procedure in C57BL/6N mice, to exclude possible artifacts, and to establish a protocol, which yields a consistent fraction of helpless mice following the shock exposure. Furthermore, we validated this protocol pharmacologically using the tricyclic antidepressant imipramine. Here, we present a mouse model with good face and predictive validity that can be used for transgenic, behavioral, and pharmacological studies.

  20. Reliability analysis of component of affination centrifugal 1 machine by using reliability engineering

    NASA Astrophysics Data System (ADS)

    Sembiring, N.; Ginting, E.; Darnello, T.

    2017-12-01

    Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.

  1. Vulnerability of Agriculture to Climate Change as Revealed by Relationships between Simulated Crop Yield and Climate Change Indices

    NASA Astrophysics Data System (ADS)

    King, A. W.; Absar, S. M.; Nair, S.; Preston, B. L.

    2012-12-01

    vulnerability analysis. They also contribute to considerations of adaptation, focusing attention on adapting to increased variability in yield rather than just reductions in yield. For example, in the face of increased variability or reduced reliability, hedging and risk spreading strategies may be more important than technological innovations such as drought-resistant crops or other optimization strategies. Our findings also have implications for the choice and application of climate extreme indices, demands on models used to project climate change and the development of next generation integrated assessment models (IAM) that incorporate the agricultural sector, and especially adaption within that sector, in energy and broader more general markets.

  2. Climate-Driven Crop Yield and Yield Variability and Climate Change Impacts on the U.S. Great Plains Agricultural Production.

    PubMed

    Kukal, Meetpal S; Irmak, Suat

    2018-02-22

    Climate variability and trends affect global crop yields and are characterized as highly dependent on location, crop type, and irrigation. U.S. Great Plains, due to its significance in national food production, evident climate variability, and extensive irrigation is an ideal region of investigation for climate impacts on food production. This paper evaluates climate impacts on maize, sorghum, and soybean yields and effect of irrigation for individual counties in this region by employing extensive crop yield and climate datasets from 1968-2013. Variability in crop yields was a quarter of the regional average yields, with a quarter of this variability explained by climate variability, and temperature and precipitation explained these in singularity or combination at different locations. Observed temperature trend was beneficial for maize yields, but detrimental for sorghum and soybean yields, whereas observed precipitation trend was beneficial for all three crops. Irrigated yields demonstrated increased robustness and an effective mitigation strategy against climate impacts than their non-irrigated counterparts by a considerable fraction. The information, data, and maps provided can serve as an assessment guide for planners, managers, and policy- and decision makers to prioritize agricultural resilience efforts and resource allocation or re-allocation in the regions that exhibit risk from climate variability.

  3. Interaction Between Phosphorus and Zinc on the Biomass Yield and Yield Attributes of the Medicinal Plant Stevia (Stevia rebaudiana)

    PubMed Central

    Das, Kuntal; Dang, Raman; Shivananda, T. N.; Sur, Pintu

    2005-01-01

    A greenhouse experiment was conducted at the Indian Institute of Horticultural Research (IIHR), Bangalore to study the interaction effect between phosphorus (P) and zinc (Zn) on the yield and yield attributes of the medicinal plant stevia. The results show that the yield and yield attributes have been found to be significantly affected by different treatments. The total yield in terms of biomass production has been increased significantly with the application of Zn and P in different combinations and methods, being highest (23.34 g fresh biomass) in the treatment where Zn was applied as both soil (10 kg ZnSO4/ha) and foliar spray (0.2% ZnSO4). The results also envisaged that the different yield attributes viz. height, total number of branches, and number of leaves per plant have been found to be varied with treatments, being highest in the treatment where Zn was applied as both soil and foliar spray without the application of P. The results further indicated that the yield and yield attributes of stevia have been found to be decreased in the treatment where Zn was applied as both soil and foliar spray along with P suggesting an antagonistic effect between Zn and P. PMID:15915292

  4. An Evaluation Method of Equipment Reliability Configuration Management

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  5. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    PubMed

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Rx for low cash yields.

    PubMed

    Tobe, Chris

    2003-10-01

    Certain strategies can offer not-for-profit hospitals potentially greater investment yields while maintaining stability and principal safety. Treasury inflation-indexed securities can offer good returns, low volatility, and inflation protection. "Enhanced cash" strategies offer liquidity and help to preserve capital. Stable value "wrappers" allow hospitals to pursue higher-yielding fixed-income securities without an increase in volatility.

  7. [Reliability and Validity of the Behavioral Check List for Preschool Children to Measure Attention Deficit Hyperactivity Behaviors].

    PubMed

    Tsuno, Kanami; Yoshimasu, Kouichi; Hayashi, Takashi; Tatsuta, Nozomi; Ito, Yuki; Kamijima, Michihiro; Nakai, Kunihiko

    2018-01-01

    Nowadays, attention deficit hyperactivity (ADH) problems are observed commonly among school-age children. However, questionnaires specific to ADH behaviors among preschool children are very few. The aim of this study was to investigate the reliability and validity of the 25-item Behavioral Check List (BCL), which was developed from interviews of parents with children who were diagnosed as having Attention-deficit/hyperactivity disorder (ADHD) and measures ADH behaviors in preschool age. We recruited 22 teachers from 10 nurseries/kindergartens in Miyagi Prefecture, Japan. A total of 138 preschool children were assessed using the BCL. To investigate inter-rater reliability, two teachers from each facility assess seven to twenty children in their class, and intraclass correlation coefficients (ICCs) were calculated. The teachers additionally answered questions in the 1/5-5 Caregiver-Teacher Report Form (C-TRF) to investigate the criterion validity of the BCL. To investigate structural validity, exploratory factor analysis with promax rotation and confirmatory factor analysis were performed. The internal consistency reliability of the BCL was good (α = 0.92) and correlation analyses also confirmed its excellent criterion validity. Although exploratory factor analysis for the BCL yielded a five-factor model that consisted of a factor structure different from that of the original one, the results were similar to the original six factors. The ICCs of the BCL were 0.38-0.99 and it was not high enough for inter-rater reliability in some facilities. However, there is a possibility to improve it by giving raters adequate explanations when using BCL. The present study showed acceptable levels of reliability and validity of the BCL among Japanese preschool children.

  8. Person Reliability

    ERIC Educational Resources Information Center

    Lumsden, James

    1977-01-01

    Person changes can be of three kinds: developmental trends, swells, and tremors. Person unreliability in the tremor sense (momentary fluctuations) can be estimated from person characteristic curves. Average person reliability for groups can be compared from item characteristic curves. (Author)

  9. Coppice Sycamore Yields Through 9 Years

    Treesearch

    Harvey E. Kennedy

    1980-01-01

    Cutting cycle and spacing did not significantly affect sycamore dry-weight yields from ages 5-9 years (1974-l 978). Longer cutting cycles usually did give higher yields. Dry-weight yields ranged from 2886 lb per acre (3233 kg/ha) per year in the 1 year, 4x5 ft (1.2 x 1.5 m) spacing to 4541 lb (5088 kg/ha) in the 4-year, 4x5 ft s,pacing. Survival averaged 67 percent...

  10. A comprehensively quantitative method of evaluating the impact of drought on crop yield using daily multi-scale SPEI and crop growth process model.

    PubMed

    Wang, Qianfeng; Wu, Jianjun; Li, Xiaohan; Zhou, Hongkui; Yang, Jianhua; Geng, Guangpo; An, Xueli; Liu, Leizhen; Tang, Zhenghong

    2017-04-01

    The quantitative evaluation of the impact of drought on crop yield is one of the most important aspects in agricultural water resource management. To assess the impact of drought on wheat yield, the Environmental Policy Integrated Climate (EPIC) crop growth model and daily Standardized Precipitation Evapotranspiration Index (SPEI), which is based on daily meteorological data, are adopted in the Huang Huai Hai Plain. The winter wheat crop yields are estimated at 28 stations, after calibrating the cultivar coefficients based on the experimental site data, and SPEI data was taken 11 times across the growth season from 1981 to 2010. The relationship between estimated yield and multi-scale SPEI were analyzed. The optimum time scale SPEI to monitor drought during the crop growth period was determined. The reference yield was determined by averaging the yields from numerous non-drought years. From this data, we propose a comprehensive quantitative method which can be used to predict the impact of drought on wheat yields by combining the daily multi-scale SPEI and crop growth process model. This method was tested in the Huang Huai Hai Plain. The results suggested that estimation of calibrated EPIC was a good predictor of crop yield in the Huang Huai Hai Plain, with lower RMSE (15.4 %) between estimated yield and observed yield at six agrometeorological stations. The soil moisture at planting time was affected by the precipitation and evapotranspiration during the previous 90 days (about 3 months) in the Huang Huai Hai Plain. SPEI G90 was adopted as the optimum time scale SPEI to identify the drought and non-drought years, and identified a drought year in 2000. The water deficit in the year 2000 was significant, and the rate of crop yield reduction did not completely correspond with the volume of water deficit. Our proposed comprehensive method which quantitatively evaluates the impact of drought on crop yield is reliable. The results of this study further our

  11. Atmospheric Fluorescence Yield

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Christl, M. J.; Fountain, W. F.; Gregory, J. C.; Martens, K.; Sokolsky, P.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    Several existing and planned experiments estimate the energies of ultra-high energy cosmic rays from air showers using the atmospheric fluorescence from these showers. Accurate knowledge of the conversion from atmospheric fluorescence to energy loss by ionizing particles in the atmosphere is key to this technique. In this paper we discuss a small balloon-borne instrument to make the first in situ measurements versus altitude of the atmospheric fluorescence yield. The instrument can also be used in the lab to investigate the dependence of the fluorescence yield in air on temperature, pressure and the concentrations of other gases that present in the atmosphere. The results can be used to explore environmental effects on and improve the accuracy of cosmic ray energy measurements for existing ground-based experiments and future space-based experiments.

  12. Reliability and validity of a brief sleep questionnaire for children in Japan.

    PubMed

    Okada, Masakazu; Kitamura, Shingo; Iwadare, Yoshitaka; Tachimori, Hisateru; Kamei, Yuichi; Higuchi, Shigekazu; Mishima, Kazuo

    2017-09-15

    There is a dearth of sleep questionnaires with few items and confirmed reliability and validity that can be used for the early detection of sleep problems in children. The aim of this study was to develop a questionnaire with few items and assess its reliability and validity in both children at high risk of sleep disorders and a community population. Data for analysis were derived from two populations targeted by the Children's Sleep Habits Questionnaire (CSHQ): 178 children attending elementary school and 432 children who visited a pediatric psychiatric hospital (aged 6-12 years). The new questionnaire was constructed as a subset of the CSHQ. The newly developed short version of the sleep questionnaire for children (19 items) had an acceptable internal consistency (0.65). Using the cutoff value of the CSHQ, the total score of the new questionnaire was confirmed to have discriminant validity (27.2 ± 3.9 vs. 22.0 ± 2.1, p < 0.001) and yielded a sensitivity of 0.83 and specificity of 0.78 by receiver operator characteristic curve analysis. Total score of the new questionnaire was significantly correlated with total score (r = 0.81, p < 0.001) and each subscale score (r = 0.29-0.65, p < 0.001) of the CSHQ. The new questionnaire demonstrated an adequate reliability and validity in both high-risk children and a community population, as well as similar screening ability to the CSHQ. It could thus be a convenient instrument to detect sleep problems in children.

  13. Reliability and Maintainability (RAM) Training

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  14. An experiment in software reliability

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Pierce, J. L.

    1986-01-01

    The results of a software reliability experiment conducted in a controlled laboratory setting are reported. The experiment was undertaken to gather data on software failures and is one in a series of experiments being pursued by the Fault Tolerant Systems Branch of NASA Langley Research Center to find a means of credibly performing reliability evaluations of flight control software. The experiment tests a small sample of implementations of radar tracking software having ultra-reliability requirements and uses n-version programming for error detection, and repetitive run modeling for failure and fault rate estimation. The experiment results agree with those of Nagel and Skrivan in that the program error rates suggest an approximate log-linear pattern and the individual faults occurred with significantly different error rates. Additional analysis of the experimental data raises new questions concerning the phenomenon of interacting faults. This phenomenon may provide one explanation for software reliability decay.

  15. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  16. Reliability and Availability Evaluation Program Manual.

    DTIC Science & Technology

    1982-11-01

    research and development. The manual’s purpose was to provide a practical method for making reliability measurements, measurements directly related to... Research , Development, Test and Evaluation. RMA Reliability, Maintainability and Availability. R&R Repair and Refurbishment, Repair and Replacement, etc...length. phenomena such as mechanical wear and A number of researchers in the reliability chemical deterioration. Maintenance should field 14-pages 402

  17. Assessing Sediment Yield and the Effect of Best Management Practices on Sediment Yield Reduction for Tutuila Island, American Samoa

    NASA Astrophysics Data System (ADS)

    Leta, O. T.; Dulai, H.; El-Kadi, A. I.

    2017-12-01

    Upland soil erosion and sedimentation are the main threats for riparian and coastal reef ecosystems in Pacific islands. Here, due to small size of the watersheds and steep slope, the residence time of rainfall runoff and its suspended load is short. Fagaalu bay, located on the island of Tutuila (American Samoa) has been identified as a priority watershed, due to degraded coral reef condition and reduction of stream water quality from heavy anthropogenic activity yielding high nutrients and sediment loads to the receiving water bodies. This study aimed to estimate the sediment yield to the Fagaalu stream and assess the impact of Best Management Practices (BMP) on sediment yield reduction. For this, the Soil and Water Assessment Tool (SWAT) model was applied, calibrated, and validated for both daily streamflow and sediment load simulation. The model also estimated the sediment yield contributions from existing land use types of Fagaalu and identified soil erosion prone areas for introducing BMP scenarios in the watershed. Then, three BMP scenarios, such as stone bund, retention pond, and filter strip were treated on bare (quarry area), agricultural, and shrub land use types. It was found that the bare land with quarry activity yielded the highest annual average sediment yield of 133 ton per hectare (t ha-1) followed by agriculture (26.1 t ha-1) while the lowest sediment yield of 0.2 t ha-1 was estimated for the forested part of the watershed. Additionally, the bare land area (2 ha) contributed approximately 65% (207 ha) of the watershed's sediment yield, which is 4.0 t ha-1. The latter signifies the high impact as well as contribution of anthropogenic activity on sediment yield. The use of different BMP scenarios generally reduced the sediment yield to the coastal reef of Fagaalu watershed. However, treating the quarry activity area with stone bund showed the highest sediment yield reduction as compared to the other two BMP scenarios. This study provides an estimate

  18. Brazil soybean yield covariance model

    NASA Technical Reports Server (NTRS)

    Callis, S. L.; Sakamoto, C.

    1984-01-01

    A model based on multiple regression was developed to estimate soybean yields for the seven soybean-growing states of Brazil. The meteorological data of these seven states were pooled and the years 1975 to 1980 were used to model since there was no technological trend in the yields during these years. Predictor variables were derived from monthly total precipitation and monthly average temperature.

  19. Reliability and Cost Impacts for Attritable Systems

    DTIC Science & Technology

    2017-03-23

    and cost risk metrics to convey the value of reliability and reparability trades. Investigation of the benefit of trading system reparability...illustrates the benefit that reliability engineering can have on total cost . 2.3.1 Contexts of System Reliability Hogge (2012) identifies two distinct...reliability and reparability trades. Investigation of the benefit of trading system reparability shows a marked increase in cost risk. Yet, trades in

  20. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    NASA Technical Reports Server (NTRS)

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  1. Mapping quantitative trait loci with additive effects and additive x additive epistatic interactions for biomass yield, grain yield, and straw yield using a doubled haploid population of wheat (Triticum aestivum L.).

    PubMed

    Li, Z K; Jiang, X L; Peng, T; Shi, C L; Han, S X; Tian, B; Zhu, Z L; Tian, J C

    2014-02-28

    Biomass yield is one of the most important traits for wheat (Triticum aestivum L.)-breeding programs. Increasing the yield of the aerial parts of wheat varieties will be an integral component of future wheat improvement; however, little is known regarding the genetic control of aerial part yield. A doubled haploid population, comprising 168 lines derived from a cross between two winter wheat cultivars, 'Huapei 3' (HP3) and 'Yumai 57' (YM57), was investigated. Quantitative trait loci (QTL) for total biomass yield, grain yield, and straw yield were determined for additive effects and additive x additive epistatic interactions using the QTLNetwork 2.0 software based on the mixed-linear model. Thirteen QTL were determined to have significant additive effects for the three yield traits, of which six also exhibited epistatic effects. Eleven significant additive x additive interactions were detected, of which seven occurred between QTL showing epistatic effects only, two occurred between QTL showing epistatic effects and additive effects, and two occurred between QTL with additive effects. These QTL explained 1.20 to 10.87% of the total phenotypic variation. The QTL with an allele originating from YM57 on chromosome 4B and another QTL contributed by HP3 alleles on chromosome 4D were simultaneously detected on the same or adjacent chromosome intervals for the three traits in two environments. Most of the repeatedly detected QTL across environments were not significant (P > 0.05). These results have implications for selection strategies in wheat biomass yield and for increasing the yield of the aerial part of wheat.

  2. Manual muscle testing and hand-held dynamometry in people with inflammatory myopathy: An intra- and interrater reliability and validity study

    PubMed Central

    Baschung Pfister, Pierrette; Sterkele, Iris; Maurer, Britta; de Bie, Rob A.; Knols, Ruud H.

    2018-01-01

    Manual muscle testing (MMT) and hand-held dynamometry (HHD) are commonly used in people with inflammatory myopathy (IM), but their clinimetric properties have not yet been sufficiently studied. To evaluate the reliability and validity of MMT and HHD, maximum isometric strength was measured in eight muscle groups across three measurement events. To evaluate reliability of HHD, intra-class correlation coefficients (ICC), the standard error of measurements (SEM) and smallest detectable changes (SDC) were calculated. To measure reliability of MMT linear Cohen`s Kappa was computed for single muscle groups and ICC for total score. Additionally, correlations between MMT8 and HHD were evaluated with Spearman Correlation Coefficients. Fifty people with myositis (56±14 years, 76% female) were included in the study. Intra-and interrater reliability of HHD yielded excellent ICCs (0.75–0.97) for all muscle groups, except for interrater reliability of ankle extension (0.61). The corresponding SEMs% ranged from 8 to 28% and the SDCs% from 23 to 65%. MMT8 total score revealed excellent intra-and interrater reliability (ICC>0.9). Intrarater reliability of single muscle groups was substantial for shoulder and hip abduction, elbow and neck flexion, and hip extension (0.64–0.69); moderate for wrist (0.53) and knee extension (0.49) and fair for ankle extension (0.35). Interrater reliability was moderate for neck flexion (0.54) and hip abduction (0.44); fair for shoulder abduction, elbow flexion, wrist and ankle extension (0.20–0.33); and slight for knee extension (0.08). Correlations between the two tests were low for wrist, knee, ankle, and hip extension; moderate for elbow flexion, neck flexion and hip abduction; and good for shoulder abduction. In conclusion, the MMT8 total score is a reliable assessment to consider general muscle weakness in people with myositis but not for single muscle groups. In contrast, our results confirm that HHD can be recommended to evaluate

  3. The design organization test: further demonstration of reliability and validity as a brief measure of visuospatial ability.

    PubMed

    Killgore, William D S; Gogel, Hannah

    2014-01-01

    Neuropsychological assessments are frequently time-consuming and fatiguing for patients. Brief screening evaluations may reduce test duration and allow more efficient use of time by permitting greater attention toward neuropsychological domains showing probable deficits. The Design Organization Test (DOT) was initially developed as a 2-min paper-and-pencil alternative for the Block Design (BD) subtest of the Wechsler scales. Although initially validated for clinical neurologic patients, we sought to further establish the reliability and validity of this test in a healthy, more diverse population. Two alternate versions of the DOT and the Wechsler Abbreviated Scale of Intelligence (WASI) were administered to 61 healthy adult participants. The DOT showed high alternate forms reliability (r = .90-.92), and the two versions yielded equivalent levels of performance. The DOT was highly correlated with BD (r = .76-.79) and was significantly correlated with all subscales of the WASI. The DOT proved useful when used in lieu of BD in the calculation of WASI IQ scores. Findings support the reliability and validity of the DOT as a measure of visuospatial ability and suggest its potential worth as an efficient estimate of intellectual functioning in situations where lengthier tests may be inappropriate or unfeasible.

  4. Reliability Growth in Space Life Support Systems

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.

    2014-01-01

    A hardware system's failure rate often increases over time due to wear and aging, but not always. Some systems instead show reliability growth, a decreasing failure rate with time, due to effective failure analysis and remedial hardware upgrades. Reliability grows when failure causes are removed by improved design. A mathematical reliability growth model allows the reliability growth rate to be computed from the failure data. The space shuttle was extensively maintained, refurbished, and upgraded after each flight and it experienced significant reliability growth during its operational life. In contrast, the International Space Station (ISS) is much more difficult to maintain and upgrade and its failure rate has been constant over time. The ISS Carbon Dioxide Removal Assembly (CDRA) reliability has slightly decreased. Failures on ISS and with the ISS CDRA continue to be a challenge.

  5. Ultra Reliability Workshop Introduction

    NASA Technical Reports Server (NTRS)

    Shapiro, Andrew A.

    2006-01-01

    This plan is the accumulation of substantial work by a large number of individuals. The Ultra-Reliability team consists of representatives from each center who have agreed to champion the program and be the focal point for their center. A number of individuals from NASA, government agencies (including the military), universities, industry and non-governmental organizations also contributed significantly to this effort. Most of their names may be found on the Ultra-Reliability PBMA website.

  6. Problematics of Reliability of Road Rollers

    NASA Astrophysics Data System (ADS)

    Stawowiak, Michał; Kuczaj, Mariusz

    2018-06-01

    This article refers to the reliability of road rollers used in a selected roadworks company. Information on the method of road rollers service and how the service affects the reliability of these rollers is presented. Attention was paid to the process of the implemented maintenance plan with regard to the machine's operational time. The reliability of road rollers was analyzed by determining and interpreting readiness coefficients.

  7. MEMS Reliability Assurance Activities at JPL

    NASA Technical Reports Server (NTRS)

    Kayali, S.; Lawton, R.; Stark, B.

    2000-01-01

    An overview of Microelectromechanical Systems (MEMS) reliability assurance and qualification activities at JPL is presented along with the a discussion of characterization of MEMS structures implemented on single crystal silicon, polycrystalline silicon, CMOS, and LIGA processes. Additionally, common failure modes and mechanisms affecting MEMS structures, including radiation effects, are discussed. Common reliability and qualification practices contained in the MEMS Reliability Assurance Guideline are also presented.

  8. Structural reliability assessment capability in NESSUS

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.

    1992-01-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  9. Structural reliability assessment capability in NESSUS

    NASA Astrophysics Data System (ADS)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  10. Validity and Reliability of an on-Court Fitness Test for Assessing and Monitoring Aerobic Fitness in Squash.

    PubMed

    James, Carl Alexander; Vallejo, Florencio Tenllado; Kantebeen, Melvin; Farra, Saro

    2018-02-14

    Current on-court assessments of aerobic fitness in squash are not designed to yield a wealth of physiological data. Moreover, tests may require complex computer equipment or involve simulated racket strokes, which are difficult to standardize at high intensities. This study investigated the validity and reliability of a squash-specific fitness test which can yield both a standalone performance score, as well as pertinent physiological markers such as V[Combining Dot Above]O2max, the lactate turnpoint and oxygen cost, in a sport-specific environment. Eight national squash players completed three tests in a counter-balanced order; an incremental laboratory treadmill test (LAB) and two on-court fitness tests (ST) that involved repeated shuttle runs at increasing speeds. V[Combining Dot Above]O2max during ST was agreeable with LAB (Typical error [TE]=3.3 mL.kg.min, r=0.79). The mean bias between LAB and ST was 2.5 mL.kg.min. There were no differences in maximum heart rate, post exercise blood lactate concentration or end of test RPE between LAB and ST (p>0.05). The ST was highly reliable, with 74 (10) laps completed in ST1 and 75 (12) laps in ST2 (mean bias=1 lap, TE=3 laps, r=0.97). Physiological markers were also reliable, including V[Combining Dot Above]O2max, (TE=1.5 mL.kg.min, r=0.95), the lap number at 4 mMol (TE=4 laps, r=0.77) and average VO2 across the first 4 stages (TE=0.94 mL.kg.min, r=0.95). We observed good agreement between LAB and ST for assessing V[Combining Dot Above]O2max and between both on-court trials for assessing test performance and selected physiological markers. Consequently, we recommend this test for monitoring training adaptations and prescribing individualized training in elite squash players.

  11. 46 CFR 169.619 - Reliability.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Reliability. 169.619 Section 169.619 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) NAUTICAL SCHOOLS SAILING SCHOOL VESSELS Machinery and Electrical Steering Systems § 169.619 Reliability. (a) Except where the OCMI judges it impracticable, the...

  12. 46 CFR 169.619 - Reliability.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Reliability. 169.619 Section 169.619 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) NAUTICAL SCHOOLS SAILING SCHOOL VESSELS Machinery and Electrical Steering Systems § 169.619 Reliability. (a) Except where the OCMI judges it impracticable, the...

  13. QTL mapping of root traits in phosphorus-deficient soils reveals important genomic regions for improving NDVI and grain yield in barley.

    PubMed

    Gong, Xue; McDonald, Glenn

    2017-09-01

    Major QTLs for root rhizosheath size are not correlated with grain yield or yield response to phosphorus. Important QTLs were found to improve phosphorus efficiency. Root traits are important for phosphorus (P) acquisition, but they are often difficult to characterize and their breeding values are seldom assessed under field conditions. This has shed doubts on using seedling-based criteria of root traits to select and breed for P efficiency. Eight root traits were assessed under controlled conditions in a barley doubled-haploid population in soils differing in P levels. The population was also phenotyped for grain yield, normalized difference vegetation index (NDVI), grain P uptake and P utilization efficiency at maturity (PutE GY ) under field conditions. Several quantitative traits loci (QTLs) from the root screening and the field trials were co-incident. QTLs for root rhizosheath size and root diameter explained the highest phenotypic variation in comparison to QTLs for other root traits. Shared QTLs were found between root diameter and grain yield, and total root length and PutE GY . A common major QTL for rhizosheath size and NDVI was mapped to the HvMATE gene marker on chromosome 4H. Collocations between major QTLs for NDVI and grain yield were detected on chromosomes 6H and 7H. When results from BIP and MET were combined, QTLs detected for grain yield were also those QTLs found for NDVI. QTLs qGY5H, qGY6H and qGY7Hb on 7H were robust QTLs in improving P efficiency. A selection of multiple loci may be needed to optimize the breeding outcomes due to the QTL x Environment interaction. We suggest that rhizosheath size alone is not a reliable trait to predict P efficiency or grain yield.

  14. Test-retest reliability of memory task functional magnetic resonance imaging in Alzheimer disease clinical trials.

    PubMed

    Atri, Alireza; O'Brien, Jacqueline L; Sreenivasan, Aishwarya; Rastegar, Sarah; Salisbury, Sibyl; DeLuca, Amy N; O'Keefe, Kelly M; LaViolette, Peter S; Rentz, Dorene M; Locascio, Joseph J; Sperling, Reisa A

    2011-05-01

    To examine the feasibility and test-retest reliability of encoding-task functional magnetic resonance imaging (fMRI) in mild Alzheimer disease (AD). Randomized, double-blind, placebo-controlled study. Memory clinical trials unit. We studied 12 patients with mild AD (mean [SEM] Mini-Mental State Examination score, 24.0 [0.7]; mean Clinical Dementia Rating score, 1.0) who had been taking donepezil hydrochloride for more than 6 months from the placebo arm of a larger 24-week study (n = 24, 4 scans on weeks 0, 6, 12, and 24, respectively). Placebo and 3 face-name, paired-associate encoding, block-design blood oxygenation level-dependent fMRI scans in 12 weeks. We performed whole-brain t maps (P < .001, 5 contiguous voxels) and hippocampal regions-of-interest analyses of extent (percentage of active voxels) and magnitude (percentage of signal change) for novel-greater-than-repeated face-name contrasts. We also calculated intraclass correlation coefficients and power estimates for hippocampal regions of interest. Task tolerability and data yield were high (95 of 96 scans yielded favorable-quality data). Whole-brain maps were stable. Right and left hippocampal regions-of-interest intraclass correlation coefficients were 0.59 to 0.87 and 0.67 to 0.74, respectively. To detect 25.0% to 50.0% changes in week-0 to week-12 hippocampal activity using left-right extent or right magnitude with 80.0% power (2-sided α = .05) requires 14 to 51 patients. Using left magnitude requires 125 patients because of relatively small signal to variance ratios. Encoding-task fMRI was successfully implemented in a single-site, 24-week, AD randomized controlled trial. Week 0 to 12 whole-brain t maps were stable, and test-retest reliability of hippocampal fMRI measures ranged from moderate to substantial. Right hippocampal magnitude may be the most promising of these candidate measures in a leveraged context. These initial estimates of test-retest reliability and power justify evaluation of

  15. The Berg Balance Scale has high intra- and inter-rater reliability but absolute reliability varies across the scale: a systematic review.

    PubMed

    Downs, Stephen; Marquez, Jodie; Chiarelli, Pauline

    2013-06-01

    What is the intra-rater and inter-rater relative reliability of the Berg Balance Scale? What is the absolute reliability of the Berg Balance Scale? Does the absolute reliability of the Berg Balance Scale vary across the scale? Systematic review with meta-analysis of reliability studies. Any clinical population that has undergone assessment with the Berg Balance Scale. Relative intra-rater reliability, relative inter-rater reliability, and absolute reliability. Eleven studies involving 668 participants were included in the review. The relative intrarater reliability of the Berg Balance Scale was high, with a pooled estimate of 0.98 (95% CI 0.97 to 0.99). Relative inter-rater reliability was also high, with a pooled estimate of 0.97 (95% CI 0.96 to 0.98). A ceiling effect of the Berg Balance Scale was evident for some participants. In the analysis of absolute reliability, all of the relevant studies had an average score of 20 or above on the 0 to 56 point Berg Balance Scale. The absolute reliability across this part of the scale, as measured by the minimal detectable change with 95% confidence, varied between 2.8 points and 6.6 points. The Berg Balance Scale has a higher absolute reliability when close to 56 points due to the ceiling effect. We identified no data that estimated the absolute reliability of the Berg Balance Scale among participants with a mean score below 20 out of 56. The Berg Balance Scale has acceptable reliability, although it might not detect modest, clinically important changes in balance in individual subjects. The review was only able to comment on the absolute reliability of the Berg Balance Scale among people with moderately poor to normal balance. Copyright © 2013 Australian Physiotherapy Association. Published by .. All rights reserved.

  16. Partitioning potential fish yields from the Great Lakes

    USGS Publications Warehouse

    Loftus, D.H.; Olver, C.H.; Brown, Edward H.; Colby, P.J.; Hartman, Wilbur L.; Schupp, D.H.

    1987-01-01

    We proposed and implemented procedures for partitioning future fish yields from the Great Lakes into taxonomic components. These projections are intended as guidelines for Great Lakes resource managers and scientists. Attainment of projected yields depends on restoration of stable fish communities containing some large piscivores that will use prey efficiently, continuation of control of the sea lamprey (Petromyzon marinus), and restoration of high-quality fish habitat. Because Great Lakes fish communities were harmonic before their collapse, we used their historic yield properties as part of the basis for projecting potential yields of rehabilitated communities. This use is qualified, however, because of possible inaccuracies in the wholly commercial yield data, the presence now of greatly expanded sport fisheries that affect yield composition and magnitude, and some possibly irreversible changes since the 1950s in the various fish communities themselves. We predict that total yields from Lakes Superior, Huron, and Ontario will be increased through rehabilitation, while those from Lakes Michigan and Erie will decline. Salmonines and coregonines will dominate future yields from the upper lakes. The Lake Erie fishery will continue to yield mostly rainbow smelt (Osmerus mordax), but the relative importance of percids, especially of walleye (Stizostedion vitreum vitreum) will increase. In Lake Ontario, yields of salmonines will be increased. Managers will have to apply the most rigorous management strictures to major predator species.

  17. Visual conspicuity: a new simple standard, its reliability, validity and applicability.

    PubMed

    Wertheim, A H

    2010-03-01

    A general standard for quantifying conspicuity is described. It derives from a simple and easy method to quantitatively measure the visual conspicuity of an object. The method stems from the theoretical view that the conspicuity of an object is not a property of that object, but describes the degree to which the object is perceptually embedded in, i.e. laterally masked by, its visual environment. First, three variations of a simple method to measure the strength of such lateral masking are described and empirical evidence for its reliability and its validity is presented, as are several tests of predictions concerning the effects of viewing distance and ambient light. It is then shown how this method yields a conspicuity standard, expressed as a number, which can be made part of a rule of law, and which can be used to test whether or not, and to what extent, the conspicuity of a particular object, e.g. a traffic sign, meets a predetermined criterion. An additional feature is that, when used under different ambient light conditions, the method may also yield an index of the amount of visual clutter in the environment. Taken together the evidence illustrates the methods' applicability in both the laboratory and in real-life situations. STATEMENT OF RELEVANCE: This paper concerns a proposal for a new method to measure visual conspicuity, yielding a numerical index that can be used in a rule of law. It is of importance to ergonomists and human factor specialists who are asked to measure the conspicuity of an object, such as a traffic or rail-road sign, or any other object. The new method is simple and circumvents the need to perform elaborate (search) experiments and thus has great relevance as a simple tool for applied research.

  18. OP-Yield Version 1.00 user's guide

    Treesearch

    Martin W. Ritchie; Jianwei Zhang

    2018-01-01

    OP-Yield is a Microsoft Excel™ spreadsheet with 14 specified user inputs to derive custom yield estimates using the original Oliver and Powers (1978) functions as the foundation. It presents yields for ponderosa pine (Pinus ponderosa Lawson & C. Lawson) plantations in northern California. The basic model forms for dominantand...

  19. Absolute quantum yield measurement of powder samples.

    PubMed

    Moreno, Luis A

    2012-05-12

    Measurement of fluorescence quantum yield has become an important tool in the search for new solutions in the development, evaluation, quality control and research of illumination, AV equipment, organic EL material, films, filters and fluorescent probes for bio-industry. Quantum yield is calculated as the ratio of the number of photons absorbed, to the number of photons emitted by a material. The higher the quantum yield, the better the efficiency of the fluorescent material. For the measurements featured in this video, we will use the Hitachi F-7000 fluorescence spectrophotometer equipped with the Quantum Yield measuring accessory and Report Generator program. All the information provided applies to this system. Measurement of quantum yield in powder samples is performed following these steps: 1. Generation of instrument correction factors for the excitation and emission monochromators. This is an important requirement for the correct measurement of quantum yield. It has been performed in advance for the full measurement range of the instrument and will not be shown in this video due to time limitations. 2. Measurement of integrating sphere correction factors. The purpose of this step is to take into consideration reflectivity characteristics of the integrating sphere used for the measurements. 3. Reference and Sample measurement using direct excitation and indirect excitation. 4. Quantum Yield calculation using Direct and Indirect excitation. Direct excitation is when the sample is facing directly the excitation beam, which would be the normal measurement setup. However, because we use an integrating sphere, a portion of the emitted photons resulting from the sample fluorescence are reflected by the integrating sphere and will re-excite the sample, so we need to take into consideration indirect excitation. This is accomplished by measuring the sample placed in the port facing the emission monochromator, calculating indirect quantum yield and correcting the direct

  20. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 2: HARP tutorial

    NASA Technical Reports Server (NTRS)

    Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Bavuso, Salvatore J.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. The Hybrid Automated Reliability Predictor (HARP) tutorial provides insight into HARP modeling techniques and the interactive textual prompting input language via a step-by-step explanation and demonstration of HARP's fault occurrence/repair model and the fault/error handling models. Example applications are worked in their entirety and the HARP tabular output data are presented for each. Simple models are presented at first with each succeeding example demonstrating greater modeling power and complexity. This document is not intended to present the theoretical and mathematical basis for HARP.

  1. Integrating Reliability Analysis with a Performance Tool

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Ulrey, Michael

    1995-01-01

    A large number of commercial simulation tools support performance oriented studies of complex computer and communication systems. Reliability of these systems, when desired, must be obtained by remodeling the system in a different tool. This has obvious drawbacks: (1) substantial extra effort is required to create the reliability model; (2) through modeling error the reliability model may not reflect precisely the same system as the performance model; (3) as the performance model evolves one must continuously reevaluate the validity of assumptions made in that model. In this paper we describe an approach, and a tool that implements this approach, for integrating a reliability analysis engine into a production quality simulation based performance modeling tool, and for modeling within such an integrated tool. The integrated tool allows one to use the same modeling formalisms to conduct both performance and reliability studies. We describe how the reliability analysis engine is integrated into the performance tool, describe the extensions made to the performance tool to support the reliability analysis, and consider the tool's performance.

  2. Interobserver Reliability of the Berlin ARDS Definition and Strategies to Improve the Reliability of ARDS Diagnosis.

    PubMed

    Sjoding, Michael W; Hofer, Timothy P; Co, Ivan; Courey, Anthony; Cooke, Colin R; Iwashyna, Theodore J

    2018-02-01

    Failure to reliably diagnose ARDS may be a major driver of negative clinical trials and underrecognition and treatment in clinical practice. We sought to examine the interobserver reliability of the Berlin ARDS definition and examine strategies for improving the reliability of ARDS diagnosis. Two hundred five patients with hypoxic respiratory failure from four ICUs were reviewed independently by three clinicians, who evaluated whether patients had ARDS, the diagnostic confidence of the reviewers, whether patients met individual ARDS criteria, and the time when criteria were met. Interobserver reliability of an ARDS diagnosis was "moderate" (kappa = 0.50; 95% CI, 0.40-0.59). Sixty-seven percent of diagnostic disagreements between clinicians reviewing the same patient was explained by differences in how chest imaging studies were interpreted, with other ARDS criteria contributing less (identification of ARDS risk factor, 15%; cardiac edema/volume overload exclusion, 7%). Combining the independent reviews of three clinicians can increase reliability to "substantial" (kappa = 0.75; 95% CI, 0.68-0.80). When a clinician diagnosed ARDS with "high confidence," all other clinicians agreed with the diagnosis in 72% of reviews. There was close agreement between clinicians about the time when a patient met all ARDS criteria if ARDS developed within the first 48 hours of hospitalization (median difference, 5 hours). The reliability of the Berlin ARDS definition is moderate, driven primarily by differences in chest imaging interpretation. Combining independent reviews by multiple clinicians or improving methods to identify bilateral infiltrates on chest imaging are important strategies for improving the reliability of ARDS diagnosis. Copyright © 2017 American College of Chest Physicians. All rights reserved.

  3. Criteria for Yielding of Dispersion-Strengthened Alloys

    NASA Technical Reports Server (NTRS)

    Ansell, G. S.; Lenel, F. V.

    1960-01-01

    A dislocation model is presented in order to account for the yield behavior of alloys with a finely dispersed second-phase. The criteria for yielding used in the model, is that appreciable yielding occurs in these alloys when the shear stress due to piled-up groups of dislocations is sufficient to fracture or plastically deform the dispersed second-phase particles, relieving the back stress on the dislocation sources. Equations derived on the basis of this model, predict that the yield stress of the alloys varies as the reciprocal square root of the mean free path between dispersed particles. Experimental data is presented for several SAP-Type alloys, precipitation-hardened alloys and steels which are in good agreement with the yield strength variation as a function of dispersion spacing predicted by this theoretical treatment.

  4. 75 FR 72664 - System Personnel Training Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ...Under section 215 of the Federal Power Act, the Commission approves two Personnel Performance, Training and Qualifications (PER) Reliability Standards, PER-004-2 (Reliability Coordination--Staffing) and PER-005-1 (System Personnel Training), submitted to the Commission for approval by the North American Electric Reliability Corporation, the Electric Reliability Organization certified by the Commission. The approved Reliability Standards require reliability coordinators, balancing authorities, and transmission operators to establish a training program for their system operators, verify each of their system operators' capability to perform tasks, and provide emergency operations training to every system operator. The Commission also approves NERC's proposal to retire two existing PER Reliability Standards that are replaced by the standards approved in this Final Rule.

  5. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  6. A Comparison of Reliability and Construct Validity between the Original and Revised Versions of the Rosenberg Self-Esteem Scale

    PubMed Central

    Nahathai, Wongpakaran

    2012-01-01

    Objective The Rosenberg Self-Esteem Scale (RSES) is a widely used instrument that has been tested for reliability and validity in many settings; however, some negative-worded items appear to have caused it to reveal low reliability in a number of studies. In this study, we revised one negative item that had previously (from the previous studies) produced the worst outcome in terms of the structure of the scale, then re-analyzed the new version for its reliability and construct validity, comparing it to the original version with respect to fit indices. Methods In total, 851 students from Chiang Mai University (mean age: 19.51±1.7, 57% of whom were female), participated in this study. Of these, 664 students completed the Thai version of the original RSES - containing five positively worded and five negatively worded items, while 187 students used the revised version containing six positively worded and four negatively worded items. Confirmatory factor analysis was applied, using a uni-dimensional model with method effects and a correlated uniqueness approach. Results The revised version showed the same level of reliability (good) as the original, but yielded a better model fit. The revised RSES demonstrated excellent fit statistics, with χ2=29.19 (df=19, n=187, p=0.063), GFI=0.970, TFI=0.969, NFI=0.964, CFI=0.987, SRMR=0.040 and RMSEA=0.054. Conclusion The revised version of the Thai RSES demonstrated an equivalent level of reliability but a better construct validity when compared to the original. PMID:22396685

  7. A comparison of reliability and construct validity between the original and revised versions of the Rosenberg Self-Esteem Scale.

    PubMed

    Wongpakaran, Tinakon; Tinakon, Wongpakaran; Wongpakaran, Nahathai; Nahathai, Wongpakaran

    2012-03-01

    The Rosenberg Self-Esteem Scale (RSES) is a widely used instrument that has been tested for reliability and validity in many settings; however, some negative-worded items appear to have caused it to reveal low reliability in a number of studies. In this study, we revised one negative item that had previously (from the previous studies) produced the worst outcome in terms of the structure of the scale, then re-analyzed the new version for its reliability and construct validity, comparing it to the original version with respect to fit indices. In total, 851 students from Chiang Mai University (mean age: 19.51±1.7, 57% of whom were female), participated in this study. Of these, 664 students completed the Thai version of the original RSES - containing five positively worded and five negatively worded items, while 187 students used the revised version containing six positively worded and four negatively worded items. Confirmatory factor analysis was applied, using a uni-dimensional model with method effects and a correlated uniqueness approach. The revised version showed the same level of reliability (good) as the original, but yielded a better model fit. The revised RSES demonstrated excellent fit statistics, with χ²=29.19 (df=19, n=187, p=0.063), GFI=0.970, TFI=0.969, NFI=0.964, CFI=0.987, SRMR=0.040 and RMSEA=0.054. The revised version of the Thai RSES demonstrated an equivalent level of reliability but a better construct validity when compared to the original.

  8. Reliability and validity of a Japanese version of the Cambridge depersonalization scale as a screening instrument for depersonalization disorder.

    PubMed

    Sugiura, Miyuki; Hirosawa, Masataka; Tanaka, Sumio; Nishi, Yasunobu; Yamada, Yasuyuki; Mizuno, Motoki

    2009-06-01

    The Cambridge Depersonalization Scale (CDS) is an instrument that has obtained reliability and validity in some countries for use in detecting depersonalization disorder under clinical conditions, but not yet in Japan under non-psychiatric conditions. The purposes of this study were to develop a Japanese version of the CDS (J-CDS) and to examine its reliability and validity as an instrument for screening depersonalization disorder under non-clinical conditions. The CDS was translated from English into Japanese and then back-translated into English by a native English-speaking American. After making the J-CDS, we examined its reliability and validity. Questionnaires that were composed of J-CDS, the Dissociative Experience Scale (DES), the Zung self-rating scale and the Maudsley Obsessional-Compulsive Inventory were administrated to 59 participants (12 patients with depersonalization disorder, 11 individuals who had recovered from depersonalization and 36 healthy controls). Cronbach's alpha and split-half reliability were 0.94 and 0.93, respectively. The J-CDS score in the depersonalization group was significantly higher than in the healthy control group. The J-CDS score was significantly correlated with scores of total DES, and DES-depersonalization. The best compromise between the true positive and false negative rate was at a cut-off point of 60, yielding a sensitivity of 1.00 and a specificity of 0.96. In this study, J-CDS showed good reliability and validity. The best cut-off point, when we use this for distinguishing individuals with depersonalization disorder from individuals without psychiatric disorders, is 60 points.

  9. Flight control electronics reliability/maintenance study

    NASA Technical Reports Server (NTRS)

    Dade, W. W.; Edwards, R. H.; Katt, G. T.; Mcclellan, K. L.; Shomber, H. A.

    1977-01-01

    Collection and analysis of data are reported that concern the reliability and maintenance experience of flight control system electronics currently in use on passenger carrying jet aircraft. Two airlines B-747 airplane fleets were analyzed to assess the component reliability, system functional reliability, and achieved availability of the CAT II configuration flight control system. Also assessed were the costs generated by this system in the categories of spare equipment, schedule irregularity, and line and shop maintenance. The results indicate that although there is a marked difference in the geographic location and route pattern between the airlines studied, there is a close similarity in the reliability and the maintenance costs associated with the flight control electronics.

  10. The Validity of Reliability Assessments.

    ERIC Educational Resources Information Center

    Basch, Charles E.; Gold, Robert S.

    1985-01-01

    Reliability guides research design and is used as a standard for judging the credibility of findings and inferences. Using data gathered in a school health education curriculum evaluation as an example, possible errors in hypothesis testing are examined. Appropriateness of internal consistency as a measure of reliability is discussed and…

  11. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  12. Reliability evaluation methodology for NASA applications

    NASA Technical Reports Server (NTRS)

    Taneja, Vidya S.

    1992-01-01

    Liquid rocket engine technology has been characterized by the development of complex systems containing large number of subsystems, components, and parts. The trend to even larger and more complex system is continuing. The liquid rocket engineers have been focusing mainly on performance driven designs to increase payload delivery of a launch vehicle for a given mission. In otherwords, although the failure of a single inexpensive part or component may cause the failure of the system, reliability in general has not been considered as one of the system parameters like cost or performance. Up till now, quantification of reliability has not been a consideration during system design and development in the liquid rocket industry. Engineers and managers have long been aware of the fact that the reliability of the system increases during development, but no serious attempts have been made to quantify reliability. As a result, a method to quantify reliability during design and development is needed. This includes application of probabilistic models which utilize both engineering analysis and test data. Classical methods require the use of operating data for reliability demonstration. In contrast, the method described in this paper is based on similarity, analysis, and testing combined with Bayesian statistical analysis.

  13. Test-retest reliability of cognitive EEG

    NASA Technical Reports Server (NTRS)

    McEvoy, L. K.; Smith, M. E.; Gevins, A.

    2000-01-01

    OBJECTIVE: Task-related EEG is sensitive to changes in cognitive state produced by increased task difficulty and by transient impairment. If task-related EEG has high test-retest reliability, it could be used as part of a clinical test to assess changes in cognitive function. The aim of this study was to determine the reliability of the EEG recorded during the performance of a working memory (WM) task and a psychomotor vigilance task (PVT). METHODS: EEG was recorded while subjects rested quietly and while they performed the tasks. Within session (test-retest interval of approximately 1 h) and between session (test-retest interval of approximately 7 days) reliability was calculated for four EEG components: frontal midline theta at Fz, posterior theta at Pz, and slow and fast alpha at Pz. RESULTS: Task-related EEG was highly reliable within and between sessions (r0.9 for all components in WM task, and r0.8 for all components in the PVT). Resting EEG also showed high reliability, although the magnitude of the correlation was somewhat smaller than that of the task-related EEG (r0.7 for all 4 components). CONCLUSIONS: These results suggest that under appropriate conditions, task-related EEG has sufficient retest reliability for use in assessing clinical changes in cognitive status.

  14. The Asphaltenes

    NASA Astrophysics Data System (ADS)

    Mullins, Oliver C.

    2011-07-01

    Asphaltenes, the most aromatic of the heaviest components of crude oil, are critical to all aspects of petroleum utilization, including reservoir characterization, production, transportation, refining, upgrading, paving, and coating materials. The asphaltenes, which are solid, have or impart crucial and often deleterious attributes in fluids such as high viscosity, emulsion stability, low distillate yields, and inopportune phase separation. Nevertheless, fundamental uncertainties had precluded a first-principles approach to asphaltenes until now. Recently, asphaltene science has undergone a renaissance; many basic molecular and nanocolloidal properties have been resolved and codified in the modified Yen model (also known as the Yen-Mullins model), thereby enabling predictive asphaltene science. Advances in analytical chemistry, especially mass spectrometry, enable the identification of tens of thousands of distinct chemical species in crude oils and asphaltenes. These and other powerful advances in asphaltene science fall under the banner of petroleomics, which incorporates predictive petroleum science and provides a framework for future developments.

  15. Rapid, Reliable Shape Setting of Superelastic Nitinol for Prototyping Robots

    PubMed Central

    Gilbert, Hunter B.; Webster, Robert J.

    2016-01-01

    Shape setting Nitinol tubes and wires in a typical laboratory setting for use in superelastic robots is challenging. Obtaining samples that remain superelastic and exhibit desired precurvatures currently requires many iterations, which is time consuming and consumes a substantial amount of Nitinol. To provide a more accurate and reliable method of shape setting, in this paper we propose an electrical technique that uses Joule heating to attain the necessary shape setting temperatures. The resulting high power heating prevents unintended aging of the material and yields consistent and accurate results for the rapid creation of prototypes. We present a complete algorithm and system together with an experimental analysis of temperature regulation. We experimentally validate the approach on Nitinol tubes that are shape set into planar curves. We also demonstrate the feasibility of creating general space curves by shape setting a helical tube. The system demonstrates a mean absolute temperature error of 10°C. PMID:27648473

  16. Rapid, Reliable Shape Setting of Superelastic Nitinol for Prototyping Robots.

    PubMed

    Gilbert, Hunter B; Webster, Robert J

    Shape setting Nitinol tubes and wires in a typical laboratory setting for use in superelastic robots is challenging. Obtaining samples that remain superelastic and exhibit desired precurvatures currently requires many iterations, which is time consuming and consumes a substantial amount of Nitinol. To provide a more accurate and reliable method of shape setting, in this paper we propose an electrical technique that uses Joule heating to attain the necessary shape setting temperatures. The resulting high power heating prevents unintended aging of the material and yields consistent and accurate results for the rapid creation of prototypes. We present a complete algorithm and system together with an experimental analysis of temperature regulation. We experimentally validate the approach on Nitinol tubes that are shape set into planar curves. We also demonstrate the feasibility of creating general space curves by shape setting a helical tube. The system demonstrates a mean absolute temperature error of 10°C.

  17. Interrater reliability: the kappa statistic.

    PubMed

    McHugh, Mary L

    2012-01-01

    The kappa statistic is frequently used to test interrater reliability. The importance of rater reliability lies in the fact that it represents the extent to which the data collected in the study are correct representations of the variables measured. Measurement of the extent to which data collectors (raters) assign the same score to the same variable is called interrater reliability. While there have been a variety of methods to measure interrater reliability, traditionally it was measured as percent agreement, calculated as the number of agreement scores divided by the total number of scores. In 1960, Jacob Cohen critiqued use of percent agreement due to its inability to account for chance agreement. He introduced the Cohen's kappa, developed to account for the possibility that raters actually guess on at least some variables due to uncertainty. Like most correlation statistics, the kappa can range from -1 to +1. While the kappa is one of the most commonly used statistics to test interrater reliability, it has limitations. Judgments about what level of kappa should be acceptable for health research are questioned. Cohen's suggested interpretation may be too lenient for health related studies because it implies that a score as low as 0.41 might be acceptable. Kappa and percent agreement are compared, and levels for both kappa and percent agreement that should be demanded in healthcare studies are suggested.

  18. Isometric and isokinetic muscle strength in the upper extremity can be reliably measured in persons with chronic stroke.

    PubMed

    Ekstrand, Elisabeth; Lexell, Jan; Brogårdh, Christina

    2015-09-01

    To evaluate the test-retest reliability of isometric and isokinetic muscle strength measurements in the upper extremity after stroke. A test-retest design. Forty-five persons with mild to moderate paresis in the upper extremity > 6 months post-stroke. Isometric arm strength (shoulder abduction, elbow flexion), isokinetic arm strength (elbow extension/flexion) and isometric grip strength were measured with electronic dynamometers. Reliability was evaluated with intra-class correlation coefficients (ICC), changes in the mean, standard error of measurements (SEM) and smallest real differences (SRD). Reliability was high (ICCs: 0.92-0.97). The absolute and relative (%) SEM ranged from 2.7 Nm (5.6%) to 3.0 Nm (9.4%) for isometric arm strength, 2.6 Nm (7.4%) to 2.9 Nm (12.6%) for isokinetic arm strength, and 22.3 N (7.6%) to 26.4 N (9.2%) for grip strength. The absolute and relative (%) SRD ranged from 7.5 Nm (15.5%) to 8.4 Nm (26.1%) for isometric arm strength, 7.1 Nm (20.6%) to 8.0 Nm (34.8%) for isokinetic arm strength, and 61.8 N (21.0%) to 73.3 N (25.6%) for grip strength. Muscle strength in the upper extremity can be reliably measured in persons with chronic stroke. Isometric measurements yield smaller measurement errors than isokinetic measurements and might be preferred, but the choice depends on the research question.

  19. Thermal Management and Packaging Reliability (Text Version) |

    Science.gov Websites

    Transportation Research | NREL Thermal Management and Packaging Reliability (Text Version ) Thermal Management and Packaging Reliability (Text Version) Learn how NREL's thermal management and ;Boosting Thermal Management & Reliability of Vehicle Power Electronics." Better power electronics

  20. 76 FR 71011 - Reliability Technical Conference Agenda

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-16

    ... Reliability Technical Conference. Docket No. AD12-1-000 North American Electric Docket No. RC11-6-000... Chief Executive Officer, North American Electric Reliability Corporation (NERC) Kevin Burke, Chairman... and Reliability, American Public Power Association (APPA); NERC Standards Committee Chairman Deborah...

  1. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  2. NASCOM network: Ground communications reliability report

    NASA Technical Reports Server (NTRS)

    1973-01-01

    A reliability performance analysis of the NASCOM Network circuits is reported. Network performance narrative summary is presented to include significant changes in circuit configurations, current figures, and trends in each trouble category with notable circuit totals specified. Lost time and interruption tables listing circuits which were affected by outages showing their totals category are submitted. A special analysis of circuits with low reliabilities is developed with tables depicting the performance and graphs for individual reliabilities.

  3. Reliability Correction for Functional Connectivity: Theory and Implementation

    PubMed Central

    Mueller, Sophia; Wang, Danhong; Fox, Michael D.; Pan, Ruiqi; Lu, Jie; Li, Kuncheng; Sun, Wei; Buckner, Randy L.; Liu, Hesheng

    2016-01-01

    Network properties can be estimated using functional connectivity MRI (fcMRI). However, regional variation of the fMRI signal causes systematic biases in network estimates including correlation attenuation in regions of low measurement reliability. Here we computed the spatial distribution of fcMRI reliability using longitudinal fcMRI datasets and demonstrated how pre-estimated reliability maps can correct for correlation attenuation. As a test case of reliability-based attenuation correction we estimated properties of the default network, where reliability was significantly lower than average in the medial temporal lobe and higher in the posterior medial cortex, heterogeneity that impacts estimation of the network. Accounting for this bias using attenuation correction revealed that the medial temporal lobe’s contribution to the default network is typically underestimated. To render this approach useful to a greater number of datasets, we demonstrate that test-retest reliability maps derived from repeated runs within a single scanning session can be used as a surrogate for multi-session reliability mapping. Using data segments with different scan lengths between 1 and 30 min, we found that test-retest reliability of connectivity estimates increases with scan length while the spatial distribution of reliability is relatively stable even at short scan lengths. Finally, analyses of tertiary data revealed that reliability distribution is influenced by age, neuropsychiatric status and scanner type, suggesting that reliability correction may be especially important when studying between-group differences. Collectively, these results illustrate that reliability-based attenuation correction is an easily implemented strategy that mitigates certain features of fMRI signal nonuniformity. PMID:26493163

  4. Reliability and Validity of Survey Instruments to Measure Work-Related Fatigue in the Emergency Medical Services Setting: A Systematic Review.

    PubMed

    Patterson, P Daniel; Weaver, Matthew D; Fabio, Anthony; Teasley, Ellen M; Renn, Megan L; Curtis, Brett R; Matthews, Margaret E; Kroemer, Andrew J; Xun, Xiaoshuang; Bizhanova, Zhadyra; Weiss, Patricia M; Sequeira, Denisse J; Coppler, Patrick J; Lang, Eddy S; Higgins, J Stephen

    2018-02-15

    This study sought to systematically search the literature to identify reliable and valid survey instruments for fatigue measurement in the Emergency Medical Services (EMS) occupational setting. A systematic review study design was used and searched six databases, including one website. The research question guiding the search was developed a priori and registered with the PROSPERO database of systematic reviews: "Are there reliable and valid instruments for measuring fatigue among EMS personnel?" (2016:CRD42016040097). The primary outcome of interest was criterion-related validity. Important outcomes of interest included reliability (e.g., internal consistency), and indicators of sensitivity and specificity. Members of the research team independently screened records from the databases. Full-text articles were evaluated by adapting the Bolster and Rourke system for categorizing findings of systematic reviews, and the rated data abstracted from the body of literature as favorable, unfavorable, mixed/inconclusive, or no impact. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) methodology was used to evaluate the quality of evidence. The search strategy yielded 1,257 unique records. Thirty-four unique experimental and non-experimental studies were determined relevant following full-text review. Nineteen studies reported on the reliability and/or validity of ten different fatigue survey instruments. Eighteen different studies evaluated the reliability and/or validity of four different sleepiness survey instruments. None of the retained studies reported sensitivity or specificity. Evidence quality was rated as very low across all outcomes. In this systematic review, limited evidence of the reliability and validity of 14 different survey instruments to assess the fatigue and/or sleepiness status of EMS personnel and related shift worker groups was identified.

  5. Universal first-order reliability concept applied to semistatic structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1994-01-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  6. Universal first-order reliability concept applied to semistatic structures

    NASA Astrophysics Data System (ADS)

    Verderaime, V.

    1994-07-01

    A reliability design concept was developed for semistatic structures which combines the prevailing deterministic method with the first-order reliability method. The proposed method surmounts deterministic deficiencies in providing uniformly reliable structures and improved safety audits. It supports risk analyses and reliability selection criterion. The method provides a reliability design factor derived from the reliability criterion which is analogous to the current safety factor for sizing structures and verifying reliability response. The universal first-order reliability method should also be applicable for air and surface vehicles semistatic structures.

  7. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  8. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  9. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  10. 18 CFR 39.3 - Electric Reliability Organization certification.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Electric Reliability... CERTIFICATION OF THE ELECTRIC RELIABILITY ORGANIZATION; AND PROCEDURES FOR THE ESTABLISHMENT, APPROVAL, AND ENFORCEMENT OF ELECTRIC RELIABILITY STANDARDS § 39.3 Electric Reliability Organization certification. (a) Any...

  11. Nanoparticle chemisorption printing technique for conductive silver patterning with submicron resolution

    PubMed Central

    Yamada, Toshikazu; Fukuhara, Katsuo; Matsuoka, Ken; Minemawari, Hiromi; Tsutsumi, Jun'ya; Fukuda, Nobuko; Aoshima, Keisuke; Arai, Shunto; Makita, Yuichi; Kubo, Hitoshi; Enomoto, Takao; Togashi, Takanari; Kurihara, Masato; Hasegawa, Tatsuo

    2016-01-01

    Silver nanocolloid, a dense suspension of ligand-encapsulated silver nanoparticles, is an important material for printing-based device production technologies. However, printed conductive patterns of sufficiently high quality and resolution for industrial products have not yet been achieved, as the use of conventional printing techniques is severely limiting. Here we report a printing technique to manufacture ultrafine conductive patterns utilizing the exclusive chemisorption phenomenon of weakly encapsulated silver nanoparticles on a photoactivated surface. The process includes masked irradiation of vacuum ultraviolet light on an amorphous perfluorinated polymer layer to photoactivate the surface with pendant carboxylate groups, and subsequent coating of alkylamine-encapsulated silver nanocolloids, which causes amine–carboxylate conversion to trigger the spontaneous formation of a self-fused solid silver layer. The technique can produce silver patterns of submicron fineness adhered strongly to substrates, thus enabling manufacture of flexible transparent conductive sheets. This printing technique could replace conventional vacuum- and photolithography-based device processing. PMID:27091238

  12. A Remote Sensing-Derived Corn Yield Assessment Model

    NASA Astrophysics Data System (ADS)

    Shrestha, Ranjay Man

    Agricultural studies and food security have become critical research topics due to continuous growth in human population and simultaneous shrinkage in agricultural land. In spite of modern technological advancements to improve agricultural productivity, more studies on crop yield assessments and food productivities are still necessary to fulfill the constantly increasing food demands. Besides human activities, natural disasters such as flood and drought, along with rapid climate changes, also inflect an adverse effect on food productivities. Understanding the impact of these disasters on crop yield and making early impact estimations could help planning for any national or international food crisis. Similarly, the United States Department of Agriculture (USDA) Risk Management Agency (RMA) insurance management utilizes appropriately estimated crop yield and damage assessment information to sustain farmers' practice through timely and proper compensations. Through County Agricultural Production Survey (CAPS), the USDA National Agricultural Statistical Service (NASS) uses traditional methods of field interviews and farmer-reported survey data to perform annual crop condition monitoring and production estimations at the regional and state levels. As these manual approaches of yield estimations are highly inefficient and produce very limited samples to represent the entire area, NASS requires supplemental spatial data that provides continuous and timely information on crop production and annual yield. Compared to traditional methods, remote sensing data and products offer wider spatial extent, more accurate location information, higher temporal resolution and data distribution, and lower data cost--thus providing a complementary option for estimation of crop yield information. Remote sensing derived vegetation indices such as Normalized Difference Vegetation Index (NDVI) provide measurable statistics of potential crop growth based on the spectral reflectance and could

  13. Interrater Reliability in Large-Scale Assessments--Can Teachers Score National Tests Reliably without External Controls?

    ERIC Educational Resources Information Center

    Pantzare, Anna Lind

    2015-01-01

    In most large-scale assessment systems a set of rather expensive external quality controls are implemented in order to guarantee the quality of interrater reliability. This study empirically examines if teachers' ratings of national tests in mathematics can be reliable without using monitoring, training, or other methods of external quality…

  14. Examining the roles that changing harvested areas, closing yield-gaps, and increasing yield ceilings have had on crop production

    NASA Astrophysics Data System (ADS)

    Johnston, M.; Ray, D. K.; Mueller, N. D.; Foley, J. A.

    2011-12-01

    With an increasing and increasingly affluent population, there has been tremendous effort to examine strategies for sustainably increasing agricultural production to meet this surging global demand. Before developing new solutions from scratch, though, we believe it is important to consult our recent agricultural history to see where and how agricultural production changes have already taken place. By utilizing the newly created temporal M3 cropland datasets, we can for the first time examine gridded agricultural yields and area, both spatially and temporally. This research explores the historical drivers of agricultural production changes, from 1965-2005. The results will be presented spatially at the global-level (5-min resolution), as well as at the individual country-level. The primary research components of this study are presented below, including the general methodology utilized in each phase and preliminary results for soybean where available. The complete assessment will cover maize, wheat, rice, soybean, and sugarcane, and will include country-specific analysis for over 200 countries, states, territories and protectorates. Phase 1: The first component of our research isolates changes in agricultural production due to variation in planting decisions (harvested area) from changes in production due to intensification efforts (yield). We examine area/yield changes at the pixel-level over 5-year time-steps to determine how much each component has contributed to overall changes in production. Our results include both spatial patterns of changes in production, as well as spatial maps illustrating to what degree the production change is attributed to area and/or yield. Together, these maps illustrate where, why, and by how much agricultural production has changed over time. Phase 2: In the second phase of our research we attempt to determine the impact that area and yield changes have had on agricultural production at the country-level. We calculate a production

  15. Designing magnetic systems for reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heitzenroeder, P.J.

    1991-01-01

    Designing magnetic system is an iterative process in which the requirements are set, a design is developed, materials and manufacturing processes are defined, interrelationships with the various elements of the system are established, engineering analyses are performed, and fault modes and effects are studied. Reliability requires that all elements of the design process, from the seemingly most straightforward such as utilities connection design and implementation, to the most sophisticated such as advanced finite element analyses, receives a balanced and appropriate level of attention. D.B. Montgomery's study of magnet failures has shown that the predominance of magnet failures tend not tomore » be in the most intensively engineered areas, but are associated with insulation, leads, ad unanticipated conditions. TFTR, JET, JT-60, and PBX are all major tokamaks which have suffered loss of reliability due to water leaks. Similarly the majority of causes of loss of magnet reliability at PPPL has not been in the sophisticated areas of the design but are due to difficulties associated with coolant connections, bus connections, and external structural connections. Looking towards the future, the major next-devices such as BPX and ITER are most costly and complex than any of their predecessors and are pressing the bounds of operating levels, materials, and fabrication. Emphasis on reliability is a must as the fusion program enters a phase where there are fewer, but very costly devices with the goal of reaching a reactor prototype stage in the next two or three decades. This paper reviews some of the magnet reliability issues which PPPL has faced over the years the lessons learned from them, and magnet design and fabrication practices which have been found to contribute to magnet reliability.« less

  16. Airborne and ground-based remote sensing for the estimation of evapotranspiration and yield of bean, potato, and sugar beet crops

    NASA Astrophysics Data System (ADS)

    Jayanthi, Harikishan

    compared with the actual yields extracted from the ground. The remote sensing-derived yields compared well with the actual yields sampled on the ground. This research has highlighted the importance of the date of spectral emergence, the need to know the duration for which the crops stand on the ground, and the need to identify critical periods of time when multispectral coverages are essential for reliable tuber yield estimation.

  17. Power Electronics Packaging Reliability | Transportation Research | NREL

    Science.gov Websites

    interface materials, are a key enabling technology for compact, lightweight, low-cost, and reliable power , reliability, and cost. High-temperature bonded interface materials are an important facilitating technology for compact, lightweight, low-cost, reliable power electronics packaging that fully utilizes the

  18. Reliability and validity: Part II.

    PubMed

    Davis, Debora Winders

    2004-01-01

    Determining measurement reliability and validity involves complex processes. There is usually room for argument about most instruments. It is important that the researcher clearly describes the processes upon which she made the decision to use a particular instrument, and presents the evidence available showing that the instrument is reliable and valid for the current purposes. In some cases, the researcher may need to conduct pilot studies to obtain evidence upon which to decide whether the instrument is valid for a new population or a different setting. In all cases, the researcher must present a clear and complete explanation for the choices, she has made regarding reliability and validity. The consumer must then judge the degree to which the researcher has provided adequate and theoretically sound rationale. Although I have tried to touch on most of the important concepts related to measurement reliability and validity, it is beyond the scope of this column to be exhaustive. There are textbooks devoted entirely to specific measurement issues if readers require more in-depth knowledge.

  19. Photovoltaic performance and reliability workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kroposki, B

    1996-10-01

    This proceedings is the compilation of papers presented at the ninth PV Performance and Reliability Workshop held at the Sheraton Denver West Hotel on September 4--6, 1996. This years workshop included presentations from 25 speakers and had over 100 attendees. All of the presentations that were given are included in this proceedings. Topics of the papers included: defining service lifetime and developing models for PV module lifetime; examining and determining failure and degradation mechanisms in PV modules; combining IEEE/IEC/UL testing procedures; AC module performance and reliability testing; inverter reliability/qualification testing; standardization of utility interconnect requirements for PV systems; need activitiesmore » to separate variables by testing individual components of PV systems (e.g. cells, modules, batteries, inverters,charge controllers) for individual reliability and then test them in actual system configurations; more results reported from field experience on modules, inverters, batteries, and charge controllers from field deployed PV systems; and system certification and standardized testing for stand-alone and grid-tied systems.« less

  20. Reliability and validity of three pain provocation tests used for the diagnosis of chronic proximal hamstring tendinopathy.

    PubMed

    Cacchio, Angelo; Borra, Fabrizio; Severini, Gabriele; Foglia, Andrea; Musarra, Frank; Taddio, Nicola; De Paulis, Fosco

    2012-09-01

    The clinical assessment of chronic proximal hamstring tendinopathy (PHT) in athletes is a challenge to sports medicine. To be able to compare the results of research and treatments, the methods used to diagnose and evaluate PHT must be clearly defined and reproducible. To assess the reliability and validity of three pain provocation tests used for the diagnosis of PHT. Ninety-two athletes with (N=46) and without (N=46) PHT were examined by one physician and two physiotherapists, who were trained in the examination techniques before the study. The examiners were blinded to the symptoms and identity of the athletes. The three pain provocation tests examined were the Puranen-Orava, bent-knee stretch and modified bent-knee stretch tests. Intraclass correlation coefficients (ICCs) based on the repeated measures analysis of variance were used to analyse the intraexaminer and interexaminer reliability, while sensitivity, specificity, predictive values and likelihood ratios were used to determine the validity of the three tests. The ICC values in all three tests revealed a high correlation (range 0.82 to 0.88) for the interexaminer reliability and a high-to-very high correlation (range 0.87 to 0.93) for the intraexaminer reliability. All three tests displayed a moderate-to-high validity, with the highest degree of validity being yielded by the modified bent-knee stretch test. All three pain provocation tests proved to be of potential value in assessing chronic PHT in athletes. However, we recommend that they be used in conjunction with other objective measures, such as MRI.